When I run the command:
exec master..xp_cmdshell 'NET USE'
from the analyzer the box responds there are no entries in the list.
After that, I run the command:
exec master..xp_cmdshell 'NET USE Z: /DELETE'
after which the box responds with a "network connection could not be found."
and that's all okay.
The weird thing is:
exec master..xp_cmdshell 'NET USE Z: \MACHINESHARENAME'
results in a "The local device name is already in use.".
The machine in this particular case is the box itself. I have no problem accessing other disks on other systems. I can see the share using the view command. There's no maximum on the share itself and I can connect to the share using another sql box with the same user.
I don't know why it won't budge, worked before like a charm. After six months or so it just stopped. Anyone seen/solved this behaviour?
I have 2 exact same sql tasks in different packages. Connection manager is defined for the same database for both. one of the sql task works, and other one throws out this error: [Execute SQL Task] Error: Failed to acquire connection "pdsprod.pdsdataread". Connection may not be configured correctly or you may not have the right permissions on this connection.
this is completely mind boggling. I have compared both sql tasks for each and every property and they are exactly the same. what is going on?
by the way I am on 64 bit box with Run64bitruntime= false.
I am using SQLExpress for Unit Testing my application. In the Unit Tests, I use a local database file that is attached automatically in SQLExpress when the Unit Test uses it.
FYI, in the Unit Test I use the following connection string :
By accident, the MyUnitTestDatabase.mdf file was marked as ReadOnly. So, after executing several times the unit Test, the attached databases appear in grey in the SQL Server Management Studio Express. That's normal ! The problem I want to report here occurs when I execute the following script in SQL Server Management Studio Express:
use master go sp_MSForEachDB 'Print ''?''' go
In .SQLExpress, I currently have 8 databases (The three last databases are those attached by the unit tests. They are ReadOnly):
- master - model - msdb - tempdb - MyUnitTestDatabase (the original db copied and used by the Unit Tests. It's not ReadOnly) - 1E6AA4A60F3733D37F016842D4626B8B_X34058MYSERVICETESTRESULTSX34058_N17400 2008-03-12 17_22_03OUTMYUNITTESTDATABASE.MDF - ADA9F382DFBC95C8334EF95336C98274_X34058MYSERVICETESTRESULTSX34058_N17400 2008-03-12 17_13_57OUTMYUNITTESTDATABASE.MDF - F00BF38C8BB8F07D37FCC4E918CF815E_X34058MYSERVICETESTRESULTSX34058_N17400 2008-03-12 17_10_04OUTMYUNITTESTDATABASE.MDF
When executed, the script mentioned above displays sometimes all the databases and sometimes only the 4 first databases ?!?!
I did a demo to various colleagues here, pressing F5 many times in the Script windows. It's seems taht it displays 4 names or 8 names "at random"... (I always wait for the message "Query executed successfully" before pressing again F5).
I have to understand the problem here because I use sp_MSForEachDB to detach all the databases at the end of the Unit Tests and it also fails from time to time...
Thx in advance for any tip that could help me in finding the origin of this problem
V.
PS. : FYI, here is the stored proc I use to automatically detach the databases at the end of the unit tests
declare @spid int declare @killstatement nvarchar(10) IF @database like '%TESTRESULTS%' BEGIN
-- Declare a cursor to select the users connected to the specified database declare c1 cursor for select request_session_id from sys.dm_tran_locks where resource_type='DATABASE' AND DB_NAME(resource_database_id) = @database open c1 fetch next from c1 into @spid while @@FETCH_STATUS = 0 begin
-- Don't kill the connection of the user executing this statement IF @@SPID <> @spid begin
-- Construct dynamic sql to kill spid set @killstatement = 'KILL ' + cast(@spid as varchar(3)) exec sp_executesql @killstatement end fetch next from c1 into @spid end close c1 deallocate c1
exec msdb.dbo.sp_delete_database_backuphistory @database exec master.dbo.sp_detach_db @database, 'true' END END
Possibly I was not detaching the databases in a "clean way" and my system databases are now corrupted ? Is such a case, what should I do in addition to the code here above to correctly detach the databases ?
Hi all, I'm encountering a weird behaviour: I have a .NET application that should allow to select Server and Catalog to install a DB.
I create a mask similar to the one in SQL Server, so i have to fill the Server DropDownList. This is my code: SqlDataSourceEnumerator instance = SqlDataSourceEnumerator.Instance; System.Data.DataTable table = instance.GetDataSources(); foreach (System.Data.DataRow row in table.Rows) { string strDBName, strInstance = row[1].ToString(); if ((strInstance == null) || (strInstance == "")) strDBName = String.Format("{0}", row[0]); else strDBName = String.Format("{0}\{1}", row[0], strInstance); Program.WriteLog("Row: " + row[0] + ", " + row[1] + " (" + row[3] + ")"); } Program.WriteLog just writes a string on the log file
Sometimes I need to restart the SqlServer so I Run CtrlDatabase = new ServiceController(strServiceName); CtrlDatabase.Stop(); CtrlDatabase.Start(); With the obvious checks
Well the first time, before restarting, the log file is Row: WKS08, COMPACS (8.00.194) However if i run Stop & Start the log file is Row: WKS08, () SqlDataSourceEnumerator seems to be not more able to read data. Why? How can i fix it?
Hi, This thread is a reformulation of a prior thread. I created a login 'Network service' at server level in Management Studio express.I use windows authentification.Then i defined an user for my database which is associated to login 'Network service', because the application asp.net uses that account (IIS 6.0). This user received db_read and db_write roles.This works.Now i experimented a little bit and i removed from the logins at server level the login 'Network service'.Result: the application still works..Then i removed the Builtinusers login from the login list at server level.Result: i get the error: "login failed for Network service".I recreated then the login 'Network Service' at server level but not the Builtinusers login.Result: it works again.My conclusion is: one of the two logins must be in the list: Network Service or BuiltinusersIs this right?Why do i get that error when both logins are removed and not only when Network Service is removed?Thanks
This is the actual statement displayed from Response.Write in classic ASP. INSERT INTO WOTasks (WoNum,TaskNum,TaskDesc,TaskMemo,Account,ModifyDate,Estimate,TaskHours,Unit,UnitCost,TotalCost) SELECT '06-012497',TaskNum,TaskDesc,TaskMemo,Account,'2006-Oct-3',1,TaskHours,Unit,UnitCost,TotalCost FROM Tasks WHERE procnum = '000002' There are 4 records returned from the SELECT part of the statement. In some situations, 4 records are inserted to WOTasks table, in others, only 1 record is inserted. I can't find out why 1 record, instead of 4, record is inserted. A form page submits the form to the save page using post method. The above statement is contained in the save page. When one of the form textbox is filled, 1 record is inserted. When the textbox is not filled, 4 records are inserted. You may think the textbox has something to do with the behaviour. I also think so but the content of the textbox does not affect the sql statement. In both cases, the insert statement is the same. In the actual codes, only strings in quotes are variables and the rest are hardcoded. When I run the statement in SQL Server, 4 records are affected. No such problem when connected with Access.The actual code belowSub AddTask(ByVal proc, ByVal wonum) Dim sSQL sSQL = "INSERT INTO WOTasks (WoNum,TaskNum,TaskDesc,TaskMemo,Account,ModifyDate,Estimate,TaskHours,Unit,UnitCost,TotalCost) SELECT '" & wonum & _ "',TaskNum,TaskDesc,TaskMemo,Account,'" & curDate & "',1,TaskHours,Unit,UnitCost,TotalCost FROM Tasks WHERE procnum='" & proc & "'" 'Response.Write sSQL:Response.End conn.Execute sSQL, , 128 End Sub
I am using MSDE and Analysis Services (lastest packs) and the same installation on the same machine has been working great for the last 18 months or so untill yesterday. Whenever I try to open a DTS (in order to edit it) the machine just goes into a coma.... I have tried to re-start many times but of no use.
Can someone kindly guide me what should I look for in order to solve this.
Hi there, I wonder if one of you worthy folks can help me out with some strange behaviour exhibited by a piece of SQL. Its my first post here , so please be gentle. :)
Here is my simple example :-
<my test table>
create table test (ind int, message varchar(255))
insert into test (ind, message) values (1,'date=01/06/2006')
insert into test (ind, message) values (1,'date=20/12/2005') insert into test (ind, message) values (2,'test')
The first query is
select * from test t1 where t1.ind in (select max(ind) from test t2 where t2.ind = t1.ind and t2.message like 'date=%' )
fine.... 2 rows
second query
select * from test t1 where t1.ind =1 and convert(datetime, (SUBSTRING(Message, CHARINDEX('=',Message,0)+1, 10)),103) > getdate()
fine same 2 rows...
but If I try to combine the 2 clauses in
select * from test t1 where t1.ind in (select max(ind) from test t2 where t2.ind = t1.ind and t2.message like 'date=%' ) and convert(datetime, (SUBSTRING(Message, CHARINDEX('=',Message,0)+1, 10)),103) > getdate()
I get a Server: Msg 241, Level 16, State 1, Line 1 Syntax error converting datetime from character string.
Hello All,The following script is reproducing the problem assuming you haveNorthwind database on the server.Please note it gives you the error message on line 12.USE tempdbGOsp_addlinkedserver 'Test17'GOsp_setnetname 'Test17', @@SERVERNAMEGOIF EXISTS (SELECT 1 FROM dbo.sysobjects WHERE id =object_id(N'[dbo].[This_works]') and OBJECTPROPERTY(id, N'IsProcedure') = 1)DROP PROCEDURE [dbo].[This_works]GOCREATE PROCEDURE This_works@UseLinkedServer bit = 0-- WITH RECOMPILE -- Does not helpASSET NOCOUNT ONIF @UseLinkedServer = 1 -- Linked ServerBEGINIF EXISTS (SELECT 1 FROM dbo.sysobjects where id =object_id(N'[dbo].[Orders_TMP]') and OBJECTPROPERTY(id, N'IsUserTable')= 1)DROP TABLE dbo.Orders_TMPSELECT * INTO dbo.Orders_TMP FROM Test17.Northwind.dbo.OrdersENDELSE -- LocalBEGINIF EXISTS (SELECT 1 FROM dbo.sysobjects where id =object_id(N'[dbo].[Orders_TMP]') and OBJECTPROPERTY(id, N'IsUserTable')= 1)DROP TABLE dbo.Orders_TMPSELECT * INTO dbo.Orders_TMP FROM Northwind.dbo.OrdersSELECT 1 FROM dbo.Orders_TMP WHERE 1 = 2 -- Why do I need this line?ENDBEGIN TRANSACTIONSelect 'Line 25'SELECT COUNT(*) FROM dbo.Orders_TMPCOMMITgoIF EXISTS (SELECT 1 FROM dbo.sysobjects WHERE id =object_id(N'[dbo].[This_does_not]') and OBJECTPROPERTY(id,N'IsProcedure') = 1)DROP PROCEDURE [dbo].[This_does_not]GOCREATE PROCEDURE This_does_not@UseLinkedServer bit = 0-- WITH RECOMPILE -- Does not helpASSET NOCOUNT ONIF @UseLinkedServer = 1 -- Linked ServerBEGINIF EXISTS (SELECT 1 FROM dbo.sysobjects where id =object_id(N'[dbo].[Orders_TMP]') and OBJECTPROPERTY(id, N'IsUserTable')= 1)DROP TABLE dbo.Orders_TMPSELECT * INTO dbo.Orders_TMP FROM Test17.Northwind.dbo.OrdersENDELSE -- LocalBEGINIF EXISTS (SELECT 1 FROM dbo.sysobjects where id =object_id(N'[dbo].[Orders_TMP]') and OBJECTPROPERTY(id, N'IsUserTable')= 1)DROP TABLE dbo.Orders_TMPSELECT * INTO dbo.Orders_TMP FROM Northwind.dbo.Orders--SELECT 1 FROM dbo.Orders_TMP WHERE 1 = 2 -- Why do I need this line?ENDBEGIN TRANSACTIONSelect 'Line 25'SELECT COUNT(*) FROM dbo.Orders_TMPCOMMITGOPRINT 'This_works'EXECUTE This_works 0PRINT ' 'PRINT 'This_does_not'EXECUTE This_does_not 0Thanks for any help or hint,Igor Raytsin
We have an application that executes a few queries against an SQL Server 2005 (64-bit) database. Since there can be several instances of the application running at any given time, and parts of the logic must be serialized, we've been using sp_getapplock and sp_releaseapplock. This has all been working fine since RC1 on which the system was released. However, after installing SP2 about a week ago, we have been having problems. The serialized portion of the code almost always stall now.
To see what is happening we've been using both Management Studio and Profiler. We have two applications running, let's call them A and B. Both create a prepared statement which begin with a call to sp_getapplock and ends with sp_releaseapplock. In between some tables are queried and inserts may be made in others. The accessed tables are never used anywhere else but in the serialized code. This is what is happening:
Application A: Calls sp_getapplock. Application A: Queries a table. Application B: Calls sp_getapplock. Application A: Inserts a row in a table. Application A: Calls sp_releaseapplock. Application B: Waits indefinitely (or at least more than 4 hours, after which we killed the spid).
Profiler cannot detect any deadlocks when this is happening. There are no blocking operations according to Management Studio. I can see the application lock having been set when I look at the spid for Application B in Management Studio.
Since this started to occur frequently after installing SP2 and had not been seen before, we are wondering if any changes has been made that could cause this behaviour? Has anyone else had problems using application locks, where a query would stall indefinitely waiting for the lock to be released? How then did you resolve it?
Any suggestions or ideas are welcome, Thanks, Lars
I were trying to achive paging through using a CTE etc, but ran into the following weither thing happening. The CTE allows me to use avariable as the ORder By field, although the CTE do not care at all what is in there? Have any one seen this or maybe can explain this?
USE AdventureWorks;
GO
DECLARE @SortExpression Varchar(50)
Set @SortExpression = 'SalesPersonID ASC';
WITH Sales_CTE (RowNumber, SalesPersonID, NumberOfOrders, MaxDate)
AS
(
SELECT
ROW_NUMBER() OVER(Order by @SortExpression) RowNumber,
SalesPersonID, COUNT(*), MAX(OrderDate)
FROM Sales.SalesOrderHeader
GROUP BY SalesPersonID
)
Select * From Sales_CTE;
WITH Sales_CTE1 (RowNumber, SalesPersonID, NumberOfOrders, MaxDate)
AS
(
SELECT
ROW_NUMBER() OVER(Order by SalesPersonID ASC) RowNumber,
When I run the package from business solution environment, DBdate cast converts my date column (correctly) into a European date format dd/mm/yyyy and as such is inserted into sql server table.
When I run a package as a job, the same date in inserted into the database as mm/dd/yyyy.
So, if I have 3rd January 2007 in the source, in the first case i'll find 03/01/2007 in the database. When I run the package as a job, I find 01/03/2007 in the db.
The problem comes when I run different select statements - the 01/03/2007 behaves as if 1st March 2007
How can I avoid inserting of American data format into the db?
Hi to all! i had tried to install ads on a windows 5. i am able to connect to the pocket database. however when i trie to connect to the database that's on desktop i see a little window (probably a message) without anything and when i tap ok i see a message telling that the connection to the desktop database was not done! i had made the configuration before. can you tell me what is going wrong?
using System; using System.Data.SqlServerCe; using System.Data;
namespace ConsoleApplication1 { class Program { static void Main(string[] args) { using (SqlCeConnection conn = new SqlCeConnection(@"data source=d:mydb.sdf")) { conn.Open();
int counter = 0; using (SqlCeCommand cmd = new SqlCeCommand("products", conn)) {
using (SqlCeDataReader reader = cmd.ExecuteReader()) { while (reader.Read()) { counter++; } } }
conn.Close(); Console.WriteLine("Row count {0}", counter); //should show row count 2 but shows row count 3 instead }
Console.ReadKey(); } } } I'm setting the starting range to a date and a product name (PC 2) but it seems that the datareader only see the first array member and completely ignore the product name and show all 3 data instead of 2.
We have this webiste which uses SQL express as database engine. Sometimes certain features of the website stop working. Like membership provider and other database related things. I have described the problem in more details here: http://forums.asp.net/t/1172253.aspx In consice the problem is: One query with fixed inputs does not always return the same results, though the data has not changed, you restart the SQL express and the problem resolives! I think thats a problem with SQL express, because when you restart SQL express everything starts working. Our database is kinda big. Like above 500 MB with up to 50 concurrent users. And our machine got a 3.2 CPU with 512 MB of ram. And our application is the only application runing there. What do you think please?
I have a scheduled job that inserts some records into a table. It fails with the following message, Violation of PRIMARY KEY constraint 'PK_FuturesOut'. Cannot insert duplicate key in object 'FuturesOut'. [SQLSTATE 23000] (Error 2627) The statement has been terminated. [SQLSTATE 01000] (Error 3621). The step failed.
The strange thing about this is, if I copy the SQL statement from the job and paste it into Query Analyzer, it works without any modifications.
If anyone can explain this I would be most grateful.
My question: If I add an index to tblTable for the column B (not used in the view's WHERE clause, but used in the s-proc), will it have a performance improvement, because of the WHERE B > 6 on the view, assuming that this condition would benefit from the index if it were in the view itself.
I guess I could also put it this way: can an index on a column in a table improve the performance of a condition on a view using that table.
I use bcp fairly often in SQLServer2000 but have never run across this before. In a 512 SQLCHAR column containing notes, when two spaces are encountered (i.e. ' '), bcp is replacing ' ' with ' '.
I figured it was a problem with my format file, but I have not found enough specific info on the MSDN site to resolve this.
This is my bcp command:
bcp "SELECT DivCode,CommCode,ContactID,substring(Notes, 0, 512) from frep.dbo.BeBackExtract" queryout "D:BeBackDetail.dat" -f "D:BeBackDetail.fmt" -e "D:BeBackDetailErrors.dat" -U user -P pass -S server
Incidentally I also tried using the REPLACE function which seems to work great until trying to replace 2 spaces with 1 space, in which case it doesn't do anything.
I have a DTS package that users of the database can run which basically acts like a 'live update' (as the database is based on a values produced from another system) and it takes roughly 30 or so seconds to run...
The dts package is not going to have a particularly large hit rate but i am interested in knowing what will happen if a user (user 1) attempts to run the package when it is in already in use by another user (user 2) ?
Will User 1 simply have to wait untill the package completes for User 1?
OR
Will a new seperate 'instance' of the package be used by User 1?
Hello all,I have what I hope is a simple question:Does SQL Server have an 'all-or-nothing' locking policy? Or does itacquire as many locks as it can and then sit and wait for the rest?Example:SELECT * FROM TABLE_A INNER JOIN TABLE_BON TABLE_A.dbid = TABLE_B.dbidNormally a SHARED lock would be acquired on both objects (pleasecorrect me if I'm wrong). But let's say TABLE_B was being updated byanother process at the same time, and so we couldn't get the sharedlock. Would the dbms go ahead and acquire the shared lock on TABLE_Aand then wait for the other lock, or would it not acquire any lock atall until locks on both TABLE_A and TABLE_B were available?I ask because I'm investigating a deadlocking problem that's drivingme mad :)Thanks,Tommy.
Hi,In order to establish a security enhanced SQL server setup, I triedto switch off network access by disabling all networking protocols,so that the server can only be reached through a pipe. Neverthelessthe server was visible in the network and could be accessed fromall clients. Does anybody know what is going on here?Georg
Hi Everybody,I have a complex view, that includes a "group by" clause. I'm tryingto join this view with a table, in a very simple query.The problem is that the optimizer is not using the table data as inputfor the view (I expect this because I have arguments for the table,but not for the view), but executing the view in a different step andthen joining to the table by a merge/hash join. This is obviously veryslow.I tried to force nested loops by using hints but it still doesn't usethe table data as input.Has anybody ever seen this?Thanks in advance...
I wonder if it's possible keep the connections previously opened in your Sql Management Studio.
I mean, I open it and connecting to sql1,sql2,sql3,etc.. without using queries or something like that and then, close and re-open again. What happens? Neither of them appearing again. If you create a project, of couse...but it's not the same, because when you create a .ssmssln project file is inteded for keep .Sql.
I'm talking about the same behaviour when you ran the old Enterprise Manager inside the Microsoft Management Console and you had the possibility of make groups and whatever.
I am seeming strange results with a query. I have two tables, lets call them Table1 and Table2. Table1 has an ID field, Table2 does not have an ID field. To be sure I wasn't blind, the query
'SELECT ID FROM Table2'
returns: Invalid column name 'ID'. OK. Now when I run the query
'SELECT * FROM Table1 WHERE ID IN (SELECT ID FROM Table2)'
it returns all the records from Table1.
What gives? Is this a bug, or am I missing something?
I am facing a strange problem. SSIS is executing a piece of code which is inside comment !!!
I have got a master package which executes number of other packages. When I right click on a package and click "Execute task", SSIS respects the comments and ignores code inside it.
But when I execute master package, SSIS executes piece of code which is inside comment!!!
The only thing I can think of is when executed from Master package, SSIS somehow picks very old precompiled binary. BTW, precompile=false for the script component so it shouldn't use precompiled binary at all.
I am sure it will be fixed if I delete and recreate this particular script component with the same script. I (luckily!) picked up this problem but the thought of SSIS not executing intended binary scares the hell out of me.
Any thoughts??? or its just me happened to be using SSIS on Friday?
I have the following bit of code that is an onclick event to save information in text boxes and list boxes. I cut out a bunch that was irrelevant to this because ALL items in the text boxes save fine. My biggest question that should lead to the answers for the rest of the code, is for lines 53-57... That stored proceedure does not appear to run. All it is set to do is and the code is DEFINITELY getting passed into it but it almost seems like it isnt and therefoe isnt doing anything. I get no errors, just nothing happens. I would of thought at the very least, it should run this proceedure and delete the information from the table even if it wouldnt save the new information based on the listboxes. I am not sure if I am explaining this correctly, but if anyone has any thoughts, I would greatly appreciate it. The delete proceedure that isnt actually making a changeALTER PROCEDURE [dbo].[Delete_Team_Data] ( @Code as int ) AS
DELETE from tblSectyData2 where Code = @Code DELETE from tblSectyData2 where SectyCode = @Code The actual code 1 2 3 protected void SaveChanges(object sender, EventArgs e) 4 { 5 6 string selectedEmployee = ""; 7 selectedEmployee = EmployeeList.SelectedValue; 8 9 10 11 string fName = ""; 12 //Snipped out a bunch of code relating to the text boxes 13 string secretaryCode = ""; 14 15 fName = txtFName.Text; 16 //Snipped out a bunch of code relating to the text boxes 17 secretaryCode = TeamList2.SelectedValue.ToString(); 18 int selemp = Convert.ToInt32(selectedEmployee); 19 20 21 22 23 String Conn = (string)Application["Facebook"]; 24 SqlConnection IntranetConnection; 25 SqlDataReader IntranetReader; 26 IntranetConnection = new SqlConnection(Conn); 27 //SaveEmpChanges works properly. 28 SqlCommand SaveEmpChanges = new SqlCommand("Exec dbo.Edit_Employee_Data '" + prefix + "','" + lName + "','" + fName + "','" + mName + "','" + pos + "','" + dept + "','" + directdial + "','" + ext + "','" + fax + "','" + hphone + "','" + cphone + "','" + partner + "','" + timekeeper + "','" + notary + "','" + practice + "','" + saddress + "','" + sphone + "','" + lnl + "','" + bar + "','" + oemail + "','" + haddresscom + "','" + haddress + "','" + hcity + "','" + hstate + "','" + hzip + "','" + school + "','" + degree + "','" + status + "','" + floor + "','" + code + "','" + email + "','" + language + "'", IntranetConnection); 29 //Delete Team does NOT work 30 SqlCommand DeleteTeam = new SqlCommand("Exec dbo.Delete_Team_Data '" + selemp + "'", IntranetConnection); 31 //GetEmployeeType does work 32 SqlCommand GetEmployeeType= new SqlCommand("Select EmpType from tblMain2 where Code = '" + selectedEmployee + "'", IntranetConnection); 33 34 IntranetConnection.Open(); 35 IntranetReader = SaveEmpChanges.ExecuteReader(); 36 IntranetReader.Close(); 37 IntranetConnection.Close(); 38 39 40 41 42 int count = 0; 43 string LinkT = ""; 44 string LinkT2 = ""; 45 46 string etype = ""; 47 48 lselemp.Text = selectedEmployee; 49 //Basically, while the page loads fine, it does NOTHING below this line... Ive comments out sections, I have also put in a bunch of labels in to show the variables being passed, it all seems fine. 50 51 52 53 IntranetConnection.Open(); 54 IntranetReader = DeleteTeam.ExecuteReader(); 55 IntranetReader.Close(); 56 IntranetConnection.Close(); 57 58 59 IntranetConnection.Open(); 60 IntranetReader = GetEmployeeType.ExecuteReader(); 61 while (IntranetReader.Read()) 62 { 63 etype = IntranetReader["EmpType"].ToString(); 64 } 65 IntranetReader.Close(); 66 IntranetConnection.Close(); 67 68 int end = Convert.ToInt32(TeamList2.Items.Count); 69 string teamcode = ""; 70 71 while (count < end) 72 { 73 74 teamcode = TeamList2.Items[count].Value.ToString(); 75 76 if (etype == "S") 77 { 78 LinkT = "Secretary"; 79 LinkT2 = "Works with"; 80 } 81 82 else if (etype == "O") 83 { 84 LinkT = "Works with"; 85 LinkT2 = "Secretary"; 86 } 87 88 89 90 //This command does not work 91 SqlCommand saveTeam = new SqlCommand("Exec dbo.Add_Team_Data '" + selemp + "','" + teamcode + "','" + LinkT + "','" + LinkT2 + "'", IntranetConnection); 92 93 94 IntranetConnection.Open(); 95 IntranetReader = saveTeam.ExecuteReader(); 96 IntranetReader.Close(); 97 IntranetConnection.Close(); 98 99 100 count++; 101 } 102 103 104 105 106 Response.Redirect("ManageEmployeeDirectory.aspx"); 107 } 108
Hi all, I want to share an experience I made in the last few days and like to hear your comments about it. I am developing an ASP.Net 2.0 Web Application using SQL Server 2005 on my local system. After implementation was done, I had to deploy the application to the production server. Because of license problems, on the server is the express edition of SQL Server installed. The system worked fine for about 2 month. But the last week we noticed, that there was deadlocks in the application. After searching a while I noticed, that there were a lot of open connections. When you open SQL Server Management Studio and look at Management > Activity Monitor, you can see all opened connections in the connection pool. So the problem was, that with every request, a new connection was created, instead of using the existing ones, even if the state of the connections was sleeping. On SQL Express, if a specific limit of connections is reached, it'll wait for a connection to be released, but there is no release, so it threw a timeout error. But suprisingly, on SQL Server there also were a lot of connections created, but there were never a deadlock, which I can't explain. Also I can't explain, why it also worked for 2 months on SQL Express. The architecture: I have data classes, which are implementing IDisposable. In the dispose method, I call Dispose on the connection and set it to null. And in code I instanciate my data classes in using blocks. So on reaching the end of the using block the data class instance is disposed. In the dispose method the connection is disposed. So I thought, that everythink will work fine, but it doesn't. The problem was solved by calling Close() on the connection in the Dispose method in my data class just before calling conn.Dispose(). So does this make sense to you? The fact, that it solved my problem lets me believe to that solution, but I can't really say why. So if you have any ideas or knowledge, I'd love to hear it. Regards, Koray
We have a view looking at all the columns of a table on a SQL server 7.0 db. When we link this view to MSAccess, we are seeing the old data which was there previously in the original table! not the latest data. We recompiled the table and views and also re-linked the view to MSAccess with no help,we still see the old data only.This is happening only for one of the views not for all.Any help??? sa.