Adding A New Record Takes Longer And Longer -- Archive? (was Table Help)
Mar 1, 2005
Hi we have a table with about 400000 records in it. It starting to take longer and longer to add a new record. I was thinking of creating another identical table and archiving off most of the records every month (we are now adding about about 4000 records a day) . Is this the best thing to do?
I don't know a lot about sql server so any help or suggestions would be great
Hi, Is there any way to audit or record in SQL Server 2000 what queries are the ones that consume more resources in the server so I can focus and improve them?
I could use a little help here. We have a stored procedure that runs on SQL2000 and for a large dataset only takes 1-2 minutes. On SQL2005 however, it takes around 25 minutes. Any advice or insight anyone could give would be great.
Here's the stored procedure:
CREATE PROCEDURE daa_upd_relationship_balance_hist AS begin tran insert fldarts..daa_relationship_bal_hist select <-- list snipped --> from daa_relationship_bal drb, daa_user_review dur where drb.acct_no = dur.acct_no and drb.control_2 = dur.control_2 and drb.nb_gl_cost_ctr = dur.nb_gl_cost_ctr and drb.nb_dda_sav_type = dur.nb_dda_sav_type and drb.acct_no+drb.control_2+drb.nb_gl_cost_ctr+drb.nb_dda_sav_type+convert(char(10),dur.activity_date, 101) not in (select acct_no+control_2+nb_gl_cost_ctr+nb_dda_sav_type+convert(char(10), activity_date, 101) from fldarts..daa_relationship_bal_hist) if @@error = 0 commit tran else begin rollback tran print '!!!Error (daa_relationship_bal_hist) : Relationship Balance History not updated' end return GO
So we have three tables. Here's a schema for each and the indexes on them. I've omitted columns from the tables that are not utilized in this query.
daa_relationship_bal:
CREATE TABLE [daa_relationship_bal] ( [control_2] [char] (3) NOT NULL , [nb_gl_cost_ctr] [char] (7) NOT NULL , [acct_no] [char] (14) NOT NULL , [nb_dda_sav_type] [char] (3) NOT NULL )
index:
idx_upd_balance_hist nonclustered located on PRIMARY acct_no, control_2, nb_gl_cost_ctr, nb_dda_sav_type
daa_user_review:
CREATE TABLE [daa_user_review] ( [control_2] [char] (3) NOT NULL , [nb_gl_cost_ctr] [char] (7) NOT NULL , [acct_no] [char] (14) NOT NULL , [nb_dda_sav_type] [char] (1) NOT NULL , [activity_date] [datetime] NULL )
index:
PK_daa_user_review_1__37 nonclustered, unique, primary key located on INDEXES control_2, nb_gl_cost_ctr, acct_no, nb_dda_sav_type
daa_relationship_bal_hist:
CREATE TABLE [daa_relationship_bal_hist] ( [control_2] [char] (3) NOT NULL , [nb_gl_cost_ctr] [char] (7) NOT NULL , [acct_no] [char] (14) NOT NULL , [nb_dda_sav_type] [char] (3) NOT NULL , [activity_date] [datetime] NOT NULL )
index:
PK_daa_rel_bal_hist_1__37 nonclustered, unique, primary key located on PRIMARY control_2, nb_gl_cost_ctr, acct_no, nb_dda_sav_type, activity_date
Any help on this would be great. If more information is needed, please let me know.
I'm running an ISP database in SQL 6.5 which has a table 'calls'. When thenew month starts I create a new table with the same fields and move the dataof previous month into that table and delete it from calls. So 'calls' holdsthe data of only the current month. for example at the start of november2003 I ran the queriesCreate Table Oct2003Calls {................................}/* Now insert data of october into new table */INSERT Oct2003CallsSELECT *FROM callsWHERE calldate < '11/1/03'/* Finaly delete october data from calls table */DELETE FROM callsWHERE calldate < '11/1/03'The problem is that while the insert query takes about 2 minutes to executethe delete queries takes over 10 minutes to affect the same no. of rows. Whyis that?This causes problems because user authentication stops when this query isrunning which means users cant connect to the internet.
Hi there... I wrote a SP to check for different types of exceptions in a few database tables. When I was writing the scripts, everything seemed to execute fairly quickly and I was satisfied with the performance. When I completed the scripts and compiled them into a stored procedure and ran it (using Exec), it took a lot longer to run than I thought it would. So I went through each section of the script and ran each portion individually to see which part was taking so long.... but all the scripts ran very quickly. The individual scripts, run separately, took a combined total of 0:26 to run.... but the SP was taking 1:30 to run. (????) So then I took ALL the script contained in the SP and ran it by itself in the Query Analyzer.... it took 0:27 to run. (??????)
So basically... the script that I wrote takes 27 seconds to execute, when run by itself in the Query Analyzer... but when I take that very same script and turn it into a Store Procedure and run it, it takes a minute and a half.
Any ideas why?? I thought SP's were supposed to run faster because they're pre-compiled.
dbase in question is only about 5GB on a 450GB partition.
at the begining of the month I run:
BACKUP LOG [objectstore] TO DISK ='D:BackupsProdackup_objectstore.BAK' WITH NOFORMAT , INIT , NAME = N'objectstore backup'
and then every 10 minutes (within working hours) for the rest of the month I run:
BACKUP LOG [objectstore] TO DISK ='D:BackupsProdackup_objectstore.BAK' WITH NOFORMAT , NOINIT , NAME = N'objectstore backup'.
The amount of data that gets backed up is the same through out the month and the loading on the server as a whole also stays constant throughout the month - NOTHING increases throughout the month that would affect this server in any way, yet at the begining of the month the backup takes 10 seconds, and at the end, it gets up to 5-6 minutes.
why?
THanks
Alastair Jones.
"A computer once beat me at chess - but it was no match for me at kick boxing" - Emo Phillips.
We are importing xer formats through the wizard to sqlserver database and It takes upto 35-45 mins for each import (single project), any option to reduce the time.Is they any other import options - which can give us faster results.
The following code is taking longer and longer to run. I am not talking about the gradualy increase in size. this job has been taking 30-40 mins normaly and in the last few days it has gone 1hr to 2 hr to 3 hr... ANy ideas why this is happening? I can not see and other jobs running at this time.
declare @filename varchar(255)
set @filename = (select top 1 physical_device_name from ****.msdb.dbo.backupset bs, ****.msdb.dbo.backupmediafamily bf where bs.media_set_id=bf.media_set_id and database_name = 'Live_PRD' and backup_start_date>getdate()-1 and type = 'D' order by backup_start_date desc)
restore database REPORTS_REP from disk=@filename with move 'LIVE_PRD_Data' to 'T:SOUTHREPORTS_REP_Data.mdf', move 'LIVE_PRD_Log' to 'U:SOUTHREPORTS_REP_Log.ldf', move 'LIVE_PRD_Log2' to 'U:SOUTHREPORTS_REP_Log2.ldf', replace, stats=2, recovery
Problem Summary: Merge Statement takes several times longer to execute than equivalent Update, Insert and Delete as separate statements. Why?
I have a relatively large table (about 35,000,000 records, approximately 13 GB uncompressed and 4 GB with page compression - including indexes). A MERGE statement pretty consistently takes two or three minutes to perform an update, insert and delete. At one extreme, updating 82 (yes 82) records took 1 minute, 45 seconds. At the other extreme, updating 100,000 records took about five minutes.When I changed the MERGE to the equivalent separate UPDATE, INSERT & DELETE statements (embedded in an explicit transaction) the entire update took only 17 seconds. The query plans for the separate UPDATE, INSERT & DELETE statements look very similar to the query plan for the combined MERGE. However, all the row count estimates for the MERGE statement are way off.
Obviously, I am going to use the separate UPDATE, INSERT & DELETE statements. The actual query plans for the four statements ( combined MERGE and the separate UPDATE, INSERT & DELETE ) are attached. SQL Code to create the source and target tables and the actual queries themselves are below. I've also included the statistics created by my test run. Nothing else was running on the server when I ran the test.
Server Configuration:
SQL Server 2008 R2 SP1, Enterprise Edition 3 x Quad-Core Xeon Processor Max Degree of Parallelism = 8 148 GB RAM
SQL Code:
Target Table: USE TPS; IF OBJECT_ID('dbo.ParticipantResponse') IS NOT NULL DROP TABLE dbo.ParticipantResponse;
I have a field which was varchar before but I changed it to text.
But i can't write in it enough text as i wish. This field is important becouse it holds SQL senences which i parse latter in my application and than i execute it.
it says <Long Text> and i can't enter any more characters.
Why does it take longer and longer for the same code to run very simply I have 8,0000,000 records I want to delete from a table . I have tried a few options
Option 1 a while loop which deletes 10,000 rows per loop starting from the earliest until it hits the cut of number I have set. THIS TOOK 5 HOURS
Option 2 created an SP which found the oldest 100,000 records then deleted them. If I run this SP manually it takes 30 €“ 60 secs. Which I thought was much better than above. So I put this SP in a while loop to run 80 odd times thinking the time it would take would be 80 mins a huge improvement.
But every time this SP is called it takes longer and longer (36,30,32,39,37,37,123,163,155,182€¦and so on(In seconds)).
All the sp is doing is as follows(8860000 is just to insure I don€™t delete to much). this sp is then called from in a while loop.
set @recnumber = (select top 1 recnumber from (select top 100000 recnumber from TabletodeleteFROM where recnumber < 8860000 order by recnumber asc ) TabletodeleteFROM order recnumber desc)
delete TabletodeleteFROM where recnumber < @recnumber
How is it possible that a 133MB SQL7 database, the backup of the database itself takes 2 seconds, the transaction log backup takes 25 minutes??? We are doing log backup every 10 minutes, and appending.
Hell All,To reproduce one of our cusotmer's probem, I need to make the SQL torun for more than a minutes before it returns the result set. I do nothave large amount of data in the database to simulate the dealy.Is there a way in SQL to cause the delay while returning the resultsetThanks for the help.RegardsRaj
Hello, been having issue w/ symantec backup exec 12 for the past few months trying to backup the database files on a x64 windows 2003 sp2 server. After talking to about 6 techs from symantec, I finally come across this error and am told to contact microsoft.
I had created a .udl file to test the connection between the media server and the SQL server. It tested successfully from the media -> sql. Then I created a .udl on the SQL server to test it there, however when I tried to select the OLE DB provider for microsoft SQL server, I got the following error: "Microsoft Data Link Error: Provider is no longer available.Ensure that the provider is installed properly."
If anyone knows how to fix this issue easily that would be great. Thank you in advance.
Any body please give me the details about how to use 'ALLOW_DUP_ROW' in a CREATE CLUSTERED INDEX statement.
I tried executing the below statement but it throws an error "CREATE INDEX option 'ALLOW_DUP_ROW' is no longer supported."(both in SQL Server 2000 and SQL Server 2005)
CREATE CLUSTERED INDEX index121 ON raj(j) WITH ALLOW_DUP_ROW
I am having a problem running a sql2k report service script on my new sql2005 server. It seems that the methods SessionHeader() and rs.render are no longer suport. can someone help me figure out what I need to do to continue automating my report running?
Dim skipreport as string = nothing dim skip as integer = 0
dim allbranchreport as string = nothing dim allbranch as integer = 0
dim report as CatalogItem dim repname as string = nothing dim specificreport as string = nothing dim filepath as string = nothing dim errorpath as string = nothing dim basefilepath as string = nothing dim baseerrorpath as string = nothing dim errorlog as integer = 0 Dim parameters() As ParameterValue = nothing Dim paramcount as integer = 0
' Render arguments Dim result As Byte() = Nothing Dim reportPath As String = nothing 'Dim format As String = "PDF" Dim format as string = "EXCEL" Dim historyID As String = Nothing Dim devInfo as string = Nothing Dim credentials As DataSourceCredentials() = Nothing Dim showHideToggle As String = Nothing Dim encoding As String Dim mimeType As String Dim warnings As Warning() = Nothing Dim reportHistoryParameters As ParameterValue() = Nothing Dim streamIDs As String() = Nothing Dim sh As New SessionHeader() rs.SessionHeaderValue = sh Dim omitdocmap as string = "True" '**************************************
'************************************** 'Repset run Dim repset as string = "" for each repset in repsetlist specificreports = specificreportsstart if specificreports = 0 then select case repset.tolower case "all" specificreports = 0 case "brkctr" specificreports = 1 reportcount = reportset_brkctr.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_brkctr(reportloop) next case "cashmgmt" specificreports = 1 reportcount = reportset_cashmgmt.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_cashmgmt(reportloop) next case "com" specificreports = 1 reportcount = reportset_com.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_com(reportloop) next case "com_vw" specificreports = 1 reportcount = reportset_com_vw.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_com_vw(reportloop) next repset = "com" case "comcons" specificreports = 1 reportcount = reportset_comcons.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_comcons(reportloop) next case "cons" specificreports = 1 reportcount = reportset_cons.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_cons(reportloop) next case "cons_vw" specificreports = 1 reportcount = reportset_cons_vw.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_cons_vw(reportloop) next repset = "cons" case "dep" specificreports = 1 reportcount = reportset_dep.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_dep(reportloop) next case "staff" specificreports = 1 reportcount = reportset_staff.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_staff(reportloop) next omitdocmapbranch = "False" case "staff2" specificreports = 1 reportcount = reportset_staff2.getupperbound(0) redim specificreportslist(reportcount) for reportloop = 0 to reportcount specificreportslist(reportloop) = reportset_staff2(reportloop) next omitdocmapbranch = "False" case else specificreports = 1 redim specificreportslist(0) specificreportslist(0) = "Undefined" end select
if repset.tolower = "staff2" then repset = "staff" End if
'Reports List Dim reports() as CatalogItem reports = rs.ListChildren(basereportpath, False)
'Execution if datasourceok = 1 then for each report in reports
skip = 0 allbranch = allbranchrun
for each skipreport in skipreports if report.name.tolower = skipreport.tolower then skip = 1 next
if skip = 0 then if specificreports = 1 then skip = 1 for each specificreport in specificreportslist if report.name.tolower = specificreport.tolower then skip = 0 next end if end if
if skip = 0 then for each allbranchreport in allbranchreports if report.name.tolower = allbranchreport.tolower then allbranch = 1 next
Redim parameters(1) parameters(0) = New ParameterValue() parameters(0).name = "branchall" parameters(1) = New ParameterValue() parameters(1).Name = "branch"
'5 parameters reports repname = report.name if (repname.substring(0,5).tolower = "loans" or repname.substring(0,4).tolower = "locs") then paramcount = 4 if (repname.length >= 13) then if repname.substring(0,13).tolower = "loans_beacons" then paramcount = 1 end if
if paramcount = 4 then paramcount = 0 redim preserve parameters(4)
parameters(2) = New ParameterValue() parameters(2).name = "comconsall" if repset.tolower = "cons" or repset.tolower = "com" then parameters(2).value = 0 else parameters(2).value = 1
parameters(3) = New ParameterValue() parameters(3).Name = "comcons" if repset.tolower = "com" then parameters(3).value = "commercial" else parameters(3).value = "consumer"
parameters(4) = New ParameterValue() parameters(4).Name = "brokered" parameters(4).value = 0 if repset.substring(0,3).tolower = "brk" then parameters(4).value = 1 end if
'All Branches Reports if allbranch = 1 then
parameters(0).value = "1" parameters(1).value = "admin" if client = "CSCU" then parameters(1).value = "newwst" if clienttype = "vw" then parameters(1).value = "1"
Try result = rs.Render(reportPath, format, historyID, devInfo, parameters, _ credentials, showHideToggle, encoding, mimeType, reportHistoryParameters, warnings, streamIDs) sh.SessionId = rs.SessionHeaderValue.SessionId Catch e As SoapException errorlog = 1 Console.WriteLine(e.Detail.OuterXml) Catch f as Exception errorlog = 1 Console.writeline(f.message) End Try
' Create an error log If errorlog <> 0 then Try Dim stream As FileStream = File.Create(errorpath, 1024) Console.WriteLine("Errorlog created: " & errorpath) stream.Close() Catch g As Exception Console.WriteLine(g.Message) End Try End if
' Write the contents of the report to a file. If errorlog <> 1 then Try Dim stream As FileStream = File.Create(filepath, result.Length) stream.Write(result, 0, result.Length) Console.WriteLine("Result written to file: " & filepath) stream.Close()
Catch e As Exception Console.WriteLine(e.Message) End Try end if
'Other Reports else parameters(0).value = "0"
for each branch in branches parameters(1).value = branch
errorlog = 0 if branch <> "h/o" then filepath = basefilepath & branch & "" & report.name & ".xls" else filepath = basefilepath & "ho" & report.name & ".xls" end if if branch <> "h/o" then errorpath = baseerrorpath & branch & "_" & report.name & ".txt" else errorpath = baseerrorpath & "ho_" & report.name & ".txt" end if reportpath = basereportpath & "/" & report.name omitdocmap = omitdocmapbranch devinfo = "<DeviceInfo><OmitDocumentMap>" & omitdocmap & "</OmitDocumentMap></DeviceInfo>"
Try result = rs.Render(reportPath, format, historyID, devInfo, parameters, _ credentials, showHideToggle, encoding, mimeType, reportHistoryParameters, warnings, streamIDs) sh.SessionId = rs.SessionHeaderValue.SessionId Catch e As SoapException errorlog = 1 Console.WriteLine(e.Detail.OuterXml) Catch f as Exception errorlog = 1 Console.writeline(f.message) End Try
' Create an error log If errorlog <> 0 then Try Dim stream As FileStream = File.Create(errorpath, 1024) Console.WriteLine("Errorlog created: " & errorpath) stream.Close() Catch g As Exception Console.WriteLine(g.Message) End Try End if
' Write the contents of the report to a file. If errorlog <> 1 then Try Dim stream As FileStream = File.Create(filepath, result.Length) stream.Write(result, 0, result.Length) Console.WriteLine("Result written to file: " & filepath) stream.Close()
Catch e As Exception Console.WriteLine(e.Message) End Try end if
The specified script failed to compile with the following errors: J:PRA Publisher> "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727vbc.exe" /t:exe /main:MainModule /utf8output /R:"System.dll" /R:"System.Xml.dll" /R:"System.Web .Services.dll" /R:"C:Program FilesMicrosoft SQL Server90Toolsinn s.exe" / out:"C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.exe" /debug- "C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.0.vb" "C:Docu ments and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb"
Microsoft (R) Visual Basic Compiler version 8.0.50727.42 for Microsoft (R) .NET Framework version 2.0.50727.42 Copyright (c) Microsoft Corporation. All rights reserved.
C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb(253) : error BC30002: Type 'SessionHeader' is not defined.
Dim sh As New SessionHeader() ~~~~~~~~~~~~~ C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb(254) : error BC30456: 'SessionHeaderValue' is not a member of 'Microsoft.SqlServer.Reporting Services2005.ReportingService2005'.
rs.SessionHeaderValue = sh ~~~~~~~~~~~~~~~~~~~~~ C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb(434) : error BC30456: 'Render' is not a member of 'Microsoft.SqlServer.ReportingServices2005 .ReportingService2005'.
result = rs.Render(reportPath, format, historyID, de vInfo, parameters, _ ~~~~~~~~~
C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb(436) : error BC30456: 'SessionHeaderValue' is not a member of 'Microsoft.SqlServer.Reporting Services2005.ReportingService2005'.
sh.SessionId = rs.SessionHeaderValue.SessionId ~~~~~~~~~~~~~~~~~~~~~ C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb(496) : error BC30456: 'Render' is not a member of 'Microsoft.SqlServer.ReportingServices2005 .ReportingService2005'.
result = rs.Render(reportPath, format, historyID , devInfo, parameters, _ ~~~~~~~~~
C:Documents and SettingsetopLocal SettingsTemp1fw7qn9k.1.vb(498) : error BC30456: 'SessionHeaderValue' is not a member of 'Microsoft.SqlServer.Reporting Services2005.ReportingService2005'.
we are running a Windows Application(.Net 2.0) against a MS SQL Server 2000 on WinXP. The application is able to succesefully connect to Database and execute "smaller stored procedure". The application also must execute a stored proc that returns 2M rows for export into a flat file. When executing the stored proc from a remote box with 512M or less RAM(slow box), the application receives the error:
A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
If a box has 1G or RAM or more, application seems running fine.
Now, I have the application running on a slow box, watching memory usage, and it seems fine - the memory acquired by the application never exceded 39M of RAM, so I'm still not clear why connection is being closed. Also, there seem to be no mesages from MS SQL server in Event Log.
The problem is that I have to make the app run on a "slow box".
Any ideas or comments are appreciated (somebody, please, help me ;-) )
We moved a 2000 database to another platform by restoring the database. It took a lot longer than I expected. Would it take less time to restore it a second time to the same target database since the allocations are already there?
Currently I have a server that has two instances of SQL Server that areboth heavily used. We are moving the databases that are on the defaultinstance of this server to a new server. Since the old server willonly need one instance (the current named instance) of SQL Server isthere a good way to remove the default instance and make the namedinstance the default? If this change is possible will the name of thecurrent named instance need to be changed? Will all of the programscurrently accessing the named instance need to change their connectionstring?
Hi folks,A DTS package we have run for years now no longer works. The specificpart that is not working is a subquery in the SOURCE object of atransformation. The source is based on a Microsoft Data Link to aSybase database (DSN changed a couple months ago but the connectionstring was updated successfully for the new 12.51 version of ASE) andthe destination is a link to a local SQL Server 2000 database.The transformation has always worked and when I remove the subqueryeverything works OK. The problem is that I need the subquery!Does anyone have a clue what is going on?Here is the full query.select TableKey = RVSN_TYPE_ID,TableCode = RVSN_TYPE,RevisionDate = RVSN_DATE,RevisionReasonCode = RSN_CODE,RevisionGroup = RVSN_GRP_ID,RevisedField = (select L.FieldIDfrom tempdb.guest.lkpRevisedField Lwhere L.TableID = R.RVSN_TYPEand L.FieldName = R.CHNG_FLD),RevisedValue = OLD_FLD_VAL,RevisionTimestamp = RVSN_TIMESTAMPfrom RVSN R,tempdb.guest.MaxTimeStamp TSwhere R.RVSN_TIMESTAMP TS.Rtimestampand R.RVSN_TIMESTAMP is NOT NULLJohn H.
I have a table with some sensitive customer data in it. I am nowkeeping all the data in another table, and encrypting it. I want toget rid of the original unencrypted data and be sure that it is nolonger anywhere on disk. Should I drop the table, or first delete therows and then do a dump tran? I'm not sure how to know if the data isactually physically deleted from disk, or if it's still there, butjust in blocks that get marked as available. Any guidance would begreatly appreciated.Thanks,Bruce
I have SQL Server 2005 installed on my machine and I am firing following query to insert 1500 records into a simple table having just on column.
Declare @i int Set @i=0 While (@i<1500) Begin Insert into test2 values (@i) Set @i=@i+1 End
Here goes the table definition,
CREATE TABLE [dbo].[test2]( [int] NULL ) ON [PRIMARY]
Now the problem with this is that on one of my server this query is taking just 500ms to run while on my production and other test server this query is taking more than 25 seconds. Same is the problem with updates. I have checked the configurations of both the servers and found them to be the same. Also there are no indexes defined on either of the tables. I was wondering what can be the possible reason for this to happen. If any of u people has any pointers regarding this, will be really useful
I must have hit some setting to turn off the debug output, but I can't find how to get it back on. The debug output window displays when I execute a package, I just don't get the execution output I had been getting. Sorry to bother you with such trivia, but any help you can provide is appreciated. Thanks.
Hi. i got an error when i connect to sqlserver 2005 db with use of Sqlserver ManagementStudio.
A connection was successfully established with the server, but then an error occurred during the login process. (provider: TCP Provider, error: 0 - The specified network name is no longer available.) (.Net SqlClient Data Provider)(Microsoft Sql Error - 64)...
I have a merge (pull) replication between SQL Server 2005 and SQL Server Express clients.
Data synchronisation is ok, and I already made some schema changes (like adding new columns) at the publisher database which were applied to the subscriber as they should.
Now this doesn't work anymore. New columns added at the publisher (with ALTER TABLE or Management Studio) are no longer replicated to the subscribers.
I don't get any error messages, and I can't find any hints in the logs why this has stopped working.
The "Replicate_Schema_Changes" is still set to TRUE (in general, none of the publication options had been changed).
I tried sp_enumeratependingschemachanges to find out any "bad changes" put the sp returns nothing.
I remember that the last thing I did was adding a default constraint, which was replicated successfully to the subscribers.
I am in the process of migrating DTS 2000 packages via DTS Migration Wizard.
I ran all of the packages through the wizard. Then, I successfully completed changing the remaining issues in two of my packages to SSIS. Everything was fine. The second package I completed is called from a SQL Agent Job. I managed to get the job step converted to use SSIS package. It ran successfully.
Now, I am trying to open other packages to work on them and I get "Microsoft Visual Studio is unable to load this document: Value does not fall within the expected range." and a message window that says "Object reference not set to an instance of an object." This actually happens with the two that I migrated, also. I can no longer access them.
What happened?!?!?!? Does anyone have any experience with this?
Hello, I had a notification set up using xp_sendmail working fine for a while. Recently I updated the SQL Server (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) to sp3a and we moved the mailbox (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) that I was using to a Exchange 2003 server. I can still send my notification if I use the domain ID that runs the SQL service and the sa ID, but not the NT ID's that were running it before. I have users use NT authentication (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) from a domain that's different than the one that the SQL server (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) resides. There is a trust and nothing has changed with that. Below are the results when I try to run the notification using an NT ID. This ID has full permissions over the SQL server. SQL Mail session started.ODBC error 8198 (42000) Could not obtain information about Windows NT (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) group/user 'INTERNALxxx.xxx'. Stopped SQL Mail session. As you can see SQL Mail starts and stops ok, but I get the error on the xp_sendmail itself. I can run xp_logininfo to return all of the ID's using this NT login. But if I run xp_logininfo just on the problem ID, I get the following results. EXEC master..xp_logininfo@acctname = 'INTERNALxxx.xxx' Server: Msg 8198, Level 16, State 24, Procedure xp_logininfo, Line 58Could not obtain information about Windows NT group/user 'INTERNALxxx.xxx'. Here's when it works. EXEC master..xp_logininfo BUILTINAdministrators group admin BUILTINAdministrators NULLINTERNALxxx.xxx user admin INTERNALxxx.xxx NULLINTERNALSxx Axx group admin INTERNALSxx Axx NULLSISDOMsxx.dxx.axx user admin SISDOMsxx.dxx.axx NULLSISDOMSxx.Dxx.Axx group admin SISDOMSxx.Dxx.Axx NULLINTERNALlxxx.rxxx user user INTERNALlxxx.rxxx NULL The ones in bold work for everything. Please advise? Julie