Does the time it takes to execute sp_repldone depend on the size of the transaction log it's working on?
Basically, I am executing sp_repldone to mark all of the transactions as replicated, so I am wondering it sp_repldone goes through each unreplicated transaction and marks it, or does it just go to the last unreplicated transaction and marks only the last one, and all of the previous ones are automatically consideres replicated?
Reason to ask: our transaction log is around 27GB right now, although normally it's approximately 1GB. The sp_repldone command has been executing for the past 30 hours, and since it doesn't give any information as to the status of the execution, I'm wondering if it's doing anything at all... :confused: I also checked the process, and it in not blocked by anything.
I have transactional replication going between servers A and B. SQL Server 2005 64 bit
A is Publisher B is Distributor and Subscriber
Now I am getting the following Log reader Agent Error Msg
The process could not execute 'sp_repldone/sp_replcounters' on 'A'. (Source: MSSQL_REPL, Error number: MSSQL_REPL20011) Get help: http://help/MSSQL_REPL20011 Only one Log Reader Agent or log-related procedure (sp_repldone, sp_replcmds, and sp_replshowcmds) can connect to a database at a time. If you executed a log-related procedure, drop the connection over which the procedure was executed or execute sp_replflush over that connection before starting the Log Reader Agent or executing another log-related procedure. (Source: MSSQLServer, Error number: 18752) Get help: http://help/18752 The process could not set the last distributed transaction. (Source: MSSQL_REPL, Error number: MSSQL_REPL22017) Get help: http://help/MSSQL_REPL22017 The process could not execute 'sp_repldone/sp_replcounters' on 'A'. (Source: MSSQL_REPL, Error number: MSSQL_REPL22037) Get help: http://help/MSSQL_REPL22037
Can anybody help me in solving this issue? I read that we have to delete the existing publication and subscription.
Is there any way to solve this problem without deleting the existing subscription and publication
Does anyone know why SSIS sometimes just sits in the Pre-Execute phase of a data flow and does nothing? It doesn't matter how elaborate the data flow is or the volume of data. It can sometimes take 80% of the task's run time.
Here's a little SP to break up those long-running, massively-locking, bring-app-to-a-halt queries. By default it does 500 rows at a time and allows for a maximum SQL query size of 4000 characters; it should be trivial to adjust those.
Cheers -b
CREATE PROCEDURE p_BatchExecute (@vcSQL varchar(4000)) AS set nocount on DECLARE @iRows int select @iRows=1 SET ROWCOUNT 500 WHILE @iRows>0 BEGIN print 'Executing batch of 500...' exec (@vcSQL) set @iRows=@@ROWCOUNT END GO
How can I execute a long running transaction using something similar to the fire and forget pattern? I intend to start the execution of a very long stored proc from within IIS. I would like to execute a sql script that will start the job and return immediately so that it doesn't hold an IIS thread.
I had got the below error when I execute a DELETE SQL query in SSIS Execute SQL Task :
Error: 0xC002F210 at DelAFKO, Execute SQL Task: Executing the query "DELETE FROM [CQMS_SAP].[dbo].[AFKO]" failed with the following error: "The transaction log for database 'CQMS_SAP' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
But my disk has large as more than 6 GB space, and I query the log_reuse_wait_desc column in sys.databases which return value as "NOTHING".
So this confused me, any one has any experience on this?
I'm looking for a way to refer to a package variable within any Transact-SQL code included in either an Execute SQL or Execute T-SQL task. If this can be done, I need to know the technique to use - whether it's something similar to a parameter placeholder question mark or something else.
FYI - I've been able to successfully execute Transact-SQL statements within the Execute SQL task, so I don't think the Execute T-SQL task is even necessary for this purpose.
I have a master package, which executes child packages that are located on a SQL Server. The Child packages execute other child packages which are also located on the SQL server.
Everything works fine when I execute in process. But when I set the parameter in the mater package ExecutePackageTask to ExecuteOutOfProcess = True, I get the following error
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "Row Count" (5349).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Custom Split" (6399).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "SCR Data Source" (5100).
Error: 0xC00470FE at DFT Load Data, DTS.Pipeline: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for component "DST_SCR Load Data" (6149).
The child packages all run fine when executed directly, and the master package runs fine if Execute Out of Process is False.
I have a SSIS package contains an "Execute SQL Task". The SQL will raise error or succeed. However, it sounds the package won't pick up the raised error?
Or is it possible to conditional run other control flow items according the the status of SQL task execution?
I am trying to execute a SP in the execute SQL task in SSIS 2005..
but I keep getting an error:
SSIS package "Package.dtsx" starting. Error: 0xC002F210 at Load_Gs_Modifier_1, Execute SQL Task: Executing the query "exec Load_GS_Modifier_1 ?, ?" failed with the following error: "Could not find stored procedure 'exec Load_GS_Modifier_1 ?, ?'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. Task failed: Load_Gs_Modifier_1 SSIS package "Package.dtsx" finis
I have set up two user parameters: startdate and enddate.. I am not sure what I am doing wrong????
My SQL statement: SQL Server seems to think my SQL Statement is too long........ I'm not sure why. "INSERT INTO tblChangeControls (Initiator,BPM,AddRemMod,ChangeType,SpecificChange,ChangeDescription,TechnicalDescriptionOfChange,DateInitiated) select 'Benjamin Short' ,'6' ,'2' ,'Printer' ,'40' ,'' ,'';" Error message: The identifier that starts with 'INSERT INTO tblChangeControls (Initiator,BPM,AddRemMod,ChangeType,SpecificChange,ChangeDescription,TechnicalDescriptionOfChange,' is too long. Maximum length is 128.
Hey, ive got a listof events that have occured on my site, updates etc. and Im trying to show how long ago the updates happened. For instance, say the date of an update is : 16/10/2007 15:16:03 I want the Label to say "Happened over 2 months ago" etc.Now ive tryed to use an IF statment but I cant seem to get it right : DateTime dt = Convert.ToDateTime("16/10/2007 15:16:03"); if(dt.ToShortDateString() == DateTime.Now.ToShortDateString()) { //happened today if(dt.ToShortTimeString() == DateTime.Now.ToShortTimeString()||dt.ToShortTimeString() < DateTime.Now.AddMinutes(-1).ToShortTimeString()) { //happened within a minute UpdateLabel.Text = "About A Minute Ago"; } } else if (dt.ToShortTimeString() == DateTime.Now.AddDays(-1).ToShortTimeString()) { //happened yesturday UpdateLabel.Text = "Updated Yesturday"; } else if (dt.ToShortDateString() == DateTime.Now.ToShortDateString() || dt.ToShortDateString <= DateTime.Now.AddDays(-7).ToShortDateString()) { UpdateLabel.Text = "Updated Last Week"; }Any ideas where Im going wrong? Im probally staring it straight in the face, but I cant see it. Thanks in advance John
SQL 7.0 SP1 - NT 4.00 EE SP6a - Cluster - Multiprocessor(2)-VB 6.0 SP4. Maybe a very stupid question,but I need to know if it is possible to assign a low priority to a VB exe that query for long time eating all the CPU (100% in task manager). Thank you. Franco
I have a challenge and am not very versed with coding this situation. The situation is I am trying to put long text (more that 255 char) in a ntext field in SQL7, but the max char can input is only 255. Anyone can advice on how to put in more that 255 char in a field as well as how to select back the record whice more than 255 char.
Please give advice if you encounter this problem before.
Anyone have same sample on how to put in and retrive long text (more then 255 characters). Please Comment.
------------ Ray Miao at 7/19/00 8:24:08 AM
You should use readtext and writetext to access text column. By the way, nchar and nvarchar can hold up to 4000 characteres.
------------ Wong at 7/18/00 11:43:15 PM
I have a challenge and am not very versed with coding this situation. The situation is I am trying to put long text (more that 255 char) in a ntext field in SQL7, but the max char can input is only 255. Anyone can advice on how to put in more that 255 char in a field as well as how to select back the record whice more than 255 char.
Please give advice if you encounter this problem before.
How long exactly is the text datatype in non-unicode characters for SQLS7? I guess I need to find out how long my input is and break it up into multiple text fields, but what's the max length?
I have created a database which has a table whose name is 128 characters long; this table has a column whose name is 128 characters long. When I issue the command sp_helpconstraint table_name, SQL Server 7 returns the message "string or binary data would be truncated."
Hi When writing a stored procedure,sometimes the line of sql code could be very long and be more than 128 caracthers. I got an error message saying that the code is vey long and the maximum is 128. But at the same time, this is erratic in MSSQL server because it works sometimes even with a line of code of more than 800 caracters... Have you ever experienced the same problem?? What is the solution? Thanks
I am upgrading SQL 7 to SQL 2000. The upgrade process has been running for over 3 hours. I have about 9G in databases on the server. Does anyone know if this is taking too long? Compaq Proliant 7000, dual Xeon II processors, 786MB memory. "Setup is updating your system..." has been running for 3 hours... TIA Janet
I have a query that takes 1,5second to execute, but only 150ms of CPU. The query is quite simple, just one where statement against a clustered index.
SQL Server Execution Times: CPU time = 156 ms, elapsed time = 1595 ms.
SELECT column1, column3, column4, ..., column10 FROM table WHERE column2 IN (37, 41, 43, 45, 49, 53, 55) ORDER BY column3 DESC
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC)) |--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=37 OR [u].[searchparamid]=41 OR [u].[searchparamid]=43 OR [u].[searchparamid]=45 OR [u].[searchparamid]=49 OR [u].[searchparamid]=53 OR [u].[searchparamid]=55 OR [u].[searchparamid]=59) ORDERED FORWARD)
I have tried to rewrite the query to an INNER JOIN instead.
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC)) |--Nested Loops(Inner Join, OUTER REFERENCES:([spal].[number])) |--Index Seek(OBJECT:([MP].[dbo].[__search_parameters_lookup].[IX___search_parameters_lookup] AS [spal]), SEEK:([spal].[hash]=-1726604993) ORDERED FORWARD) |--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=[spal].[number]) ORDERED FORWARD)
but the query still takes 1,5 seconds.
It spends 59% (according to execution plan) of sorting. 14% for the index seek of the __search_parameters_lookup table and then 24% of a clustered index seek of the __searchtest table.
How come it only uses that small of CPU but it still takes 1,5 seconds? It seems to be reading from memory as well so it shouldnt be an IO-problem?
The index I have on the table is a clustered index on (column 2).
Any ideas of how I can improve this? I have tried with DTA, also with a non clustered index on column3.
If I remove some columns from the SELECT-list the query will execute alot faster:
SQL Server Execution Times: CPU time = 32 ms, elapsed time = 32 ms.
Booth the CPU and the elapsed time goes down and now appears to be more normal.
So there seems to be a problem caused by data transfer. I tried to do a remake and normalize the table and when I do that I get the query execute with a speed of 400ms CPU and 400ms total. And this is still the exact same result, so why does it only spend 400ms of "rendering" or fetching the data when the tables are normalized but 1500ms when its denormalized?
Any ideas?
I am running Microsoft SQL Server 2000 - 8.00.2039
I have tranferred my database from Access to SQL. In some cases where there is a very long text in the field of "content" (thats the name of the field) appears the sign <Long Text> while in some other fieldsof the "content" the text appears normally...
I have used ntext 16 as a datatype... How can i dissapear this sign from the field...is there any way?? Thank you
Maybe a 'dumb' question, but I can't figure it out. I can't tell from the logs since they are periodically truncated, so how can I tell when the system was last restarted?
I would like to know the period of time that a dmv has been updating a particular view.
The following ALTER takes about 2 hours in my environment. totalnumber of records is about 2.8 million. IS this typical? Is there away to speed up this process.BEGIN TRANSACTIONSET QUOTED_IDENTIFIER ONSET TRANSACTION ISOLATION LEVEL SERIALIZABLESET ARITHABORT ONSET NUMERIC_ROUNDABORT OFFSET CONCAT_NULL_YIELDS_NULL ONSET ANSI_NULLS ONSET ANSI_PADDING ONSET ANSI_WARNINGS ONCOMMITBEGIN TRANSACTIONALTER TABLE dbo.PERSON ADDFL_CNSL_NTFY char(1) NOT NULL CONSTRAINT DF_PERSON_FL_CNSL_NTFYDEFAULT '',CD_INTRP_NEED smallint NOT NULL CONSTRAINT DF_PERSON_CD_INTRP_NEEDDEFAULT 0GOCOMMITThanks for any tips on this issue....
Hi,I am still not very proficient in SQLServer. So apology if thequestion sounds basic.We have a script to clean old unwanted data. It basically deletesall rows which are more than 2 weeks old. It deletes data from33 tables and the number of rows in each table runs into few millions.What I see in the script (not written by me :-) ) is that all data isdeleted within a single BEGIN TRANSACTION and COMMIT TRANSACTION. AsI have background in informix, such an action in Informix may resultin "LONG TRANSACTION PROBLEM". Does SQLServer have a similar concept.Also won't it have performance problem if all rows are marked lockedtill they are committed.TIA.
Msg 103, Level 15, State 4, Line 3 The identifier that starts with ' "C:Program FilesMicrosoft SQL Server90COM ablediff" -sourceserver "VSNET1" -sourceuser "sa" -sourcepassword "sysadm" -sour' is too long. Maximum length is 128.
When i split the Identifier as
SET QUOTED_IDENTIFIER ON DECLARE @sFull varchar(1000) DECLARE @s1 varchar(1000) DECLARE @s2 varchar(1000) DECLARE @s3 varchar(1000) DECLARE @s4 varchar(1000) SET @s1= '""C:Program FilesMicrosoft SQL Server90COM ablediff""' SET @s2= ' -sourceserver "VSNET1" -sourceuser "sa" -sourcepassword "sysadm" -sourcedatabase "PMSTEMP" ' SET @s3=' -sourcetable "DEMAT" -destinationserver "VSNET1" -destinationuser "sa" -destinationpassword ' SET @s4=' "sysadm" -destinationdatabase "PMSRESTORE" -destinationtable "DEMAT" -f "c:Diff"' SET @sFull = @s1 +@s2 + @s3 + @s4 EXEC XP_CMDSHELL @sFull
this executes fine but the optput is as
The input line is too long. NULL
What is this? how could i execute such long commad using CMDSHELL?
Maintenance Plans Broken Database Mail Broken Reporting Services Report Viewer web part displays US Format dates instead of report design English-Australian. . In Detail:
Microsoft SQL Server 2005 SP2 was downloaded from Microsoft.com. This download occurred after 5 March 2007 and the file version was 9.0.3042.1 (SQLServer2005SP2-KB921896-x86-ENU.exe) During the Installation of SP2 by a Domain Administrator an error occurred on the Database Engine. A Hotfix Exception was raised and the setup completed with errors. The Error Log reported a problem: ---------------------------------------------------------------------------------- Product : Database Services (MSSQLSERVER) Product Version (Previous): 2047 Product Version (Final) : Status : Failure Log File : C:Program FilesMicrosoft SQL Server90Setup BootstrapLOGHotfixSQL9_Hotfix_KB921896_sqlrun_sql.msp.log Error Number : 29534 Error Description : MSP Error: 29534 Service 'MSSQLSERVER' could not be started. Verify that you have sufficient privileges to start system services. The error code is (1067) The process terminated unexpectedly.
The SQL Server service could not be manually started. An error was reported in the event viewer: One or more files do not match the primary file of the database. If you are attempting to attach a database, retry the operation with the correct files. If this is an existing database, the file may be corrupted and should be restored from a backup.
It was noted that a new file mssqlsystemresource.ldf had been copied to the %data% folder. Note: As part of separating data and log files, both the master and mssqlsystemresource databases had been previously moved to another volume %data% whilst their respective log files had been moved to a separate volume also %log%. This had occurred after SP1 was initially installed and months prior to the SP2 install using Microsoft KB articles to detach and reattach the databases. It appears that the SP2 installer does not work correctly if mssqlsystemresource mdf and ldf files are not in the same location. Whilst the Engine could not be started €“ the new mssqlsystemresource.ldf file was manually moved to the %log% file, overwriting the previous version. The SQL Server Service was now able to start without any errors. Since the SQL Server Service has been started two major errors have become apparent: The following error is recorded when trying to create a new maintenance plan: TITLE: Microsoft SQL Server Management Studio ------------------------------
Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
------------------------------
Invalid column name 'from_msx'. Invalid column name 'has_targets'. (Microsoft SQL Server, Error: 207)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.3042&EvtSrc=MSSQLServer&EvtID=207&LinkId=20476
Also when using the maintenance plan wizard the following error is recorded when the Wizard attempts to save the plan at the end: TITLE: Maintenance Plan Wizard Progress ------------------------------
Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
------------------------------
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
------------------------------
Invalid column name 'msx_job_id'. (Microsoft SQL Server, Error: 207)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.3042&EvtSrc=MSSQLServer&EvtID=207&LinkId=20476
------------------------------
Database Mail is also broken on the server. The Database Mail Log reports an €œActivation Failure€? when a test email is sent. (the email does not arrive). The SQL Log and event viewer also report the following message: The activated proc [dbo].[sp_sysmail_activate] running on queue msdb.dbo.ExternalMailQueue output the following: 'Error executing extended stored procedure: Invalid Parameter'
Another error affects Reporting Services reports that are accessed using the SharePoint V2.0 Web parts ( rsWebparts.cab) . The date picker in the Report Viewer web part now only recognises US format dates. It is important to note that the same reports run through the Report Manager web site work correctly using DD/MM/YYYY (English-Australian ) date format for the report picker. It is noted that the rsWebparts.cab file was not updated by SQL 2005 SP2 install. However, the behaviour of the Report Viewer web part has definitely been affected by the SP2 install. Are the first two issues related? Are they due to the fact that mssqlsystemresource mdf and ldf files had been moved to an unexpected location prior to SP2? How can this be fixed? Are the Reporting Services SharePoint (V2) web parts going to be updated? We are not in a position to upgrade our Portal 2003/SharePoint Servcies V2 installation at the present time.
Here is a fairly basic question in which I'd like assistance. In determining a standard for a schema, for simplicity, my thinking is that using the long data type would meet our purposes.
Are there any thoughts on mapping out where we would not need to take advantage of long vs short? I realize the space but for this particular table, space won't an issue.
I have a program that allows various users to login to the sql server. On the login window there is a dropdown that lists all the databases on the server and the user can select which database they wish to login to. To get the list of databases on the server I use a login created for the program we will call 'worker'. The program logs in as worker and runs "SELECT * FROM sys.databases". Before I load the login window, I check to make sure the login worker exists or has been corrupted (in case a user deletes it or changes the password, etc) and if it has been corrupted or deleted, I recreate the login using the "CREATE LOGIN worker WITH PASSWORD = '123' " function.
The problem occurs after I recreate the worker login. The login is created successfully but when the login window appears and logs in as the worker login to get a list of the databases, I get an error saying the login failed for worker. If open the login window again, everything works fine (the worker login isn't created again as it already exists). Further, if I run the same code but put a break point in the code and step through everything, it works the first time.
Is there an amount of time that is necessary to wait for the CREATE LOGIN function to be executed?
I have a query that takes 1,5second to execute, but only 150ms of CPU. The query is quite simple, just one where statement against a clustered index.
SQL Server Execution Times: CPU time = 156 ms, elapsed time = 1595 ms.
Code Snippet
SELECT column1, column3, column4, ..., column10 FROM table WHERE column2 IN (37, 41, 43, 45, 49, 53, 55) ORDER BY column3 DESC
Code Snippet
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC)) |--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=37 OR [u].[searchparamid]=41 OR [u].[searchparamid]=43 OR [u].[searchparamid]=45 OR [u].[searchparamid]=49 OR [u].[searchparamid]=53 OR [u].[searchparamid]=55 OR [u].[searchparamid]=59) ORDERED FORWARD)
I have tried to rewrite the query to an INNER JOIN instead.
Code Snippet
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC)) |--Nested Loops(Inner Join, OUTER REFERENCES:([spal].[number])) |--Index Seek(OBJECT:([MP].[dbo].[__search_parameters_lookup].[IX___search_parameters_lookup] AS [spal]), SEEK:([spal].[hash]=-1726604993) ORDERED FORWARD) |--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=[spal].[number]) ORDERED FORWARD)
but the query still takes 1,5 seconds.
It spends 59% (according to execution plan) of sorting. 14% for the index seek of the __search_parameters_lookup table and then 24% of a clustered index seek of the __searchtest table.
How come it only uses that small of CPU but it still takes 1,5 seconds? It seems to be reading from memory as well so it shouldnt be an IO-problem?
The index I have on the table is a clustered index on (column 2).
Any ideas of how I can improve this? I have tried with DTA, also with a non clustered index on column3.
If I remove some columns from the SELECT-list the query will execute alot faster:
SQL Server Execution Times: CPU time = 32 ms, elapsed time = 32 ms.
Booth the CPU and the elapsed time goes down and now appears to be more normal.
So there seems to be a problem caused by data transfer. I tried to do a remake and normalize the table and when I do that I get the query execute with a speed of 400ms CPU and 400ms total. And this is still the exact same result, so why does it only spend 400ms of "rendering" or fetching the data when the tables are normalized but 1500ms when its denormalized?
Any ideas?
I am running Microsoft SQL Server 2000 - 8.00.2039
I have a pretty complex query that aggregates lots of data and inserts multiple rows of that data into a reporting table. When I call this SPROC from SQL Server Management Studio, it executes in under 3 seconds. When I try to execute the same SPROC using .NET's SqlCommand object the query runs indefinitely until the CommandTimeout is reached. Why would this SPROC behave differently with the same inputs, but being called from .NET? Thanks for your help!