I have an application that allows users to synchronize SQL CE data with the server. I would like to implement the functionality wherein if the synch process hits an error, the whole synchronization rolls back.
Is it possible to roll back the Synchronization process? I am using Merge replication over web in the application.
I wrote a stored procedure. It work properly. But I want to catch if any error occurs while executing it. And I want to make roll Back on error .And send the error OUTPUT How can I roolBack the command below and How can I send the error over OUTPUT parameter? execute(@cmdS) Thanks.
Hi, I'm using data flow as flat file source -> derived column ->Data Conversion-> oledb destination I have a fifty thousand record in a delimited file. while processing i got error in 45000th row. In database only 40000 records are there(why is there no all 45000 records). Is there any way to roll back all the 40000 records.
Hello, I do not know how to implement transaction roll back in asp.net application. I am using SqlHelper class to communicate with my sql db.Thanks, junior
Hi, I'm planning to install the cumulative hotfix (build 2187) on my sql 2000 clustering server (SP4, 2040). And I would like to know if the cumulative hotfix is able to roll back. If possible, please provide me any information about that. Thanks in advance.
hey guys i need help urgently i just ran an update statement without a where statement by mistake and i need to rollback this changes
i;m runnning sql express sp2 and the database is set to recovery model simple
i hace the mdf and ldf file the mdf is 1.5 gb ledf is 65 mb and the last changed date is the same time i ran the update statement so i think the changes are there in the ldf file but i just need to roll back to 1 minute b4 i run the update
My question is how to get IDENTITY_INSERT Incriment Primary Key ID roll back when the application fails. Using TransactionScope with single connection in DataObject. I am trying to insert row in two dataTable using its own tableAdapter (two tableAdapter). I have Product table with ProductID primary key with incriment identity. and that ProductID is used to insert row in ProductHistory Table. Lets say Product table has the last ProductID=8 (8 rows) and the next ProductID will be 9. When I insert row in both table and if the second table insert fails both gets roll back (which is good). but when I insert again another time the Product ID=10 not 9. Is there any way to roll back the ProductID in Product table so when i insert next time it has incriment number instead of gap.
I have a script contains multiple statements to update multiple tables. How can I make sure that either all statements get executed successfully or no changes apply to the tables (in case one or more errors occur)? I've been searching on Internet and it seems like I need to use Rollback and begin transaction.
I followed Remus' post about not doing 'fire and forget'.
I have two queues, ProcessingSendQueue and ProcessingReceiveQueue.
Once i receive from ProcessingReceiveQueue, activation SP gets called on ProcessingSendQueue and ends conversation.
However,if I then get an exception, the action of the activation SP ( ie the ending of the conversation ) does not get rolled back... is this possible? I would have thought that the action of the activation SP would get rolled back too.
My ProcessingSendQueue activation SP is as follows:
ALTER PROCEDURE [dbo].[ProcessingSendQueue_AP] AS BEGIN DECLARE @dh UNIQUEIDENTIFIER; DECLARE @message_type SYSNAME; DECLARE @message_body NVARCHAR(4000);
RECEIVE @dh = [conversation_handle], @message_type = [message_type_name], @message_body = CAST([message_body] AS NVARCHAR(4000)) FROM [ProcessingSendQueue];
IF @dh IS NOT NULL BEGIN IF @message_type = N'http://schemas.microsoft.com/SQL/ServiceBroker/Error' BEGIN RAISERROR (N'Received error %s from service [ProcessingReceiveQueue]', 10, 1, @message_body) WITH LOG; END END CONVERSATION @dh; END END
Goofed up and ran an update query. It messed up all the data in a single table. I'm trying not to restore the table from a previous backup since the backup is more than 20 GB. It's going to take forever to restore it. Any advice would be much appreciated!
Can I roll back certain query(insert/update) execution in one page if query (insert/update) in other page execution fails in asp.net.( I am using sqlserver 2000 as back end) scenario In a webpage1, I have insert query into master table and Page2 I have insert query to store data in sub table. I need to rollback the insert command execution for sub table ,if insert command to master table in web page1 is failed. (Query in webpage2 executes first, then only the query in webpage1) Can I use System. Transaction to solve this? Thanks in advance
There is some strange behaviour i've recently noticed while watching synchronization progress in Replication Monitor on SQL 2005 Server Standard with merge replication configured. The merge process seems to repeat several times.
This is the initial synchronizaion (reinitalization at the subsciber). Client is using Microsoft.SQLServer.Replication objects from .net framework assemblies.
The synchronization starts normally (status is "Running"). The last message of selected session box shows (among other messages): "Beginning evaluating partial replication filters" then "Finished evaluating partial replication filters" and finally "Merge completed after processing xxx changes... etc." after a few seconds. Status changes to "Completed" and then... the merge process starts again!! "Beginning evaluating partial replication filters" etc. And this repeats about 15-20 times.
And so whole process takes about 15 minutes instead about 45 seconds to complete initial synchronization. The number of changes is "Merge completed after processing ..." never change since the first such message.
Is this some bug in web synchronization or some invalid configuration setting? Why does merge process repeat itself so many times??
Hello,I'm trying to create a simple back up in the SQL Maintenance Plan that willmake a single back up copy of all database every night at 10 pm. I'd likethe previous nights file to be overwritten, so there will be only a singleback up file for each database (tape back up runs every night, so each daysback up will be saved on tape).Every night the maintenance plan makes a back up of all the databases to anew file with a datetime stamp, meaning the previous nights file stillexists. Even when I check "Remove files older than 22 hours" the previousnights file still exists. Is there any way to create a back up file withoutthe date time stamp so it overwrites the previous nights file?Thanks!Rick
New to Database Mirroring and I have a question about the Principal database server. I have a Database Mirroring setup configured for High-safety with automatic fail over mode using a witness.
When a fail over occurs because of a lost of communication between the principal and mirror, the mirror server takes on the roll of Principal. When communication is returned to the Principal server, at some point does the database that was the previous Principal database automatically go back to being the Principal server?
I need to run two reports each of A5 Size to run back to page and print on single A4 paper means in 1st half Sale bill will be printed and in second half Gate Pass Will Be Printed both report will be on same page and size and shape should be maintained. How to do it.
When you dril down on MonthYear you get the detail data:
Month Number of Sales Total Sales
- Jan 2007 10 $610.00
1 $10.00
1 $20.00
1 $30.00
1 $40.00
1 $50.00
1 $60.00
1 $70.00
1 $80.00
1 $100.00
1 $150.00
My question is. I added a filter to the detail data to give the Bottom % =75 of sales. So My detail data only displays the following rows:
Month Number of Sales Total Sales
- Jan 2007 10 $610.00
1 $10.00
1 $20.00
1 $30.00
1 $40.00
1 $50.00
My problem is the group still displays the total of my dataset (as seen above), but I want it to display the total of the detail data group, like below:
Month Number of Sales Total Sales
- Jan 2007 5 $150.00
1 $10.00
1 $20.00
1 $30.00
1 $40.00
1 $50.00
If I change the fields in the group to look at the detail data ,for instance =count(Fields!NumberofSales.Value,"Details_Group") I get an scope error.
How can I display the totals of the detail data in the parent group after I added a filter to the detail data?
I have a situation where i need to insert or update the data from a flat file to a sql server database. The flat file contains nearly one lakh records.
I am using transactions. If all the rows are inserted or updated successfully i am commiting. If there is any error i am rollbacking the transaction.
when rollbacking the transaction it is taking more than 3 to 4 hours.
Hello,I am hoping you can help me with the following problem; I need to process the following steps every couple of hours in order to keep our Sql 2000 database a small as possible (the transaction log is 5x bigger than the db).1.back-up the entire database2.truncate the log3.shrink the log4.back-up once again.As you may have determined, I am relatively new to managing a sql server database and while I have found multiple articles online about the topics I need to accomplish, I cannot find any actual examples that explain where I input the coded used to accomplish the above-mentioned steps. I do understand the theory behind the steps I just do not know how to accomplish them!If you know of a well-documented tutorial, please point me in the right direction.Regards.
Hi again, In ASP.net, is there any elegant way to handle a set of time inserts from a form when the 2nd time is past midnight? Specifically, I have a form with 2 textboxes on it (startTime and endTime) that are set up to accept time values (using AJAX MaskedEditExtender for formatting/validation - pretty cool). This data is posted to a sub that enters the data into a table (T_Details). However, I've noticed that the data inserted as part of the record (SQL field is smalldatetime) doesn't take into account the fact that a time value past 23:59:59 in the "endTime" textbox is a time on the next day - it simply rolls to an A.M. date for the same day as the date for the pre-midnight value from the "startTime" textbox. I'm sure that I can simply do some conditional coding and modify the date if necessary but is there a better way to do it? Thanks as always...this forum is a great resource
The package is corrupt, who knows why, who knows what. It was running fine but there was a bug in an active script. Having failed miserably to debug this (note the ability to put a breakpoint in a script would be nice and dear god do not ask for edit and continue in here, it is bad enough as it is!) i eventually stuck a ref to the winforms namespace and debugged with a hundred messagebox.show statements. I did try to fire up an event that would write to an event tlog and i'm pretty sure that that is where the main problem lies as the package is still trying to load/use/connect to some log options but it did run for a while after i abandoned this path. Either way, I need some clue as to what bits of the xml file i can hack cos that is all i can get access to. 4 miles of xml, somwhere within the errors below are related. Clue me please or its curtains. Note also, the package is corrupt everywhere - in bin, in deployment, could not the last good compile get saved somewhere for recovery purposes as opposed to being overwriten with the bust one?
Error 1 Error loading Copy of pckSplitSourceNQ.dtsx: Element "{AD446434-4355-42C7-AD07-3D64A37C1671}" does not exist in collection "LogProviders". G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 2 Error loading Copy of pckSplitSourceNQ.dtsx: Element "{F43808C7-03B1-4A09-86B1-08274073DA5C}" does not exist in collection "Executables". G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 3 Error loading Copy of pckSplitSourceNQ.dtsx: Error loading value "<DTS:Executable xmlns:DTS="www.microsoft.com/SqlServer/Dts" IDREF="{F43808C7-03B1-4A09-86B1-08274073DA5C}" DTS:IsFrom="0"/>" from node "DTS:Executable". G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 4 Error loading Copy of pckSplitSourceNQ.dtsx: Error loading value "<DTS:PrecedenceConstraint xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:Property DTS:Name="Value">0</DTS:Property><DTS:Property DTS:Name="EvalOp">2</DTS:Property><DTS:Property DTS:Name="LogicalAnd">-1</DTS:Property><DTS:Property DTS:Name="Expression"></" from node "DTS:PrecedenceConstraint". G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 5 Error loading Copy of pckSplitSourceNQ.dtsx: Error loading value "<DTS:SelectedLogProvider xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:InstanceID="{AD446434-4355-42C7-AD07-3D64A37C1671}"/>" from node "DTS:SelectedLogProvider". G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 6 Error loading Copy of pckSplitSourceNQ.dtsx: Error loading value "<DTS:LoggingOptions xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:Property DTS:Name="LoggingMode">0</DTS:Property><DTS:Property DTS:Name="FilterKind">0</DTS:Property><DTS:Property DTS:Name="EventFilter" DTS:DataType="8">2,10,OnProgress,23,ScriptComponen" from node "DTS:LoggingOptions". G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 7 Error loading Copy of pckSplitSourceNQ.dtsx: Error loading a task. The contact information for the task is "Performs high-performance data extraction, transformation and loading;Microsoft Corporation; Microsoft SQL Server v9; (C) 2004 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1". This happens when loading a task fails. G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Error 8 Error loading 'Copy of pckSplitSourceNQ.dtsx' : The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails. . G:Visual Studio 2005 ProjectsSSISDataCleanseDataCleanseCopy of pckSplitSourceNQ.dtsx 1 1
Hello, I'm Designing sql server 2005 SSIS Packages. According to my requirment i have a sequence container. It has few data flow task, on success of one next one is running. If any one of them get failed then it should roll backed all the transaction. Each Data flow task transfering a data from one server to another server in similar table.
There are special instructions for installing service packs on desktop edition installations -- command line options etc. How do you install hotfix roll-ups? Does build 2187 apply to desktop edition?
Hello, I have problem for insert multiple query for insert in differenr tabels for a single record. I have mail record for candidate and now i wants to insert candiate labour info, candidate passport detail in diff tabel like candidatLabour and candidatePassport, i used two store procedure for it and i write code for it.and it works fine,but i think that if one SP executed and one record inserted but then some problem occure and 2nd SP not executed then........... so plz help me Thanks
I am trying to implement a log shipping scenario in sql 2005 where the secondary server is in standby mode with the ability to roll change during failover.
With the help of BOL (ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/udb9/html/2d7cc40a-47e8-4419-9b2b-7c69f700e806.htm) I can implement my scenario in Recovery mode, but not in standby mode. I use the following sql to put my primary in standby
BACKUP LOG [database] TO DISK = @filename WITH STANDBY = 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLBackupROLLBACK_UNDO_database.BAK' GO
which works, but then my restore job fails on the last step. How can I put my primary db in standby mode in such a way that the log shipping restore will work?
I missed the ability to restore based on a time (10/23 6pm) due to our purge cycle in our production environment, but I was able to obtain the 10/18 full backup, the 10/23 differential backup, and the 4, 10/23 trans. log backups. I moved all the fore mentioned files to a staging environment, and now I am trying to restore all of the files to 10/23 6pm and I get :
"The log or differential backup cannot be restored because no files are ready to rollforward" error.
Does anybody know of a way to rollback SQL Server 2005 databases back to SQL Server 2000? Is there a way of doing it without resorting to Copy Database Wizard? I love to find a way of attaching a SS 2005 database to a SS 2000 instance without any issues.
I recently upgraded to SS 2005 and I am very unhappy with the SS 2005 and I want to rollback to SS 2000, which was a lot more stable. I am having several major issues that are affecting my whole company's day-to-day operations and the managers are not happy. Some of the issues include night time batch running very sluggish for no apparent reason. This is a biggest problem because it only occurs once or so a week and causes a disturbance with the daily activities when the night time processing isn€™t completed on time. The rest of the time, the batch processing runs great, even a little better then on SS 2000. I don't believe it is a matter of my application needing to be retuned because if that was the case, then why isn't it running sluggish every night? Also, it's never the same day that the sluggish behavior occurs. If it was occurring on the same night, then I would have something to investigate within our application, but it doesn't. Another issue that I am having involves a night time job that restores a copy of the production database to the Data Warehouse server to be used for updating the data warehouse. Again, most of the time it runs great (~2 1/2 hours), but once or twice a week, it goes stupid and takes 6 1/2 hours for no apparent reason. Again, it is not happening the same day either, which could give me something to invesigate. On SS 2000, this same job ran flawlessly. Never I did I run into situation that the database restoration took that long to run. Even another issue involves a SQL Server Agent Job that was put into suspended state. What's a suspended state and how can I get it out of suspended state? I can find no information about suspended state in BOL. I did a Google and nothing came up. If this suspended state was put in for security reasons, great, but then tell me how I can remove the suspended state. I am also not happy with the fact that I can't get accurate information about the queries that are actively running at that particular moment. In SS 2000, when I noticed high CPU usage on the server, I would run the sp_who2 active stored proc and it would show me all the active thread and how much CPU it was consuming. I would then find the running threads with the highest CPU numbers and investigate the query and see if we could improve it. Now in SS 2005, I get in the same situation and run the sp_who2 stored proc, and there is no smoking gun. All of the active threads are showing very little CPU usage, which I am very suspect of. What the heck happen to sp_who2? I looked at some of the other ways of looking at running processes (i.e... sys.sysprocesses) and they don't appear to be giving the information that I need.
I am very unhappy and I just want to roll back to SS 2000 and wait a couple of years before I upgrade to SS 2005.
Is it possible in SQL Server to have replication happen immediately aschanges are made. That is, a change is made on server A, and that change isautomatically applied to server B, rather than the replication happening atset intervals?Thanks.