I have a query batch "update" script that upgrades my users database from,
say version 0 to version 1, or from version 1 to version 2. I would like to
know how I can wrap the entire script in a transaction, so that either the
whole thing succeeds or none of it does.
For example:
BEGIN TRANSACTION
.....
..... Alter some tables
.....
GO
.....
..... Alter a stored procedure
.....
GO
.....
..... Create a new stored procedure
.....
GO
COMMIT TRANSACTION
or
ROLLBACK TRANSACTION
GO
(how do I get to the "ROLLBACK TRANSACTION" if an error occurs in the update
script?)
Hello, I have problem for insert multiple query for insert in differenr tabels for a single record. I have mail record for candidate and now i wants to insert candiate labour info, candidate passport detail in diff tabel like candidatLabour and candidatePassport, i used two store procedure for it and i write code for it.and it works fine,but i think that if one SP executed and one record inserted but then some problem occure and 2nd SP not executed then........... so plz help me Thanks
Here the SELECT query is fetching the records corresponding to ITEM_DESCRIPTION in 5 separate transactions. How to change the cursor to display the 5 records in at a time in single transactions.
CREATE TABLE #ITEMS (ITEM_ID uniqueidentifier NOT NULL, ITEM_DESCRIPTION VARCHAR(250) NOT NULL)INSERT INTO #ITEMSVALUES(NEWID(), 'This is a wonderful car'),(NEWID(), 'This is a fast bike'),(NEWID(), 'This is a expensive aeroplane'),(NEWID(), 'This is a cheap bicycle'),(NEWID(), 'This is a dream holiday') --- DECLARE @ITEM_ID uniqueidentifier DECLARE ITEM_CURSOR CURSOR
We have a scenarion in a batch job. There are 3 sp's which are executed for every record in a table. After the execution of first sp the second sp executes depending upon the result of first sp. Simillarly for 2nd and 3rd sp.
Now if any sp execution fails than the whole transaction for that record in the table has to be rolled back.
Can this whole process of executing the multplie sp's insides a single transaction be accomplished using service broker with either a single queue or multiple queues?
sp 1 (1 record)
__________ l_____________
l l l sp2 ( 3 records for 1 record in sp1)
Simillalry for the one record in sp2, sp3 executes for multiple records.
Or in other words if processing of any message in a queue fails all the messages that have been processed already shoould be rolled back and no further execution should happen?
Also i would like to know can a conversation group be rolled back if processing of any message in the conversation group fails. I am asking this as we can club sp2 and sp3 together to get the results directly and than try for parallel processing.
I have over 500 transaction records in a single DB process handling within SQL transaction (Begin, Commit, RollBack and End). Is there any limitation for the following rollbacktransaction function to handle more records (eg. over 500 records)? Public Shared Sub RollBackTransaction() Dim transactionObj As Object Try transactionObj = SqlTransaction.GetExistingTransaction If (Not IsNothing(transactionObj)) Then CType(transactionObj, SqlTransaction).RollBack() End If
Catch ex As Exception Throw New Exception(ex.Message) End Try End Sub
I have a transaction table to store all the transaction happened on a single day. as per my requirement I wrote the query like this select Currency Code,TransactionCode,TransactionAmount,COUNT(TransactionCode) as [No. Of Trans] from TransactionDetails where TransactionCode in ('BNT' ,'BCN','BTC','STC','SCN','SNT') group by TransactionCode,CurrencyCode,TransactionAmount order by CurrencyCode..I got the result like this
My I want to show this result like this                ARS  BNT        0           0  BCN       0           0  SCN       1           12  BTC       0            0  STC       0            0  SNT       0            0     [code]...
and so on for all the the currency lists.how can I achieve like this .
First of all i do not know whether this is the right form to ask the question Let me describe the scenario iam using Iam generating xml files at a particular place and sending them to a server xml1|--------------------->dataset1------------------------------>adapter1.update(dataset1)xml2|----------------------->dataset2----------------------------->adapter2.update(dataset2)xml3|----------------------->dataset3------------------------------>adapter3.update(dataset3) all the three updates should happen in only one transaction if any one of the update fails then the transaction should rollbackcan anyone tell me a way to do iti am desperately in search of any ways to do it can anybody help please
Goofed up and ran an update query. It messed up all the data in a single table. I'm trying not to restore the table from a previous backup since the backup is more than 20 GB. It's going to take forever to restore it. Any advice would be much appreciated!
I am using Enterprise Manager to generate a script of the stored procedures on my database. Under File|Options, I selected Windows Text(ANSI). The file which is generated contains all stored procedures, but there are some procedures which are so long that they wrap to the next line.
When I then try to run this script from ISQL in a batch file, those procedures which wrap error out. Is there any way to generate the script in a more formatted manor, avoiding the wrapping?
In some text boxes within the table header, I want to force the text to wrap. This would be comparable to using Alt+Enter in Excel. For example, the text box may read like this...
I have a report that contains a table. I would like the table to wrap to a second column on the report page to decrease the number of pages the report spans. Is there a way to tell the table to span to the second column before going onto the next page?
visual studio.net seems to default to single batch mode when runningsql scripts. does anyone know how to change this behavior? typicalbatches will include object existence, drop, and create batches priorto processing. i have attempted removing the graphical plan, usingbatch separator 'go', and the vba trick using ';' to no avail.tia
I'm expecting to revamp some stored procs so that their selects are executed on a dynamic string that always returns the same columns but varies the sources.
I'm concerned that the bread and butter of products like RS and SSIS is the ability to predict what columns, and what column types to expect from a query, but that introducing dynamic sql will complicate using them.
I'm motivated not to use temp tables or table vars if possible. I'm also somewhat motivated to learn of a solution that works equally well in 2000 and 2005.
I've tried wrapping dynamic calls in a select as shown below but to no avail...
After€¦
declare @sqlString nvarchar(4000) set @sqlString = 'select * from [' + @dbName + '].[dbo].[activity]'
I€™ve already tried things like€¦. select a.* from exec sp_executesql @sqlString a
I am using SQL Server Express and Visual Studio 2005. I am new to batches and am trying to understand how they work. I am trying to write a query that creates an assembly and the functions that are contained in it. Here is my query:
USE ProductsDRM GO
IF NOT EXISTS (SELECT 'True' FROM sys.assemblies WHERE name = 'ComputedColumnFunctions') BEGIN CREATE ASSEMBLY ComputedColumnFunctions FROM 'C:WebsitesAssemblyTestStoredFunctionsStoredFunctionsinStoredFunctions.dll' GO
CREATE FUNCTION fImageFileName ( @ProductID int, @ImageSizeCode nvarchar(4000) ) RETURNS nvarchar(4000) AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].ImageFileName GO
CREATE FUNCTION fTestInt ( @ProductID int ) RETURNS int AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].TestInt GO
CREATE FUNCTION fTestInt2 ( @TestInt int ) RETURNS int AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].TestInt2 END ELSE BEGIN PRINT 'The assembly named "ComputedColumnFunctions" already exists. No new assembly was created.' END
GO
I read in a book about SQL Server 2005 about including a test for whether the object (such as assembly in this case) exists before trying to create it. If I only include the CREATE ASSEMBLY statement and the FROM line below it and delete the next GO down through the last CREATE FUNCTION (just before the END ELSE), it works fine. If I leave it as is, I get a runtime error on the GO line just after the CREATE ASSEMBLY statement. What am I doing wrong?
I am using FOR XML to generate the HTML and following is my query. I am using td { white-space:nowrap;} for the columns to used no wrap. But i want two columns which need to be wrapped. i have tried lots of options.
In my report, I have a Matrix control placed next to a table. It renders properly and displays data aligned in two controls in Visual Studio Preview. However, when I deploy to production, it wraps the matrix control below the table, in fact puts the entire matrix control underneath the table. Why such strange behavior and not in the Preview of the report but only in Production? Any ideas how to fix this?
Hello I was very happy to have found the thread http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=491642&SiteID=1 that explained how to get the text to display Bottom-to-Top/Left-to-Right.
The solution was to setup a function that creates a bitmap the text to be displayed. This works well and correctly displays the text image in HTML and PDF. (Excel, XML and CVS won't export backgroud images).
To extend the solution to wrap text it requires a few additional lines...
Code Snippet Function LoadImage3(ByVal stText As String)
stText = stText.PadRight(10) stText = stText.PadLeft(10) Dim iMaxLength as Integer = 180 Dim iMaxWidth as Integer = 180 Dim f As Drawing.Font = New Drawing.Font("Arial",7, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point) Dim bmpImage As New Drawing.Bitmap(1, 1) Dim MyGraphics As Drawing.Graphics = Drawing.Graphics.FromImage(bmpImage)
Dim imageSize As Drawing.SizeF = MyGraphics.MeasureString(stText, f) Dim i As New System.Drawing.Bitmap(iMaxWidth, iMaxLength) Dim g As Drawing.Graphics = Drawing.Graphics.FromImage(i) Dim rectF1 As New Drawing.Rectangle(-(iMaxWidth/2),-(iMaxLength/2),iMaxWidth,iMaxLength ) g.FillRectangle(Drawing.Brushes.White, 0, 0, i.Width, i.Height) g.TranslateTransform((iMaxWidth/2), (iMaxLength/2) ) g.RotateTransform(270.0F) 'flip the image 270 degrees
g.DrawString(stText, f,Drawing.Brushes.Black, rectF1) g.DrawRectangle(New Drawing.Pen(Drawing.Color.White, 1), rectF1) g.Flush() Dim stream As IO.MemoryStream = New IO.MemoryStream Dim bitmapBytes As Byte() i.Save(stream, System.Drawing.Imaging.ImageFormat.Jpeg) 'Create bitmap bitmapBytes = stream.ToArray stream.Close() i.Dispose() Return bitmapBytes
End Function
Items highlighted in yellow reflect changes made to orignal solution.
I developed a report with some values in textboxes. I want the output not to wrap around to the next line but to be truncated if it is more that the size of the textbox. Is there any setting that i can do b/c values are going to the second line when their size is more like printing name and last name will cause last name to go to the second line ?
We had finished converting a lot of our reports from Crystal Reports to SSRS. Upon doing so, one report, the customer's invoice has comments in the report that contain (s) at then end of a word to indicate singular or plural. Crystal reports handled the wrap correctly and kept the word together including the (s), but in SSRS, the (s) is being wrapped to the next line, which doesn't look good at all.
I have an application that processes a large number of input files in a CSV format and then posts the data to a table on SQL Server Express.
The data table can end up very large and I have no requirements to store all the data locally. I have included the table in my DataSet using visual studio express so I have access to the schema, but will not run Fill() on it.
Ideally I would like to process a CSV file at a time. I can add records to my local (empty) data table and when I am happy, I can call tableAdapter.Update() or dataSet.DataTable.AcceptChanges() to generate lots of SQL 'INSERT' commands to update the physical database at the server end.
I would then like to empty my local data table (so it doesn't get too big) and repeat the same process over again for each CSV file.
How can I empty my local table without causing it to generate a load of SQL 'DELETE' commands? I want to empty the table and fool ADO.NET into thinking that everything is synchronised, as if it has just done an update but actually hasn't.
Hello SQL Team! I'm stuck at this problem for days and need help. The problem is with the GO keyword. I know the GO causes the batch to get executed and all local variables are lost. But I can't seem to find a work around. I would like each stored procedure to get executed in different batches.
create table sql_cmd(cmd nvarchar(255)) insert into sql_cmd(cmd) values('exec user_sp param1,''param2'') insert into sql_cmd(cmd) values('exec user_sp2 param1,''param2'')
declare @sql nvarchar(255) declare c_sql cursor --small table peformance not a problem. Also open to other suggestions for select cmd from sqlcmd open c_sql fetch next from c_sql into @sql while @@fetch_status = 0 begin print @sql exec sp_executesql @sql GO fetch next from c_sql into @sql end
Recently I was stumped on a problem where I was granting permissions to a user, from within a script that was creating a stored procedure. Then I would check the permissions on the procedure, but the permissions were empty. It turned out that I was missing a "go" statement to seperate the create procedure statement from the grant statement. So that got me to thinking about what other kids of statements must be seperated into seperate batches ? I would thik that anything that is being created must be seperated from any statements that grant or alter permissions, because the items would need to exist first.
So i tweaked a stored procedure that did a 1 hour update for a specific countryId.
If that procedure was called at the same time with two different countryId than one update took place and afterwards the other.
Since all the rows are distinct i switched to an batch update only updating 10000 rows at a time ( and not all of the 2 million).
The general locking looked better afterwards but now i receive strange deadlocks.
My theory:
TX1 Updates a row on Page1 (P1) with rowlock. TX2 also does this on P1. Now TX1 deceides to escalate to PAGELOCK. TX1 waits for TX2 to be done with the row so it can lock the page. TX2 waits for TX1 to leave the page since TX2 may also want to pagelock.
Everybody waits for each other so we have a deadlock. Is that feasible ? OR is there another common problem when doing batch updates on the same table with distinct rows ( that can be on a same datapage ofc).
Hi, I would like to delete a data from a 750million row table in chunks of 10000,without blocking the users.As ours is a 24/7 shop I donot want to block the users for a long time. Answer for this is highly appreciated. Thanks Samna
I want to update tableToUpdate in batches of 5000 per batch and set the lastenecryptionDT to null based on the the join to the tableValues using the column ENCRYPTIONID, and also output updated rows into another table. Incase I would need to do a rollback.
I need to group up the records randomly into ‘n’ number of batches. That can be done by NTILE, but I want group up similar records in single group.
Say for example, following is the list of records I have in my table which I want to group into 5 batches
A123 A124 A124 A123 A127
After Ntile I will get the below,
Desired output is, Need output like Ntile but all same id should reside in single batch
Even if I n=5, maximum possibility of batches are 3 only.
In another forum post, a poster was deleting large numbers of rows from a table in batches of 50,000.
In the bad old days ('80s - '90s), I used to have to delete rows in batches of 500, then 1000, then 5000, due to the size of the transaction rollback segments (yes - Oracle).
I always found that increasing the number of deleted rows in a single statement/transaction improved overall process speed - up to some magic point, at which some overhead in the system began slowing the deletes down, so that deleting a single batch of 10,000 rows took more than twice as much time as deleting two batches of 5,000 rows each.
good rule-of-thumb numbers (or even better, some actual statistics and/or explanations) as to how many records should be deleted in a single transaction/statement for optimum speed? 50,000 - 100,000 - 1,000,000 or unlimited? Are there significant differences between 2008, 2012, 2014?
We are using the Transfer SQL Server Objects Task to transfer a large table. The trans log is filling up for this table. Is there a method to split the Data Transfer Task into smaller batches? (Smaller tables are transferring without issue.)
We have a SQLServer 2005 Enterprise merge replication publication with SQL Mobile 3.0 subscribers (Windows Mobile 5.0 and 6.0). We do not use pre-computed partitions due to trigger performance issues with an SSIS/ETL application that supplies data to the merge database. We do use the "Optimize" (=true) option, though we have tried this both ways with no significant differences. We use filters and joins for each worker ID (as HOST_ID) from the subscriptions.
The sync times become increasingly worse after we run the snapshot and bring the publication online. I have tried rerunning the snapshots, this helps little, as it often behaves like the subscription was set to reinitialize and forces a big sync (reload of all data) to the subscriber. We have tried much of the obvious (e.g., flattening filters and joins, adding indexes, etc.).
When users are synchronizing, we watch replication monitor and notice that a lot of time is spent processing "enumerating inserts and updates for article [any article]", especially processing the many generations and batches. This is true for any follow-up syncs after the 1st big sync (initializing the subscription).
I read several posts regarding the batches and generations of changes, and decided to try increasing the €œDownloadGenerationsPerBatch€?. I tried adding this parameter to the snapshot agent job, and the job fails each time with a vague message, even with the default value of 100. How do you change this parameter for SQLServer 2005 Enterprise?