Consider this example please.
BEGIN TRAN
insert into OrderDetail (fields) VALUES (values)
insert into Orders (fields) VALUES (values)
COMMIT TRAN
If there is a trigger on Orders that takes some fields from OrderDetail and puts them into some other table.
When the trigger fires, can it find the details if the COMMIT has not occurred yet.
I'm wondering If I should use isolation level READ UNCOMMITTED.
After all, If the whole transaction rolls back, my trigger activity will be rolled back as well. Comments?
i have a database which get refreshed every day from client's data . and we need to pull heavy data from them every day as reports . so only selects happens on that database.
we do daily population of some table in some other databases from this daily refreshed DB.
will read uncommitted or NOLOCK with select queries to retrieve data faster.
there will be no dirty read as there are NO DML operation in that database so for SELECT which happens concurrently on these tables , will NOLOCK work?
I have a question on locking pattern of read committed with snapshot isolation level that when two transaction update two different records then why do they block to each other even if they have previous committed value (old version of record).
I executed the below batch from a query window in SSMS
--Session 1: use adventureworks create table marbles (id int primary key, color char(5)) insert marbles values(1, 'Black') insert marbles values(2, 'White') alter database adventureworks set read_committed_snapshot on set transaction isolation level read committed begin tran update marbles set color = 'Black' where color = 'White'
--commit tran
Before committing the first transaction I executed below query from second query window in SSMS
--Session 2: use adventureworks set transaction isolation level read committed begin tran update marbles set color = 'White' where color = 'Black' commit tran
Here the first session blocks to second session. These same transactions execute simultaneuosly in snapshot isolation level. So my question is why this blocking is required in read committed with snapshot isolation level?
I have several databases set to read committed snapshot isolation level. Tempdb is configured according to best practices, but I don't see it's used much.
The application uses EF6, and it calls the stored procedures in the following way
I receive Error: 3967, Severity: 17, State: 1. Insufficient space in tempdb to hold row versions. We have 8 data files for temp db of 10210 GB size and given 10240 GB as max size.
As MS suggest to calculate the temp db file size and growth rate we need to monitor the perform counters Free Space in Tempdb (KB) and Version Store Size (KB) in the Transactions object.
basic formula: [Size of Version Store] = 2 * [Version store data generated per minute] * [Longest running time (minutes) of your transaction
My report disk utilizations says tempdb is full ? I thonk I need a shrink for the file .
Still I am confused in calculating the size , My perform counter gives me data as such
Free Space in tempdb (KB) 279938496 Version Generation rate (KB/s) 53681040 Version Cleanup rate (KB/s) 53422320 Version Store Size (KB) 258720 Version Store unit count 22 Version Store unit creation 774 Version Store unit truncation 752
I am inserting updating few tables from snapshot and reading same bunch of tables from reporting using readcommitted . It is showing some deadlocks i think it is write in this situation as " x" is not compitable with "s" ,"is".
I have state, day_date, error, and text column. If there is data then it is showing all the columns. But if there is no comments I would like to show no comments in the text field. Currently I have this store procedure.
CREATE PROCEDURE dbo.up_daily_quad_text
@DAY_DATE datetime
AS
BEGIN
SELECT
dbo.adins_database.ZONE_NAME + ', ' + dbo.adins_database.ZONE_STATE AS [STATE AND LOCATION],
dbo.COMMENT_CATEGORY.NAME,
dbo.COMMENT.TEXT
FROM
dbo.adins_database,
dbo.COMMENT_CATEGORY,
dbo.COMMENT,
dbo.DIM_DATE
WHERE
( dbo.adins_database.adins_id=dbo.COMMENT.adinsdb_id OR (dbo.COMMENT.adinsdb_id is Null) )
AND ( dbo.COMMENT.comment_dt=dbo.DIM_DATE.DAY_DATE )
AND ( dbo.COMMENT_CATEGORY.category_id=dbo.COMMENT.category_id )
AND
dbo.DIM_DATE.DAY_DATE = @DAY_DATE
END
GO
GRANT EXECUTE ON dbo.up_daily_quad_text TO AdIns_SSRS
I have a package loading up rows of employees from an excel file. An employee may be in the file multiple times. The package does a lookup and if not found it inserts the employee. When the employee is found it does an update.
So in my file employee A gets added on the first occurrence in the file. On the second occurrence, the lookup does not find the employee and tries to insert a second time and I get a unique constraint. How do I make the first insert Commit so that it is seen by the lookup on the second occurrence.
Hello, I am developing an application with a SQLExpress database. The database contains a very simple table, and I have added a Dataset object to the solution, and generated a plain Adapter object in the Dataset (using the standard VS wizard capabilities) to operate on one of the tables in de database. Just plain SELECT, INSERT operations, etc. Strange thing is, the data changes caused by the adapter's generated Insert command (which will call the INSERT statement on the database) are only visible while the application runs (or so it seems). For example: - I start off with the table containing 2 records, and start debugging- perform a COUNT within the code: 2 records- call the Insert function once from code- perform a COUNT within the code: 3 records- stop debugging- inspect the table: the 3rd record doesn't exist, just the 2 recordsSo, it seems that the changes don't get committed, although I am not sure it is a transaction/commit problem. I haven't been able to figure out what I can do to fix this. I have used adapters before on a SQL database, no problems there. Any comments appreciated. Cheers, JP
I have a queue in service broker, which gets messages with a stored procedure which is invoked from VS application. The messages are properly stored in queue. I want to receive these messages one after the other and put it into a table. I am fetching the messages in a while loop which breaks when messages are over.
The first record is inserted properly. In the next iteration of the loop I want to append the next message data in that same record. This will continue for approximately 10 minutes after which a new record will be created. When I attempt to append the data in first record that record is not being read even though I have used isolation level read uncommitted.
What is it that I am missing to achieve the desired results?
I got problem in production server at client place(No backup copy & not replicated,it's a SQL SERVER 2000 Enterprise server),by mistake client updated the data without using where condition then updated lakhs of rows (in SQL server autocommited),Now I need to recover this data from LOG file(.ldf).I tried with LOG EXPLORER(Third party tool) Trail version recovered from default database(Northwind,Pubs).But client not willing purchase this S/W for simple cause,How can we recover the data from LOG file.
1.Can we write the Program in C# to read the SQL Server Log and show the past transactions? 2.Is There any Stored procedures exist in SQL Server to read the day transactions in log file and take the backup? 3. How to read Transactions Log file in SQL Server 2000?
It's very Urgent,I am not expert in SQL Server 2000.
After I watch a video for how to create MS SQL Replication, I configure distribution and create the publication. The problem comes from subscription. If I create a "PUSH" subscription, it works fine. However, when I create a "PULL" subscription, I got the following error. Where I should look for solution for this error.
Hi everyone. I've taken a while off of developing site in ASP.net but had a site that I wanted to upgrade a little, but needed a little help.What I have currently is a website of a person with videos and images of that person. To view the videos, I have a "view_video.aspx?ID=" page that plays the video from YouTube by looking my database for the ID, YouTube URL, Name, and Description. What I want to do is create a "comments" table for visitors to add comments for each video and then display all comments on the page.So far, my "comments" table looks like this. ID, otherID, name, email, comment, type, date.The ID is the id of the comment, otherID is the foreign key to the "videos" table, the name is the name of the person leaving the comment, e-mail is for the person, comment is the text, type is the type of video (tutorial, sampler, random video), and date is a timestamp for when users leave the comment.So hopefully I have a good start. I don't have any code to show right at the moment for I am at work, but if anyone has any ideas or critiques so far, I'd love to hear them. This is an interesting project for my friend and I'd love to implement this sometime.Thanks,TetrisSmalls
I am working on a project that lets visitors to my webpage post comments. I have there name, city, etc. stored into an sql database. This all works until I put text into the comments field of my form. I get this message when it is trying execute -- String or binary data would be truncated. I was wondering what data type to use, I have used char and varchar and neither one is working. Is there anything else I could try to fix this.
We're migrating from Access to MS SQL server 7.0 Inside the edit of an Access tables; the Description column can also be used to place comments other then the description of the field, ie. explanation of field's use pay payrate per hour in USD or allowed values status O=open, C=closed I can not find something similar inside SQL 7.0. If this option is not available, what other alternatives are there ?
Patrick
PS. what is the CTEXT field inside the SYSCOMMENTS table ?
I am trying to plan out a database that will have users and comments on those users. How should I structure this? Should each comment be attached to the user it describes or should each user have a list of the comments that describe it?
On quick observation, it seems to me DTS has a strange way tointerprete File System. ENV: NT OS, SQL Server 2000.When a data import package (source point to c:XYZdir) is executed atEM level, sql server seems to think the "c:XYZdir" is the currentuser's "c:XYZdir", however, if the same package is scheduled as a joband executed as a job, then, sql server seems to think the "c:XYZdir"is THE SQL SERVER INSTALLTION MACHINE's C:XYZdir. It's quiteinconsistent. Please let me know your finding about this.Also, I use ActiveX scripts to perform exception handing (errorchecking) for certain transaction. For instance before a packageimports a file an ActiveX script checks source file's existence andformat, if not as expected halt here (not to execute the package).Now, the two cases of (a) file not exist; and (b) incorrect file formatcan't be determined by not running the package. Probably, the ActiveXscript should capture each case by creating a record/row during thefile checking. What's your thought?TIA.
I'm trying to do Sharepoint DR with Log Shipping and every thing configured except one thing which is switch the WSS_Content (Standby /Read-Only) DB to be ready and Write.
I tried from
GUI or ALTER DATABASE [WSS_Content] SET READ_WRITE WITH NO_WAIT
I have two database files, one .mdf and one .ndf. The creator of these files has marked them readonly. I want to "attach" these files to a new database, but cannot do so because they are read-only. I get this message:
Server: Msg 3415, Level 16, State 2, Line 1 Database 'TestSprintLD2' is read-only or has read-only files and must be made writable before it can be upgraded.
What command(s) are needed to make these files read_write?
I am reading about the RESTORE command to a point in time using logs, I would like to know the minimum point in time recovery for a backup image using T-SQL command before applying a log restore and what are the log ranges needed for the restore during restore.
OBJECTIVE: I would like to read a text file from SQL Server 2000, read the text file content, and load its conntents in a RichTextBoxTHINGS I'VE DONE AND HAVE WORKING:1) I've successfully load a text file (ex: textFile.txt) in sql server database table column (with datatype Image) 2) I've also able to load the file using a Handler as below: using System;using System.Web;using System.Data.SqlClient;public class HandlerImage : IHttpHandler {string connectionString;public void ProcessRequest (HttpContext context) {connectionString = System.Configuration.ConfigurationManager.ConnectionStrings["NWS_ScheduleSQL2000"].ConnectionString;int ImageID = Convert.ToInt32(context.Request.QueryString["id"]);SqlConnection myConnection = new SqlConnection(connectionString);string Command = "SELECT [Image], Image_Type FROM Images WHERE Image_Id=@Image_Id";SqlCommand cmd = new SqlCommand(Command, myConnection);cmd.Parameters.Add("@Image_Id", System.Data.SqlDbType.Int).Value = ImageID;SqlDataReader dr;myConnection.Open(); cmd.Prepare(); dr = cmd.ExecuteReader();if (dr.Read()){ //WRITE IMAGE TO THE BROWSERcontext.Response.ContentType = dr["Image_Type"].ToString();context.Response.BinaryWrite((byte[])dr["Image"]);}myConnection.Close();}public bool IsReusable {get {return false;}}}'>'> <a href='<%# "HandlerDocument.ashx?id=" + Eval("Doc_ID") %>'>File </a>- Click on this link, I'll be able to download or view the file WHAT I WANT TO DO, BUT HAVE PROBLEM:- I would like to be able to read CONTENT of this file and load it in a string as belowStreamReader SR = new StreamReader()SR = File.Open("File.txt");String contentText = SR.Readline();txtBox.text = contentText;BUT THIS ONLY WORK FOR files in the server.I would like to be able to read FILE CONTENTS from SQL Server.PLEASE HELP. I really appreciate it.
Did some searching and didn't seem to find what I'm looking for. I'm pretty new to SQL Server (most of my experience is on DB2 for z/OS).I'm building some new tables, and want to find a way to add comments to the metadata for the column. In DB2 the syntax is:COMMENT ON COLUMN TB_CREATOR.TB_NAME.COLUMN_NAME IS 'comments here';ORCOMMENT ON TB_CREATOR.TB_NAME (COLUMN1 IS ' comment here',COLUMN2 IS ' comment here', );Is there anything like this in SQL Server?Thanks!
hai everyone i have a requirement. i am writing a very large sp. i have written many comments in that sp. i wanted to generate a document from that comments i have wriiten in that sp. i know a similar thing can be done in vb.net comments. is it possible top do such a thing in .sql file.
ie all comments that i have wriiten in my .sql file should be extracted. to a document.
I have a table with almost a million rows, although it's quite slim with just ID, date, userID, JobID etc.
Now I want to the ability to add comments to some (probably less than 1%) of those lines.
The question is whether to create a separate comments table to join to it, or to create a comments field within the existing table? The comments field would obviously default to NULL, so wouldn't bloat the table unnecessarily if I add that field (right?), and would always be selected with the row from that table, so I'm leaning towards the latter alternative.