The Fastest Way To Perform An Update ... Advice Needed :)
Nov 12, 2006
Hi all,
I have a situation where my Visual C# application presents a number of fields. In order to update a student object, I wish to call a stored proc. 1 or more fields can be updated... And If one is left null, then I don't want to update it, but instead I want to keep the old value.
I am really wondering if I am approaching this the right way. The following stored proc does what I want it to do, however I'm thinking there may be a faster way...
Here it is:
-- Update a student, by ID.
DROP PROCEDURE p_UpdateStudent
CREATE PROCEDURE p_UpdateStudent
@ID INT,
@NewFName VARCHAR(25),
@NewOName VARCHAR(25),
@NewLName VARCHAR(25),
@NewDOB DATETIME,
@NewENumber VARCHAR(10),
@NewContactAID INT,
@NewContactBID INT
AS
BEGIN
SET NOCOUNT ON;
-- DECLARE THE OLD VALUES
DECLARE @FName AS VARCHAR(25)
DECLARE @OName AS VARCHAR(25)
DECLARE @LName AS VARCHAR(25)
DECLARE @DOB AS DATETIME
DECLARE @ENumber AS VARCHAR(10)
DECLARE @ContactAID AS INT
DECLARE @ContactBID AS INT
-- Get all of the old values
SELECT @FName = FName FROM TBL_Student WHERE ID = 10000
SELECT @OName = OName FROM TBL_Student WHERE ID = 10000
SELECT @LName = LName FROM TBL_Student WHERE ID = 10000
SELECT @DOB = DOB FROM TBL_Student WHERE ID = 10000
SELECT @ENumber = ENumber FROM TBL_Student WHERE ID = 10000
SELECT @ContactAID = ContactAID FROM TBL_Student WHERE ID = 10000
SELECT @ContactBID = ContactBID FROM TBL_Student WHERE ID = 10000
-- USE ISNULL to set all of the new parameters to the provided values only if they are not null
-- Keep the old ones otherwise.
SET @NewFName = ISNULL(@NewFName, @FName)
SET @NewOName = ISNULL(@NewOName, @OName)
SET @NewLName = ISNULL(@NewLName, @LName)
SET @NewDOB = ISNULL(@NewDOB, @DOB)
SET @NewENumber = ISNULL(@NewENumber, @ENumber)
SET @NewContactAID = ISNULL(@NewContactAID, @ContactAID)
SET @NewContactBID = ISNULL(@NewContactBID, @ContactBID)
-- Do the update
UPDATE TBL_Student
SET FName = @NewFName,
OName = @NewOName,
LName = @NewLName,
DOB = @NewDOB,
ENumber = @NewENumber,
ContactAID = @NewContactAID,
ContactBID = @NewContactBID
WHERE
ID = @ID
END
GO
So yeah it works. But As you can see I wish to keep an old copy of the values to perform checks pre update....
Is there any faster way, or am I on the right track? I need a pro's advice :) (before i write all of my procs!!)
Hi everyone, My hoster hosts asp.net but does not yet support sql 2005 only sql 2000 and access I want to use either of the starter kits. I am confused on what is needed to make the changes to make either sql 2000 or access work. I know enough that the connection strings will need to change my concern is code. Is there strings that I will need to find and change in the application. I sure would appreciate any and all advice. Thank you for your help. DKB
I€™d appreciate an advice about the following data mining scenario. A bank institution has customers and keeps tracks of the customer demographics data (e.g. income, age, etc). A customer could have purchased one or more products (checking, savings, CD, and other accounts). My task is to predict the likelihood for a customer to purchase a new product. For example, if the custom has checking and savings accounts, what€™s the likelihood for the customer to purchase a CD account so we can offer this product do the customer.
My questions:
What data mining task (segmentation, classification, etc) does this requirement represent? What algorithm(s) is best suited for this task? For a given customer, will I be able to predict the buy probability for all products or do I need to query the mining model to predict one product at the time?
I have a master table which has demographic data such as name, dob, location along with a primary key id. It will have about 10-12000 records. We get a refresh file every hour which may or may not have corrections for these records hourly with about 3,000 records. I put this data into a table. This data should be considered always to be correct. To handle the update to the master table I need to create an update process. I can take one of two approaches, just update all the records in the master table regardless if they are correct or not, or do some type of left join on those that do not match (in other words, only update the ones where thae names or dob don't match) There is an underlying update trigger on the patient master which will also fire if these values are changed. An opinions on a best approach?
I need some advice on a project that I am working on... First, here is what I am trying to achieve: A Web Form with two controls: A DropDownList with two items added at design time (Fruits and Vegetables) and an empty ListBox. When the user chooses a "category" from the DropDownList, the ListBox will be populated with a list of either "Fruits" or "Vegetables" retrieved from a SQL database. (Note: Since the data in the SQL database must be converted and formatted programatically, simply databinding the ListBox will not work here.) I believe that I can do this with the following code (stolen from an MSDN article):'Create ADO.NET objects. Private myConn As SqlConnection Private myCmd As SqlCommand Private myReader As SqlDataReader Private results As String
'Create a Connection object. myConn = New SqlConnection("Initial Catalog=Northwind;" & _ "Data Source=localhost;Integrated Security=SSPI;")
'Create a Command object. myCmd = myConn.CreateCommand myCmd.CommandText = "SELECT FirstName, LastName FROM Employees"
'Open the connection. myConn.Open()
myReader = myCmd.ExecuteReader()
'Concatenate the query result into a string. Do While myReader.Read() results = results & myReader.GetString(0) & vbTab & _ myReader.GetString(1) & vbLf Loop 'Display results. MsgBox(results)
'Close the reader and the database connection. myReader.Close() myConn.Close() Now here is the part that I am not sure about: Is the FormLoad event the best place to put this code? If I do, is this not a lot of overhead (creating, opening and closing a connection) everytime there is a page refresh/PostBack? Would I be better off putting this code in the DropDownList SelectedIndexChanged event? Although that seems like it could make the process of selecting a category take a fairly long time. Finally, if the is a better way of doing this, I am certainly open to suggestions. All advice is greatly appreciated.
Our backups run using a stored proceedure called sp_fullbackup. This takes a dump of every database to a device each night. The only problem is that it overwrites the previous nights backup, due to the way it has been written:
DUMP DATABASE master TO @bkupdevice WITH NOUNLOAD , INIT , SKIP
How do I change the syntax so that it will retain the previous 3 backups?
I could use some sound advice regarding replication. (Or a better avenue to take if available)
I have a production server and a *live* server. 90% of the updates are done on my production server. When needed I update the live server by completing overwriting everything with the data on the local machine. (This is because there has been numerous changes to the database design as well as the data but this should no longer be the case.)
Now updates are taking place on both servers meaning they should both be identical.
The problem I see is that something like a snapshot or merge replication could never be done. Since the same id's can be created on both machines I see no way that the software can tell them apart.
So is it best to update one server and then send the request to the other to update the same record (or insert/delete etc) or is it better to use something like DTS or transactional replication to accomplish the same task?
I dunno, kinda in the dark at present but any advice on the subject would be very helpful.
We are developing a software for a logistic company, which will have around 1000 branches, and we need to synchronize the database(SQL Server) between all the branches using the database at web server, i.e the branches can get the new data from the web server,as well as push the data at the web server, please tell us how can we accomplish this task,
Dear GroupA while ago I've asked how paging is possible and have read the articles onaspfaq.com. Great work, esp. the speed comparison of all the differenttechniques. Thank you to everyone who has responded to my post. SomewhowGoogle didn't let me reply to the thread!All of the methods described work perfectly fine until I'm trying toimplement a WHERE or ORDER BY DESC on a particular field. E.g. Create allpages but sort on a date field DESC doesn't start with the lowest date onthe first page and the highest date on the last page but all dates mixed up.As a workaround I'm at first using a cursor to populate a temporary tablee.g. SELECT * FROM MyTable WHERE MyField1 = Condition ORDER BY MyField2which only contains the data I would like to use to create the pages. I'musing a cursor since I read that a normal SELECT to poulate the temporarytable doesn't always guarantee that rows are inserted in the correct order.Next I'm creating the individual pages with the Count and Page.Rank methodoff the temporary table.This method is not the best in performance and I'm sure there must beanotherway to perform paging with filtering and sorting. I'm grateful for any tippsyou have.Thanks for your time & efforts!Martin
I'm hoping somebody can help me here as i'm struggling to find any information elsewhere on the net. We have recently purchased a new server, the rough specs are:
2 X Quad-Core Xeon E7320 2.13GHz 4Mb Cache 32Gb PC2-5300 DDR II RAM
We are planning to install the 64 Bit version of SQL Server 2005. We want to use the server for a number of purposes.
Building and weekly processing of 2 complex data marts (approx size is 1Tb each)
Processing and querying of 2 Analysis Services databases that will be built from these data marts. These will be queried by no more than about 15 users (no more than about 5 simultaneously).
Relational querying of the data marts themselves (same users as above) My problem is that I am not sure of the best way to configure SQL Server. Should I use 2 separate instances? How should the processors/memory be shared between SQL Server/Analysis Services? My main priority is the performance of the OLAP querying. However, I also want the weekly processing and any ad-hoc SQL querys run against the marts to be efficient.
This message is similar to a previous post, but no one seems to have an answer to it so I thought I should try to solve my problem in another way.
I have to use a pass through query to access db2. I cannot get the four-part name to work.
I want to simplify things for the developers. I can't figure out how to make one stored procedure or view to handle all the columns the developers may need to filter by or return. Is the best practice to just make multiple views or stored procedures to in each database to handle each query needed?
I am writing a report in RS and my query is supposed to pull transactional records from a certain date (yesterday).
The table I am querying contains roughly 1.5 million records. This is on an AS400 dB2 system. Mostly I use ODBC with no problems, however this query ran for about two hours and was still running before I killed it.
I am thinking of using SSIS to copy the records into SQL. If I go this approach (which should be much faster), is there a way to just copy over differential records each day?
Or does anybody have any other suggestions? What is Linking a Server?
Hello, What is the fastest way to update 20million records in our database. I have tried to do a simple update statement like this: update trail_log with (tablockx, holdlock) set trail_log .entry_by = users.user_identity from users where trail_log.entry_by = users.user_id
but it take 10 plus hours to run since it cannot commit the transactions until the very end. So was was thinking that I need to commit in batch like after 50K but that is slow as well. Set rowcount 50000 Declare @rc int Set @rc=50000 While @rc=50000 Begin Begin Transaction update trail_log With (tablockx, holdlock) set trail_log.entry_by = users.user_identity from users where trail_log.entry_by = users.user_id and trail_log.entry_by not like '%[0-9]%' Select @rc=@@rowcount --Commit the transaction Commit End go I have let the above statement run for 1.5 hours and it only update 450000 rows. Any ideas... Maybe I'm doing it wrong. Please Help!!
How do most people handle database searches for things like a product database? What I mean by that is your typical product table may look like: ProductIDProductTitleProductDescriptionProductCategoryIDEtc... Assuming you have a Full Text Catalog set up on the ProductDescription field, would you use Dynamic Sql or a Stored Procedure? I'm wrestling with this because I haven't found a good way to either parse a parameter in SQL to make a stored procedure work with the same flexibility as Dynamic TSQL, and I would prefer not to have any direct access to the table from the application. My other option (which to me isn't a great one) is to have a fairly large number of optional "keyword" parameters in the stored procedure and then parse them on the application side...this is less than favorable in that in theory someone could pass more keywords than I've allotted in my stored procedure. How would you approach this challenge? Thanks in advance. Ryan
hi, iam thinking of changing my ajax slideshow so that it gets the data from the databse. currently i am finding it hard to add text functianlity the way i want with the slide show. what my query is, that if i to using a datalist can i add javasscript functionality to the data being retrived. for example, currently i have written some javascript so that a series of text is diplayed one after the other in a sequence from just one button click. so if im pulling data out of a databse can i still add this javascript functionality to it? i hope this makes sense, if it doesnt then i am willing to elaborate. please can any one offer any advice or examples or any suggestions on how i can do this. any help is much appricated as i am struggling to find a solution as i orinally wanted to be able to add this javascript functionality with the play button of the slide show but i couldnt find a solution.also i think its better to use some kind of database as i can use the editing funtions visual web developer offers thank you
I can't connect to any instance of MSDE on my laptop using my web app. I duplicate the whole process on another local machine and have no problems. I'm using DNN, but it's not a problem with a connection string.
Over the last week I've had to rebuild this laptop - reformating and all. I'm running win xp pro, vs'03. I've tried this on msde2000sp3a & msde2000rela. Everthing is fully patched, updated, and current.
I'm working on two pcs at the same time with identical environments, doing one step on one keyboard and then the other: I take one set of fresh DNN files and install them on each pc. create a new db, user with dbo rights. create a new virtual directory in iis. alter the dotnetnuke.sln, dotnetnuke.vbproj.webinfo to reflect the path to the local site. alter the web.config file with the connection string. browse to the website (this is how dnn launches, creating the db from scripts, etc.)
At this point my desktop pc works great and I've got a new instance of DNN.
My laptop does launch dnn, but it can't connect to the db and says there's a problem with the connection string - but the only difference in the strings is the instance name - which isn't spelled wrong!!!!
So I'm led to believe that there is something wrong with the setup on the laptop which was just rebuilt.
It's got .net 1.1, msde2000sp3a, winxppro sp1. I've uninstalled and reinstalled all the components I can think of.
What would you do??? I'm completely out of ideas on this.
Do I format the hd and start again? That would hurt, but I've tried everthing else I can think of.
Oh, here's the other rub. I can create a system dsn, test the connection - and it works! I know that point's toward dnn, but like I've said I believe I've ruled that out.
Hi, hoping I can get a few view on a question I have relating to the above.
I am new to Stored Procedures and Triggers and I am trying to understand 'best practice' a little better. Here is my question: If I have a table that stores information, and when any field in that table is updated (and changes) I would like to inactive the row, prior to change and then add the change by way of a new, active row. This way I can see what it was before and that it's inactive, and what the active value is.
Hope this makes sense, if this is the wrong way to manage change history any suggestions would be appreciated.
A second question I have is as follows: If I have a table that stores a number, based on that number, what would be the best way to create new records in a different table that pulls from the first table. Where the number stored in table 1 represents how many times the record is to be created in the second table.
Thanks. If anyone needs more data, please feel free to ask, I will help as best as I can and appreciate any advice & comments that you can give.
Hello!I'm seeking advice on a rather complex type of query I need to buildin an Access ADP (SQL-Server 7). There are four tables:tblPeopleID(PK)PRENAME---------------1Thomas2Frank3ChristblInventoryClassesID(PK)INVENTORYCLASS----------------------1Car2PhonetblInventoryItemsID(PK)relInvClass(FK)ITEM-----------------------------------11Dodge Viper21Chrysler32Nokia42SamsungtblPeopleInventoryID(PK)relPeople(FK)relInvItem(FK)--------------------------------------112213321423534In this example the last table tells me thatThomas owns a Chrysler (class Car) and a Nokia (class Phone).Can someone tell me how to write a query or a stored procedure whichproduces a resultset like this:qryOwnersPeopleCarPhone-----------------------------ThomasChryslerNokiaFrankDodge ViperNokiaChris[NULL]SamsungThe main idea is that I need to be able to iterate such a collection.It is guranteed that one "People" only owns one or zero "Car" and oneor zero "Phone".I guess that it might be impossible to design a stored procedure withsuch a variable amount of columns (in this case, each item fromtblInventoryClasses would mean another column).Ary there any possibilities in accomplishing this without creatingtemporary tables?Any help would be really appreciated ;-)Greetings,Christoph Bisping
Hello Specialists !Please help me - i need advice in importing textual data to SQL Server.I am using DTS with a simple process : Text(source)->Connection.I want to increase speed of importing because i have to import 4GB (1000char lines lenght) of data.Do you have any tips for me ?Best regards,Manu
Hi,Is there a programmatic wasy to convert the results of a sql data set to xls, csv, etc. Ideally a user would be able to make a selection to view the data (result set has, E.g. make, model, year, condition viewed in a datagrid or similar control) and then be able to export the file to the format they choose, and have a download box popup from the browser to download the file.E.g. Export this data to: __ XLS __ CSV __TXT . I know DTS can do this but any advice on how to encapsulate this in a C# web app woudl be greatly appreciated! Thanks!
HI, I'm still on the steep side of the learning curve with ASP.NET. I've looked through a number of threads on this forum, and have gotten pieces of the answer, but need help getting past a roadblock.
I haven't been able to get a simple test application to connect to the Pubs database loaded on my system running MSDE. The only control in the application is a WebDataForm created by the Wizard. Everything works well (even the Preview Data form the Data menu loads and displays the correct data), except when the form is viewed in a browser. Click the load button and an error:Login failed for user 'NT AUTHORITYNETWORK SERVICE'.
Similar stories are common on this forum, and I tried to address the problem. I'm quite sure it's a autherization or authentication problem. I have Windows Server 2003 (with IIS 6.0), VS.NET 03 and MSDE as the SQL server on the same box. The MSDE server is using Windows Security mode. I Created a new user called ASPNET on the Windows Server. I added a login to the SQL Server 'WinServerNameASPNET' using windows auth. Still get the same login failed message.
Is there something I'm missing? Do I have to add a new user to IIS?
For the record, what users/settings do I need to have in Windows, SQL-Server and IIS, get past this login problem.
Thanks for answering this basic question - one more time.
I'm very green with SQL so I could do with some advice please.
I need to pull some data from a table based on the year portion of a datetime field, so far I've got this...
USE MfgSys803
SELECT orderdate, ordernum FROM orderhed
WHERE ((SELECT CONVERT(VARCHAR(4),GETDATE(),111)) = (SELECT CONVERT(VARCHAR(4),ORDERDATE,111) FROM orderhed))
... the field 'orderdate' is the datetime. The purpose of the WHERE statement is to get the current year fromt he system and then compare this to the current year of the field 'orderdate'.
Unfortunately I get the error...
Msg 512, Level 16, State 1, Line 3 Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression.
... the 'SELECT CONVERT' portions of the WHERE work fine on thier own but I can't use them together.
We have a couple of MS SQL Server 2000/2005 databases with a bunch of..NET clients written in C#, but may want to replace the dbserver withpostgres instead. The clients will still run on Windows, hopefully withas few changes as possible.We don't have any stored procedures or triggers, so all we need to portare the tables/index definitions.What are the most common issues/problems people run into on the clientside? My guesses are stuff like- identity columns- transaction handling(autocommited vs. implicit)- date and datetime- general error handling and error codesAnything else?I've googled for migration guides and howto's, but without success.Pointers to such are appreciated.Boa
In my enrv. DBA just migrated SQL Server 2000 Databases from WINDOWS 2000 to WINDOWS 2003 Enterprise Edition and the same SQL version. The problem which i need to analyse
1. Why the full backup size is occupying only 70GB when DB size is 120GB?
Here is the situation
Full Backup is taken once every day ---- 70gb Diff Backup Taken every 3 hrs till 5 PM ----- size is 50GB Transaction log backed up every 10 min uptill 8 PM ----- not a big size
I am really confused as to why Full DB Backup is taking only 70GB.............Can some one please throw a Light on how the SQL Server 2000 Backup functions.
Hello everyone, Here's my situation... I'm running a web service which involves 51 seperate servers and databases. There are fifty licensee servers (One for each US state) and one corporate server. Each night I need to upload sales and membership data from the licensee's databases to the corporate database to compile reports. The application platform I'm using is ASP.NET 2.0 and the the database is SQL2005 express. I want this process to be run automatically, so I believe it's a scheduled windows service I need to setup up in .NET to make the data transfers. If anyone has already set something up like this, or knows the steps to take? I would love to have your input. Thanks in advance, Robert
I am working on two versions of an application, one of which will be awindows forms application (which will need to be redistributable) andthe other will be a web application.I have MS Visual Studio 2005 (along with the developer's edition of MSSQL Server), but not MS Access. I also have MySQL, PostgreSQL, Sun'sapplication server, Tomcat and Apache web server. I am working onWindows XP Pro, and have installed the .NET 3 SDK and all relevantrelated products I could find (e.g. 2 extensions packages for VisualStudio).I have one MS Access database, to which my users should have read onlyaccess. I have, and have used, a tool for importing MS Accessdatabases into MySQL. I expect that SQL Server has a similar utilityhidden somewhere (where I haven't yet looked, though I HAVE beenlooking - obviously in the wrong places). I have located a similarutility for importing MS Access databases into PostgreSQL. I have notyet decided which servers to use for the web version, but that isanother story, for which I may raise another thread in due course (butI welcome suggestions which may reduce the effort required givenrequired effort for the windows forms app).My problem is for the windows form aplication (intended for use by asingle family). I expect to use ADO.NET. The question is, should Iimport the Access database into MS SQL, and redistribute it, along withMS SQL Server Express (or is that necessary), or distribute it just asan Access database and use the jet engine to access it. A relatedquestion is, "Does ADO.NET support creating new databases for a givenengine?" Imagine a recipe database. It is easy enough to create a SQLscript that creates all the required tables, indices, foreign keys,&c., but can I submit that SQL script to an ADO.NET object, along witha file name, and have it create, e.g., an Access database with thesupplied name. Or do I have to create a database file with nothing init other than the schema?I have more questions, but they'll have to wait.ThanksTed
I am currently working on a PHP based website that needs to be able to drawfrom Oracle, MS SQL Server, MySQL and given time and demand other RDBMS. Itook a lot of time and care creating a flexible and solid wrapper and amdeep into coding. The only problem is a noticed VARCHAR fields being drawnfrom SQL Server 2000 are being truncated to 255 characters.I searched around php.net and found the following :Note to Win32 Users: Due to a limitation in the underlying API used by PHP(MS DbLib C API), the length of VARCHAR fields is limited to 255. If youneed to store more data, use a TEXT field instead.(http://www.php.net/manual/en/functi...ield-length.php)The only problem with this advice is Text fields seem to be limited to 16characters in length, and I am having similar results in terms of truncationwith other character based fields that can store more than 255 characters.I am using PHP 4.3.3 running on IIS using the php_mssql.dll extensions andthe functions referenced here http://www.php.net/manual/en/ref.mssql.php.What are my options here? Has anybody worked around this or am I missingsomething obvious?James
Hi all, I hope you guys can help me with the following bit of T-SQL. I already have a solution but I really don't like it and I've been trying to find a simpler more elegant way of doing the same thing.
Firstly, let me present you with a brief explanation of what I am trying to do together with some sample data for you to play with and hopefully assist me in finding a better solution than the one I’ve come up with.
insert into #VehMake select 222, 'FORD' union all select 210, 'FORD (USA)' union all select 223, 'FORD (AUS)' union all select 269, 'HONDA' union all select 253, 'NISSAN' union all select 280, 'VOLKSWAGEN'
This contains various vehicle makes which I'm sure you'll recognise!
insert into #VehicleHistory (PersonId, VehMakeVehModel) select 1, 'FORD (USA) MUSTANG' union all select 2, 'HONDA CIVIC' union all select 3, 'NISAAN ALMERA' union all select 4, 'VOLKSWAGEN PASSAT'
As you can see, in the second table, the second column contains a string of the vehicle Make and Model in one string. What I need to do is to split the Make and Model in to separate columns with an update statement.
This seems easy enough with a simple LIKE comparison:
VehMakeVehModel like VehMake+' %'
....BUT if you notice, there are two records in the #VehMake table that are similar but not the same. These are the 'FORD (USA)' and 'FORD (AUS)'. The update statement would return two records from the #VehMake table when trying to match with the first record in my #VehicleHistory table.
As I said, I did come up with a solution but it seems over complicated and I have a feeling that there is a way of doing this with an update. Maybe use the LEN() function but I'm not sure.
Your help would be much appreciated.
BTW, once I've identified the correct Make, I can easily populate my model as all I have to do is use the replace function on VehMakeVehModel column and remove the matched make to get the full model name.
Hope that makes sense and thanks for any help in advance.
Update [table] Set collumnName = Substring(collumnName, 1, 2) = '11' AND Substring(collumnName, 3, 5) = '00000' Where (conditional statement)
SQL is not letting me perform the update because of the AND statement between the two Substring's. Is there anyway around this...is there anything else I can do to acomplish what I'm trying to do.
I would like to create a database for keeping track of payroll data for employees where the supervisors (job coaches) on our workshop floor can use a Pocket PC device to record the hourly employee data on the fly. Then at the end of the day, the supervisor can place the device in a cradle of some sort and synch the newly entered data into the main database.
I'm guessing that SQL Server Compact edition would be perfect for this type of task? Is that correct? Can someone give me recommendations on how to go about setting this up? What should I use as the main database? SQL Server? Access? Any advice is appreciated!
I open query analyser and on one tab I update a record in a transaction and hold it.
begin tran update customers set territory = 'x' where customer = 'A00001' --rollback tran
In a second tab I attempt to update all records in the table
update customers set carrier = ''
Clear this fails because of the lock placed during the first script and this is fine.
However, is there a way to get the 2nd script to ignore the locked rows and just update as many as it can? The obvious answer seemed to be the READPAST hint like follows…
update customers with (READPAST) set carrier = ''
…but this is still blocked by the original lock. I’ve tried combining it with all sorts of other table hints but all seem to get blocked.
The following does work, ignoring the lock and not returning the data
Select * from customers with (READPAST) where customer = 'A00001'
I’ve tried combining this with the update like so…
update customers set carrier = '' from customers with (READPAST) where customer = 'A00001'
..but this is blocked too.
I’m so desperate I tried moving the update into a cursor and update one row at a time. Nothing worked. I thought I might be able to do something like this
If (Select count(*) from customers with (READPAST) where customer = 'A00001') > 0 --then perform update
..but this returns a value of 1 even though the following returns no rows.
Select * from customers with (READPAST) where customer = 'A00001'
Dear All - iam facing a problem for passing a parameter to update paradox table statment in the OLE DB Command
Briefly an OLE DB Command exist between OLE DB Source and OLE DB destination , all i want it to send the data that flow from the OLE DB Source to the update paradox table statment in OLE DB Command ,
i use the parameter annotation which is "?" to refer to the parameter in the update where clause and it sucess when i use the OLE DB Command to connect to SQL Server DB but when i use it with the Paradox Table an error appear saying that there is more than one data source column with the name "?" The data source column names must be unique.
i dont know what that means is it means that "?" it consider "?" as a column or what or is there another way to send parameter to Update statment for the paradox table differ from the SQL Server Update statment which is the "? "
i hope someone help me as soon as possible because i need it in work sOOon Thanks, Maylo
I am trying to run an update statement against a vendor's database that houses HR information. If I run a regular select statement against the database with the following query, it returns without error:
SELECT "QUDDAT_DATA"."QUDDAT-INT", "NAME"."INTERNET-ADDRESS", "QUDDAT_DATA"."QUDFLD-FIELD-ID", "QUDDAT_DATA"."QUDTBL-TABLE-ID" FROM "SKYWARD"."PUB"."NAME" "NAME", "SKYWARD"."PUB"."QUDDAT-DATA" "QUDDAT_DATA" WHERE ("NAME"."NAME-ID"="QUDDAT_DATA"."QUDDAT-SRC-ID") AND "QUDDAT_DATA"."QUDTBL-TABLE-ID"=0 AND "QUDDAT_DATA"."QUDFLD-FIELD-ID"=16 AND "QUDDAT_DATA"."QUDDAT-INT"=11237When I try to convert it into an
[Code] ....
I am assuming I am receiving this error because it doesn't know where to find QUDDAT-INT? How can I fix that?
The "QUDDAT-INT" column houses the employee number. So in the case of the SELECT query above, I am testing against a specific employee number.