I need to update the ilocationid from Table 1 to all Table 2 records related to Table 1but there is no direct relation from Table 1 to Table 2. I needed Table 3 to make the connection from Table 1 to 2.
I'm trying to update a checkbox from "False" to "True" within a single table for multiple records. I can update a single record using the script below. However, I'm having trouble applying additional Id's to the string.
(Works) - Update Name_Demo set KEY_CONTACT = 'true' where ID = 225249
(doesn't work) - Update Name_Demo set KEY_CONTACT = 'true' where ID = '225249, 210014, 216543'
It says query executes successfully but returned no rows.
writing the query for the following, I need to collapse the continuity. If the termdate for an ID is one day less than the effdate of the next id (for the same ID) i need to collapse the records. See below example .....how should i write the query which will give me the desired output. i.e., get min(effdate) and max(termdate) if termdate is one day less than the effdate of next record.
How can I update each record in a table, based on a value in another tablewith a single SQL statement?For example, suppose I have the following two tables:Table1Name Something Color-----------------------------------------John GHAS BlueJohn DDSS BlueJohn EESS BluePaul xxxx RedRingo HJKS RedRingo FFFS RedSara hjkd PurpleSara TTHE PurpleJimi sdkjls GreenTable2Name Color------------------------John ?Paul ?Ringo ?Sara ?Jimi ?How can I update the color field in table 2 to correspond with the colorfield in table1 (so I can normalize the db and delete the color field fromtable1)?I know I could open table2 and loop through within my app; just wonderingabout a single SQL statement that would do it. I need a similar technique inother places as part of my app.Thanks,Calan
I have a really simple query I'm trying to execute. I want to replace all instances of int X (8) in Column A (LastActivityID) with int Y (27) but the following SQL returns the said error. I understand the error, but not sure how to script the SQL differently to get the intended result. Thanks in advance.
Code Block
UPDATE Item SET LastActivityID = 27 WHERE LastActivityID = 8 Error: Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression.
Dear all,I need to update one field in a table for a given record and visit number.Example below is how the table looks -SID VISIT DLCO101 0 12101 1 16102 0 18102 2 10103 1 12103 2 14Here is how I would like it to look. The changes are the starred items.SID VISIT DLCO101 0 14*101 1 16102 0 18102 2 16*103 1 12*103 2 14I know it is an UPDATE statement, but I am not sure how to use it when Ineed to update more than one record.Thanks for the help in advance.Jeff
I have a situation where deleting old records is blocking updating latest records on highly transactional table and getting timeout errors from application.
In details, I have one table called Tran_table1 in OLTP database. This Tran_table1 is highly transactional table, it will receive data for insert/update continuously
While archiving 2 years old records from Tran_table1 into Tran_table1_archive in batches(using DELETE OUTPUT INTO clause), if there is any UPDATEs on Tran_table1,these updates are getting blocked and result is timeout errors in application.
Is there any SQL Server hints to avoid blocking ..
I need to list all the records in Table2 which don't have matching field values in Table1.
This the the exact opposite of what I need: SELECT DISTINCT Field1, Field2, Field3, Field4, Field5 FROM [Table1] WHERE EXISTS( SELECT DISTINCT FieldA, FieldB, FieldC, FieldD, FieldE FROM [Table2] )
The above seems to give me all records in Table1 in which the five fields match the five fields specified in Table2. What does not show up is the test record I put in Table2 which is not in Table1.
What I need, however, is the exact opposite.
I tried the above using NOT EXISTS but I get no records at all.
I have a table;CREATE TABLE theLiterals (theKey varchar (255) NOT NULL ,theValue varchar (255) NULL)INSERT INTO theLiterals VALUES('defaultServer','\MyServer')INSERT INTO theLiterals VALUES('defaultShare','MyShare')INSERT INTO theLiterals VALUES('defaultFolder','MyFolder')INSERT INTO theLiterals VALUES('defaultFile','MyFile.dat')I then try;SELECTdefaultServer = CASE WHEN theKey = 'defaultServer' THEN theValue END,defaultShare = CASE WHEN theKey = 'defaultShare' THEN theValue END,defaultFolder = CASE WHEN theKey = 'defaultFolder' THEN theValue END,defaultFile = CASE WHEN theKey = 'defaultFile' THEN theValue ENDFROM theLiteralsand I get;defaultServer defaultShare defaultFolder defaultFile\MyServer NULL NULL NULLNULL MyShare NULL NULLNULL NULL MyFolder NULLNULL NULL NULL MyFile.datbut I want it COALESCEd like this;defaultServer defaultShare defaultFolder defaultFile\MyServer MyShare MyFolder MyFile.dat....but my syntax is incorrect. Is there an efficient way of doing this.I want to have a script/UDF where I can say...GetLiteralsFor('defaultServer','defaultShare','def aultFolder','defaultFile')and then my one-row recordset will be...RS(0) will = '\MyServer'RS(1) will = 'MyShare'RS(2) will = 'MyFolder'RS(3) will = 'MyFile.dat'Thanks for any help!
I want to compare two tables and log the difference in new table with the fields as (old value,new value, column name). The column name should be the changes value column.
what is the best way to uses ssis to deal with a CSV file that contains a number of records that contain a different number of fields ?
I could just load as a single column delimited by <CRLF> but then I would have to write the code to parse the line, effectively detokenising the columns myself but if that is the case then why uses ssis at all ?
I'm having a bit of problem putting together a query that will update records in one table from another table. I've got 2 tables lets call them tblA and tblB.In tblA there is EID(int), QID(int), OID(int) and in tblB OID(int) and QID(int). Also there is an input parameter @EID.What I want to have happen is when someone inputs @EID then tblB gets updated from tblA. To give you a heads up there are no PKs or FKs in either of the tables.If there is an OID in tblA it takes the QID from tblA and places it in tblB where OID from tblA and tblB match.Hopefully this makes sense I thought that I could do something like this:CREATE PROCEDURE test3(@EID int)AS UPDATE tblUsedSET QuestionID = (select QuestionID from tblExamQuestions where ExamID = @EID)WHERE OrderID = (select OrderID from tblExamQuestions where ExamID = @EID)GOBut I get an error that says that there are to many records being returned by the subqueries. Winston
I have a table with 35,000 records in it. I want to update a value in column A for only the first 5000 records, leaving the value in Column A for the remaining 30,000 records as it is now. What would be the command I would use to update Column A for the first 5000 records.
I am trying to update a large table which consists of 45 million records , it is taking more than 2 days to the update , below is my approach
1. The table has only one clustered index and no other indexes on the table. 2. I am updating in batches say 20000 record-wise. 3. Changed the recovery mode to bulk logged and auto-growth size is set to  300MB and there is enough space in my disk for transaction log .
I have many large tables with millions of records in a SQL Server database. They all use an Identity column which is the clustered index. We haven't been deleting any records until recently because disk space is now becoming a problem.
Assuming I delete a lot of old records, I'm thinking that the freed up space in the data pages won't be reused. New records will be added at the end of the table because of the Identity column being the clustered index. So the table will keep getting bigger even though there is lots of free space.
Assuming I'm right, then how do I recapture this unused space? Is an alter table rebuild the best way? Or rebuilding the clustered index?
Hi,Currently we're a building a metadatadriven datawarehouse in SQLServer 2000. We're investigating the possibility of the updatingtables with enormeous number of updates and insert and the use ofcheckpoints (for simple recovery and Backup Log for full recovery).On several website people speak about full transaction log and thepace of growing can't keep up with the update. Therefore we want tocreate a script which flushes the dirty pages to the disk. It's notquite clear to me how it works. Questions we have is:* How does the process of updating, insert and deleting works with SQLServer 2000 with respect to log cache, log file, buffer cache, commit,checkpoint, etc?What happens when?* As far as i can see now: i'm thinking of creating chunks of data of1000 records with a checkpoint after the Query. SQL server has thedefault of implicit transactions and so it will not need a commit.Something like this?* How do i create chunks of 1000 records automatically withoutcreating a identity field or something. Is there something like SELECTNEXT 1000?Greetz,Hennie
I have a SSIS package that simply moves data from a SQL database A to another SQL database B. I have update (increased) the size of a nvarchar column, on both A and B.I am wondering if there is a way to "refresh" somehow the SSIS package so I don't have to rebuild and redeploy it.The error I get now is a truncation error: "Text was truncated or one or more characters had no match in the target code page".
So to the above table I have added a new field/column named "ProdLongDescr(varchar, Null)"
So, I need to populate this newly added column with specific values for each row depending on "ProductCode" which is different forevery row. The problem is that I have 25 rows.So instead of Writing 25 individual update scripts, is there a way in which single query will do the same job instead of writing one update query for each row ?. If so can some one guide me how to achieve that OR point to me a good resource.
Below are a couple of Individual update scripts I Wrote. "ProductCode" is different for all 25 rows.
Update tblValAdPackageElement SET ProdLongDescr = 'Slideshows' WHERE ProductCode = 'SLID' And szElementDescr='Slideshow' if @@error <> 0 begin goto ErrPos end
Update tblValAdPackageElement SET ProdLongDescr = 'CategorySlideshows' WHERE ProductCode = 'SLDC' And szElementDescr='CategorySlideshow' if @@error <> 0 begin goto ErrPos end
I have installed SQL Server Express Edition. I have migrated a set of tables from Oracl10g (by using Microsoft's Migration Tool Kit).While I am trying the following simple update command, the session hangs and it never finishes !!!!!!!!!!!!
I am trying to figure out if what i am attempting to do is possible and whether or not my approach is wrong to begin with.
I am trying to build a custom report for our accounting system which is Traverse from Open systems. This is what i have done in the stored procedure thus far
SET QUOTED_IDENTIFIER ON GO SET ANSI_NULLS ON GO
ALTER PROCEDURE rptArFLSalesByCustItemized_sp @custId pCustID, @dateFrom datetime, @dateThru datetime, @itemIdFrom pItemId, @itemIdThru pItemId as set nocount on
-- define some variables for previous year declare @LYqty int, @LyAmt money, @LYfrom datetime, @LYthru datetime
-- set defaults SET @itemIdFrom=ISNULL(@itemIdFrom,(SELECT MIN(itemId) FROM tblInItem)) SET @itemIdThru=ISNULL(@itemIdThru,(SELECT MAX(itemId) FROM tblInItem)) SET @LYfrom=DATEADD(YEAR,-1,@dateFrom) SET @LYthru=DATEADD(YEAR, -1, @dateThru)
-- create small temp table to hold customer info Create Table #tmpArCustInfo ( custId pCustID, custName VARCHAR (30), ) -- populate customer temp table with info Insert into #tmpArCustInfo select custId, custName from tblArCust WHERE custId = @custId
-- create a temp table to hold the Data for each Item Create Table #tmpArSalesItemized ( itemId pItemId, productLine VARCHAR (12), pLineDesc VARCHAR (35), descr VARCHAR (35), LYQtySold int, LYTDQtySold int, QtySold int, LYTDsales money, totalSales money, LastInvDate datetime, )
-- populate the temp table with all of the inventory items insert into #tmpArSalesItemized select ii.itemId, ii.productLine, ip.Descr, ii.Descr, 0,0,0,0,0, NULL from tblInItem ii, tblInProductLine ip where ip.productLine = ii.productLine AND ii.itemId BETWEEN @itemIdFrom AND @itemIdThru
-- update table with this years quantities update #tmpArSalesItemized SET QtySold = (select SUM(QtyOrdSell) from tblArHistDetail hd where TransId IN (select TransId from tblArHistHeader where custId = @custId) AND orderDate IN (select OrderDate from tblArHistHeader where OrderDate BETWEEN @dateFrom AND @dateThru) AND hd.partId BETWEEN @itemIdFrom AND @itemIdThru GROUP BY hd.partId )
-- Return the temp tables results select * from #tmpArSalesItemized, #tmpArCustInfo
drop table #tmpArSalesItemized, #tmpArCustInfo
return
GO SET QUOTED_IDENTIFIER OFF GO SET ANSI_NULLS ON GO
My problems begin where i want to start updating all of the Qty's of the QtySold field. I have managed to get it to write the same sum in every field but i cannot figure out how to update each row based on the sum of the qty found for that item in the tblArHistDetails table, trouble is too that there is no reference to the custId in that table either. The custId resides in tblArHistHeader and is linked to the details table via the TransId column. So really i need to update many rows based on criteria from 2 other tables.
Can anyone please help? I dont have a clue how to make this work, and most of what i have learned about sql thus far has been from opening other stored procs etc in the accounting system and just reading to see how the developers have done things.
I have 4 databases in the same space.My users use all of them and use the same username and password to log into these 4 databases.In each of these databases,i have put a control table to allow me to keep track of all users that have to reset their passwords.
The control table consists of the username and flag fields.When the flag is ON(1) the user is forced to reset thier password and if the flag is OFF(0) they are not.
When a user logs into any one of these databases and they have to reset thier password,how do i make sure that all the other tables in other databases are also updated to make sure that the user is not forced to reset their password again when they log into those other databases later since they are using the same username and password for all databases.
I am planning to use a stored procedure which i will put into all the four databases and when a user logs in and has to reset their password,that sproc is called and automatically updates all the other 3 tables in other 3 databases.
i have a scenario where I have to read 2 files that update the same table (a temp staging table)...this comes from the source system's limitation on the amount of columns that it can export. What we have done as a workaround is we split the data into 2 files where the 2nd file would contain the first file's primary key so we can know on which record to do an update...
Here is my problem...
The table that needs to be updated contains 9 columns. File one contains 5 of them and file2 contains 4 of them.
File 1 inserts 100 rows and leaves the other 4 columns as nulls and ready for file 2 to do an update into them. File 2 inserts 10 rows but fails on 90 rows due to incorrect data. Thus only 10 rows are successfully updated and ready to be processed but 90 are incorrect. I want to still do processing on the existing 10 but cant affort to try and do processing on the broken ones...
The easy solution would be to remove the incorrect rows from the temp table when ever an error occurs on the 2nd file's package by running a sql query on the table using the primary keys that exist in both files but when the error occurs on the Flat File source, I can't get the primary key.
What would be the best suggestion? Should i rather fail the whole package if 1 row bombs out? I cant put any logic in the following package that does the master file update/insert from the temp table because of the nature of the date. I
Hello all,my first post here...hope it goes well. I'm currently working onstored procedure where I translated some reporting language into T-SQLThe logic:I have a group of tables containing important values for calculation.I run various sum calculations on various fields in order to retrievecost calculations ...etc.1) There is a select statement which gathers all the "records" whichneed calculations.ex: select distinct Office from Offices where OfficeDesignation ='WE' or OfficeDesignation = 'BE...etc.As a result I get a list of lets say 5 offices which need to becalculated!2) A calculation select statement is then run on a loop for each ofthe returned 5 offices (@OfficeName cursor used here!) found above.Anexample can be like this(* note that @WriteOff is a variable storing the result):"select @WriteOff = sum(linecost * (-1))From Invtrans , InventoryWhere ( transtype in ('blah', 'blah' , 'blah' ) )and ( storeloc = @OfficeName )and ( Invtrans.linecost <= 0 )and ( Inventory.location = Invtrans.storeloc )and ( Inventory.itemnum = Invtrans.itemnum )"...etcThis sample statement returns a value and is passed to the variable@WriteOff (for each of the 5 offices mentioned in step 1). This is donearound 9 times for each loop! (9 calculations)3) At the end of each loop (or each office), we do an insert statementto a table in the database.