How To Insert More Than Once When Processing One Row In Script Component

Feb 14, 2006

I am trying use call DirectRowToOutput0 more than once to achieve the goal, but it gives an error of calling COM. I can create more than one destination to achieve it, but I need create 8 destinations for my situation. Is there any easy way?

Thanks

View 5 Replies


ADVERTISEMENT

The Component Metadata For Component DataReader Source (1113) Could Not Be Upgraded To The Newer Version Of The Component.

Oct 26, 2007

Hello,

I have a package that has a data lfow task. this task imports data from a db2 database (using the IBM Ole DB provider fro db2) and adds it to sql server database table. This package was created on the server. then though version control (using TFS source control) I check out the package on my local machine. and when I open the package I get the foll 3 errors.

Error 1 Validation error. Import Account Num from BMGP_BDR: DTS.Pipeline: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.

Error 2 Error loading BMAG Download Xref Tables - bmag.dtsx: Microsoft.SqlServer.Dts.Pipeline.ComponentVersionMismatchException: The version of component "DataReader Source" (1113) is not compatible with this version of the DataFlow. [[The version or pipeline version or both for the specified component is higher than the current version. This package was probably created on a new version of DTS or the component than is installed on the current PC.]] at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostCheckAndPerformUpgrade(IDTSManagedComponentWrapper90 wrapper, Int32 lPipelineVersion)

Error 3 Error loading BMAG Download Xref Tables - bmag.dtsx: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.


Please advice.
Thank you.





View 7 Replies View Related

The Component Metadata For Component DataReader Source Could Not Be Upgraded To The Newer Version Of The Component.

Jan 23, 2007

Hi,

I have a package which reads an Access file from a folder. My connection manager to this file is .NET providers for OledbMicrosoft Jet 4.0 OLE DB Provider.

Package works from my computer. But when I execute it on the server as a SQL Agent job, I get







The component metadata for "component "DataReader Source" (1) could not be upgraded to the newer version of the component. The PerformUpgrade method failed.  

I copied the mdb file to a folder on the server which my packages have no problem reading data from.

My packages run under the same domain account as defined in proxies.

Appreciate a help.

Gulden

 

 

View 4 Replies View Related

Interrupt Processing Of A Large Insert Process In SQL Server 2000.

Jul 20, 2005

I'm running a resource-intensive stored procedure, which reads a filewith about 50,000 lines with a BULK INSERT into a temp table, thengoes through it and inserts a record for each line into another table.While this procedure is running, SQL server stops accepting any otherrequests coming from the website.Question:Is there a way to make SQL server "listen", or emulate an "interrupt"to other requests while in the middle of a long intensive process?I really appreciate your replies.Thank you,Oleg.

View 5 Replies View Related

Transact SQL :: Truncate And Insert Operation - Avoid No Rows In Table During Processing

Oct 22, 2015

I have a job which executes hourly.

Essentially:

Begin
Truncate table A
Insert into A
(Col1,
Col2,
Col3...
)
Select Value1,
Value2,
Value3...
From Table B
End

The insert operation query takes approximately 3.5 minutes to execute. What's occurring is the Table is immediately truncated, and there are no rows in the table for those 3.5 minutes.

How can I avoid having this gap - where there are no rows in the table for that period of time during the job execution ? The table could be locked, but that doesn't seem like the best solution.

View 8 Replies View Related

Insert Two Rows At The End On Every Component In Table

May 17, 2014

I have a table that stores a structure. With the main material and component. The components are numbered.

Now I want to insert two components in the end of all components per main material in the table. These two new components are similar for all materials. How do I do this the best way?

Ex:

Material----Component----RowRn----
----------------------------------
Label2------EVO_121---------1-----
Label2------EVO_243---------2-----
Label2------EVO_638---------3-----
Label3------EVO_311---------1-----
Label3------EVO_120---------2-----
Label3------EVO_081---------3-----
Label3------EVO_987---------4-----

The two new components are NewComp1 and NewComp2

I want it to loo like this:

Material----Component----RowRn----
----------------------------------
Label2------EVO_121---------1-----
Label2------EVO_243---------2-----
Label2------EVO_638---------3-----
Label2------NewComp1--------4-----
Label2------NewComp2--------5-----
Label3------EVO_311---------1-----
Label3------EVO_120---------2-----
Label3------EVO_081---------3-----
Label3------EVO_987---------4-----
Label3------NewComp1--------5-----
Label3------NewComp2--------6-----

View 7 Replies View Related

Use Of A SSIS Variable Of Type “Object� Inside Script Component And Task Component

Mar 16, 2007

In a Data Flow, I have the necessity to use a SSIS variable of type €œObject€? inside Script Component and assign to it the content of 'n' variables of string type.
On exiting from the script the variable of type object should contain something like in the following lines:
AAAAAAAAAAAAAAAAAAAAAAAAAAAAA
BBBBBBBBBBBBBBBBBBBBBBBBBBBBB
CCCCCCCCCCCCCCCCCCCCCCCCCCCCC
DDDDDDDDDDDDDDDDDDDDDDDDDDDDD
€¦€¦€¦€¦€¦€¦€¦.
€¦€¦€¦€¦€¦€¦€¦.
On exiting from the data flow I will use the variable of type Object in a Script Task, by reading each element in a cyclic fashion.
Is there anyone who have experienced something like this? Could anyone provide any example of that?
Thanks in advance!

View 3 Replies View Related

A Custom Component For Use As A VIEW In SSIS- Is It Possible To Create One MERGE Like Component With More Than 2 Inputs

Aug 13, 2007

Hi all
I'm into a project which uses a lot of views for joining 2 or more tables. Using the MERGE component in SSIS will be a huge effort coz it only has 2 inputs and I gotta SORT the input too.
Isnt it possible to have a VIEW like component that joins more than 2 tables and DOESNT need sorting??
(I've thought about creating views in database engine but it breaks my data floe in SSIS and is'nt a practical solution)

View 4 Replies View Related

Reference To Preceeding Component From Custom Dataflow Transformation Component

Mar 30, 2006

I am writing a custom dataflow transformation component and I need to get the name of the preceeding component.

I have been trying to find a way to get a reference to the Package object, MainPipe object or IDTSPath90 object (connecting to the IDTSInput90 of my component) from my component because I think from there I can get to the information I want.

Does anyone have any suggestions?

TIA . . . Ed

View 7 Replies View Related

Serious Script Component Bug - Clears Out All Code Inside Component

Nov 27, 2007



No idea where this bug crept in from. Have been using SSIS for 1.5 years now without hitting this problem.

I had a script component opening an XML document and parsing it using XPATH. I added some code that uses StreamReader / Streamwriter (closing one stream before starting the other). The code works without issue in my C# app.

And it ran without issue 2-3 times in SSIS. Then suddenly after running my package again, the script component says it completes successfully, yet nothing happens. I set a breakpoint on the first line of code - it never hits it. I add a msgbox as the first line of code - and it never displays.

I then close my package / exit out of ssis ... and then re-open it. When i open my script component, all of my code is GONE. All references that I added are gone.

I tried adding the streamreader/writer process to a dll I created from my c# app ... and added the DLL to the package -- same result.

I can reproduce this on 2 different computers.

Anyone experience this problem ? Any idea how to stop it ? Or debug it ?


Here is a slimmed down code sample of what causes the error :


Public Class ScriptMain
Public Sub Main()
Try
Dim xmlDoc As New XmlDocument
xmlDoc.Load("c:ulkasync_86281519_20070628045850225_4.xml")
MsgBox("xmlLoaded") --this doesn't display once the package starts "acting up"
Catch ex As Exception
MsgBox(ex.Message)
UpdateXML("c:ulkasync_86281519_20070628045850225_4.xml", ex.Message)
End Try
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub UpdateXML(ByVal fileName As String, ByVal message As String)
Try
Dim invalidChar As String = message.Trim().Substring(message.Trim().IndexOf("0x"), 4)
Dim rd As StreamReader = New StreamReader(fileName)
Dim xml As String = rd.ReadToEnd()
Xml = Xml.Replace(invalidChar, String.Empty)
xml = xml.Replace("", String.Empty)
xml = xml.Replace("<![CDATA[<![CDATA[", "<![CDATA[")
xml = xml.Replace("]]>]]>", "]]>")
MsgBox("replaced")
rd.Close()
Dim wr As StreamWriter = New StreamWriter(fileName)
wr.Write(xml)
wr.Close()
Dim xdoc As XmlDocument = New XmlDocument()
xdoc.Load(fileName)
Catch ex As Exception
UpdateXML(fileName, ex.Message)
End Try
End Sub
End Class

View 4 Replies View Related

Enable Error Handling When Writing Custom Source Component /custom Error Handling Component.

Apr 21, 2006

1) We are writing a custome Source component for Oracle with OCI calls, Could some one please let me know how to Enable Error Handling for the Same,

2) Is it possible to write Custome Error Handeling Component for SSIS? if yes could you please help me on how to write it.

Thanks in advance.

View 1 Replies View Related

DTS PROCESSING

Aug 3, 2001

I have a DTS that imports data from an orcle database into SQL Server.
Doesn't the processing mostly occur on the SQL Server, not on the
oracle database from which the data is being imported?
The oracle database is vendor provieded and they are saying
our SQL Server DTS package is killing their server.
Any insight is appreciated.
Thanks

View 1 Replies View Related

XML Processing

Mar 27, 2008

Hey All,

I've got a process that creates records in my database based on XML input that I've gotten. What I am doing it giving this XML to a stored procedure to handle a specific task, then modify the XML and send it to the next stored procedure.

For instance, the XML could hold header records with detail records, I would first send the XML to a stored procedure that creates the header records, then updates the XML so the XML now knows the identity values of the header records I have just created, and then send the XML to the next stored procedure to create the details for those headers.

All works great and fine, but I have a problem with writing the identity values back to the XML. It seems I can only change one item in the XML at a time and thus need to loop this. For many records this really takes a long time.

Here is some sample code of what I'm doing (please excuse any typos, this is a simplified version of the code) :

declare @lvSeq numeric(15)
declare @lvRowNo int
declare @lvNumRows int

insert into myHeaderTable (
recid, recdesc
) select
ref.value('@recid', 'nvarchar(25)') recid,
ref.value('@recdesc', 'nvarchar(250)') recdesc
from @pXML.nodes('//headers/header') R(ref)

select @lvRowNo=1, @lvNumRows = @pXML.value('count(//headers/header)', 'int')
while (@lvRowNo<=@lvNnumRows) begin
select @lvSeq = recseq
from myHeaderTable
where recid = @pXML.value('//headers/header[position()=sql:variable("@lvRowNo")]/@recid)

set @pXML.modify('replace value of (//headers/header[position()=sql:variable("@lvRowNo")]/@recseq with sql:variable("@lvSeq")')

select @lvRowNo=@lvRowNo+1
end


Obviously I am looking for a better way to update the XML with the sequences. The insert takes a second, the loop takes minutes with large XML sets. I guess MSSQL is searching the whole XML to find the item to update.

It would be nice if I didn't have to loop through the XML. One solution I was thinking off is to store the XML in a temporary table with a single record per header item. Then I could do the modify in one go and recreate the XML by simply selecting the contents of the temporary tabel. I have no idea if this is possible.

So something like this:

select
ref.value('@recid','nvarchar(25)') recid,
ref.value('.','XML') XMLData -- this gives an error
into #TMP_XML
from @pXML.nodes('//headers/header') R(ref)

insert into myHeaderTable (
recid, recdesc
) select
recid,
ref.value('@recdesc', 'nvarchar(250)') recdesc
from #TMP_XML CROSS APPLY XMLData.nodes('/header') R(ref)

update #TMP_XML
set XMLData.modify('replace ....')
from myheadertable
where #TMP_XML.recid = myheadertable.recid

-- recreate XML here, not sure how....

View 1 Replies View Related

SQL Mail Processing

Sep 1, 1998

Is it possible to read a mail message,
store its msgtext in a variable,
decrypt the retreived string,
and execute the string.(wich is a sql statement)

If it is possible wich directions should I take
to implement the above.

View 1 Replies View Related

Row Vs Column Processing Help!!

Jun 15, 2005

Hello friends,
I needed a suggestion, I am currently working on a reporting website that generates reports and i need to store all the reports in the database.

I usually go by row wise processing as it can be easily controlled but the problem is there will be a lot of reports, that is an estimation of 30,000 rows in a month and i m not sure if sql server can hold more than 2 billion rows per table.

Please Help!!

View 2 Replies View Related

Regarding Records Processing???

Mar 1, 2006

Hi…all

I will just explain whole scenario what I m facing in tricky problem..

We have xml files coming at regular interval by some other source into sql server 2000…daily having records near @10000 to 70000…we have job scheduled to run it regular interval…we doing this by some filter criteria… suppose the flow is like staging table into secondary table and then final into primary table….
We design DTS package accordingly means take the records from staging table put into secondary table and then into primary table…(near @ 8 task involved in it…)
Suppose xml file came at 8:30 am and our DTS package will run at 9:00 am…and then 11:00 am and the 1:00 pm like that….what I observing from many days is that after running job at 9:00 am successfully some good data still pending in secondary table not processed into primary table. But when again job ran at 11:00 am it processed that pending good records into primary table…some times when I ran this job manually through DTS design level the good data that pending in secondary table processed!!!

My question is that why this job not processed all the good records in single shot????

T.I.A

Papillon

View 3 Replies View Related

XML Data Processing

Dec 30, 2005

Hello everyone. I need help regarding the following:Given the following table:CREATE TABLE T1 (C1 nvarchar(10), C2 money)INSERT INTO T1 VALUES ('A',1)INSERT INTO T1 VALUES ('B',2)INSERT INTO T1 VALUES ('C',3)let's say that i have this table in a local server and i want to uploadit to a remote server and in the remote server upload it to a databasethat contains the same table.the uploading part can be done by another application in the remoteserver, but i want i need is a way to transfer the data at the fastestpossible way.what steps do i need to follow?tia,Rey Guerrero

View 10 Replies View Related

Processing Dataset As A Whole, And Not Per Row

Sep 20, 2007

Which component should be used to process dataset as a whole, and not on per row basis? I have need to process dataset conditionally (condition based on dataset), e.g. if a special row is present in dataset than dataset should be processed in a special way. Should I maybe use one Script Transformation to determine if dataset satisfies condition (and store that result into a variable), and then based on that condition (variable value) perform or not processing (using Conditional Split and Script Transformation)?

View 7 Replies View Related

Need Help With Row By Row Processing In SSIS

Feb 28, 2007

Hello all,

I am having trouble trying to construct the following process in SSIS/SQL
2005:

1. Grab a set of unprocessed rows (ProcessDT = null) in an 'Action' table
2. For each of these rows, execute multiple stored procedures base on the
action type
If actiontype = 1, exec spAct1a @param1, @parm2
exec spAct1b @param1, @parm2, @param3, @param4
If actiontype = 2, exec spAct2a @param1, @parm2, @param3
exec spAct2b @param1, @parm2, @param3
etc....
3. Update ProcessDT so it's not processed again
4. Repeat until all rows are processed

Note - all sp params are contained in additional columns in the Action
table. Basically the Action table is a store for post-event processing of
sorts
but is order dependent, hence the row by row processing. And some of my
servers are 2000 so Service Broker is not an option (yet).

I first attempted to do this totally within the control flow - using an ado
recordset/foreach loop control, but I could not figure out how to run
conditional
process paths based on the ActionTypeID. I then tried to do this within the
dataflow using on OLEDB data source, a conditional split, and an oledb
command control
which almost got me there - the problem being for each row I need to execute
multiple sp's and it appears as if the oledb command only gives me one sp.

Any help would be appreciated!

Thanks in advance!

View 3 Replies View Related

Processing One Row At A Time

Jul 11, 2007

I'm populating a new table based on information in an existing table. The new table is a list of all "items" and contains a primary key. The old table is a database of receipts where items can appear many times in any order.



I have put together the off-the-shelf components to do this, using a lookup transformation to see if the item is already in the new table. Problem is, because there's so much repetition in the old table I need to process the old table one row at a time. Batch processing is generating errors because the lookup doesn't detect duplicates within the buffer.



I tried setting the "DefaultBufferMaxRows" property of the task to 1, but that doesn't seem to have any effect.



To get the data from the old table, I'm using an OLE DB source. To get the data into the new table, I'm using the OLE DB Command transformation with parameters to execute an INSERT statement.



This is a job I have to do exactly once, so I don't care if I have to run it overnight. I'd rather have a simple, easy to understand but inefficient script so I understand what it's doing completely.



Any help on forcing SSIS to process one row at a time?



Lee Silverman
JackRabbit Sports

View 8 Replies View Related

Processing Time

Sep 25, 2007

Hi,



I am verifying my reports processing time. I get the information from the Reporting Service DB - [ExecutionLogs] table. I have the following information:



[TimeEnd] €“ time that reports generation ends.

[TimeStart] - time that reports generation starts.

[TimeDataRetrieval] - amount of time spent running the data sources.

[TimeProcessing] - time spent processing the report.

[TimeRendering] - time spent generating the output format.



If this information is correct the following statement should be true:



([TimeEnd] - [TimeStart]) = ([TimeDataRetrieval] + [TimeProcessing] + [TimeRendering])



But it isn't, ([TimeEnd] - [TimeStart]) is always bigger then ([TimeDataRetrieval] + [TimeProcessing] + [TimeRendering]).



Why does this happen?



Regards,


Rodrigo

View 5 Replies View Related

Query Processing

Apr 6, 2006

can anyone give me query processing for Database mirroring

View 1 Replies View Related

AS Processing Task

Nov 27, 2005

When using the AS processing task with a connection to "an Analysis Services project in this solution", only some processing options are available for processing dimensions. For instance, it is not possible to select "Process Update". Once I change the connection manager to point to the deployed cube database, I can choose from all the options. Is this by design?

View 1 Replies View Related

Lot And Fifo Processing In Sql

Feb 12, 2008

I need help from sql experts for the following problem

DOCTYPE DATE QTY PRD LOT
Purchase 1 jan 20+ AA 2007FW
Purchase 4 jan 50+ AA 2007SS
Purchase 9 jan 10+ AA 2007FW
Sale 3 jan 10- AA
Sale 4 jan 20- AA
Returned Good 4 feb 10 AA


As you can see I don't have the LOT code in sales records, so I must update these records with the following logic:

I have to assign LOT code to sales in FIFO order:
from the table above the 3rd of January I find the first sale of 10, take the first LOT (ascending date order), check for qty on hand (20+), consume 10 from LOT (10 remaining), update sale record with 2007FW LOT code.

Then I find next sale of 20, as before I take the first LOT with qty on hand to consume, again it's the first record with only 10 remaining so I set LOT qty on hand to zero but I have 10 more to allocate to a LOT code. So I find next available LOT of 50 and consume 10, with 40 remaining.

It's important to remember that I can sell part of a "lot", have to track the remaining goods per LOT

Is it possible in SQL to do that in batch mode? How? If I have to split sale record consuming 2 or more LOTS, how can I do? Can you show me SQL or good hints?

Thanx in advance

View 3 Replies View Related

Cube Processing

May 22, 2008



Hi All,

I have 3 cubes in a single SSAS database and these cubes should be processed using the following schedule

Cube 1 - Every Day
Cube 2 - Every Week
Cube 3 - Every Month
Cube 4 - Every Day

The issue that I face is that these cubes share the dimensions and so I cant do a FULL process of these SHARED Dimensions as it will affec other cubes.

I can expect additions and deletions to my dimension data , but the structure remains the same. It would be great if someone can suggest how to go about processing the dimensions. I am confused with the number of options(Process Incremental, Process Update etc.,) available for processing the dimensions.

I will creating a SSIS package to automate the processing. One more question is say, if Cube 2 fails during a day and Cube 1 has succesfully processed on the same day earlier, how do I revert back to the old state of Cube 2? Does this mean that I need to do a back up of the SSAS database before processing each cube?

Apologies if I had missed anything.

Thanks!

View 1 Replies View Related

Availability During ETL Processing

Dec 5, 2006

Hi,

I am in the midst of designing a new Data Warehouse system. As we get further into the design of the system, the more we are realising how complex our ETL is going to be and that the amount of time it will take to run could be significant i.e. a few days! My question is obviously I don't want to have a down time in my relational system for this long and prevent my users from accessing the data for days at a time each month. So what functionality should I be looking at to allow me to maintain a working copy of the data that users can query whilst I perform database updates and then perform a quick promotion of the updated data to users for querying?

If you can point me in the direction of the right functionality in SQL Server 2005 and possible some relevant white papers that cover this sort of scenario I would be grateful.

regards

Colin

View 1 Replies View Related

Need Help Processing SP Output Parameter

May 7, 2007

Stored procedure works:
PROCEDURE dbo.VerifyZipCode@CZip nvarchar(5),@CZipVerified nvarchar(5) outputAS IF (EXISTS(SELECT ZIPCode FROM ZipCensus11202006 WHERE ZIPCode = @CZip)) SET @CZipVerified = 'Yes' ELSE SET @CZipVerified = 'Not Valid Zip'RETURN
 Need help calling and processing sp information:
 
protected void C_Search_btn_Click(object sender, EventArgs e){        SqlConnection con = new SqlConnection(ConfigurationManager.ConnectionStrings["localhomeexpoConnectionString2"].ConnectionString);        SqlCommand cmd = new SqlCommand("VerifyZipCode", con);        cmd.CommandType = CommandType.StoredProcedure;        cmd.Parameters.AddWithValue("@CZip", UserZipCode_tbx.Text);
             . . . how do you set up output parameter, call the SP and capture output. . . ?        if (@CZipVerified == "Not Valid Zip")               {                    TextBox5.Text = "Zip code not valid please re-enter zip";               }                    else               {                    Continue processing page               }
}

View 5 Replies View Related

Processing Incoming Emails

Sep 14, 2004

How would I go about checking incoming e-mails? For example, on a certain e-mail address, I would get e-mails formatted in a certain way. According to the response, some scripts need to run/ some sql tables updates etc. How can one do this in (ASP) .NET with SQL Server? Anyone did this kind of stuff before?

View 3 Replies View Related

Multi-rec Processing In A Trigger & Sp

Mar 28, 2001

SQL 7 SP3

Hi.

I have a trigger that fires for any updates/deletes/adds and logs information via a sp call.

If I have more than one record, is there a better way to process them other than using a cursor ?

Yes, they have to be processed through the sp.

Thanks,

Craig

View 3 Replies View Related

Processing Cube Error 211

Aug 14, 2000

When I process a new cube I recieve an error "Error (211): Unknown dimension member ' 8'; Time: 8/11/00 1:09:45 PM". Any ideas about what this error is about ?

View 2 Replies View Related

Urgent !!!Processing Time

Jun 25, 2003

hii...

Is there anybody out there that can help me on how can I know the processing time taken for one transaction by using SQL Analyzer??

1)For example, I want to update using Analyzer and I would like to know time taken to do this update???

2) How to reduce processing time by using Store Procedures that using cursor?? I have add in some commit statement in my update statement.. Is there any other ways??

Please advise

View 14 Replies View Related

Delays Processing Queries

Dec 28, 2001

I've noticed lately that my queries through ADO/VB are taking alot longer to process at certain times. The query and the result information never changes, only that at certain times the query takes alot longer than usual. I thought that I possibly need more licenses, or it might be network traffic. I currently use MS SQL Server 2000 small business Ed(5 Cals).

Has anyone any information about performance problems due to licensing issues?


Thanks,

Dave Cohoon

View 1 Replies View Related

Daily File Processing Help

Jan 9, 2004

Hello DBA's:

I need to extract a particular file from our AS 400 system on a daily basis and do some processing on it. Also I want to do the daily processing only on those records that have been added/updated since the initial load.
Here is the approach and possible implementation.

Approach
I have the DB2 source data containing 10 columns (Col1 to Col5 together form the key) and the rest are attributes. I am interested only in the key and two attributes. So I load my table with only Col1 to Col7 ( 5 for key and the two attributes). I then do my processing on this table.

Here is the implementation given by a member of dbforums -



You'll then have to deal with 3 potential actions,

INSERT: New records on the file.
DELETES: Records that don't exists.
UPDATES: Records that are on the file, but attributes have changed.


You had given me this code template.

CREATE TABLE myTable99 (
Col1 char(1)
, Col2 char(1)
, Col3 char(1)
, Col4 char(1)
, Col5 char(1)
, CONSTRAINT myTable99_pk PRIMARY KEY (Col1, Col2))

CREATE TABLE myTable00 (
Col1 char(1)
, Col2 char(1)
, Col3 char(1)
, Col4 char(1)
, Col5 char(1)
, CONSTRAINT myTable00_pk PRIMARY KEY (Col1, Col2))
GO

INSERT INTO myTable99(Col1,Col2,Col3,Col4,Col5)
SELECT '1','1','a','b','c' UNION ALL
SELECT '1','2','d','e','f' UNION ALL
SELECT '1','3','g','h','i' UNION ALL
SELECT '1','4','j','k','l'
--DELETED

INSERT INTO myTable00(Col1,Col2,Col3,Col4,Col5)
SELECT '1','1','a','b','c' UNION ALL--NO CHANGE
SELECT '1','2','x','y','z' UNION ALL-- UPDATE (My comment - Instead of an update I want to insert a new record)
SELECT '1','3','g','h','i' UNION ALL--NO CHANGE
SELECT '2','3','a','b','c'--INSERT
GO

SELECT * FROM myTable99
SELECT * FROM myTable00
GO

--DO DELETES FIRST (My comment - Before deleting, I want to copy the rows that I am going to delete on a separate table to maintain history. Then I want to delete from a). I don't get the logic. If the rows of the old extract are not in new extract then delete them. So shouldn't it be <> instead of =. why the where clause condition)
DELETE FROM a
FROM myTable99 a
LEFT JOIN myTable00 b
ON a.Col1 = b.Col1
AND a.Col2 = b.Col2
WHERE b.Col1 IS NULL AND b.Col2 IS NULL

-- INSERT (My comment - I don't get the logic of the where. If the rows of the old extract are not in new extract then delete them. So shouldn't it be <> instead of =)

INSERT INTO myTable99(Col1,Col2,Col3,Col4,Col5)
SELECT a.Col1, a.Col2, a.Col3, a.Col4, a.Col5
FROM myTable00 a
LEFT JOIN myTable99 b
ON a.Col1 = b.Col1
AND a.Col2 = b.Col2
WHERE b.Col1 IS NULL AND b.Col2 IS NULL

-- UPDATE

UPDATE a
SET Col3 = b.Col3
, Col4 = b.Col4
, Col5 = b.Col5
FROM myTable99 a
INNER JOIN myTable00 b
ON a.Col1 = b.Col1
AND a.Col2 = b.Col2
AND ( a.Col3 <> b.Col3
OR a.Col4 <> b.Col4
OR a.Col5 <> b.Col5)
GO
------------

Can anybody look at My comments and answer them or revise this code template if need be?

Brett Kaiser - I sent you an e-mail on this. Can you respond to it when time permits.

Thanks

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved