Unexpected Behavior When Setting A Script Task Breakpoint In Package W/ Multiple Script Tasks
Apr 16, 2006
I ran into a variety of problems trying to set a script task breakpoint in a package containing multiple script tasks. The debugger apparently treats the breakpoint as if it were set in ALL tasks in the package, not just the one in which it is actually set.
At best this results in hitting breakpoints in scripts where they are not set and at worst the debugger brings up the "Send error report" dialog and quits (while the package continues to run). The latter seems to happen most often when the script with the breakpoint has more lines than an earlier script and the breakpoint is set at a line number that exceeds the number of lines in the earlier script--it bombs when the earlier, shorter script starts.
To get the debugger to work under these circumstances I had to add some nonsense code like
While False
Dim i as Integer = 0
End While
to every script, at the same line number near the beginning of the script (line 40, for example). I then set a breakpoint on the middle statement in one of the scripts (doesn't matter which) to cause the debugger to open at runtime. It doesn't hit the breakpoint because the line is never executed. If the breakpoint is set on a line that can be executed in any script in the package, bad things tend to happen.
I then add a "stop" statement to the script that I want to debug. This only works if the debugger is already open, hence the dummy breakpoint above.
This workaround is usable, but I am debugging a package that has quite a few scripts and having to insert dummy code in all of them at a fixed line number is rather inconvenient. I would really like to see breakpoints work the way one would expect--only in the scripts where they are set.
Is there some other, easier way around this problem? Is there at least an easier way to get the debugger to open so that "stop" will work?
View 3 Replies
ADVERTISEMENT
Mar 15, 2007
I cannot set a breakpoint in a Script task and have it complete successfully. I am running Vista 32-bit and only the workstation tools for SQL Server 2005 SP2. The steps to recreate are:
1) Create a new SSIS project/package.
2) Add a Script Task.
3) Set a breakpoint in the Script Task.
4) Run the package.
When I remove the breakpoint, the script task succeeds. When I put it back the task fails. The execution results say "Error: The script files failed to load." I completely uninstalled all SQL Server 2005-related entries using the Programs and Features control panel, rebooted, reinstalled the Workstation Tools, then applied a freshly-downloaded SQL Server 2005 SP2 and rebooted. If I change "RecompileScriptIntoBinaryCode" to false the script fails whether there's a breakpoint or not. If it's true (the default), the script only fails when a breakpoint is set. I would like to be able to set a breakpoint to assist in debugging the package. For the time being, I can move the package to a different (non-Vista) OS, where it works fine, but I would like to be able to develop and debug on Vista.
View 4 Replies
View Related
Jan 20, 2006
I€™m using a For Loop container to with an Execute Package Task inside, looping until a folder is empty. I€™ve noticed some strange behaviors:
1. The child package keeps creating new connections. I start with 3 connections to the DB and when the For Loop container is done I€™ve got 364 connections.
2. The Execute Package Task is pulling the wrong version of the package I€™ve specified. I€™m using a package saved to the File System and there€™s only one copy on the drive. I€™ve verified the path is going to the correct location.
Does anyone have a work-around for the €˜connection generation€™ issue?
TIA
Eric
View 1 Replies
View Related
Apr 22, 2007
SQL Server 2000 SP4.Running the script below prints 'Unexpected':-----------------------------DECLARE @String AS varchar(1)SELECT @String = 'z'IF @String LIKE '[' + CHAR(32) + '-' + CHAR(255) + ']'PRINT 'Expected'ELSEPRINT 'Unexpected'-----------------------------If the @String variable is set to 'y' (or in fact any ANSI character otherthan 'z'), the result is 'Expected'. The comparison also evaluates asexpected if CHAR(255) is replaced with CHAR(254). The server collation, ifthat matters, is SQL_Latin1_General_CP1_CI_AS.It would be helpful to find the explanatin of this behavior. Thanks.--(remove a 9 to reply by email)
View 2 Replies
View Related
Jun 11, 2007
I've recently been struggling with moving a bigint data result from a query into an SSIS variable through the Execute SQL Task. Now, I could just be doing this incorrectly, but I couldn't get it to work at all if I made the variable int64 in SSIS. So here's what I found:
1. bigint from the query into int64 SSIS variable doesn't work at all. It fails.
2. bigint from the query into string SSIS variable works. BUT, it truncates it. And it's a crazy result, at least to me. All of a sudden the number 840550000000000 becomes 8405500000. Now, that's a really unexpected result to me. It seems to me this is a pretty crazy result.
3. Now, if I cast the bigint to varchar in the query and put it into a string SSIS variable, it works perfectly.
Is this a bug? If not, how is this expected behavior? Or maybe I'm just nuts after hours of futzing with this.
Thanks,
Michael
View 14 Replies
View Related
Jan 26, 2006
I have several sequence containers in one package that fire off execute package tasks. I would like each of the sequence containers to start at the same time when the job starts running. However when I set them up to do that, i get an error that the variable cannot be read because it is locked. I have the variables setup as readonly so not sure why they are being locked. When I run the package and have each sequence container fire off after the previous one ends it runs fine.
Any ideas on how to get around this?
Thanks!
View 1 Replies
View Related
Jul 25, 2007
What are the advantages and disadvantages of having multiple data flow tasks in one SSIS package?
Is this a good idea at all considering the workflow may be similar now but may change in the future? Should it be left as one data flow per package?
View 1 Replies
View Related
Jul 12, 2007
I am still pretty new to SSIS, SQL, and DOT NET. I came from the UNIX world. I have a SSIS package. On the Control Flow one of the Items I have is a Script Task. I was able to successfully set breakpoints in the Script Task and they worked fine. I could step through the script and check values in variables. Life was good.
Now something happened. I set the breakpoint, from the menu I select €œstart with debugging€? and I get the following window:
----------------------------- -
Visual Studio Just-In-Time Debugger
An unhandled exception (€˜System.Runtime.InteropServices.COMException€™) occurred in DTAttach.exe [3380].
Possible Debuggers:
New instance of Microsoft CLR Debugger 2003
New instance of Visual Studio .NET 2003
New instance of Visual Studio 2005
[_] set the currently selected debugger as the default
[_] Manually choose the debugging engines
Do you want to debug using the selected debugger?
-----------------------------------------------------------
I have tried selecting Yes, but that doesn€™t work. If I delete all breakpoints the package runs fine. I greatly appreciate your help.
View 3 Replies
View Related
Feb 13, 2007
Hi,
Let's say I have a package taking as parameter "InvoiceID". I want to execute this package as a child in another package. The parent package gets the list of invoices to produce and calls the child package for each entry of the list.
How do I pass the InvoiceID to the child? I know I can use the parent's variables from the child but I don't want the child package to be dependant on the parent package. For example, I might want to execute the "child" package as a stand-alone in development (providing it with a predefined InvoiceID). I might also want to call the same child package from another parent package with other variable names.
What I would like to do is "push" the value instead of "pulling" it. I know it's possible using the command line and the /SET option (ex.: /SET Package.Variables[InvoiceID].Value;' 184084)... Is it possible using the Execute Package Task?
Thanks
View 4 Replies
View Related
Sep 23, 2007
I have a child package which is executed several times within the same SSIS ETL. I have placed a break point on one of the child package's tasks, set to trigger on a PreExecute() event. The first time the child package is invoked, the breakpoint is triggered. However, on each successive invocation the breakpoint is ignored. Does anybody know if this behaviour is normal? Thanks in advance!
View 1 Replies
View Related
May 24, 2007
Hi there,
I'm creating an SSIS package that will execute legacy dts packages. The package to be executed is decided at runtime using a sql query task.
Executing a dts package statically works fine, but when I try to set the details of the dts to be run via expressions, I get the error below.
To make it dynamic, I created a variable of type string, and put the package name in here. I also have a string variable for the packageid. Then I set up an expression on the Execute DTS 2000 Package Task that sets the PackageName & PackageID property to this variable.
The PackageId is a string variable I've retrieved using:
select top 1 name,
cast(id AS varchar(50)) id,
cast(versionid AS varchar(50)) versionid
from sysdtspackages
where name = @PackageName
order by createdate desc
When I use the task to set the package id it works find (by selecting a dts, then changing the name), but when I try to provide the package id I get this message:
Error: 0x0 at Execute DTS 2000 Package Task: System.Runtime.InteropServices.COMException (0x8004040E): Invalid GUID specified.
at DTS.PackageClass.LoadFromSQLServer(String ServerName, String ServerUserName, String ServerPassword, DTSSQLServerStorageFlags Flags, String PackagePassword, String PackageGuid, String PackageVersionGuid, String PackageName, Object& pVarPersistStgOfHost)
at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.PackageUtils.LoadDTS8Package(String pkgSrc, String sourceUser, String sourcePwd, Int32 srcType, Boolean bUseTrustedConnection, String packageName, String packagePassword, String packageID, String packageVersionGUID)
at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.Exec80PackageTask.LoadPackage()
at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.Exec80PackageTask.ExecuteThread()
Task failed: Execute DTS 2000 Package Task
but the method signature specifies String PackageGuid, and it is a string..
any ideas??
i tried casting the variable like so:
(DT_GUID) @[User:TSPkgId], as the versionid is a uniqueidentifier on sysdtspackages.. It didn't like that at all (can't cast from type DT_WSTR to DT_GUID error code 0xC00470C2)
View 8 Replies
View Related
May 22, 2008
I have a dimension with a hierarchy. In this hierarchy it is possible for a non leaf member to get loaded with a fact measure.
Using a sales organization as an example, let's say
(1) Bill and Ted are sales reps, and they both report to John.
(2) John, not only manages Bill and Ted's sales, but he also makes sales of his own.
(3) Bill sells 10 units, Ted sells 8 units, and John sells 5 units.
In this example, John is the non-leaf member because he is above Bill and Ted in the hierarchy. When this data is aggregated in the hierarchy, we want to make sure that 10 for Bill plus 8 for Ted do not overwrite the 5 for John. Instead, we want to include the 5 for John with Bill and Ted's total of 18, thus getting a total of 23 for John.
Here is my question. What is the default aggregation behavior? The behavior I get is that the non leaf data is overwritten by the sum of the leaf data. So in the example above, I get 18 for John, not 23.
Now, from my research, I think the MembersWithData setting on a dimension attribute as something to do with this, but I can't nail this down.
View 6 Replies
View Related
Jan 21, 2008
we're trying to get a better understanding of how RS behaves when parameters are being set. We see quirky behavior that is a little difficult to describe. Right now we assume that if the revolving green circle (with the phrase "Report is being generated" beneath it) doesnt appear, the report really wasnt rendered properly, even if the report region changes.
One peculiarity that seems pretty consistent is on reports we've prototyped with "from" and "to" date parameters. It seems that when we set one date (doesnt matter which is 1st) things progress normally, ie no "report clearing event" occurs as a result of setting cursor focus in the calendar control and changing its value. The report region doesnt change from what showed previously. But trying to set focus on 2nd (doesnt matter if its "from" date or "to" date, just that its the 2nd date being set) always seems to trigger some kind of event that 1) doesnt allow focus to be on that text box, 2) blanks out the report region including headings. Only after this "event" occurs, can we set focus on the 2nd date, change the value and click the "view report" button for rerendering.
We see similar types of behavior with other types of parameters that include multi value dropdowns and booleans. The toughest part of this is trying to explain it to our users. On some parameters, the event always occurs every time they are changed. On other parameters, it appears that the event only occurs if another parameter was changed beforehand.
I believe we've even seen headings with no data rendered, thinking temporarily that no rows were returned, just to find out that by clicking the "view report" button there really was data to be reported based on current filters. Unfortunately I cant reproduce this scenario when I want to.
View 3 Replies
View Related
Oct 22, 2006
I got this error message during SQL Server Express install with no error code on Server 2003 SP1. I'm installing on an older development machine and although I do have a very large log file somewhere I wanted to post before I even dug into it. I noticed in IIS under my Webs that the MSSQLExpress webs were there at first but when I clicked on them to get any info or Properties... they dissapeared. So they were almost installed but they no longer exist. Also I do not have a SQL Service Manager icon down near the clock.
I almost took a step to fix this. After reading a few posts about similar problems I thought maybe this happened because the Default Web Sites IP # was not set to (All Unassigned). I was going to try a repair install from Add/Remove Programs but the temporary setup files are already gone. I really dont want to uninstall all 5 SQL components from Add/Remove Programs and start from scratch.
As good security practice we usually remove the Default Web or as in this development servers case we changed the IP from (All Unassigned) to an IP that is only accessible from certain machines on the internal network. I obviously should have read more pre setup guides. Anyway how can I fix this without reinstalling everything?
View 2 Replies
View Related
Aug 9, 2007
I would like to know what happens when a very large reference data set for a lookup transform with full caching enabled is getting loaded during package execution and the computer memory runs out or is very low.
Does SSIS
a) give an out of memory error of some sort
b) resort to a no caching or partial caching mode
c) maintain the full caching mode but will switch to using the paging file(virtual memory).
I think it will resort to using the page file in which case the benefits of in memory lookups are lost and performance would suffer. If I cannot upgrade the memory or shrink the reference set somehow, i should switch that lookup task to use partial caching or no caching with an indexed lookup table. Would this make sense?
View 1 Replies
View Related
Apr 3, 2008
I can't find information on how you are suppose to handle the TransactionOption setting inside of a custom task.
I have a custom task that I have developed, and it basically calls a COM+ object that writes data to a database. When I have the task inside of a container that has the transactionoption set to required, and my custom task is set to supported, if one of the X items fail to execute in my custom task I am telling my task to fail the parent, which I thought would rollback everything. But it does not.
Is there someplace that I need to write rollback code in custom SSIS tasks? If there is I can't find any mention of it anywhere. Any examples out there on how to build custom SSIS tasks that support the TransactionOption parameter?
Thanks!
Matt
View 4 Replies
View Related
Jul 23, 2005
I've got a DTS package that runs an active-x script. The script issimple - it runs a stored procedure and saves the results to a CSVfile. I kept getting this error message when trying to run it sayingthat the recordset object I was using could not be used when closed.Well, it didn't make a whole lot of sense to me as to why that washappening, and it doesn't realte to my question except to give you asense of what I'm trying to do. After spending an inordinate amount oftime on that... I decided to just create a SQL Server connectionobject and an Excel Object and then use a transformation to load thequery results. Simple enough, or so I thought. So in thetransformation object under the Source tab, I typed in the query to runthe Stored Procedure:Declare @S nvarchar(30)Declare @E nvarchar(30)SET @S = Convert(nvarchar(30), GetDate()-1, 101) + ' 12 AM'SET @E = Convert(nvarchar(30), GetDate()-1, 101) + ' 11:59:59 PM'exec CTI_REPORT_Q_ACTIVITY_DETAIL @S, @EAnd then I clicked on the preview button. I got the message that norowset was returned. In a way, that explains the issue with theActive-X script. BUT, I know darn well it returns data. It returns438 rows of data when I run this in Query Analyser.So, here's my question....how could that be? Is there some issue thatDTS packages have with temporary tables? I do use a couple in theStored Procedure. Without having to post the stored procedure andtables, etc. could someone let me know if they've run into somethinglike this before?Thanks,Jennifer
View 1 Replies
View Related
Jul 5, 2006
In Executing "E:EmailDelivery.exe" "EP12A 4" at "", The process exit code was "-532459699" while the expected was "0".
View 4 Replies
View Related
May 1, 2007
Hello,
I am looking for some confirmation on a behavior of the SSIS Script Task. I have a custom script task that takes an input file, and archives it after it has been processed into the database.
When I run this package in the Visual Studio GUI, if the destination drive is full, it throws an exception telling me that there is not enough disk space. So, my questions are:
1) If this happens when the package is running through the command line, would this exception still be thrown? (I am thinking it will be)
2) Also, Do I need to explicitly fail the script task in the event handler, in order to ensure this .Net exception being thrown will cause the component to fail. (I am fairly certain I do, since this is what I had to do inside of the Visual Studio GUI, but does anyone know if this same behavior would occur when running from the command line?)
Thanks,
Chris
View 1 Replies
View Related
Jan 3, 2008
Hello all-
I am currently trying to deploy a production package to our servers and am seeing some very strange behavior. I am attempting to use a File System Task inside a Foreach Loop Container that renames (i.e. moves) a file from a processing location, appends a date, and places it in a different directory. The source file is coming from a connection manager whose file name is generated dynamically. The destination file is coming from a variable that is set in the foreach as a preceding task. I experience two different behaviors for the File System task depending on where/how I run the package...
Inside the VS solution on the server & from the SSIS MSDB store on the server
The package behaves as intended. The file is correctly processed then renamed/moved to the correct location.
From the SSIS MSDB store on my local machine
The file is correctly processed, however strange behavior occurs in the renaming task. The informational message in the execution window says that it deleted the newly renamed file (the same message displayed in the VS solution results). Upon checking the file system, the file is neither in the source directory nor in the destination.
Does anyone have some insight as to why these might be behaving differently? My only guess is that it's some sort of file system permission (even though I have domain credentials for a local admin on the server), but I can't imagine why it would allow me to delete the file and not rename it... Though finding the source of the problem would be nice, my biggest question is whether or not the package will behave correctly if I setup a schedule for it.
View 6 Replies
View Related
Dec 20, 2007
Hello Everyone,
I've seen many entries about trailing spaces but have not found one like this.
In the Control Flow I am using an "Execute SQL Task" to populate some SSIS local variables (type string) by: (1) executing a SQL stored proc with output variables (type varchar(100)) to (2) be mapped to the local variable name (the parameter mapping Data Type is VARCHAR).
One of these mapped outputs is used as a path for subsequent operation in the Control Flow. At execution the sproc fires, populating the local variable with the path but with trailing spaces out to 255. Later in the "Script Task" when that path is used I receive an error telling me that the path is too long, and something about 260 or 246 characters.
Here's the oddity. I have two desktop environments running XP and a server environment (server 2003). This package runs just fine on the server - no trailing space issue, no need to trim. But on both my desktops I get the errors. By adding trim statements I can get back the correct path, but varchars should not be including trailing spaces, and the sproc return variable is a varchar (100).
I know this soulds like numerous other posts which indicate the solution is to trim, but I think the question I am asking is why does it work on the server but not the desktop? Is the SSIS variable type string experiencing a bug on different OS's?
Not to further complicate the issue but it used to work on my laptop, but through a horrible sequence of events I had to reload the studio in which case the error started to happen on that too.
View 6 Replies
View Related
Mar 28, 2006
How does database connection pooling work for multiple "copies" of the same web application on the same server? My IIS 5 win2K server has 32 sites, all pointed to the same code. Each site responds to a different IP/URL(www.a.com, www.b.com, etc.) for SSL reasons. DB is SQL 2000.
Based on perf stats, as a new site spins up, a new pool is created. So when I hit site 1, I see 10 pooled connections. Site 2 bumps it up to 20, etc. However, I don't see it go up to 320, so it isn't linear.
BTW, I'm trying to build a case for having 1 site/application that all URL's point to in order to conserve memory, connection pooling, caching, etc.
Any help is greatly appreciated.
-Allen
View 2 Replies
View Related
Jun 29, 2006
I am using execute pacakge task to execute another package . I am giving the Connection string for the package to execute. It works fine in my development machine but when i try to run in another server after i deployed it. It looks for the datasource path of the DTSX file in the same location.
how do i set the path according to each server where the dtsx file is stored. or any other method of storing it like connection string.
if i store it in theparent package variable where should i point to...
thanks
aa
View 1 Replies
View Related
Sep 8, 2006
Hi,
I am making use of the DtUtil tool to deploy my package to SQL Server.
Following is my configuration:
32-bit machine and 32-bit named instance of Yukon.
I have some package variables which need to be set in the code.
Previously I did it as follows:
Set the package variables in the code. For example:
pkgFile.Variables["User::DestinationServerName"].Value = <myvalue>
Deploy the package as follows:
applnObj.SaveToSqlServer(pkgFile, null,
destinationServer, null, null);
Here the package was successfully deployed and when i open those packages using BIDS, I am able to see that the variables are set to the values as doen in teh code.
Because of oen problem I am not using SaveToSQLServer method. So I switched to DTUtil tool.
Now I am doing it this way:
Set the package variables as before.
Deploy the package to SQL Server using DTUtil tool.
Now is the problem:
The package is successfully deployed. But the variables are not set to the value that I have specified in the code.
I also tried DTexec utility to set the package variable. Even that does n't work.
Can anyone help me out? Is there any alternate method to set package variables?
Thanks,
Sandhya
View 8 Replies
View Related
Dec 11, 2014
I have a scenario where i have to run update task on multiple servers in parallel and once all of them are completed (success or failure) another task is to be run on another server
1. in maintenance plan, if we add tasks which are not joined, will they run in paralled at the same time
2. if we link the last task to all the tasks with link type 'completed' will the last task complete after all tasks are completed or when any one of them is completed (i have big doubt here)
the business requirement behind this is to bring data from multiple servers into shadow copies locally and then process them together. its ok if some server data transfer fails, but its not ok to start processing centrally while data transfer is going on. further, we want to run data transfer from multiple servers in paralleled to save time.
View 0 Replies
View Related
Mar 29, 2008
Thanks in advance in reading this thread.
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
Thanks in Advance.
View 4 Replies
View Related
Jun 26, 2007
Hello,
I have done a search and have read some of the posts, but am left more confused than before. I am fairly new to SSIS. Here is my situation and what i am trying to accomplish.
I have a package that has a sequence container, in which there are multiple SQL tasks (about 20) running in parallel. I have checkpoints enabled, and FailPackageOnFailure enabled as well. If the package fails, when i re-run the package it will run the last task as well as all the other tasks. What I am looking to accomplish is when the package is re-run, have the SQL tasks that failed ran and not the previous successful tasks.
I think the best way would be via disabling tasks on successful completion of a task, where it writes the name of the SQL task to a temp table, but I am skeptical.
Can anyone point me in a direction to help me accomplish what I am looking for please.
Thanks in Advance.
View 4 Replies
View Related
Feb 6, 2007
Hi,
I have a Flat File Source and I want to retrieve few properties of it in an Script Component. How do I?
Also, How could I make the file path of Flat File Source or Connection manager dynamic or configurable through some file ?
any input is appreciated.
Fahad
View 1 Replies
View Related
Feb 7, 2007
Hi everyone,
Once I've accesed to package by means of LoadFromSqlServer method how to read its Sql Tasks, for example?
I'm trying with the Executables property but unsuccessfully results:
pkg.Executables.Item(0)
Thanks in advance,
View 14 Replies
View Related
Feb 12, 2007
Hi,
We want to develop an error handling process that will log the errors into multiple destinations (eventlog, text files or sql database) depending upon a variable set in the package. Also we want that this errror handling process should be initiated by all the tasks in the package on error.
Is this possible? Can the same event handler be called from multiple tasks in the package? Also in the event handler can we call another package which actually does the error handling. This way we have only one place to change our error handling process in case required.
Thanks in advance for your help.
$wapnil
View 4 Replies
View Related
Mar 31, 2008
This may sound a little anal-retentive, but I have a number of SSIS packages that, when I open them, the first thing I have to do is scroll to the left or up to get to where the tasks are displayed. Even if I move the tasks right or down, they still end up in that initial position. This happens even if I use auto arrange.
Is there a way for me to set the package so it has a consistent point or display at which it opens?
View 6 Replies
View Related
Feb 5, 2008
I have put together the stored-procedure below to carry out update, delete and insert queries in one visit. The code has been pieced together from the pages listed at the bottom. I pass the procedure three XML strings and after testing it a few times it seems to work fine. I’m fairly new to stored procedures though, so I was hoping someone would answer these questions:
1. Is this an acceptable way to do this? Can you foresee any problems?2. I want to make this an ‘all-or-nothing’ event, i.e. if any part of the procedure fails, it must all fail. How would I achieve that?3. I want to know in my calling code what the result is. I’ve used output parameters before, but I’m unsure how to combine one with 2 above.
Sorry this is a long script, but I’ve removed most of the column names and values to shorten it. Thanks in advance.CREATE PROCEDURE amendPageRecords
@PagesToUpdate xml,
@PagesToDelete xml,
@PagesToInsert xml
AS
BEGIN
-- UPDATING RECORDS ------------------------------------------------------------
DECLARE @UpdateTable TABLE (PageID int, PageType varchar(10))
INSERT INTO @UpdateTable (PageID, PageType)
SELECT PageID = UpdatePages.Item.value('@PageID', 'int'),
PageType = UpdatePages.Item.value('@PageType', 'varchar(10)')
FROM @PagesToUpdate.nodes('Pages/Page') AS UpdatePages(Item)
UPDATE page
SET page.page_type = UP.PageType
FROM page INNER JOIN @UpdateTable UP ON page.page_id = UP.PageID
-- DELETING RECORDS ------------------------------------------------------------
DECLARE @DeleteTable TABLE (PageID int)
INSERT INTO @DeleteTable (PageID)
SELECT PageID = DeletePages.Item.value('@PageID', 'int')
FROM @PagesToDelete.nodes('Pages/Page') AS DeletePages(Item)
DELETE FROM page
FROM page INNER JOIN @DeleteTable DP ON page.page_id = DP.PageID
-- INSERTING RECORDS -----------------------------------------------------------
DECLARE @InsertTable TABLE (SiteID int, PageType varchar(10))
INSERT INTO @InsertTable (SiteID, PageType)
SELECT SiteID = InsertPages.Item.value('@SiteID', 'int'),
PageType = InsertPages.Item.value('@PageType', 'varchar(10)')
FROM @PagesToInsert.nodes('Pages/Page') AS InsertPages(Item)
INSERT INTO page
SELECT SiteID, PageType
FROM @InsertTable
END
GOCode taken from:http://weblogs.asp.net/jgalloway/archive/2007/02/16/passing-lists-to-sql-server-2005-with-xml-parameters.aspxhttp://www.eggheadcafe.com/articles/20030627c.asphttp://www.sommarskog.se/arrays-in-sql-2005.html
View 10 Replies
View Related
Aug 16, 2007
I'm trying to do some custom SSIS logging using event handlers, similar to the ideas provided by Jamie Thomson in the past. My problem is that when I use System:ourceID as one of the items to be logged, I can't match up the SourceID to any of the GUIDs that are displayed in the property window for the various tasks in my package.
Where is this sourceID coming from and how can I track it down?
Thanks for any insight on this.
John Woods
View 14 Replies
View Related