How To Know What BRKR TASK (or Any Background Task For That Matter) Is Doing?
Apr 3, 2008
Sometimes we have some pretty serious CPU spikes and I would like to be able to pinpoint the exact issue. I have been able through the DMVs to associate a session with an operating system thread. That's great if I have a batch that's causing the problem with a non NULL sql_handle. However, how do I further narrow down what is going on with the background service broker tasks using the DMVs? Or am I just at the blind mercy of SQL Server at this point? What can I query to know what the heck is going on?
OK. I give up and need help. Hopefully it's something minor ...
I have a dataflow which returns email addresses to a recordset.
I pass this recordset into a ForEachLoop configuring the enumerator as (Foreach ADO Enumerator). I also map the email address as a variable with index 0.
I then have a Execute SQL task which receives this email address as a varchar variable (parameter 0) which I then use in my SQL command to limit the rows returned. I have commented out the where clause and returned all rows regardless of email address to try to troubleshoot this problem. In either event, I then use a resultset to store the query result of type object and result name 0.
I then pass this resultset into a script variable to start parsing the sql rows returned as type object. ( I assume this is the correct way to do this from other prior posts ...).
The script appears to throw an exception at the following line. I assume it's because I'm either not passing in the values properly or the query doesn't return anything. However, I am certain the query works as it executes just fine at the command prompt.
My intent is to email the query results to each email address with the following type of data by passing the parsed data from the script to a send mail task. Email works fine and sends out messages but the content is empty. I pass the parsed data as string values to the messagesource and define the messagesourcetype as a variable in the mail task.
part number leadtime
x 5
y 9
....
Does anyone have any idea what I might be doing wrong?
I'm using SSIS in Visual Studio 2012. My Execute SQL Task calls a Stored Procedure where I have a TRY-CATCH. Last week there was a problem and the CATCH was executed and logged an error to my error table, but for some reason the Execute SQL Task didn't fail. Is there a setting to make the Execute SQL Task fail when an SP encounters a failure?
I am trying to create a simple BI Application for SSIS. In Visual Studio 2005 I just get a Data Flow Task from the toolbar and add it to the project. When I double click it I get the following error:
The task with the name "Data Flow Task" and the creation name "DTS.Pipeline.1" is not registered for use on this computer.
Then when I try to delete it it gives this other error:
Cannot remove the specified item because it was not found in the specified Collection.
I am creating this application in an administrator account in this computer, so I doubt the problem is related to permissions. I am running SQL Server 2005 and Visual Studio 2005 in WinXP Tablet PC Edition.
Any suggestions why this is happening and how to fix it?
I am using the "Transfer SQL Server Objects Task" to copy some tables from database A to database B including data.
The tables, primary key constraints, Foreign key, data and all transfers nicely except for "DEFAULT CONSTRAINTS" on the tables.
I have failed to find any option in the "Transfer SQL Server Objects Task" task to explicitly say "copy default constraints". So I guess logically it should happen automatically but it doesn't. I hope it is not a bug :-)
I am using SQL 2005 SSIS. I am joining several large tables and then the move result into another table in the same database.
I would like know which method is faster:
Use Execute SQL Task to insert the result set to the target table
Use the Data Flow Task to insert the result set to the target table. (Use OLE DB source to execute SQL command and then use the SQL destination) Could you tell me why then other is slower?
I have a SQL Task that calls a stored procedure and returns an output parameter. The task fails with error "Value does not fall within the expected range." The Stored Procedure is defined as follows: Create Procedure [dbo].[TestOutputParms] @InParm INT , @OutParm INT OUTPUT as Set @OutParm = @InParm + 5 The task uses an OLEDB connection and has a source type of Direct Input. The SQL Statement is Exec TestOutputParms 7, ? output The parameter mapping is: Variable Name Direction Data Type Parameter Name User::OutParm Output LONG @OutParm
In the Control flow tab, I have an Execute SQL Task that outputs full Result set into a variable of an object type. Now how can I write the contents of the Full Result Set into a text file using Script Task. I also want to format the following way while I output into a file:
Column Name 1 : Column Value
Column Name 2: Column Value and so on
I tried writing the contents of the Object Variable into a file, but the file had an output of single word: System.__ComObject.
Code for Writing the Full Result Set into a Text File
Dim RSsqloutput as String = Dts.Variables("objVariable").Value.ToString
Dim strVal as String = "File completed on " & Now() & vbCrLf & "------------------------------------------------------" & vbCrLf
In short, does the €œTransfer SQL Server Objects Task€? support distributed transactions?
In trying to use a €œTransfer SQL Server Objects Task€? in a container using a transaction on the container. The task is set to support the transaction. It is setup to copy table data from several tables from a non-domain server (sql server 2000) to a domain-based server (sql server 2005). I get an error stating, €œThis task can not participate in a transaction€?.
I am wondering if it means exactly what it says €“ this task in SSIS can€™t participate at all. Or does it mean that it won€™t in this scenario for some reason. I attempted a simple copy of data from mssql 2005 to mssql 2005 (same server) and the task still failed). MSDTC appears to be running properly on my machine and such (I can do a simple distributed transaction across linked server to the 2000 server in Query Analyzer (QA)). Also, MSDTC appears to be working on both servers with distributed transaction query tests in QA.
Here€™s the error info€¦
SSIS package "Development BusinessContacts and Products Migration.dtsx" starting. Information: 0x4001100A at Copy BusinessContacts Data: Starting distributed transaction for this container. Error: 0xC002F319 at Copy BusinessContacts database table data 1, Transfer SQL Server Objects Task: This task can not participate in a transaction. Task failed: Copy BusinessContacts database table data 1 Information: 0x4001100C at Copy BusinessContacts database table data 1: Aborting the current distributed transaction. Information: 0x4001100C at Copy BusinessContacts Data: Aborting the current distributed transaction. SSIS package "Development BusinessContacts and Products Migration.dtsx" finished: Failure. The program '[4700] Development BusinessContacts and Products Migration.dtsx: DTS' has exited with code 0 (0x0).
I have an application like fetching records from the DataBase(MS Access 2000) and results i have to use in Script Task. At present i have used the record fetching query,connection string in Script itself. I would like to use in Independently. Is there any Tools like (Control Flow Tools like Execute SQL Task) are there to fetch the result set from Acccess and can use the fetching results in Script Task....
I have a stored procedure that is executed via a sql script task that returns a full result set. I map this result set to a variable or object type. Is there a way to use this variable as a data source in a subsequent data flow task?
I have made one package which extracts data from the source does transformation and submits the data to destination. Subsequently it also updates the required control files.
Now I want to add a functionality :
If the package is executed again it should check the status of previous execution in control file if success mark all tasks disable and stop
if failure mark all tasks at enable and start extracting data and continue further with execution.
I was able to attain similar functionality in SQL Server 2000 using activeX script. What code do I need to write as a part of Script Task in order to attain above functionality.
For first time I'm testing this task and surprisingly, when I try "Edit Package" option:
1)The DTS host failed to load or save the package properly 2)The selected package cannot be opened 3)Error HRESULT E_FAIL has been returned from a call to a COM component
But after these messages you can see all the tasks but they haven't name!!
It seem as if RCW mechanism has failed between managed and unmanaged coded-partially.
I don't dare to follow doing more stuff, I don't know if that package is well-loaded or not from there. ?¿
2: Dataflow Task: Datareader--Script componant--OLE DB Destination (SQL Server 2005--a single table --always around 600,000 rows)
How do I set up a transaction where if there is a failure the Truncate Table command will roll back---and the OLE Destination (A single SQL Server table) will be left the same as before the load started.
Another question with that volume of data --600,000 rows will a truncate table be pratical in a transaction
HI, I need to trigger some packages upon existance of specific files in a particular directory. Sound lkike the file watcher task (from SQLIS) would do the job but I am wondering what is the difference of using this tool instead of a for each loop container. I mean, If a file exists in a directory, the for each loop container will detect it. Since the file watcher is not a service, the package containing it needs to ne scheduled on a regular basis for the filewatcher to detect the file, right? So, a for each loop container would do the job? So, waht wouldbe the advantage of using the file watcher task?
A common issue that I run across with clients is they want only want to process a file if it's finished transmitting to the server. This SQL Server 2005 task reads the properties of a file and writes the values to a series of variables. For example, you can use this task to determine if the file is in use (still be uploaded or written to) and then conditionally run the Data Flow task to load the file if it's not being used. You can also use it to determine when the file was created in order to determine if it must be archived.
I'm trying to get a record count out of a databse using OLE DB Source and row count tasks but keep getting an error. I set up a variable as int32 and select the variable name in the row count task and when I go to the Input Columns tab to select a field to count, it gives me this error:
Error at Data Flow Task[Row Count[505]]: The component "Row Count" (505) has forbidden the requested use of the input column with lineage ID 32.
Hi, I have to researching of how to accesss package tasks and component using the SqlServer.Dts.Runtime class and so far, I havent found any solution. For example, if you package has a scriptiong task and a data flow task(<- which contains a data source component). Is it possible to use the scripting component to access the data source component in the data flow task and manipulate its properites like sqlcommand etc.
I've created my own posting for this. The original post was here, I apologize: http://forums.microsoft.com/forums/ShowPost.aspx?PostID=2906512&SiteID=1
According to the poster it's not possible. But there has to be some way to do it? Reflection (don't know how)?
I need to get a reference to the task host in an SSIS Task component.
Basically the scenario is this:
I have a custom task I have created. However I would like to validate that the ExecValueVariable is infact a string variable during the validate event of the task. I know how to verify its a string variable. But I can't figure out how to read what the user selected (such as User::Myvariable). The only way I've been able to figure out how to do it, but it only works if you open my custom task UI.
What I did is this:
I've implemented IDtsTaskUI and during the initialize section I wrote:
Sub Initialize(ByVal taskHost As TaskHost, ByVal serviceProvider As IServiceProvider) Implements IDtsTaskUI.Initialize ' Store the TaskHost of the task. Me.taskHostValue = taskHost Dim myTask As CustomTask= CType(taskHost.InnerObject, CustomTask) myTask.myTaskHost = taskHost End Sub
My Task is named: CustomTask. I have a public variable in my task as follows:
Public NotInheritable Class CustomTask Inherits Task Implements IDTSComponentPersist Public myTaskHost As TaskHost = Nothing
Therefore I pass back the taskhost value to the CustomTask class, and voila I have it.
Problem is, this only works if the custom task calls the initialize method, and this only happens when you open the custom editor.
I then do the validation in my CustomTask class and it works fine, but myTaskHost is null/nothing until you actually open the custom task UI
I created a package with SQL 2005. The package gets the Access DB and then inserts it into SQL Server.
If I open the package in .NET, I can see the SQL Task and Data Flow Task. The SQL Task has a property sqlstatementsource, which has the necxessary SQL code to create the tables.
How can I tell the SQL Task to recompile the SQL code if I give it another DB name, because the tables differ and don't map in the Data Flow Task
Using "Integration Services Project" template in Business Intelligence Studio. Using platforms Visual Studio 2005 along with SQL Server 2005.
Getting the error while trying to execute package after loading it programmaticaly.
I've just one task "Transfer SQL Server Objects Task" on my Integration Services package. But when I try to execute it from VS 2005 project programmaticaly, it gives the above mentioned error.
The commands I use:
Package pkg = new Package();
pkg = a.LoadPackage(@"C:Documents and SettingsabcMy DocumentsVisual Studio 2005ProjectslSSISSSISPackage.dtsx", null, true);
DTSExecResult dResult = pkg.Execute();
The the error comes like: error: 0xc0012024 The task Transfer SQL Server Objects Task cannot run on this edition of Integration Services. It requires higher level edition.
I have a developer here that created an SSIS package that contains a Send Mail Task. When this developer runs the package in the Business Intelligence Development Studio (BIDS) the send mail task runs without issue. But when he tries to run it using command line and the DTEXEC program it errors out with the following error message:
Error: 2007-08-01 15:57:44.37
Code: 0xC0012024
Source: Send Mail Task
Description: The task "Send Mail Task" cannot run on this edition of Integration Services. It requires a higher level edition.
End Error
Warning: 2007-08-01 15:57:44.37
Code: 0x80019002
Source: ELMSFeed
Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
End Warning
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 3:57:24 PM
Finished: 3:57:44 PM
Elapsed: 19.922 seconds
Here are the details of his machine: Visual Studio 2005 Version: 8.0.50727.762 (SP.050727-7600)
Under the Installed Products section it reads: SQL Server Integration Services version: 9.00.3042.00
Once we promoted the package to a production server it runs fine. I can also run the same package from my machine without issue. So, I'm pretty sure that it's specific to his machine, but I have no idea where to start looking.
I have a SQL 2000 DTS which I imported in SSIS. The data transformation tasks were not converted, but saved as SQL 2000 transformation tasks. I want to re-write this, so that it is in compliance with 2005 standards. My problem is, I can't figure out how to capture data from a SQL task, and refer to its result set in a script task. Specifically, I want to capture data by querying a view, then refer to 2 fields in that result set in a script. I want to then transform a code into a full description, and write the transformed data to an Excel destination. The vbScript code that I used in 2000 is as follows:
' Copy each source column to the destination column Function Main() dim vEssenCode, vEssenDesc vEssenCode = DTSSource("Essential_Code") If vEssenCode = "E" Then vEssenDesc = "EOC - Able to Perform" Elseif vEssenCode = "D" Then vEssenDesc = "Dept - Able To Perform" ElseIf vEssenCode = "1" Then vEssenDesc = "EOC - Not Able To Perform" ElseIf vEssenCode = "2" Then vEssenDesc = "Dept - Not Able To Perform" Else vEssenDesc = "Non-Applicable" End If DTSDestination("Essen_Code") = vEssenDesc Main = DTSTransformStat_OK End Function
This worked fine until recently, but now always returns "Non-Applicable". Does anyone have any suggestions as to how I can accomplish this? I think that I have to create a data flow task with an OleDB source, send its output to an SQL Task, then send its result set to the script, do the transformation, and, finally, send its output to an Excel destination. Any suggestions will be greatly appreciated!!
How can I execute a sql task from a script task. Both these tasks are part of the same package. The script task is actually part of the error handler. The execute sql task is part of control flow in the package.
I cannot set a breakpoint in a Script task and have it complete successfully. I am running Vista 32-bit and only the workstation tools for SQL Server 2005 SP2. The steps to recreate are:
1) Create a new SSIS project/package. 2) Add a Script Task. 3) Set a breakpoint in the Script Task. 4) Run the package.
When I remove the breakpoint, the script task succeeds. When I put it back the task fails. The execution results say "Error: The script files failed to load." I completely uninstalled all SQL Server 2005-related entries using the Programs and Features control panel, rebooted, reinstalled the Workstation Tools, then applied a freshly-downloaded SQL Server 2005 SP2 and rebooted. If I change "RecompileScriptIntoBinaryCode" to false the script fails whether there's a breakpoint or not. If it's true (the default), the script only fails when a breakpoint is set. I would like to be able to set a breakpoint to assist in debugging the package. For the time being, I can move the package to a different (non-Vista) OS, where it works fine, but I would like to be able to develop and debug on Vista.
I have 2 questions regarding the file system task and the FTP task
File System: Is there a way to delete a group of files within a directory? ie: I want to delete all files in directorty c:logs*.txt, is that possible?
If not, is it possible to then call a BAT file without user interaction? I have a batch file that deletes files from a network share, but when I run my SSIS package the user is getting prompted which I can't have do to its going to be scheduled and run daily before anyone gets in the office.
FTP: Again, batch file question. I have a batch file that is FTP files over from a directory. Is it possible to use the FTP task, to do this? I noticed that it can do one file at a time, but is there a way to bring everything over within that directory or do I need to create a new FTP task for all 200 files?
I have a scenario where I want the FTP task to connect to a particular FTP server and download files. But if the files are not available on the FTP server then wait for 10 mins and then try again. Continue this till 3:00 am in the morning and if the files are still not there then raise an error notification.
Do I have to create a custom task for this? Or can it be achevied by using the existing SSIS components?
Your guidance is required to start me in the correct direction.
Hi, I'm trying to implement an incremental data pull (Oracle to SQL) based on Andy's blog: http://sqlblog.com/blogs/andy_leonard/archive/2007/07/09/ssis-design-pattern-incremental-loads.aspx
My development machine is decent: 1.86 GHz, Intel core 2 CPU, 3 GB of RAM. However it seems the data flow task gets hung whenever I test the package against the ~6 million row source, as can be seen from these screenshots. I have no memory limitations on the lookup transformation. After the rows have been cached nothing happens. Memory for the dtsdebug process hovers around 1.8 GB and it uses 1-6 percent of CPU resources continuously. I am not using fast load to insert new records into my sql target table. (I am right clicking Sequence Container 3 and executing this container NOT the entire package in the screenshots)
The same package works fine against a similar test table with 150k rows. http://i248.photobucket.com/albums/gg168/boston_sql92/7.jpg http://i248.photobucket.com/albums/gg168/boston_sql92/8.jpg
The weird thing is it only takes 24 minutes for a full refresh of the entire source table from Oracle to the SQL target table. Any hints,advice would be appreciated.
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages: BULK INSERT TableName FROM 'C:DataDbTableName.bcp' WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following: DataFileType: DTSBulkInsert_DataFileType_WideNative RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help. Paul