Integration Services :: Logging Record Counts To Execute Task For Table Update
Jun 20, 2015
Have an SSIS package running great in 2008R2. It generates several flat files based on inline database queries. The first step of the package inserts a record into a log stats table and the last step of the package updates this record with the package name, run time and execution status. Now I need to add the records counts for each flat file to the log table.
Is there a way I can update one field for run counts with each of the counts for each file. So the [run counts] table column would look something like:
file1: 43522
file2: 645367
file3: 7883
Is it possible to store the record counts and flat file names in variables then concat them at the end when updating this record?
Or, is a better way to just insert/update a new record for each flat file step and log the counts for that file for its own record?
In either case, how I can capture the file count and pass that to the update statement.
The SSIS Package deletes the current week's data, reloads it with fresh data, then calculates the difference between the current week and last week.
The problem is that randomly, instead of deleting the current week, it will delete the previous week. This happens maybe 5-10% of the time. At least it does until I rebuild the package and import it into SQL Server again.
I'm guessing that the Execute SQL Task is not updating the value of the Week variable before it executes. I started with the source type being a variable. Then I decided to try Direct input and pass in the Week as a parameter (OLE DB Connection Type). That didn't work either.
Most recently I tried writing the Week variable to a table first, then having a sequence container with all the tasks second. Slightly better but I still saw the date was wrong 2 times in about 90 executions. I was hoping that writing the Week variable out to the database would force an update of any associated connections to it, but that didn't seem to work.
I've got a package in SSIS 2012 that has an Execute SQL task in the control flow level.
The SQL in question does an Upsert via the SQL merge statement. What I want to do, is return the count of records inserted and records updated (No deletes going on here to worry about). I'm using the output option to output the changed recs to a table variable.
I've tried returning the values as:
Select Count(*) as UpdateCount from @mergeOutput where Action = 'Update' and Select Count(*) as InsertCount from @mergeOutput where Action = 'Insert'
I've tried setting the resultset to both Single rowset and Full rowset, but i'm not seeing anything returned to the package variables I've set for them (intInsertcount and intUpdatecount).
Using SSIS 2012 (within Visual Studio) on Windows 7.
Before allowing my Data Flow task to fire, I'd like to check the target table (OLE DB Destination) for a specific date value in a specific field. I've seen how the Lookup Task is commonly used to check for dupes before inserting, but I'm not able to use that method because the data value I want to search the table for is contained in a Global Variable (let's say "MyVariableDate").
Is there any way to check for any records in a target table where Date1 = MyVariableDate (i.e. scanning the entire table for any occurrence of MyVariableDate in the Date1 field)?
I am using SSIS integration between two database. Both databases are sql server 2008. using many integration but getting problem in two only only two integration giving problem, both are executing perfectly and out put also not showing any error.
but destination table not inserted/updated anything.
first issue integration is using data flow task with oledb source and destination. second one is using execute task with for-eachloop container.
I'm using SSIS in Visual Studio 2012. My Execute SQL Task calls a Stored Procedure where I have a TRY-CATCH. Last week there was a problem and the CATCH was executed and logged an error to my error table, but for some reason the Execute SQL Task didn't fail. Is there a setting to make the Execute SQL Task fail when an SP encounters a failure?
This is for SSIS 2008r2. I am generating flat files successfully with a datetime stamp (filename_yyyymmddhhmm). Now I need to append a MAX(FILEDATE) from the file. I have a query to do this, but am not sure about two things:
1) Is it advised to put the query in a Script Task (with db conn and so forth) or is it better to put it in an Execute SQL Task? I am thinking the latter but am not 100% sure.
2) How would I pass the result of this query (yyyymm) to the filename. The filename format would be (filename_yyyymm). I am assuming that I would probably need to pass the result into a variable/expression but am not quite sure of the steps involved.
I have an SSIS package which calls a command line app.When run in BIDS, it executes normally. The command line app is passed the arguments, does what it needs to do.When called as a SQL Agent Job (by the agent, or by me) it fails when calling the app, giving an exit code of 2 (which is an exception trapped by a try-catch). The SQL Agent service is running under my user (it's a test environment). The argument passed (from the log) is valid, and I've run it against the app, it provides the appropriate output.I can't for the life of me figure out what's going wrong.The app is passed an argument of a path and a password, and applies the password to the file, using interop.
in a SSIS 2012 pkg, I'm trying to specify a SELECT TOP ? myColumn FROM myTable inside an Execute SQL task, but unsuccessfully.Is it possible to parameterize the TOP clause?
while i am trying to unzip files using execute process task ,getting below error
[Execute Process Task] Error: In Executing "C:Program Files7-Zip7z.exe" "a -tzip D:excel.zip D:unzipfileexcel.xls" at "", The process exit code was "1" while the expected was "0".
Warning: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
i want to know more about unzip and zip files and folders using execute process task.
zip folder: C:Program Files7-Zip7z.exe SQL version: SQL server 2008 R2
do not having win rar so please instruct using 7z.its quite interest to work but i don't know to get desired result.
I have tried to incorporate 638 lines sql code inside the Execute SQL Task editor. But it is truncating the codes inside the Execute SQL Task (SQL command) in SSIS.write all 639 lines of codes in the Execute SQL Task Editor in SSIS.
IS that possible to get teh output of a execute sql task to excel destination.I have query which will comapre the data difference between two databses. It will comapre all tables in both databses and list out the difference in data by each table. I need to run this query using SSIS and need to get the output to a excel sheet...I have used the data flow task to run this query but my query is giving some error when used with data flow task. So i have used excecute sql task and need to write teh out put to a excel sheet.
hi, i need execute a store procedure of Informix from sql server integration services. i use driver IBM informix provider 3.2 and have linked server Informix and use sql task for execute the procedure it's posible??
I have created for each container to call all the packages in a folder like below, also created a variable.
Then I add execute package task inside of foreach container and selected file system in a location and in connection called currently creating package name finally in connection properties i added variable in expression which i created and mapped into for each loop container. I referred below link
[URL] ....
All the packages are running but its not ending once all the packages executed its re run and continue the running process, how to stop once all the packages execute.
I have an execute process task set up to run ftp.exe and a script argument. The ftp.exe is referenced in the executable field without a qualified path. The package just seems to know it's there relatively. I need to change this to run a secured ftp executable that I recently installed on my pc. I put the new executable in the WindowsSystem32 folder where the old ftp.exe is stored. But when I put the new executable in the executable field, it says the 'File/Process "FTPS.exe" is not in path'. I get the same error when I fully qualify the path. Is there something I need to do with the new executable for SSIS to pick it up without having to fully qualify the path?
In my ssis 2012 package, I have a 'object' type variable with some table like records. I want to do some SQL operations like insert/update on the records in another table based on this 'Object' type variable records. Basically I want to use a MERGE statement with another physical table with the records in the 'Object' type variable.how to map/use the Object type variable in Execute sql task.I am not good in script task. How to utilize this Object variable in a Execute sql task?
I'm trying to put together an SSIS package that will look in one directory for 2 distinctly named .csv files. The files will then be loaded into 2 tables.
The first task I execute is a SQL Task that checks to see if the tables I'm loading the data into are empty. If the tables have data, the package stops executing. If they are empty, the package continues to execute.
What is the best way to send an email out from SSIS if the package stops on the first step?
There is one SP with parameters varchar, int, varchar and int. It runs perfect in SSMS. Now, I've moved into a Execute SQL task with four IN parameters. In parameter mapping, data type is varchar, long, varchar, long and parameter name is 0,1,2,3. It worked before which is great. Today, I opened this pkg for another testing but failed with the following error:failed with the following error: "Incorrect syntax near ';'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly,parameters not set correctly, or connection not established correctly.
I know SP is fine as tested in SSMS. I could not understand where it could go wrong in the ssis package. Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --
I am using SSIS 2012 to dynamically backup stored procedures on a list of Servers and Databases.Here are the steps in my package,
1. Execute SQL Task: Captures a result set (configured to save the data set in an Object variable) with all the Servers and Databases on which stored procedures exist.
2. For each loop that is configured to get each each row(server name @[User::Server_Name] and databases name @[User::DataBase_Name]) from the object variable (@[User::Connection_Strings])and pass it to a connection manager that has an expression for servername and database name.
2a) Within the for each loop, i have an execute process task that is configured as
i) Executable: C:WindowsSystem32WindowsPowerShellv1.0powershell.exe ii) Arguments: Configured this to fetch value from an expression. The expression i am using is,'C:batch - CopyPowerShell Scripts to Backup Stored ProceduresScriptOutSPs.ps1' -$Server_Name "+ @[User::Server_Name]+ " -$Database_Name "+ @[User::DataBase_Name]
Note: @[User::Server_Name] is the Servername from object variable and so is @[User::DataBase_Name] for database name . The execute task is to run a command line that triggers a powershell script with parameters. Here is the powershell script that i am using,
When i execute the script, by passing parameters from arguments, it executes successfully but nothing happens. Passing wrong arguments in the expression?
I'm trying to execute a simple VBS file from the Executable command line in the Execute Process Task Editor.
My line is this : cscript.exe "c:convertcsvssisXlsToCsv.vbs"
SSIS keeps saying there are illegal characters here. I've Googled and looked about 20 articles and I can't resolve it.
I have a ForEach that loops through Excel files and changes them to CSV files using code i found. This script takes an original Excel file and transfers it to a new CSV file in a new directory.
So in DOS at the CMD line I would type : XlsTocsv.vbs originalfile.xls newfile.csv
I have the original file and new file in the Arguments line so I'm assuming that after the script executes it will look at the filepaths in the loop and loop through them so I want it do to this when it runs:
XlsTocsv.vbs [User::@ExcelFile] [User::@CSVFile]
I just can't get it to execute and I keep getting illegal characters.
I have a table where I am grouping on one field and would like an individual (separate) count of values from another field of same table. So for example, I have following data:
instance_id, area, values 101 North 1 102 North 2 103 East 2 104 East 2
I would like to report on Area, and count of rows with different Values types, for example:
Area Value - 1, Value - 2, Value - 3 North 1 1 0 East 0 2 0
I am not sure what the technical term is for such report, but I can group by Area column, and but its aggregating counts on different Value types that I am having difficulty in performing in SSRS.
I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file.
Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.
Which is the best way to setup the Dataflow task for this requirement?Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
I got a problem with a user who needs to log on to the integration services. The user ist is "sysadmin" on the SQL Server and I granted him rights in the DCOM configuration for MSSQL Integration Services (remote and local log-on and remote and local execute and start). But if the user trys to log-on to the integration services it says "Access denied", why.
We run std 2008 r2. When I deploy and run a pkg from the catalog, how can I get that flat file system log we always instructed ssis to write to when we ran from the command line? I believe it was the /L param . Not sure at this point if i'll use sql agent or somehow employ task scheduler to kick off the pkg.
I 'm using 'Analysis Services Processing Task' as part of a SSIS package to refresh the cube. in the property page,
the 'loggingMode' is set 'enabled', but there is no records in the sysdtslog90 table while all other tasks are logged in the table. How to logging into the sysdtslog90 table?
I would like to fire a pre execution event, grab the name of the stored procedure (source of the sql task), insert a record with the name and datetime, and then fire a post event that would update the record with a modified dated.
What is the best way to capture the source value name in the execute sql task.