I have three SQL tasks executing in parallel in an Integration Services package.
+-B-+
A-+-C-+-E
+-D-+
It starts with task A; then B, C, and D all execute in parallel; and finally task E runs after BCD are done.
B, C, and D are all Execute SQL tasks, all with the same connection manager. Here is their code:
B) SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process
FROM temp_B
C) SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process
FROM temp_C
D) SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process
FROM temp_D
Each one is setting a binary value to a package variable (using Result Set settings) based on the count of records from different tables.
This works with no problems when I run it against one server (development). But when I switch to the production server, task B and D both fail. I'v checked to make sure all of the temp tables exist in the database for that connection manager and that all three have the same connection manager - all is okay.
Here's the trickier part. When I'm still pointing to the production server and I run these tasks individually, they are all successful. It is only when they are attempting to run in parallel that they fail.
Here is the Output error:
Error: 0xC002F210 at Process Med?, Execute SQL Task: Executing the query "SELECT CASE WHEN COUNT(*) = 0 THEN 0 ELSE 1 END AS Process FROM temp_B" failed with the following error: "Invalid object name 'temp_B'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I have done a search and have read some of the posts, but am left more confused than before. I am fairly new to SSIS. Here is my situation and what i am trying to accomplish.
I have a package that has a sequence container, in which there are multiple SQL tasks (about 20) running in parallel. I have checkpoints enabled, and FailPackageOnFailure enabled as well. If the package fails, when i re-run the package it will run the last task as well as all the other tasks. What I am looking to accomplish is when the package is re-run, have the SQL tasks that failed ran and not the previous successful tasks.
I think the best way would be via disabling tasks on successful completion of a task, where it writes the name of the SQL task to a temp table, but I am skeptical.
Can anyone point me in a direction to help me accomplish what I am looking for please.
I have a master package with a sequence container with around 10 execute package tasks (for child packages), all in parallel. Checkpoints has been enabled in the master package. For the execute package tasks FailParentOnFailure is set to true and for the sequence container FailPackageOnFailure is set to true.
The problem i am facing is as follows. One of the parallel tasks fails and at the time of failure some of the parallel tasks (say set S1) are completed succesfully and few are still in execution (say set S2) which eventually complete successfully. The container fails after all the tasks complete execution and fails the package. When the package is restarted the task which failed is not executed, but the tasks in set S2 are executed.
If FailPackageOnFailure is set to true and whatever be the FailParentOnFailure value for the execute package task, in case of restart the failed package is executed but the tasks in set S2 are also executed.
Please let me know if there is any setting that only the failed task executes on restart.
I have a SQL Server 2000 instance running on a Windows Server 2003 box with 4 processors. SQL Server is configured to use all 4 processors, and use all available processors for parallelism.
I have created a simple DTS package which has 2 "execute external process" tasks with no precedence constraints between them. There are no connections required or defined for the two tasks (sequential processing is forced on tasks sharing connections). The DTS package properties have the "limit the number of tasks to execute in parallel" set to 4.
However, despite the above configuration, the two steps are never executed in parallel, but always sequentially.
Does anyone have any ideas as to why these tasks are not being executed in parallel?
I've create a package that currently uses 5 DataFlow tasks connected in series to get data from 5 different files and place that information into 5 different temp tables. Each Dataflow task contains only a OLE Source, a row count and a OLE destination. My question is - Is it normal practise to keep each of these separate, or should I put them all into a single DataFlow? The package should only continue if all five dataflow task complete successfully.
I have a scenario where i have to run update task on multiple servers in parallel and once all of them are completed (success or failure) another task is to be run on another server
1. in maintenance plan, if we add tasks which are not joined, will they run in paralled at the same time 2. if we link the last task to all the tasks with link type 'completed' will the last task complete after all tasks are completed or when any one of them is completed (i have big doubt here)
the business requirement behind this is to bring data from multiple servers into shadow copies locally and then process them together. its ok if some server data transfer fails, but its not ok to start processing centrally while data transfer is going on. further, we want to run data transfer from multiple servers in paralleled to save time.
I've made a query like the one in msdn (SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE Targetinstance ISA "CIM_DirectoryContainsFile" and TargetInstance.GroupComponent= "Win32_Directory.Name="e:\\temp""). I have 20 similar tasks for watching in different folders, but when there are too much tasks in parallel, it doesn't work anymore. I change the numbers of executables to 128 (in the general properties of the package (to test)) but it doesn't seems to work.
I don't understand why it works when there are only 1 or 2 (6 seems to be the maximum) tasks and not if there are more than 6.
Could you help me with this issue?
Configuration : Windows Server 2003, SQL Server 2005, SSIS, Sql Server Agent
I created the package to download 4 ftp files at once. I set the MaxConcurrentExecutables for the SSIS package to 4. So in BIDS in downloads 4 files at the same time.
However, when I started the job I noticed that only 3 files were downloaded at the time (looking at temp files in download directory)
Solution: Sure enough after digging around for awhile - in Step properties for SSIS package - there is execution tab - and "Maximum Concurrent Executables" was -1 (which for some reason defaults to 3 concurrent processes even on our dual CPU server) - so after chanign that value to 4 - tada - all 4 files in parallel
Can we execute multiple instances of the same SSIS package simultaneously?? If yes, how? If no, what is the work-around to simulate such a functionality?
I have about 30 different reports that I want to pull into a dashboard. I need to make sure that they don't execute in serial to get good performance.
There are two ways I can approach it
1) I can create a stored procedure for each report and then make sync calls for each of the reports from the web site. So, basically this will be controlled from the web end.
2) I can do this from the SQL Server database, if there is someway to execute these stored procedures in parallel.
Hi,All. I'm writing test cases on C# for a few methods that make changes in database.To prevent making changes I used BeginTransaction-Rollback,everything was good.But this doesn't work if tested method has BeginTransaction-Rollback code itself.An error appears in NUnit: System.InvalidOperationException : SqlConnection does not support parallel transactions. Do smb know how to solve the problem?
This is my code in vb.net with Sql transactionI am using insertcommand and update command for executing the sqlqueryin consecutive transactions as follows.How can I achive parallel transactions in sql------------------start of code---------------------trybID = Convert.ToInt32(Session("batchID")) strSQL = "" strSQL = "Insert into sessiondelayed (batchid,ActualEndDate) values (" & bID & ",'" & Format(d1, "MM/dd/yyyy") & "')" sqlCon = New System.Data.SqlClient.SqlConnection(ConfigurationSettings.AppSettings("conString")) Dim s1 As String = sqlCon.ConnectionString.ToString sqlDaEndDate = New System.Data.SqlClient.SqlDataAdapter("Select * from sessiondelayed", sqlCon) dsEndDate = New DataSet sqlDaEndDate.Fill(dsEndDate) dbcommandBuilder = New SqlClient.SqlCommandBuilder(sqlDaEndDate) 'sqlCon.BeginTransaction() 'sqlDaEndDate.InsertCommand.Transaction = tr If sqlCon.State = ConnectionState.Closed Then sqlCon.Open() End If sqlDaEndDate.InsertCommand = sqlCon.CreateCommand() tr = sqlCon.BeginTransaction(IsolationLevel.ReadCommitted) sqlDaEndDate.InsertCommand.Connection = sqlCon sqlDaEndDate.InsertCommand.Transaction = tr sqlDaEndDate.InsertCommand.CommandText = strSQL sqlDaEndDate.InsertCommand.CommandType = CommandType.Text sqlDaEndDate.InsertCommand.ExecuteNonQuery() tr.Commit() sqlDaEndDate.Update(dsEndDate) sqlCon.Close() End If Catch es As Exception Dim s2 As String = es.Message If sqlCon.State = ConnectionState.Closed Then sqlCon.Open() End If strSQL = " update SessionDelayed set ActualEndDate= '" & Format(d1, "MM/dd/yyyy") & "' where batchid=" & bID & "" sqlDaEndDate.UpdateCommand = sqlCon.CreateCommand() tr1 = sqlCon.BeginTransaction(IsolationLevel.ReadCommitted) sqlDaEndDate.UpdateCommand.Connection = sqlCon sqlDaEndDate.UpdateCommand.Transaction = tr1 sqlDaEndDate.UpdateCommand.CommandText = strSQL sqlDaEndDate.UpdateCommand.CommandType = CommandType.Text sqlDaEndDate.UpdateCommand.ExecuteNonQuery() tr1.Commit() sqlDaEndDate.Update(dsEndDate) sqlCon.Close()
I cannot find any information on this error. It occurs on packages that are writing to the same table using a sql server destination. I suppose it would be a good exercise in error handling, but I'd rather avoid it.
Got this error running a query with ssms2014 on SqlServer2014 but db has Compatibiliy level 100 (sql2008)
no sql error code in message and no error found in eventviewr (sqlserver log, windows log, application log ecc)when i run that query i got this message in "messages" tab, and in results tab i got a strange result, indeed the value returned doesn't exists in table.
ex table1.field1 possible values= 2,3 it returns 1113344
we got that error thru jdbc too...that query use a lot of joins, unfortunatley i can't post it here. today got this error, so i make a copy of 1 of involved table into another db in the same instance, re run the query on the original db, and it works well.
Hi, I'm having an SSIS package which gives the following error when executed :
Error: 0xC002F210 at Create Linked Server, Execute SQL Task: Executing the query "exec (?)" failed with the following error: "Syntax error or access violation". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.Task failed: Create Linked Server
The package has a single Execute SQL task with the properties listed below :
General Properties Result Set : None
ConnectionType : OLEDB Connection : Connected to a Local Database (DB1) SQLSourceType : Direct Input SQL Statement : exec(?) IsQueryStorePro : False BypassPrepare : False
Parameter Mapping Properties
variableName Direction DataType ParameterName
User::AddLinkSql Input Varchar 0
'AddLinkSql' is a global variable of package scope of type string with the value Exec sp_AddLinkedServer 'Srv1','','SQLOLEDB.1',@DataSrc='localhost',@catalog ='DB1'
When I try to execute the Query task, it fails with the above error. Also, the above the sql statement cannot be parsed and gives error "The query failed to parse. Syntax or access violation"
I would like to add that the above package was migrated from DTS, where it runs without any error, eventhough it gives the same parse error message.
I would appreciate if anybody can help me out of this issue by suggeting where the problem is.
I have a simple Error row redirection (from an OLEDB Command) to redirect all rows in error to a Transform script and thereafter to a Flat file destination. This is via the red arrow (DF path) output from the OLEDB Command.
I don't understand why this leg executes even when there are no rows in error? Zero byte Flat Files get written out when there are no errors.
How come? Why would a path with a red arrow execute even when there are no errors?
Part 2 When I introduce some errors in the data to cause an integrity violation, and I hook up an OnError event handler, it is never raised even through the error rows are successfully redirected and written out to the Flat File destination.
So what consitutes an error for a Data Flow Task? Is an error raised by Sql Server for an integrity violation bubble up as an error in the SSIS package?
How would you do a log in a massive rows loading, I'm having problems because every row error(because of casting, format, lookup) in a transformation task is redirected to a text file as a log, this is ok when only exist one error by row, but in the case when I have two errors in the same row detected by diferents transformation tasks only the first one is reported to the text file, I have to wait to the second information load, after I correct the first error, to find the second one, I need to validate as many errors exists by row in the same load...
which component or which strategy can I use in a SSIS Packge to achieve this?
I have this job that download 4 files once a month from the same server. The files are sizable and I need to download them in less than 5 hours total. In 2000 I use an active x script to generate the ftp script then execute the script. all four files download at the same time in 4 different tasks with no issues.
I am rewrote the process in 2005 so that it uses the IS FTP function but when all 4 ftp tasks kick off they all fail... instantly. the initially shared the same FTP connection manager so I created different ones for each and still the same result
the error is one that relates to changing directories.... Now if I just run one of the tasks it runs fine it is just when more than one try to run at once. I ended up putting in 10 second delays between each ftp task kicking off and it works just fine...
Does this sound like a bug??
Also... I am on SQL 2005 Enterprise SP1 on Windows 2003 enterprise SP1.
I am getting this error An error occurred while executing batch. Error message is: The directory name is invalid."
in sql management studio.
i found that if i switch to results to text i don't get the error but with results to grid I get it no matter what i query in any database on the server.
Hi all my problem is as follows : when i try to configure an ODBC DataAdapter using the configuration wizard
i use the query builder to select some fields from multiple tables , the displayed info depends in a user input , when i define the input parameter by writing(= ?) in the criteria column of the input field and continues , it displays an error
Error H4000 Error while executing the query , Error near ')' near character (214 for example).
so please if any one knows the solution to this error reply soon
The import from Flat File Source fails: Error 0xc02020a1: Data Flow Task 1: Data conversion failed.
The data conversion for column "ArticleName" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard)
I have changed the size of the column "ArticleName" (varchar) to max but the error comes up again.
The data i want to import came with multiple flat files. They all could import properly but this one is a problem.
I am facing this error when using a SP called by a DTS package (using ADO connection). The SP gets three parameter and looks like the following (just major part of it):
SET @STR = 'SET ROWCOUNT 10000' + ' WHILE 1 = 1 ' + 'BEGIN' + ' BEGIN TRAN' SET @STR = @STR + ' DELETE FROM ' + @TableToBeCleaned + ' WHERE ' + @SELECTEDCOLUMN + ' <
Server: Msg 266, Level 16, State 2, Line 1 Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing.
Previous count = 4, current count = 5. Server: Msg 266, Level 16, State 2, Procedure p_DBCleanerDeleteRows, Line 56 Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing.
Previous count = 4, current count = 5.
I use the SET ROWCOUNT to execute the delete in batches. When checking with SELECT @@TRANCOUNT it
returns me 5. Somehow it is clear to me that @@TRANCOUNT exited the SP with another value than when
entering it but there is commit in the dynamic SQL statement and I do not know what is wrong.
Is it neccessary to indicate BEGIN TRAN and COMMIT in the statement at all or can I ommit it?
Perhaps this has already been answered. If so, I apologize. I'm anewbie at this stuff. I have a DTS package to import some VisualFoxpro data. Yes, I said Visual Foxpro. :) The DTS package worksbeautifully when I rightclick on the package and choose "execute." Andit works beautifully when I run dtsrun.... at the DOS prompt. But itdoesn't work when I try to execute it with code in Query Analyzer.I've tried exec master..xp_cmdshell 'dtsrun ...' And I've triedscheduling the package as a Job, disabling the Job and then usingmsdb..sp_start_job to run it. Neither of those work.I get a "DSN not found and no default driver specified."My DSN stuff looks OK to me, but perhaps some of that is wrong. Whatdoes the DSN have to look like in order for the code in Query Analyzerto find it? If the DSN is screwed up, why would it work OK whenexecuted from some places, but not in others? What am I missing?Any help would be much appreciated. :) thanks...-emily
I hope this forum is suitable for posting my question.
I have a site in asp and vb with an access database. It's work just fine. I want to convert my database from access to mssql express. So I convert it with Microsoft SQL Server Migration Assistant for Access. I also install mssql server on my computer, so I can test my site at home. But now, I have trouble, I've received many errors like:
[Microsoft][ODBC SQL Server Driver][SQL Server]Incorrect syntax near '#'. (this appear when I try to insert date and time in database, witch in access database I have no problem)
ODBC driver does not support the requested properties (this appears when I make a selection in database, where I use inner join).
And many more ...
Do I have to rewrite the code to all my files? It's so different working with MSSQL comparing with Access database?
From what I read, I understand that after converting the database from Access to MSSQL, I must change only the connection string and the site have no problem. But it seems it's not so easy. I'm pretty disappointed.
Can you give me an advice? Or some links where I can read something that can help me.
This is the error I get if the login "reports" does not exist on the destination server
Error: The Execute method on the task returned error code 0x80131500 (ERROR : errorCode=-1073548784 description=Executing the query "EXEC dbo.sp_grantdbaccess @loginame = N'reports', @name_in_db = N'reports' " failed with the following error: "'reports' is not a valid login or you do not have permission.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. helpFile= helpContext=0 idofInterfaceWithError= {8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}). The Execute method must succeed, and indicate the result using an "out" parameter.
This is the error I get if the login "reports" does exist on the destination server
Error: The Execute method on the task returned error code 0x80131500 (ERROR : errorCode=-1073548784 description=Executing the query "EXEC dbo.sp_grantdbaccess @loginame = N'reports', @name_in_db = N'reports' " failed with the following error: "User, group, or role 'reports' already exists in the current database.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. helpFile= helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}). The Execute method must succeed, and indicate the result using an "out" parameter.
Hello, Today we succesfully integrated Report Services with Sharepoint services.After that I was trying to implement the reports we made in Report Services into the Sharepoint site.The first report was ok, everything went well, no errors and the report was succesfully executed.Then the second report, it is a bit a bigger and more advanced report.It also has to be executed at an other server.I am going to explain how the server setup is.We have 2 servers, a x32 and a x64 server.The x64 server is the "Live" server, and the x32 is our "Test" server.On the x32 server we have MS SQL Server 9.0.3200 (SP2), it is the same as on the x64 server.When i try to execute the report on the x32 server, I got an error.The error I get from reporting server is:
An internal error occurred on the report server. See the error log for more details.So I checked the error log, and then I got the following message: EventType sql90exception, P1 w3wp.exe, P2 6.0.3790.3959, P3 45d6968e, P4 reportingservicesnativeclient.ni.dll, P5 2005.90.3042.0, P6 45cd6edb, P7 0, P8 00005283, P9 00000000, P10 NIL.For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.
After I had found the error, I had searched on Google what the message means. I found some topics on other forums that said "Install SP2". But that was not an option since we are on SP2.
Some additional information: Just tried to run the first report on the x32 server, and it worked. So the first report runs on the x32 as the x64 server.
Hello, I finally could upload the package, and from the management studio interface I ran the package and it worked perfectly.
When I created a job, with one step only to execute that package, the job fails.
When I go to history it doesnt give me any details of what failed on the package or in the job
Date 24/01/2007 12:30:28 Log Job History (Carga datos ACH)
Step ID 1 Server ATLANTESQL2005 Job Name Carga datos ACH Step Name Carga de datos de ach Duration 00:00:02 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message Executed as user: ATLANTESYSTEM. The package execution failed. The step failed.
Maybe is the user that it tried to execute the package as?
I wrote package using Integration Services and tried execute it on Visual Studio 2003 (Visual Basic). Package was loaded, but execute method has failed without any error messages. When I tried to run it on Visual Studio 2005, execution was successful.
Codes: Dim App As DTSLib.IDTSApplication90 = New DTSLib.Application Dim p As DTSLib.IDTSPackage90 = App.LoadPackage("d: empPackage.dtsx", True, Nothing) p.Execute()