Project Real SQL Agent Is Ignoring @[User::RootDir] + \ And Finding An Old Package Path Somewhere In Bids SLN Metadata
May 12, 2008
Executed as user: xxxxxx-sql. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 4:14:05 p.m. Error: 2008-05-12 16:14:07.84 Code: 0xC00220DE Source: EPT Dimensions - x Description: Error 0x80070003 while loading package file "E:ETLLoadGroup_Dimensions_Daily.dtsx". The system cannot find the path specified. . End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 4:14:05 p.m. Finished: 4:14:07 p.m. Elapsed: 2.25 seconds. The package execution failed. The step failed.
Hi Im using project real framework and have BIDS installed on same server as SQL Server 2005, when I run from BIDs all
good when I run from SQL Server agent via File system --the first package that master package calls fails
I have double checked the environment variables in Project Real and all good because otherwise BIDs would have failed
and of course in each package the expression data uses @[User::RootDir] to point to the package file path
@[User::RootDir] + \LoadGroup_Dimensions_Daily.dtsx
the issue is the package under SQL Agent is ignoring @[User::RootDir] + \ and finding an old package path somewhere in Bids SLN metadata
Any clues how can refresh the metadata of the package for deployment
Hello, I am reading Microsoft REAL project ETL part. I find that it uses neither transaction nor checkpoints throughtout the packages. I am wondering if something goes wrong with the process and the process aborts in the middle,how are we going to handle this. Any particular reson not using transaction? Is it best practice without using transaction for performance consideration and log volumn? I am new to SSIS. I would like to follow the best practice. Thanks for advice..
I am trying to load Project Real. I get warnings when opening the RecurringETL package. However I created the two environment variables it's looking for and have confirmed these variables are functional. I am unsure how to troubleshoot this and am afraid to do anything given these warnings. Where is the configurations collection??????:
Warning 1 Warning loading TestHarness.dtsx: The configuration environment variable was not found. The environment variable was: "REAL_Configuration". This occurs when a package specifies an environment variable for a configuration setting but it cannot be found. Check the configurations collection in the package and verify that the specified environment variable is available and valid. C:Microsoft Project REALETLTestHarness.dtsx 1 1
Warning 2 Warning loading TestHarness.dtsx: The configuration environment variable was not found. The environment variable was: "REAL_Root_Dir". This occurs when a package specifies an environment variable for a configuration setting but it cannot be found. Check the configurations collection in the package and verify that the specified environment variable is available and valid. C:Microsoft Project REALETLTestHarness.dtsx 1 1
Warning 3 Warning loading TestHarness.dtsx: Failed to load at least one of the configuration entries for the package. Check configurations entries and previous warnings to see descriptions of which configuration failed. C:Microsoft Project REALETLTestHarness.dtsx 1 1 Thanks
SQL 2012 - Convert BIDS project and DTUTIL cannot load the package I just converted a really simple 2008 BIDS project to 2012 and 2012 dtutil will not load it. Tried 32 bit and 64 bit dtutil.
Get the following error message.
'count not load package "c: empSSISPackage.dtsx" because of error 0x80131534. Description: the package failed to load due to error 0x80131534 "<null>". This occurs when CPackage::LoadfromXML fails.
I can Import the package in SQL Manager, and I can deploy it using the manifest and the deployment wizard. But DTUTIL chokes on it. It is the dtsx right from 2012 Bids build.
All: As the subject suggests I am encountering an error while running a package through an agent. Unfortunately the error does not provide much information for me to diagnose the problem, and hence the post.
I have pasted the error below and appreciate help from anyone.
Thank you,
Message Executed as user: EPSILONSYSTEM. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:16:31 AM Error: 2007-08-02 10:16:32.25 Code: 0xC002F304 Source: File System Task File System Task Description: An error occurred with the following error message: "Could not find a part of the path 'P:FinanceItems Sold Below CostItems Sold Below Cost_2007-08-01.csv'.". End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:16:31 AM Finished: 10:16:32 AM Elapsed: 1.047 seconds. The package execution failed. The step failed.
Hi, I have been searching through the posts hoping to find an answer to my problem. Most errors and problems seem to deal with Agent not being able to run packages because of permission issues.
My problem has to do with SQL Server Agent not finding the package to run. I have a proxy with a high enough level of permissions (SqlAgentOperatorRole).
The error message I get is:
Executed as user: ___/____. The package could not be found. The step failed.
The package is in a folder on our network. When I create the step I use the browser to go find and select the package. So why does it not find it at runtime? I can run it manually
My group is using 2005 for the first time. Any help you can provide would be very much appreciated.
I have packages stored in SQL store. I was letting users run the packages from a .net app that I made with
Microsoft.SqlServer.Dts.Runtime
Now I have noticed this causes the packages to run on the client pc cpu, as well as the network traffic is done via the client pc, in my particular case this is slow.
From the doc and in this forum I have found that you can run a package on the Server cpu through sql agent, let packages be run in a sql job. after that you can start a package from an application with the SQL sp_start_job .
But How do you set a user::varibale in a package if you have to start the package from a sql agent job ?
Timings... sometimes there are almost too many ways to do the same thing.The only significant findings I see from all the below timings is:1) Integer math is generally fastest, naturally. Bigint math isn't muchslower, for integers that all fit within an integer.2) Converting float to varchar is relatively slow, and should be avoided ifpossible. Converting from integer to varchar or varchar to int is severaltimes faster.3) Most significantly, and less obvious, CASE expr WHEN .... recomputes exprfor each WHEN condition, unfortunately, and is the same speed (or perhapsslightly slower) as listing WHEN expr = value for each condition. Perhaps anindexed computed column (somehow materialized) would be advisable whenpossible to avoid repeated computations in CASE..WHEN expressions (if thathelps..).Note that if you divide by COUNT(*), most timings below are below onemicrosecond per row, so this all may not be very significant in mostapplications, unless you do frequent aggregations of some sort.COUNT(*) FROM [my_sf_table] = 477446 rowsThe result from each query = either 47527 or 47527.0Platform: Athlon 2000 XP w/512MB RAM, table seems to be cached in RAM, SQL2000, all queries run at least 3 times and minimum timings shown (msec).SRP is a REAL (4 bytes)Fastest ones are near the end.CPU SQL(ms)-- Convert to varchar (implicitly) and compare right two digits-- (original version -- no I didn't write it)4546 select sum(case right(srp,2)when '99' then 1 when '49' then 1 else 0 end)from sf-- Use LIKE for a single comparison instead of two, much faster-- Note that the big speedup indicates that-- CASE expr WHEN y then a WHEN z then b .-- recalculates expr for each WHEN clause2023 select sum(case when srp like '%[49]9' then 1 else 0 end)from sf-- Floating point method of taking a modulus (lacking fmod/modf)2291 select sum(case round(srp - 100e*floor(srp*.01e),0)when 99 then 1 when 49 then 1 else 0 end)from sf-- Round to nearest 50 and compare with 491322 select sum(case round(srp-50e*floor(srp*.02e),0)when 49 then 1 else 0 end)from sf-- Divide by 49 by multiplying by (slightly larger than) 1e/49e811 select sum(floor((cast(srp as integer)%50)*2.04082E-2))from sf-- Integer approach without using CASE731 select sum(coalesce(nullif(sign(cast(srp asinteger)%50-48),-1),0))from sf-- My original integer approach651 select sum(case cast(srp as integer)%100when 99 then 1 when 49 then 1 else 0 end)from sf-- Modulus 50 integer approach without CASE481 select sum((cast(srp as integer)%50)/49)from sf-- Modulus 50 integer approach460 select sum(case cast(srp as integer)%50when 49 then 1 else 0 end)from sf-- bigint without CASE531 select sum((cast(srp as bigint)%50)/49)from sf-- bigint with CASE521 select sum(case cast(srp as bigint)%50when 49 then 1 else 0 end)from sf-- get SIGN to return -1 or 0, then add 1-- much better than the coalesce+nullif approach500 select sum(sign(cast(srp as integer)%50-49)+1)from sf-- SIGN with BIGINT551 select sum(sign(cast(srp as bigint)%50-49)+1)from sfBTW, I know srp should be int to begin with for this to be faster... Okay,so...select cast(srp as int) srp into sf from [my_real_sf_table]720 select sum(case when srp like '%[49]9' then 1 else 0 end) from sf339 select sum(1+sign(srp%50-49)) from sf310 select sum(srp%50/49) from sf300 select sum(case srp%50 when 49 then 1 else 0 end) from sfWhat if it were a char(7)?select cast(cast(srp as integer) as char(7)) srp into sf2 from[my_sf_table]801 select sum(case right(rtrim(srp),2) when '49' then 1when '99' then 1 else 0 end) from sf2717 select sum(case when srp like '%[49]9' then 1 else 0 end) from sf2405 select sum(srp%50/49) from sf2391 select sum(case srp%50 when 49 then 1 else 0 end) from sf2How about varchar(7)?drop table sf2select cast(cast(srp as integer) as varchar(7)) srp into sf2 from[my_sf_table]581 select sum(case right(srp,2) when '49' then 1when '99' then 1 else 0 end) from sf2569 select sum(case when srp like '%[49]9' then 1 else 0 end) from sf2LIKE is faster on VARCHAR than on CHAR columns...Apparently it has to effectively RTRIM the trailing spaces during the LIKEoperation.Is binary collation any faster?drop table sf2select cast(cast(srp as integer) as varchar(7))COLLATE Latin1_General_BIN srpinto sf2 from tbl_superfile561 select sum(case right(srp,2) when '49' then 1when '99' then 1 else 0 end) from sf2530 select sum(case when srp like '%[49]9' then 1 else 0 end) from sf2Binary collation comparisons are slightly faster, though it's not a bigdifference (with just two characters being compared).662 select sum(case convert(binary(2),right(srp,2))when 0x3439 then 1 when 0x3939 then 1 else 0 end) from sf2-----------5037 select right(srp,2) srp,count(*) from my_sf_tablegroup by right(srp,2)order by right(srp,2)920 select cast(srp as int)%100 srp,count(*) from my_sf_tablegroup by cast(srp as int)%100order by cast(srp as int)%100---On the one hand, premature optimization can be a waste of time and energy.On the other hand, understanding performance implications of variousoperations can help write more efficient systems.In any case, an indexed computed column or one updated on a trigger couldvirtually eliminate the need for any of these calculations to be performed,except upon insertion or update, so maybe my comparisons aren't verymeaningful for most applications, considering we're talking about less than3 microseconds per row here worst-case.But the results remind me, some recommend avoiding Identity when it's notnecessary. I find Identity(int,1,1) to be a nice, compact surrogate key thatis useful for quick comparisons, grouping, etc, and so on. Also, it seemsmost appropriate as the primary key to all lookup tables in a star schema inOLAP data warehousing. (?) Of course, in some situations, it's notappropriate, particularly when having a surrogate key violates dataintegrity by allowing duplicates that would not be allowed with a properprimary key constraint, or when the surrogate key is completely redundantwith (especially a short) single-column unique key value that would be abetter selection as the primary key. With multi-column primary keys, I thinkit's sometimes convenient to have a surrogate Identity if only for INclauses that reference that identity column (though EXISTS can usuallyreplace those, so maybe that's a weak excuse for an extra column.)
I have a data flow task with a single source and destination task. I'm having the source task creating a table from a variable expression and the destination table also created from a variable expression. I'm running this under 3 scenarios in which each scenario has a different source and destination table. They are different in name but close in table structure with the exception of one column being different. The Metadata for the source flow path seems to be "sticky" in that it is not modifying the source table structure in the flow to account for this different column. I'm not sure how to adjust this. Any ideas? I've modified several properties in the task and data flow but nothing seems to make this adjustment in run-time.
Hi, I have been working on an SSIS project for sometime now. The project files are located on a remote server. Suddenly I am not able to open the solution I get a lot of error messages and all the data flow taks are gone. I later found out that SSIS encrypts packages, so that other users will not be able to see them. Fine, but I have been using the same windows user account for months now. What could be the problem? This is what I get when trying to open the solution:
There were errors while the package was being loaded. The package might be corrupted. See the Error List for details.
And the error list also contains messages saying "Could not load from xml".
I am new to SQL 2005/SSIS and Bids. I have sucessfully imported an old DTS package modified it to no longer use the DTS (fully SSIS) and created a build /sent it back to MSDB. Now I have a couple questions. What do most people do with projects after they are promoted. I see it remains out there in your project list and under my documents folder. Do you delete them or do you keep them for history, or do you re-use them in future.
Let me give you our environment might help you give me suggestions on what might be best for our situation. We have just one server for a data warehouse(no development server) Development db will most likely be on proudciton server with just a different name.
We use DTS/wil now be SSIS to import info to db on a daily basis once a day. And we have scheduled jobs that run these SSIS packages.
IS there any info out there on good practices for BIDS and promoting etc?
Most of the installation went well. I got to the point of attaching the source database, but had problems attching the warehouse database:
Database 'REAL_Warehouse_Sample_V6' cannot be started in this edition of SQL Server because it contains a partition function 'pf_Range_Fact'. Only Enterprise edition of SQL Server supports partitioning.
I am running standard edition at mey development workstation, the only one we have running. I do not necessarily need to see partitioning working. But sounds like I will need to install Enterprise edition at my development computer to go down this path of discovery? Anyone been able to do otherwise?
Is there any way to get in and look at the stored procedures at least, or schemas?
Even after going back, it was not clear to me from the prerequisit list that Enterprise edition was required for this.
i have sql server 2005 dev and bids installed. i want to create a dtsx package, but when i go to File>new>project the template to create an interogation services package is not there. the template for Olap projects is missing also. it used to be present before i upgraded my system to Vista. Could this be a vista issue or do i just need to install a patch or something simple like that?
Can someone explain how the PR testharness is supposed to function? When I run it f with and without debugging and by using dtexec other processes can start and stop before the awaiting response from the console date and number of days to process. If you just look at the testharness and the loadgroupfulldaily the increment date and SQL Audits will finish prematurely before the console data is even entered. Is there a property that I am missing?
This is my first time to deploy an asp.net2 web site. Everything is working fine on my local computer but when i published the web site on a remote computer i get the error "Failed to generate a user instance of SQL Server due to failure in retrieving the user's local application data path. Please make sure the user has a local user profile on the computer. The connection will be closed" (only in pages that try to access the database) Help pleaseee
Hi List Im trying to set up an implementation of Project Real --it works like this- Create two system environment variables called REAL_Root_Dir and REAL_Configuration with the values given below. Click on Start -> Control Panel -> System. Go to the Advanced Panel, click Environment Variables button, then New in the System variables box.
If the Project REAL files were installed at C:Microsoft Project REAL, then the variable values will be:
The package OLEDB connections work like this First read enviroment variable to get location of config file Next read Config File to get connection string for Config Database <?xml version="1.0"?> <DTSConfiguration> <Configuration ConfiguredType="Property" Path="Package.Connections[SQL - Configuration].Properties[ConnectionString]" ValueType="String"> <ConfiguredValue>Data Source=(local);Initial Catalog=DataWarehouseABC;Provider=SQLNCLI.1;Integrated Security=SSPI;</ConfiguredValue> </Configuration> </DTSConfiguration> Next read Config database to get connection strings for Source and Destination databases
Destination database is called "DataWarehouseABC" Source database is called "SnapshotABC"
the Source database OLEDB connection works 100% however the destination OLDB connection we get this error below PS--Both source and destination databases are on the same development machine , however both databases are restored bak files from another production machine
Error 1 Error loading LoadGroup_Daily.dtsx: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Login failed for user 'xxxxxx'.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Cannot open database "DataWarehouseABC" requested by the login. The login failed.".
Any ideas on how one OLEDB Connection in this package can get this corruption
I am on this project that will search an optimal route for user from starting point to his/her destination on a map in my SQL Server 2005. I hv create two versions to test out the performance of the path finding algorithm. I have a few classes, which are:
PriorityQueue class which is implemented as List() object and plus codes to sort them in order PathNode class which are instances for the nodes of the search tree with information on heuristics value DataSource class which stores data retrieved from the SQL Server 2005 into the RAM for faster execution of the path finding PathFinding class which implements the path searching algorithm (based on A* algorithm), with PriorityQueue as the openlist, List() object as the closedlist, PathNode as the nodes in both the list to store information and lastly retrieve data from DataSource object that loads the whole table from SQL Server 2005In the first version, i simply use SELECT query to retrieve every correspondent nodes data from the SQL Server 2005 which makes the performance very low which i hv used SQL Server Profiler to check. Next, i use the current version to load all the data into my RAM to increase the execution, which has successfulyl achieved <1sec as opppsed to the 1st version ~8secs.
Now, my problem is to port the algorithm part to my SQL Server 2005 as SQL CLR integration to achieved better results withour the need to burden on client PC. My question is how am i going to do this? I tried before, and several erros like i need to serialize my current PathNode class and i did it. Do i need to make all class into UDT compatible? or??
dear list can anyone figure out a workaround as to why OLEDB Provider MSDORA cannot store passwords I have all the info stored in a table (Project Real Best Practise) The user id I have stored in string ConfiguredValue gets transfered to OLEDB Provider MSDORA named (SQL_REAL_Source_myoradb) but not the password To workaround this bug only with MSDORA can aynonne sugest a setting I should use in package security ie the default is EncryptSensitiveWithUserKey?
thanks Dave
CREATE TABLE [admin].[Configuration]( [ConfigurationFilter] [nvarchar](255) NOT NULL, [ConfiguredValue] [nvarchar](255) NULL, [PackagePath] [nvarchar](255) NOT NULL, [ConfiguredValueType] [nvarchar](20) NOT NULL ) ON [PRIMARY]
Hi all hope some one can help - please bear with me new to this Basically I have had to change pc's so I copied and pasted my ASPNET.MDF and LDF from my old pc to new PC including webpages/apps etc.. created. However now all I get is "Failed to generate a user instance of SQL Server due to failure in retrieving the user's local application data path """" Just don't understand what permissions the db should have, does it need to match the SQL Express owner...
I installed my asp.net 2.0 web application and sql express june editon on my windows 2003 server.When the application tries to reach the database i am getting the following error"Failed to generate a user instance of SQL Server due to failure in retrieving the user's local application data path. Please make sure the user has a local user profile on the computer. The connection will be closed."I can understand that it has something to do with user right but more than that I am pretty lost. Can anyone translate this error for me and give me some tips on what do to.I checked first the user running the sql express service and saw it was the NETWORK SERVICE user. I changed it to be the local system account but that did not make any difference. The application is running just fine on my local computer(Win XP sp2). This one has SQL express April edition though.
I am new to SSIS. I followed the direction of the tutorial Creating Simple ETL Tutorial package in BooksOnline. I have tried more than five times and have done exactly as suggested in the tutorial but it does not work.
1)[Lookup [30]] Error: Row yielded no match during lookup.
2) [Lookup [30]] Error: The "component "Lookup" (30)" failed because error code 0xC020901E occurred, and the error row disposition on "output "Lookup Output" (32)" specifies failure on error. An error occurred on the specified object of the specified component.
3) [DTS.Pipeline] Error: The ProcessInput method on component "Lookup" (30) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
4) [DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0209029.
Can someone help me with this tutorial error? or Am I doing something wrong.
Up to this point only two of use, both server administrators, have developed reports. Now we need to add two new report analysts to be able to develop reports in BIDS but limit their access to other server folders, etc. Our report source code is located at G:Visual Studio 2005 Source CodeProjects and all report folders, sln files, etc. are located under the Projects sub-folder. I have shared the Projects sub-folder and given read & execute / list folder contents / read / write / modify rights. They can see the report folders and sln files via BIDS or Explorer but get "access denied" when they attempt to execute one of them. Any thoughts on what I may be missing?
Hello,I'm using BIDS to create a package, and I need to pass in a variable value; how does this new DTS process work, in deploying the package and passing in a custom value at runtime?Thanks.
I have all the package logging tickboxes checked but in the log file I only get this: OnPreValidate,<machinename>,<account>,<packagename>,{7741AD7F-1941-4F4C-AE9D-08068C8856E4},{F6924552-600A-450B-995F-C24AB5C49FC3},6/25/2007 5:21:12 PM,6/25/2007 5:21:12 PM,0,0x,(null)
I'm working on a fairly straight forward data transfer package and have found that the package runs dramatically faster when I run the package inside BIDS than with DTExec. When I run the package on the server using debug in BIDS, the job completes 1 million rows in around 6 minutes. When I run DTExec with the same package on the same server it is much slower and the package takes roughly 25 minutes to complete.
I know this sounds crazy and that it's supposed to be the other way around with DTExec running much faster, but I'm stumped as to what could be causing the issue. The machine this is running on is a two processor, dual core CPU with GB of RAM and I'm using terminal server to login and create the package with BIDS on SQL Server 2005 SP2.
The main feature of this package is a Foreach container that uses an ADO record set to loop over a set of values from a control table. There are a large number of iterations so the package loops frequently, but the data flow task is fairly simple and uses an OLEDB source and OLEDB destination to transfer data between two SQL Server 2005 databases.
The package works in either BIDS and DTExec, but I'm really puzzled why it would run so much faster inside BIDS?
I have a problem that's baffling me. I have a package that loads some files into the database. If I run it from BIDS, it works fine. But if I run the package from the job, I get this error:
Cannot open the datafile "D:myFoldermyFile.TXT".
It seems like a permissions issue, but the job runs under a local admin account, and not to mention, the very same package/job reads and loads files from this exact same directory, with no problems.
I am runing WinXP Pro SP2 with all current updates and also VS2005 Team Ed for Developers. VS2005 is installed on D drive as is nearly all of my development tools. SQL Server 2000 SP4 is on the C drive and just installed SQL 2005 express with advanced services to D drive. I then attempted to install the express toolkit BIDS to the D drive only to learn it's hard coded(really stupid to not check for existing VS 2005) to install on C drive only. I've gotten past the denenv.exe issue.
The issue now is when I open VS2005 with the normal shortcut or the Business Intelligence Development Studio short cut and open any project that contains Crystal Reports reports and attempt to open a report I get package load failures for ReportDesignerPackage and Datawarehouse VSIntegration Layer Package. Also get this same error if you try to now create a BIDS report project.
I thought maybe VS2005 has a search path variable in tools/options or maybe a system envirnoment variable that could be tweaked to tell VS2005 to also look in the IDE folder for the dummy VS install on the C Drive. If there is I have not discovered it yet.
Second thought was to copy the files in the IDE folder of the dummy VS install on C drive to the IDE folder where my VS2005 is actually installed. I saw a post last night by someone that had done that with apparent success. That solution seems a little suspect since the BIDS packages files are registered at the C drive paths, so you certainly don't want to delete or move those files from where they were installed.
I'm nervous about side effects on my existing VS2005 projects during development and deployment and aren't even using BIDS.
So, now the question is how does one resolve this conumdrum?
Most of my packages that I've created in BIDS will NOT run in SQL Server 2005. The simplest one that I have fails during a script task that calls external managed code. I've done all the steps outlined in "Referencing Other Assemblies...", but I'm still getting "Object reference not set to an instance of an object." Here's a sample of a script that's having a problem. The line in green is the one that seems to be cause of the error. This is extremely frustrating. This code will even run from a command line console without error. Why is it so difficult to deploy one of these projects with managed code?
Code Snippet
Public Sub Main() Dim variable1 As String = DirectCast(Dts.Variables("packagevariable1").Value, String) Dim variable2 As String = DirectCast(Dts.Variables("packagevariable2").Value, String) Dim variable3 As Integer = DirectCast(Dts.Variables("packagevariable3").Value, Integer) Dim variable4 As String = DirectCast(Dts.Variables("packagevariable4").Value, String) Dim filePath As String = DirectCast(Dts.Variables("filePath").Value, String) Dim variable5 As String = DirectCast(Dts.Variables("packagevariable5").Value, String) Dim results As Boolean Dim fileGenerator As IProviderInterface Dim intFactory As integrationServiceFactory = New ProviderIntegrationServiceFactory()
I have developed a simple SSIS Package that will export data from an AS400 iSeries server to a flat file. When I try to debug the package I receive this error. I have tried to change the security level of the package to EncryptAllWithPassword and specified a password. For some reason the password for the connection to the AS400 is not being retained. when I enter the password the following error disappears when I try to debug the package the error returns. Does anyone know how to correct this? Thanks in advance for your help.
Things to also know:
I am using a Native OLE DBIBM AS400 OLE DB Provider w/user name and password (Allow saving password checked) Test Connection succeeded.
I am using a OLE DB Source to extract the data with Data Access Mode of Table or View. When I try to select a table I am prompted from the AS400 to enter password. Then I can see the tables.
I can select the columns I need and click OK to save.
Error 1 Validation error. Data Flow Task: OLE DB Source [39]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SERVERNAME.USERNAME" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. SSIAS400DataExport.dtsx 0 0
I have a 7 step SSIS package that manipulates some data on a DB2 database. The package executes perfectly in Business Intelligence Development Studio. I save the package to my SSIS store and then point my scheduled task to it and it fails after about 9 seconds everytime. I have an identical job that works with a different DB2 database that works without any problem. The only difference is the database it's pointing to.
The package is executing as the same user who created it, which has sysadmin to both the SSIS store and the SQL instance the package is executing on. When I saved the package I selected "Rely on server storage roles for access control" for the protection level.
This one is driving me crazy, can't figure it out. Any idea's?
2 SQL Execute Task, One Loop container, 2 Data Flow tasks, 1 Foreach loop container, 1 ftp task. The data flow tasks has 1 oledb source, 1 flat file source, 1 row count transformation, 1 recordset destination and 1 oledb destination.
When I load the package into BIDS it takes 125 MB of memory and then everything is slow, the properties panel slides in slowly and exists slowly. The object is the packages are not painted properly. to make changes and run takes lot of time.
Am I doing anything wrong here? Why is it consuming so much of memory?
I have a SSIS package that contains a DTS 2000 package in it. The DTS 2000 package imports data into several tables from an ODBC data source. When I execute the package through BIDS, no problems. Everything works great. I am now trying to execute the SSIS package in my stored procedure & it gives me the following error: Error: 2007-01-30 11:54:24.06 Code: 0x00000000 Source: Populate IncrTables Description: System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user. at DTS.PackageClass.Execute() at Microsoft.SqlServer.Dts.Tasks.Exec80PackageTask.Exec80PackageTask.ExecuteThread() End Error
I did a search for this & found KB 904796. It had the exact error message but I don't believe my packages uses 2000 metadata services. Just to be safe, I reinstalled the backward compatibility features & the DTS 2000 tools on the server. That still did not fix anything. I found another forum that suggested loading the DTS 2000 package internally, which I did & it did not fix anything. I am using a password for the protection level so that is not causing my issue. Does anyone else have any suggestions as to what I might be able to try?
SQL 2005 Dev Ed SP1 & post SP1 hotfixes installed Win 2k3 server Thanks! John
I configured my packages to user package configurations, but when i try to debug a package in BIDS in get the following error in the execution results tab :
[Connection manager "XXXXXXXXXX"] Error: An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E4D Description: "Login failed for user 'XXXXXXX'.".
Edit.
Now the degub works but after i deply the package to the sql server the package execution fails and the following is written into sql server errorlog
2008-03-17 15:56:31.71 Logon Error: 18456, Severity: 14, State: 8. 2008-03-17 15:56:31.71 Logon Login failed for user 'XXXXXXX'. [CLIENT: <local machine>]
The state 8 indicates bad password, but it isnt... because logging into sql server with that user and password works.