Reading DBASE IV Files (dbf) - AcquireConnection Fails
Apr 14, 2006
Has anyone had any trouble moving a package using a OLE DB Connection Manager reading DBASE IV files? While developing I never had a problem, the confiugration string described int his thread (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=76237&SiteID=1) worked just fine. Since I have enabled Configurations, my package will always fail when trying to read the dbf file. I've gone through just about every setting in the config file that I can think of.
Information: 0x4004300A at MyDataFlow, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0202009 at MyPackage, Connection manager "MyDBASEIVConnManager": An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020801C at MyDataFlow, OLEDBFileSource [15]: The AcquireConnection method call to the connection manager "MyDBASEIVConnManager" failed with error code 0xC0202009.
Error: 0xC0047017 at MyDataFlow, DTS.Pipeline: component "OLEDBFileSource" (15) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at MyDataFlow, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at MyDataFlow: There were errors during task validation.
I have a SSIS package that reads the data from an Excel file using an Excel Connection Manager and imports the data to a table on a SQL Server 2005 DB instance.
When I run this package locally on the server the package being on the file system, the package executes perfectly. Now I upload the package to the msdb database and the run the package from there and the package still executes successfully.
Now I schedule the package to run as a SQL Server Agent job and the package fails and when the logging is enabled I see this in the log file;
OnError,WEB-INTSQL,NT AUTHORITYSYSTEM,Copy to CRN-ALLOCATION_COMMENTS_TEMP,{40A6BF6E-7121-448B-A49D-DED58FDC746A},{BD991566-F4BD-41BC-AEBF-264032D8D0D3},5/9/2006 1:54:52 PM,5/9/2006 1:54:52 PM,-1071611876,0x,The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009.
OnError,WEB-INTSQL,NT AUTHORITYSYSTEM,Copy to CRN-ALLOCATION_COMMENTS_TEMP,{40A6BF6E-7121-448B-A49D-DED58FDC746A},{BD991566-F4BD-41BC-AEBF-264032D8D0D3},5/9/2006 1:54:52 PM,5/9/2006 1:54:52 PM,-1073450985,0x,component "Allocation Comments" (1) failed validation and returned error code 0xC020801C.
I am wondering why the AcquireConnection method call is failing when the package is scheduled? I am running the step as a SQL Agent Service Account and it is the Local System account that starts up the SQL Server Agent and Server and is an Administrator on the box.
I've created an SSIS package that contains a Sequence Container with TransactionOption = Required. Within the container, there are a number of Execute Package Task components running in a serial fashion which are responsible for performing "Upserts" to dimension and fact tables on our production server. The destination db configuration is loaded into each of these packages using an XML configuration file. The structure of these "Upsert" packages are nearly identical, while some execute correctly and others fail. Those that fail all provide the same error messages.
These messages appear during Pre-Execute
[Insert new dimension record [1627]] Error: The AcquireConnection method call to the connection manager "DW" failed with error code 0xC0202009.
[DTS.Pipeline] Error: component "Insert new dimension record" (1627) failed the pre-execute phase and returned error code 0xC020801C.
... which are followed by
[Connection manager "DW"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D00A "Unable to enlist in the transaction.".
[Connection manager "DW"] Error: An OLE DB error has occurred. Error code: 0x8004D00A.
While still in debug mode, I can check the properties of the "DW" connection and successfully test the connection within the packages that fail.
The same packages run successfully when tested outside the container (i.e. no transaction) or when the configuration file is modified to point the "DW" connection to a development version of the db which is running on the same server as the source database.
I have successfully used DTCtester to verify that transactions from source to destination server are working correctly. Also tried setting DelayValidation = True with no change. I have opened a case with Microsoft and am awaiting a reply so I thought I'd throw a post out here to see if anyone else has encountered this and might have a resolution. Here's some more on the environment:
Source Server:
Windows Server 2003 Enterprise Edition SP1 SQL Server 2005 Enterprise Edition SP0
Destination Server:
Windows Server 2003 Enterprise Edition SP1 SQL Server 2000 Enterprise Edition SP3 (clustered)
Thank you in advance for any feedback you might be able to provide.
I've got an SSIS package that I've been testing all week and it runs fine in Visual Studio. The package uses an OLE DB connection manager to access the dBase files using the Jet Provider. The dBase files are stored on a remote file share. There are a series of identical files held in different dated folders. I use a ForEach loop to move through the folder list and import the data as I go. When I deploy it to the server and try to run it, I get the following:
The AcquireConnection method call to the connection manager "Aloha" failed with error code 0xC0202009.
Aloha is the ConncetionManager to the dBase files. I've searched everywhere I can for ideas on how to solve this. I've made the SQL Agent Service is running under the same account I used to author the package. I've made sure that the package is using Server Storage. Everything looks like it should be working.
Can anyone give me any other items I can look into?
I am trying to use linked tables to connect SQL Server 2000 to a legacy system using dbase III files. (I need real time read only access to these files)
I have created a linked table from SQL server to a folder on the C drive which contains the dbase III files, using an ODBC DSN which uses "Microsoft dbase driver (*.dbf)". DSN tested successfully using Excel. Linked server connection is then created using "Microsoft OLE DB Provider for ODBC drivers".
The dbase tables appear OK in Enterprise manager, but I cannot get a query to work in SQL Anayzer, using the 4-part name syntax.
My query is just : SELECT * FROM LinkedTable...customers
Error message is "Invalid schema or catalog specified for provider 'MSDASQL'". Now I am pretty sure dbase files do not support any sort of schema / catalog set up, so I suspect SQL Server is looking for something it is not going to get.
One clue might be that in Enterprise manager, under the catalog column, I get the pathname to the dbase file, ie c:customers.dbf, which I cannot enter in the 4-part syntax.
We're trying to read DBASE IV files as a source, but can't find any providers for that format. Will these be included in the final release? Is there another way? DBASE has always been supported, so it's kinda stranged.
I saw this post by dterrie in the Wishlist thread and I just wanted to second it:
"How about bringing back a simple dBase import. The SSIS guys are clearly FAR out of touch with reality if they think people who handle data no longer need to work with dbf files. I've seen alot of dumb stuff in my day, bit this is just sheer brilliance. I just love the advice of first importing into Access and then importing the Access table. Gee, why didn't I think of such a convenient solution. I could have had a V-8."
I've been struggling with this the last couple days and finally decided to import the dBase III file into Access and then import that into SQL Server 2005. Imagine my surprise when I discovered this was the current recommended method.
That's just ridiculous. Can someone tell me why they would reduce some of the functionality of SQL Server from 2000 to 2005? This was a very easy process in SQL Server 2000...
I think we dont have option to read Transaction file in SQLserver Other than using Logexplorer. IS this Logexplorer working file to audit the sql server. We are planning to buy Logexplorer. Is it good product to buy.
I NEED TO READ A TEXT FILE INTO A SQL SERVER 6.5 TABLE. THE FILE HAS VARIABLE LENGTH FIELDS AND THE FIELDS ARE SEPARATED BY PLUS SIGNS ("+"). ANY IDEAS ? THANKS FOR YOUR TIME.
I have created a C2 Audit .trc file in sql server 2000 windows platform. i would like to be able to read it. I have tried the following but i get a message that the file does not exist or is not a recognizable trace file format or there was an error opening the file.
SELECT * FROM ::fn_trace_gettable('D:Program FilesMicrosoft SQL ServerMSSQLDataaudittrace_20070207073931.trc', default) GO
I know the file and the path are valid. what else could be the problem??
I have 2 tables EmployeeA(Eng) and EmployeeB(Spanish) kept in seperate mdb's. I want to add the records into Sql Server 2005 table called StateEmployee.
Procedure:
1. Loop through 2 folders..one containing table EmployeeA in mdb and other containing tbl EmployeeB in diff mdb's. 2. Pick a file from EmployeeA and EmployeeB, both at the same time. 3. Count the total no of rows in both files. If equal proceed. 4. Compare the 'employeeid' of one row of employeeA to the employeeid of EmployeeB. 5. If employee id matches, load both the rows in Sql server else file it to the error table. 6. Loop through all rows simultaneously till end of row. 7. Go to next mdb.
How do i go about this step by step. I am fairly new to SSIS. I asked my other friends too but they have complex answers which i couldnt follow. Hope someone gives an 'easy to understand' solution with sample.
I am trying to read a 36 byte files that contains compressed data. I create my Flat File data source and SSIS reads it fine UNTIL it hits a x00 in the file. Then it stops reading and I can't get any data after it. There is data after the x00. Here the entire hex string: C7 C7 CF 6A 00 00 05 02 3D 03 21 01 E0 02 00 00 00 00 00 00 00 00 3D 3C 1E FD 02 C8 00 00 00 AE 41 E3 28 7C
To test, I changed the two x00 in bytes 5 and 6 to x01 and SSIS read until the next x00.
I have a challenge i am trying to overcome, hopefully soneone would have come across this issue before..
I am creating a DTS package that will be scheduled to run at a certain time everyday. A source folder exists that get a set of new files everyday.The DTS Package will then read each file and copy the data into a load table in my database the challenge is this:
I am trying to load files from a source folder into my load table, Within each file, the entires are in a specific format using pipes to seperate the data that goes into which column e.g
example of a file entry:
column1 | column2 | column3
data1 | data2 | data3
data1 | data2 | data3
data1 | data2 | data3
And now i am using DTS to specify the file format and map the cloumns as apprporiate to my table...all this is well and good, but my problem is each file has a different name as well as being timestamped, now how do i use DTS to specify the source folder, open each file sequentially and read (or more appropriate, copy the entries into my table, inserting new data from each file into my load table as well as overwriting old data in the load table from the files in the folder ?) is there a way in specifying your source folder in DTS rather than specifying the file in the Menu options (in the transformation data task properties )given, and or do i need to write a script for this(reading the file?)
can someone please give me a solution and how to approach this?
Hi€¦ During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are: 1) Can SQL CE 3.5 handle a 4 €“ 6 GB file - Read - Parse (SQL) 2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file? - Will I need a .NET (small) client to read the large (4-6 GB) text file? More info: The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.
We have a package that is using a ForEach loop container to access files on a network drive. For some reason I am getting a message that the ForEach enumerator is empty and did not find any files that matched the pattern. For the pattern I left the default *.* for testing purposes. I have specified the file folder as \remoteserverfilesharesubfolder and also as \remoteserverc$filesharesubfolder and have gotten the same message. However when I map a network drive and set the file folder to the network drive it finds the files. Is this a permissions issue?
After I finish processing the file I want to move it to a new directory. Once this is deployed in production, the package will not be running under a domain account and probably won't have access to the network folder. Is there any way to specifiy in the connection manager itself that it should use a specific account to access the folder?
My requirement is I have to read 2 sets of files from a folder. For example, I have to read all files starting with either 'a' or 'b' only. In 'Foreach Loop', if I say 'a*,b*', it is not working. Instead of comma (,), I tried colon, semi-colon and pipeline characters also. It is not working. So I am using 2 loops now. But I would like to know is there any way to do it using a single loop?
I have two FTP Tasks configured in my SSIS package. One is for "Receive files" and the other is set for "Delete remote files." Both use variables for the source/destination paths. My remote path variable contains a wild card in the name field such as /usr/this/is/my/path/*.ext and it is working to FTP all the .ext files to my working directory. I then rename the files and want to remove the original files from the FTP server. I use the same variable as the remote path variable in the delete as I do in the receive.
Using the same FTP connection manager for both tasks I am always getting a failure on the delete. The FTP connection manger is setup to use the root user. Using a terminal I am able to open an FTP connection to the server and remove the files manually. There doesn't seem to be any detailed documentation on the FTP Task configured for Delete remote files so I'm hoping someone might have some insight to the problem.
I receive the same message for each of the files that was downloaded: Error: 0xC001602A at MyPackage, Connection manager "FTP Connection Manager": An error occurred in the requested FTP operation. Detailed error description: 550 usr hisismypathdatafile1.ext: No such file or directory. The attempt to delete file "usr hisismypathdatafile1.ext" failed. This may occur when the file does not exist, the file name was spelled incorrectly, or you do not have permissions to delete the file.
With the root user/working manually I'm not understanding the permission reason, the file does exist and is spelled correctly.
I have a database that consists of 4 files (1 MDF, 2 NDF's and 1 LDF-file). I can confirm that by running:
select name from sys.database_files
The result is: Adam_Data Adam_Log Adam_FullText Adam_History
About a year ago this database had 3 more NDF-files, but we moved all the objects that were stored in these secondary datafiles to the other datafiles and removed them completely from the database. This operation worked correctly otherwise the sys.database_files view would have returned more files I assume.
Everything works perfectly with this database until I try to detach and reattach it. If I detach this database and immediately try to attach it again on the same server using Sql Server Management Studio it wants me to attach 7 database files! The 4 database files that are still being used by this database (which is perfectly valid), but it also wants me to attach the 3 old database files which obviously don't exist anymore and that aren't even reported by the sys.database_files view. Is there anything I can do to make sure that an attach of this database will work again? How can I completely remove these old database files?
I have a FTP task in my control flow that download files from a FTP server. This ftp task is inside a foreach container that loops over a ADO recordset for the file name. The files that the ftp task pulls are huge. If the FTP task fails then I want the FTP task to restart and only download those files that have not been downloaded. Is this possible?
What possible configurations do I have to make to the foreach container and the filetask?
I am experiencing an issue with SQL Server 2000 Maintenance Plan. DB Backup job fails to delete old backup files from the file server (I am backing up to the network share - actually, a DFS). Backup part of the maintenance plan/job succeeds, but then cleanup part fails.
I made sure that service account under which SQL Server Agent is running, has sufficient privileges over the network share by logging in and successfully deleting files in question.
I was not able to locate any log entries either on the SQL Server machine or on the file server machine that would indicate the root of the problem. Even though I turned on auditing for Delete operations for the destination folder, its subfolders and files, I could not find anything in the Security event log.
I would appreciate any ideas on how to troubleshoot and correct this problem.
One of the step in sql agent job is to execute the SSIS package to copy files from remote shared location to local server where sql server is installed. The account on which SQL agent runs as has access to remote shared location. However SSIS package always fails with following error:
An error occurred with the following error message: "Access to the path '\sharedlocationBCKTST105092008.csv' is denied.".
We have tried using proxies but same issue come with proxy also. When using proxy, we created a proxy for a user who has access on that shared folder on Windows server(from which files are to be copied). If we login to SSIS server with the above user and execute SSIS package manually it works fine. However if we login with different user and run the job via SQL agent using proxy of the above user, then job fails throwing same above mentioned error
Hi We have a problem installing SQL Express Edition SP1 on Vista. The installation stops during installation of setup support files with the following message:
Errors occurred during the installation:Error 2 installing Microsoft SQL Server 2005 Setup Support Files. See log file for more detailed information.Det går inte att hitta filen. (translated: File not found) The log file points to the support.log file with the following content:
=== Verbose logging started: 2007-03-13 17:34:23 Build type: SHIP UNICODE 4.00.6000.00 Calling process: C:cd nät40..SQLEXPRSetup.exe ===MSI (c) (C4:F4) [17:34:23:450]: Resetting cached policy valuesMSI (c) (C4:F4) [17:34:23:450]: Machine policy value 'Debug' is 0MSI (c) (C4:F4) [17:34:23:450]: ******* RunEngine: ******* Product: C:cd nät40..SQLEXPRSetupSqlSupport.msi ******* Action: ******* CommandLine: **********MSI (c) (C4:F4) [17:34:23:450]: Client-side and UI is none or basic: Running entire install on the server.MSI (c) (C4:F4) [17:34:23:450]: Grabbed execution mutex.MSI (c) (C4:F4) [17:34:23:591]: Cloaking enabled.MSI (c) (C4:F4) [17:34:23:591]: Attempting to enable all disabled privileges before calling Install on ServerMSI (c) (C4:F4) [17:34:23:607]: Incrementing counter to disable shutdown. Counter after increment: 0MSI (s) (00:10) [17:34:23:622]: Grabbed execution mutex.MSI (s) (00:1C) [17:34:23:622]: Resetting cached policy valuesMSI (s) (00:1C) [17:34:23:622]: Machine policy value 'Debug' is 0MSI (s) (00:1C) [17:34:23:622]: ******* RunEngine: ******* Product: C:cd nät40..SQLEXPRSetupSqlSupport.msi ******* Action: ******* CommandLine: **********MSI (s) (00:1C) [17:34:23:622]: Machine policy value 'DisableUserInstalls' is 0MSI (s) (00:1C) [17:34:23:669]: SRSetRestorePoint skipped for this transaction.MSI (s) (00:1C) [17:34:23:669]: Note: 1: 1402 2: HKEY_CURRENT_USERSoftwareMicrosoftWindowsCurrentVersionPoliciesExplorer 3: 2 MSI (s) (00:1C) [17:34:23:669]: Note: 1: 1324 2: .. 3: 1 MSI (s) (00:1C) [17:34:23:669]: MainEngineThread is returning 2MSI (s) (00:10) [17:34:23:669]: No System Restore sequence number for this installation.MSI (c) (C4:F4) [17:34:23:669]: Decrementing counter to disable shutdown. If counter >= 0, shutdown will be denied. Counter after decrement: -1MSI (c) (C4:F4) [17:34:23:669]: MainEngineThread is returning 2=== Verbose logging stopped: 2007-03-13 17:34:23 === The funny thing here is that it works fine when the installer is run from a network share, ie \testserversetupsetup.exe but when the same folder is burned to a CD or copied to a local folder it fails. I have verified that the folders are identical.
I'm not sure what info is required to pinpoint the problem but here are some log files and a screen shot of the message.
Any help or pointers would be greatly appreciated.
It seems to me that files created on Unix machines with line terminator , or chr(10), cannot be imported using the Bulk Insert statement. Is this a bug, or an oversight by Microsoft? Does this mean that unless one replaces all with , there is no way to use Bulk Insert to import Unix files? This is a very strange behavior by MSSQL. Even lessor programs such as Excel and Word automatically recognize chr(10) as a line termination character. Am I missing something, or is this just the way MSSQL is?
I have a SSIS package which reads an excel file and loads data into a table using script component(C#) as a source. The package runs without any errors when I manually run it on my machine and on the server. But the package fails when run as a SQL Server Agent job.
I tried all the possible fixes I found on the web but still can't get it to work.
I get the following message when I execute a mantenance plan to delete files older than 1 day.
Error # -1073548784
Executing the query "EXECUTE master.dbo.xp_delete_file 0,N'',N'',N'2007-09-30T07:56:09' " failed with the following error: "Error executing extended stored procedure: Invalid Parameter". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I have a simple SSIS package that reads a flat file and copies it into a SQL Server table.
When the flat fiel is on the C drive I have no problem runnign this package from SQL Server Agent, but as soon as I update the path to a network location the package only works when I run it manually, but fails when is executed via the SQL Server agent job.
The error says "cannot open the datafile", while the datafile location is valid.
Is this a kind of limitation of a SQL Server Agent that only local files are allowed to be processed?
In a SSIS Package I have defined a ConnectionManager to a SQL Server 2005. Now I am trying to query data in a Visual Studio for Application Script.
I tried the following: Dim local_SQLConnectionManager As Microsoft.SqlServer.Dts.Runtime.ConnectionManager Dim local_SQLConnection As New System.Data.SqlClient.SqlConnection Dim local_SQLDataReader As System.Data.SqlClient.SqlDataReader Dim local_SQLCommand As New System.Data.SqlClient.SqlCommand
local_SQLCommand.CommandText = "SELECT something" local_SQLDataReader = local_SQLCommand.ExecuteReader() If local_SQLDataReader.Read Then Dts.Variables("Just a variable").Value = local_SQLDataReader.GetString(0) End If local_SQLDataReader.Close()
It always stops on the line with the AcquireConnection: Unable to cast COM object of type 'System.__ComObject' to class type 'System.Data.SqlClient.SqlConnection'. and so on ....
I have written some code to programmatically create an SSIS package. The code runs well if I just stick it in a windows form button event handler (e.g. OnClick). The code works well if I wrap it up in a DLL and call the DLL from a windows application. The code works well if I suck the DLL into a Windows Service and a windows client calls the service to create and run the SSIS package. In other words, the code always works when I call from a windows (Forms) application - regardless of how the code is hosted.
Now, if I take the code and stick it in the event handler of a Web Form button - it fails. If I use my DLL that wraps the code and call the DLL from a web form - it fails. It I make a remoting call to a windows service that then calls the DLL - it fails.
Note, in my windows service I have taken every possible measure to disassociate the call from the identity of the client. The windows service has a listener thread that places the arguments for the package creation on a queue. The listener thread then signals an event. A different thread then pulls the arguments off of the queue and creates the package.
Mind you, all of this works great if the client to the windows service is a windows application. In fact, as I said, the code always runs if it is called from a windows application (directly or via the windows service). But, if my client to my windows service is an ASP.Net application, then the code always fails. I have no idea why the service would care who its client is - but it seems to. I printed out the identity of the User that owns the thread that the windows service uses to create and execute the package, it is not NETWORK SERVICE or ASPNET or any other user - it is the windows service user - which happens to be the administrator account on my machine.
The package creation always fails in the same place. A call to AcquireConnections for an OleDBDestination fails with COM error 0xC020801C. I assumed this was some sort of permissions error - though with my windows service running as administrator and doing all of the work related to the package, I cannot really understand why.
So, going down the path of permissions, I gave every user on my machine administrative priviliges on the machine and the database. That did not work. I then created a custom ASP.Net Application Pool so I could create a new user. I gave that new user Administrator permissions and it still does not work.
I am running Windows 2003, IIS6, SQL 2005, all 32bit, all on the same local machine which is not part of a domain.
This is truly magical. The code always works when called from windows and it never works when called from ASP.Net - even with a windows service in between.
I am totally stumped. Any help would be greatly appreciated.