Trying To Access A DBase III File With A SSIS Package
May 4, 2006
I am trying to get a handle on the new SQL Server Integration Services in SQL Server 2005. There is a legacy DTS package that I need to get working on our new server (using SQL Server 2005) but I can't seem to get it to work.
First, I migrated it and attempted to run the package. It gave me many errors. So, I thought I would just rebuild it in integrated services. Everything was going smooth until I got to the real data transformation part.
I added a data flow task and it took me to the Data Flow tab. Then I added a DataReader Source. I am trying to read from a dBase III file and I assumed I would be using an ODBC driver for that. Well... it has been hell trying to get it to see the dBase III file and access it.
I have tried two solutions from the internet.
This one and this one (by Wenyang) with no positive results.
No matter what I do I still get an error in the bottom of the Advanced Editor for DataReader Source that reads: "Error at Sales Transformation [DataReader Source [81]]: Cannot acquire a managed connection from the run-time connection manager."
Has anyone attempted anything like this before? Are there any SSIS experts around here that can drop me a few pearls of wisdom?
I have an SSIS package that has an "execute process task" that executes a batch file. The package has been deployed to the msdb database, and is called from a stored procedure using xp_cmdshell dtexec ...
I can execute the package just fine if I'm logged onto the server as a system administrator, by running the stored procedure from a query window.
However, if I log on to the server as a non-admin user, the package attempts to run, but breaks at the file system task, with "Access Denied". It can't run the batch file. It seems to be a permissions issue at the file system level.
I am developing a package to restore a database from backup file on a remote server. I am having problems accessing the remote backup file when it is addressed via the admin share, in this case N$. It runs okay if a specific share is created but for some unknown reason fails via the adminshare.
I am executing the package job with a proxy account that is a member of the local administrators group on the remote server.
It appears that access via a remote admin share isn't possible from within a SSIS package. Is this the case?
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Data Source=D:Account;Extended Properties=DBASE III;', TRANS.DBF)
and I got the following message :
OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Could not find installable ISAM.". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".
I have a MS SQL database that I use MS ACESS to add data to and also to query.
My question is how can I setup the MS Access database so that it will not allow duplicate entries to be added to the MS SQL database? I know How I could do it using php via a webpage but I need to be able to do this using the MS Access program directly since thats what we use to add and search the dbase with.
I created several SSIS packages on a 32-bit system, using odbc to to access the dbase source file. Finally, after setting up a proxy, etc., we got the package to run and import data. We then needed to move the packages to a 64-bit system. So, we built our manifests, and installed all of our packages. We are using several dtsConfig files for our packages, and in there we specify the DSN that we will be accessing. I understand that MS does not support odbc drivers in a 64-bit world (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=85703&SiteID=1). Has anyone been faced with this issue? How did you get past it? I accessed the 32-bit odbc drivers on the 64-bit system, and i setup a dsn that points to the dbase file, but when i run the job, the log file has an error that the dsn was not found. However, when I run the package itself, it runs as expected...
I am trying to create and later read a data file from a package deployed in SSISDB, but it is not reading it while it is successfully creating the file. The same package when run from the file system package, runs successfully. Generating ispac and deploying in SSISDB is running for infinite time. Is it a permission issue?
I am having trouble connecting dbf of version dbase II. I can connect dbase III and dbase IV in ssis package,but can't connect dbase II.hope your help!
I am somewhat new to SSIS, so please forgive my confusion.
I have a situation and I have two possible scenarios that will work, but I'm not sure if either of them are feasible. I have a database that is updated using about 8 text files extracted from the company's main software package. I have set up a SSIS package that takes these 8 files and updates the appropriate tables. This package works pretty well. My question surrounds the automation of the extraction of the files from our software. I am able to do the extraction in one of two ways - a) manually opening up an instance of the tenet program used to run the program with a script built into it or b) using VBA in an Access application to open up the program to run the built in script. In the past, when the platform for the database was Access, I ran the extraction and database update from the same VBA module, but now I'm not sure how to do it.
Is it possible to either:
A) Intiate the run of a SSIS package from Microsoft Access or
B) Somehow open up the program that run's the company's software using SSIS. And this brings another question, if this is a step in an SSIS package, will the package wait to move to the next step until the entire extraction process is complete?
Hi,SQL Server 2000 SP3Windos 2000 Server SP4I have a DTS package that imports data from a dBase IV databse withfiles located in two folders (dBF1 and dBF2). I use a transform datatask to transform the data.They were running properly, but last week we installed W2K SP4, andnow the transform task for files from dBF2 are not working properly.I have two tranform tasks to extract data from files in dBF2 folder.If I double click to open the transform data task of either of them,Enterprise Manager crashes with the errrormmc.exe applicatio ErrorThe instruction "xxxx" referenced memory at "xxx". The memory couldnot be read.Althoug the transform task for one of the files will run, the otherwill not run giving the messageError Source: DTS packageCatastrophic failureAlso, I have an Access database that has links to the same dBasefiles. For files from dBF2 folder, I'm able to see the data from oneof the files, but if I double click to see the data from the other,access crashes with no specific error message. Nothing has changed ondBase related files (permission wise).The transform tasks to extract data from the other dBase folder (dBF1)files are working fine, and data is accesible from Access.Any advice how to tackle this one?
Has anyone had any experience using the openrowset function to access a dbase IV file? I can access the file using ADO from within VB but not getting anywhere using openrowset. If anyone knows the syntax or has an example it would be greatly appreciated.
and was successful at importing ".dbf" files from a dBase 7.0 folder.
However, this import was quite slow: approx. 2000 rows per second. Some of the tables I must import have over 8,000,000 rows, which would then take 4,000 seconds, or over one hour. This import rate seems unaffected by whether or not I put a "Copy Column" Data Flow Transformation task between the "OLE DB Source" and "OLE DB Destination" objects.
In SQL Server 2000 it was much faster to import such large tables. The 8 million row table takes only 14 minutes (800 seconds) with SQL Server 2000 -- giving a transfer rate approx. 5 times as fast.
Is there any way to speed up this process in SQL Server 2005? (In SQL Server 2000 there was a "Use fast load" option on the "Options" tab of the "Transform Data Task Properties." I have not yet found any similar option for SQL Server 2005.)
My receiving database is using the "Simple" model, so there is no need to create transaction entries for loading these large tables. There are also no indexes on these tables: the data are being loaded into new tables created in SSIS.
Due to some legacy requirements, I'm using an OLEDB connection with the Jet driver and creating dBASE III tables (.DBFs) as the destination for a data export. The source is SQL Server 2000. The SQL table has a bunch of varchar columns, and I can't send them directly to the .DBF because SSIS complains that it can't (implicitly) convert from non-Unicode to Unicode. UNICODE?! dBASE III is ancient. Why would Jet/SSIS assume that these .DBFs are Unicode? As it stands, I have to put a data conversion task between the source and destination and convert all the columns. It's a real pain.
And no, I don't want to make all my SQL Server columns nvarchar. Is there any setting I can put in the connection manager or the connection string to prevent this error? I've already set AlwaysUseDefaultCodePage to True on the OLEDB destination component and the default code page is 1252. That didn't work; I still got the can't convert error.
Response.Write("The following exception occurred: " & ex.Message)
End Try
End If
End Sub
I've tried both "sql" and "dts" with all the various permutations that are listed (and commented out) above. I either get an error that says: The top level folder "VisBridges" is not found. ("dts" "VisBridges") -- or -- Invalid package name or location: MSDBVisBridgesExportTime ("dts" "MSDBVisBridges") -- or -- Cannot find folder "MSDBVisBridgesExportTime" ("sql" "MSDBVisBridges") -- or -- The specified package could not be loaded from the SQL Server database ("sql" "VisBridges")
Dear Friends, I store several configurations in the main database of my SSIS packages. I need to get the servername from a xml or txt file in order to get those configurations stored in my database. How you think is the better way to do that? Using a FlatFileSource to read the file and a script to save the value into a SSIS variable? Using the package configuration I cant do that... maybe I dont know, but I can save the SSIS variale in the configuration file, but what I need is to do the inverse, read the configuration file and save the value in the SSIS variable. How the best way you suggest?! Regards!! Thanks.
Hi all, I have created an SSIS package to export rows of data from SQL to Access using SSIS package. The package is executed from asp.net web application. Below is what i want to achieve: -User enters a date range -SSIS package will export data between the date range from SQL to Access database. -When user enter another date range, I want to clear the contents of the Access database. (Im using Execute Sql Task--- Delete tablename) The problem is that when I look at the table after the second user request, the fields will show #deleted. Only after i click refresh will the new data appear. How can I make the data appear without manually refreshing the Access table. Thks alot.
My current project requires me to both rename the MDB file for an Access database and rename the table it contains. The Access files comes in with random names, each containing one table with a specific name. Based on the table name it contains, I rename both the file and the interior table to a standard name which a later package in the process references.
A foreach container loops through all the mdb files in the applicable directory, containing a script task and a file system task. The script task uses GetOleDbSchemaTable to extract the table name, then loops through an array of table names from the client's configuration, comparing it to a similar array of constant names and getting the matching one. The file system task then uses that found name (or the original table name if a conversion is not found) to rename the file to match that standard name. So far, so good.
Now I have to rename the table within the file as well. All of the examples of code I'm finding on the 'net refernce ADOX, but I haven't been able to figure out how to use that in a script task, assuming that's what I want to do in the first place.
Anyone have any experience with doing things like this?
If I create a job with an OS step with the text below
"c:Program Files (x86)Microsoft SQL Server90DTSBinn"dtexec /DTS "MSDBew Import" /SERVER HAYDN /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING V
I keep getting the following error:
Message Executed as user: XYZAdministrator. The process could not be created for step 1 of job 0x2161C39C2C34C54AA850602A482E82DF (reason: Access is denied). The step failed.
HOWEVER...
If I take put the above text within in a batch file (foo.cmd); and chanage the step to execute "c:foo.cmd" it works fine.
What am I missing here? It's seems wierd that it works one way, and not the other.
I need to run a make-table query against an Access database out of an SSIS package. I tried to do this with an OLE DB Command Task but it fails to create the table even though the task execution comes back successful. Any thoughts???
I'm creating a SSIS in the designer view of SQL Server BI Dev. Studio (SQL Server 2005)
I need to import a whole table from MS Access into my local SQL Server.(this task will be performed weekly, so once working I'll schedule a job for it)
I've created a 'FILE' connection to MS Access in the 'Connection Managers'.
When I'm on the 'Data Flow' tab I can't find a Data Flow Item to use as a MS Access connection. (available on the 'Data Flow Sources' are only: DataReader, Excel, Flat File, OLE DB, Raw File and XML Sources)
Hello, For packages where an MS Access database is the destination, what are some ways to detect whether or not the file is 'in use'? I know there is a lock file associated with Access databases. Could it be as simple as using something from the FileSystem Object to discover whether or not there is a lock file with the same name as the destination database? I'd like to stop the package if the file is in use.
I saw this post by dterrie in the Wishlist thread and I just wanted to second it:
"How about bringing back a simple dBase import. The SSIS guys are clearly FAR out of touch with reality if they think people who handle data no longer need to work with dbf files. I've seen alot of dumb stuff in my day, bit this is just sheer brilliance. I just love the advice of first importing into Access and then importing the Access table. Gee, why didn't I think of such a convenient solution. I could have had a V-8."
I've been struggling with this the last couple days and finally decided to import the dBase III file into Access and then import that into SQL Server 2005. Imagine my surprise when I discovered this was the current recommended method.
That's just ridiculous. Can someone tell me why they would reduce some of the functionality of SQL Server from 2000 to 2005? This was a very easy process in SQL Server 2000...
S: Running SQL Server Express V: Running SSIS package in VS.Net F: Shared folder host excel files
And an openrowset SQL statement: select * from openrowset(..... \Fexcel.xls....). This statement can be run in SS management studio connecting to S using my Windows logon(integration security) without any problem.
However, the same SQL running inside SSIS package (integration security using my Windows account) get the following error:
Error: 0x0 at Check headers: OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "The Microsoft Jet database engine cannot open the file '\Fexcel.xls'. It is already opened exclusively by another user, or you need permission to view its data.".
Error: 0xC002F210 at Check headers, Execute SQL Task: Executing the query "....openrowset....." failed with the following error: "Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
(My Windows account is administrator of Windows and sysadmin or SQL Sever Express on S)
I am trying to execute an SSIS package from an MS Access 2003 database that imports a table from the Access database into a target table in SQL 2005. I saved the package in SQL 2005 and tested it out. If I run it from the Management Studio Console with Run->Execute ... everything works just fine. However, if I try to run it using "Exec master.dbo.xp_cmdshell 'DTExec /SER DATAFORCE /DTS SQL2005TestPackage /CHECKPOINTING OFF /REPORTING V'" the execution will always fail when the Access database is open (shared mode). It will only work when the Access database is not open. The connection manager looks like this: "Data Source=E:Test.mdb;Provider=Microsoft.Jet.OLEDB.4.0;Persist Security Info=False;Jet OLEDB:Global Bulk Transactions=1". The error is listed below:
Code: 0xC0202009 Source: NewPackage Connection manager "SourceConnectionOLEDB" Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not use ''; file already in use.".
I need to convert an access 2000 database to access 2013 and then load the data into a sql server 2012 database. Thus any urls (links) that will show me how to accomplish the following in an SSIS package:
1. Convert an access 2000 database to access 2013 database? 2. Load the converted Access 2013 database into sql server 2012?
I have a ssis package which identifies duplicate records in access database. I have staged access database into sql sever and created ssis package. Now, I have final list of records which needs to be delete from access database and new records which are to be inserted into access database.
What do I need to do if I want to delete those duplicate records directly from access database using SSIS. I cannot truncate whole access database and reload. I just have to delete duplicate rows from access db and add new records.
Ok everybody. I am new to sql. I have ms sql staging database that pulls data from mysql database. Then once a day I run a ssis package that moves the data to a live database and also creates a flat file that is posted to a ftp site then truncates the table. One problem I am running into is if the mssql staging database has no records the flat file is still created. How do I stop it?
I want to unzip a file through "Execute Process Task".
For compressing(zip) a file i write the following.
In Process Tab: In Executable: C:Program FilesWinZipWINZIP32.EXE In Arguments: -min -a "C:file.zip" "C:file.mdb"
What arguments should i write to unzip a file? And in case i want to copy a file from one location to another through Execute Process Task what should i do?
Is there a way to unzip files in a SQL Server 2005 SSIS package. I now I can do it using winzip and executing the procedure, master..xp_cmdshell, but that is not what I am preferring at the moment. Anything direction in this regards will be much appreciated.