DTS Works, Job Fails!(data Source Foxpro And Destination SQL Server 2000
Sep 26, 2006
The DTS works perfectly when I run it manually. However, when I run it
as a job it fails. Before you ask if i'm running it under different
security context. I have already made sure of that. I was logged into
the server through remote viewer, when I created and ran the package,
as well as scheduling the job. So the accounts they're running under
are consistent. They're the same accounts as the SQL Agent is running
under and it's the sys admin account.
The data source is a Fox pro database with a pull of two tables. I am
using Microsoft OLE DB Foxpro driver as my source connection and OLE DB
Connection for SQL Server as my destination. I am doing a simple table
to table transformation. The path of my connection is a mapped Drive:
E:Main. There are other packages and jobs within my job queue that are
pointing to the same database and they seem to run fine using the above
mapped drive. The ONLY difference between this package and other
packages are that, they're few months old and this one was created last
night. I have also enabled logging on this package and here is the
below error when the job fails:
Package Steps execution information:
Step 'DTSStep_DTSDataPumpTask_1' failed
Step Error Source: Microsoft OLE DB Provider for Visual FoxPro
Step Error Description:Invalid path or file name.
Step Error code: 80040E21
Step Error Help File:
Step Error Help Context ID:0
Step Execution Started: 9/23/2006 11:39:17 AM
Step Execution Completed: 9/23/2006 11:39:17 AM
Total Step Execution Time: 0.031 seconds
Progress count in Step: 0
An SSIS package to transfer data from a DB instance on SQL Server 2005 to SQL Server 2000 is extremely slow. The package uses an OLEDB Source to OLEDB Destination for data transfer which is basically one table from sql server 2005 to sql server 2000. The job takes 5 minutes to transfer about 400 rows at night when there is very little activity on the server. During the day the job almost always times out.
On SQL Server 200 instances the job ran in minutes in the old 2000 package.
Is there an alternative to this. Tranfer Objects task does not work as there is apparently a defect according to Microsoft. Please let me know if there is any other option other than using a Execute 2000 package task or using an ActiveX Script to read records from one source and to insert them into the destination source, which I am not certain how long it might take and how viable will that be?
Hi all,I am fairly new to using triggers and was seeking some help from thosethat have experience with them. I am looking to transfer data from aSQL 2000 database to a Visual FoxPro database on another computer. Iwould like to transfer about three fields of data to a VFP table eachtime an insert is made on the SQL table. I am some what familiar withthe structure of creating the trigger but here is what I would likehelp with: Selecting the SQL data to transfer, Connecting to VFPdatabase, Insert SQL data into VFP table.CREATE TRIGGER [xyz] ON [dbo].[AAA]FOR INSERT??? Select a,b,c from SQL table??? Connect to VFP Database and Table??? Insert into VFP table Values a,b,cAny information, tips, or even an example Trigger procedure would helpand be greatly appreciated.Thank you,Brett
I created an updateable partioned view of a very large table. Now I get an error when I attempt to declare a CURSOR that SELECTs from the view, and a FOR UPDATE argument is in the declaration.
There error generated is:
Server: Msg 16957, Level 16, State 4, Line 3
FOR UPDATE cannot be specified on a READ ONLY cursor
Here is the cursor declaration:
declare some_cursor CURSOR
for
select *
from part_view
FOR UPDATE
Any ideas, guys? Thanks in advance for knocking your head against this one.
PS: Since I tested the updateability of the view there are no issues with primary keys, uniqueness, or indexes missing. Also, unfortunately, the dreaded cursor is requried, so set based alternatives are not an option - it's from within Peoplesoft.
I created a data model for report builder. And since it wont let me use views, i tried to put all the tables im using in one of my already made views, to create a data source view like my sql view. But when i connect everything the same, my report model still doesnt narrow my search. I only want to see Breed information, but since i have a Breeder table,and Customer table, its showing me all customers, and not limiting it to the ones that are breeder customers.
heres the tables:
Code Snippet
FROM dbo.Tbl_Customer INNER JOIN dbo.Tbl_Customer_Breeder ON dbo.Tbl_Customer.Customer_Code = dbo.Tbl_Customer_Breeder.Customer_Code INNER JOIN dbo.Tbl_Customer_Detail ON dbo.Tbl_Customer.Customer_Code = dbo.Tbl_Customer_Detail.Customer_Code INNER JOIN dbo.Tbl_Customer_Category ON dbo.Tbl_Customer.Customer_Category_Id = dbo.Tbl_Customer_Category.Customer_Category_Id INNER JOIN dbo.Tbl_Customer_Classification ON dbo.Tbl_Customer.Customer_Classification_Id = dbo.Tbl_Customer_Classification.Customer_Classification_Id INNER JOIN dbo.Tbl_Customer_In_Breed ON dbo.Tbl_Customer.Customer_Code = dbo.Tbl_Customer_In_Breed.Customer_Code INNER JOIN dbo.Tbl_Breed ON dbo.Tbl_Customer_In_Breed.Breed_Id = dbo.Tbl_Breed.Breed_Id
What am i doing wrong, and are there any suggestions.
I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.
I want to process data from source table to destination table without using cursor.
At this point; we are creating temp table and inserting data from source to temp table. once we get data into temp table; using while loop we are processing record one by one to destination table.
while executing stored procedure; we noticed that there are few records in source table which are invalid and stored procedure is terminating from such records.
Any better approach to log INVALID data and resume code to process next record instead of terminating.
I have read only permission in the source (OLTP) database. The source database is running in SQL Server 2000 and the size is more than 200 GB. I need to pull data from source and load target which is running in SQL server 2005.
Following are the objectives I want to achieve.
Data should be loaded on incremental basis.
Whatever changes take place (Update/Delete) in source, that should be replicated to already uploaded data.
Here I want to mention that, the source database does not have any identification key or timestamp column like Updated_Date by which I can filter the data which are recently inserted or updated into the source and upload the same. The source does not maintain any history data also. So I do not have any track of deleted record also.
I don€™t have any scope to change the schema in the source. In this scenario can anybody suggest me the best approach to achieve the above mentioned objectives?
Can I retrieve only the recent updated or inserted date form transaction log back up. Can log shipping solve the give the solution?
One more question. Say I have a table and I am exporting/importing all the data from/to my target table using SSIS or DTS. In this scenario does using query or using directly the table affects the performance?
I need to write TSQL script that will insert, update, delete rows in a production db from new, updated and deleted data from staging db. They are both in different servers and I can only use TSQL. Basically, the production DB needs to be synchornized with the staging DB.
I was thinking using dynamic sql, cursors, and linked servers. Any ideas?
i want to compare the source and destination data my source is Oledb source and destination is sql server destination I want to Compare the Data in destination with source and also want to add validation like iF Any row is there is destination but not in source delete that row if any row in source but not in destination Insert that row nd compare source with destination respect to Primary key if There is change in any column of that row Update that if not change than move to next record. Please tell me what is the sutable object and what to do Please Help me Out Thanks Sandeep
I am new to SSIS can u help me out in this i have a source(Flat File) and target SQL Server 2005. Source has got 2 columns- Col1 and Col2 even target has got 2 col's Col1 being the PK. I have created a package that checks if target and if the record exist it updates and if it does not it inserts. My package looks like
Source - lookup on target- Conditional split- Derived Column and then 2 OLE DB Destinations 1 for inserts and 1 for updates.
I have created a relationship in lookup with col1 from source and col1 from target and col2 as lookup col and connected red output to 1 OLE DB for inserts and redirect rows to it. Green out put i have taken to conditional split and gave condition like Col2 from source is not equal to lookup col2. I run the package with this it is inserting new records but not updating it.
I tried to add derived column after conditional split but lacks in writing expression that says update col2 records if they changed at source. Can u help me out in this scenario. I would appreciate if you could get back to me ASAP.
I am a complete newbie to SSIS. I can create a simple package to transfer data between SQL instances and thats about it.
I have tableA (source data) and tableB (Destination data). TableA has 4 column and tableB has 5. I want to transfer all of the columns from tableA into TableB, but the 5th column in tableB needs to be populated with the ServerInstance name of the server TableA sits on. Do I need to have multiple data sources to achieve this? I have tried but no matter how I set it up, the Column in the destination is set to ignore.
Inside a data flow task, i have a oledb source and destination. In my situation, I need to pull data from a table in the source, but also hard code some columns myself, which means my source is a blend of data from table, hard coded data, which will then have to be mapped to columns in oledb destination. Does anyone which option to choose in the oledb source dropdown for the data access mode. Keep in mind, i do need to run a a select query, as well as get data from a table. Is it possible to use multiple oledb sources and connect to one destination, because that is really what intend to do here. I am not sure how it will work, or even if its possible. Basically my source access mode needs to be a blend of sql command and table columns, how would that be implemented? Any help or advice is appreciated.
I'm trying to build a DTSX package that FTP's an XML file to the local file system and then imports it into an existing table.
My "Data Flow" for the package starts with an XML Source component and then goes to Data Conversion component and then to an OLE DB Destination component.
It all executes with out error and seems to work fine, but when I look at the data in the table after I've run the package it seems to have inserted the appropriate number of rows from the XML file but all of the column values are NULL.
All of the data in the XML file is surrounded by <![CDATA[ ]]> and I discovered that if I remove the CDATA wrapper by hand then it inserts the data properly. The only problem is that I'm not in a position to have the data provider remove the CDATA tags and some of the data in the XML file needs the CDATA wrapper or else it will not validate.
Anyone know of anything I can do to get the CDATA to import properly?
I have an ASP.NET webform:This connection works: "Server=localhost;uid=sa;pwd=;database=pubs"but this connection DOES NOT work: "Server=dnrsqlt1;uid=sa;pwd=;database=pubs"dnrsqlt1 is a sql server my network.Do I have to do something to users ASPNET or IUSER_Machinename on the remote machine
I have a new problem when I import data from an xml source file in two destination tables. The two tables are linked by a foreign key... for example :
table MOTHER (MOTHER_ID, MOTHER_NAME)
table CHILD (CHILD_ID, MOTHER_ID, MOTHER_NAME)
After a lot of transformations data are inserted into MOTHER table and I want to insert other fields of the data flow in CHILD table. To do this, I need the MOTHER_ID field that is auto incremented in MOTHER table.
My problem is to chain the insertion in CHILD table after the insertion in MOTHER table to be sure that the relative row in MOTHER table is really inserted. I haven't find any solution to chain another transformation task after my flow destination "Insert into MOTHER table".
The only solution I have found is to create a new flow control to insert data in CHILD table, using a lookup transformation task to bind with MOTHER table... But with this solution all my flow control transforms are made two times...
Is there a solution to chain two insertions with a foreign key constraint in a data flow?
I would like to replace data of some tables from STG to DEV database daily using SSIS package. Should I use "Transfer SQL Server Objects Task" to do that? Thanks.
I am unsure if this is the correct forum to send this question, but I can't seem to find any information regarding this problem. If this is the wrong place, please direct me to the correct spot.
I am attempting to import data from a free tables FoxPro database to SQL 2000 using a DTS Package which has worked correctly every day for the past 2 years. Yesterday, I got an error.
The package has around 10 tables that it deletes, re-creates, and populates with data from the Foxpro. All of the tables except one work correctly.
When I try to do an explicit import using the ODBC connection to populate that one table, I get the following error: Context: Error calling Openrowset on the provider.
I created an access database on my local computer and setup an ODBC connection and link tables to the database to see if it would work, and it did. So I thought there might be something wrong with the ODBC data source on the SQL Server, so I deleted it and created a new one, used it and I receive the same error.
I thank you in advance for any assistance or direction you can provide me for finding an answer.
I am extracting data from an Oracle database and have installed the latest x64 ODAC. When I run the 64 bit dtexec in cmd the package runs fine with no errors, but when creating and executing a SSIS job in the agent it fails. I've tried creating the job using CmdExec as well and I get the same errors:
Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 11:12:43 AM Error: 2007-08-15 11:12:45.63 Code: 0xC0202009 Source: Load Dimension Data Connection manager "ora" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x800703E6. An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x800703E6 Description: "Invalid access to memory location.". End Error Error: 2007-08-15 11:12:45.65 Code: 0xC020801C Source: CopyCustomers CustomerSource [1] Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "ora" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. End Error Error: 2007-08-15 11:12:45.65
The CmdExec job step uses the _exact_ same command as the one I execute successfully in cmd: "c:Program FilesMicrosoft SQL Server90DTSBinnDTExec.exe" /FILE "C:PackagesLoad Dimension Data.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW
I have managed to find two work arounds, but they are quite ugly:
1) Schedule the package execution using Windows Task Scheduler, basically create a bat file which runs the package. 2) Schedule the package in SQL Server agent, but as T-SQL script and use xp_cmdshell, i.e. EXEC xp_cmdshell 'dtexec /FILE "C:PackagesLoad Dimension Data.dtsx" /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING EW'
Both work and use the 64 bit dtexec. Not that elegant though..
CmdExec and the 32 bit version of dtexec works, but is not good enough for me since we bought new hardware and 64 bit software just to be able to address more memory. Has anyone managed to execute a SSIS package successfully using the agent and the 64 bit OraOLEDB.Oracle.1?
Any help would be greatly appreciated. Thanks!
Btw, I am using Windows Server 2003 x64 and SQL Server with SP2.
I have two calls to stored procedures that in an SSIS package fails silently. They are simply not executed in production but works fine in test, nothing happens and the sql server agent reports that everything has gone just fine.
In test they have 1 server with db A and B. No issue here.
In prod they have 2 servers with db A and B. On server 1 sql server agent executes a job that includes an SSIS package that on server 2 runs a couple of sp's. That user is db owner on server 2 db B and yet nothing happens. The sp's are not executed.
If I in prod run the job manually then it works, but not when run with the sql server agent account that as said is even db owner.
I have a package that does simple exporting from an excel sheet to a table. I used a Dataflow task with Excel Source and OLEDB Destination Components. And i created Package configurations for Source and Destination Components. After than when i execute the package i get the following error.
Information: 0x40016041 at ProductDetails_Import: The package is attempting to configure from the XML file "D:TEST_ETLLPL_Config2.dtsConfig".
Information: 0x40016041 at ProductDetails_Import: The package is attempting to configure from the XML file "D:TEST_ETLDBCon2.dtsConfig".
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0202009 at ProductDetails_Import, Connection manager "Excel Connection Manager": SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020801C at Data Flow Task, Excel Source [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
Error: 0xC0047017 at Data Flow Task, DTS.Pipeline: component "Excel Source" (1) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at Data Flow Task, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Data Flow Task: There were errors during task validation.
We are in the process of upgrading a SQL Server 2000 database to 2005. For the mean time, we're trying it out on SQL Server Express 2005. My question is, can reports in SSRS 2005 pull data from SQL Server 2000 in case both are installed side by side in the same machine and the 2000 version is the default instance while the 2005 express edition is the named instance?
How do I retrieve the connections (connection managers) collections from Custom Data Flow destination? ComponentMetadata.RuntimeConnectionCollection is empty. I would like to be able to access all the connections defined in the package from the custom data flow task.
I came across code in which it was possible to access the Connections collection using the IDtsConnectionService for custom task (destination). The custom task has access to serviceProvider, whcih can be used to get access to the IDtsConnectionService interface but not the custom data flow task.
We have a single generic SSIS package that is used to import several hundred iSeries tables into SQL. I am not looking to rewrite the process. But I am looking for ways to improve performance.
I have tried retain same connection, maximum insert commit size, lock table (tablock), removed some large columns, played with the log file location and size, and now I am working to tweak the defaultbuffermaxrows.
To describe the data flow task - there are six data flows tasks (dft)  working at the same time. Each dtf has their own list of iSeries tables and columns and the corresponding generic SQL table names. Each dtf determines their list of tables based on the number of columns to import. So there is dft30 (iSeries table has 1-30 columns to import), dtf60 (iSeries table has 31-60 columns to import), etc. The destination SQL tables are generically called Staging30, Staging60, etc. Each column in the generic Staging tables are varchar(100). The dtfs are comprised of an OLE DB Source and an OLE DB Destination.
The OLE DB Source uses a SQL Command from Variable to build a SELECT statement. The OLE DB Source uses a connection manager that uses an IBM iAccess IBMDA400 provider. The SQL Command ends up looking like this for the dtf30. This specific example is importing from the iSeries table TDACLR and it only has two columns so it will be copied to the Staging30 table.
select TCREAS AS C1,TCDESC AS C2,0 AS C3,0 AS C4,0 AS C5,0 AS C6,0 AS C7,0 AS C8,0 AS C9,0 AS C10,0 AS C11,0 AS C12,0 AS C13,0 AS C14,0 AS C15,0 AS C16,0 AS C17,0 AS C18,0 AS C19,0 AS C20,0 AS C21,0 AS C22,0 AS C23,0 AS C24,0 AS C25,0 AS C26,0 AS C27,0 AS C28,0 AS C29,0 AS C30,''TDACLR'' AS T0 from Store01.TDACLR
The OLD DB Source variable value looks like the following, but I am not showing the full 30 columns
select cast(0 AS varchar(100)) AS C1,cast(0 AS varchar(100)) AS C2,cast(0 AS varchar(100)) AS C3,cast(0 AS varchar(100)) AS C4,cast(0 AS varchar(100)) AS C5, ... cast(0 AS varchar(100)) AS C30.
The OLE DB Destination uses OpenRowSet Using FastLoad From Variable. The insert into Staging30Â ends up looking like this.
Of course we then copy and transform the Staging30 data to the SQL table that equals T0.
But back to defaultbuffermaxrows. Previously the dtfs had default values of 10000 for DefaultBufferMaxRows and 10485760 for DefaultBufferSize. I added a SQL task to SUM the iSeries column sizes, TCREAS and TCDESC in this example, and set the DefaultBufferMaxRows by dividing the SUM of the columns max_length into 10485760. But I did not see a performance improvement. Do you think that redefining the columns as varchar(100) for the insert is significant? Should I possibly SUM the actual number of columns (2) as 2x100 or SUM the 30x100?
we have tables with many image columns. We fill these image columns via ODBC and SQLPutData as described in MSDN etc (using SQL_LEN_DATA_AT_EXEC(...), calling SQLParamData and sending the data in chunks of 4096 bytes when receiving SQL_NEED_DATA).
The SQLPutData call fails under the following conditions with sqlstate 08S01
- The database resides on SQL Server 2000 - The driver is SQL Native Client - The table consists e.g. of one Identity column (key column) and nine image columns - The data to be inserted are nine blocks of data with the following byte size:
1: 6781262 2: 119454
3: 269 4: 7611
5: 120054
6: 269
7: 8172
8: 120054
9: 269 The content of the data does not matter, (it happens also if only zero bytes are written), nor does the data origin (file or memory).
All data blocks including no 7 are inserted. If the first chunk of data block 8 should be written with SQLPutData the function fails and the connection is broken. There are errors such as "broken pipe" or "I/O error" depending on the used network protocol.
If data no 7 consists of 8173 bytes instead of 8172 all works again. (Changing the 4096 chunk size length does not help)
Has anybody encountered this or a similar phenomenon?
Here's my situation:I have an ODBC DSN setup for Timberline Data (An accounting packagethat uses pervasive.sql) on my sql box. I set up a linked server usingthe supplied timberline odbc driver. I have two sql instances setup,the default instance and a named instance. On the default instance,the linked server works great no matter who is logged in using it (allauthentication is NT integrated). However on the named instance onlythe NT account that the MSSqlserver$NAMED service is logged in undercan utilize the linked server. All others get a ODBC error:"error 7399: OLE DB Provider 'MSDASQL' reported an error. AccessDenied. OLE DB error trace [OLE/DB Provider 'MSDASQL'IUnknown::QueryInterface returned 0x80070005: Access Denied.]."Far as i can tell, both instances are setup the same, except that oneis the default and one is a named instance.Why the different results for the default instance vs a named instance.Any ideas?Thanks