Disconnected Recordset Error On OLE DB Destination Data Flow
Oct 2, 2007
I have an update query in an OLE DB Destination (access mode: SQL Command) that updates a table with an INNER JOIN from another table in another database. I'm getting the error, "No disconnected recordset available for the specified SQL statement". Does this have to do with the SQL query trying to access the other database? How can I get around this error?
View 4 Replies
ADVERTISEMENT
May 12, 2006
I am getting the following error running a data flow that splits the input data into multiple streams and writes the results of each stream to the same destination table:
"This operation conflicts with another pending operation on this transaction. The operation failed."
The flow starts with a single source table with one row per student and multiple scores for that student. It does a few lookups and then splits the stream (using Multicast) in several layers, ultimately generating 25 destinations (one for each score to be recorded), all going to the same table (like a fact table). This all is running under a transaction at the package level, which is distributed to a separate machine.
Apparently, I cannot have all of these streams inserting data into the same table at one time. I don't understand why not. In an OLTP system, many transactions are inserting records into the same table at once. Why can't I do that within the same transaction?
I suppose I can use a UnionAll to join them back together before writing to a single destination, but that seems like an unnecessary waste and clutters the flow. Can anyone offer a different solution or a reason why this fails in the first place?
Thanks in advance.
View 3 Replies
View Related
Mar 7, 2008
Hi all,
I have a package that does simple exporting from an excel sheet to a table.
I used a Dataflow task with Excel Source and OLEDB Destination Components.
And i created Package configurations for Source and Destination Components.
After than when i execute the package i get the following error.
Information: 0x40016041 at ProductDetails_Import: The package is attempting to configure from the XML file "D:TEST_ETLLPL_Config2.dtsConfig".
Information: 0x40016041 at ProductDetails_Import: The package is attempting to configure from the XML file "D:TEST_ETLDBCon2.dtsConfig".
SSIS package "ProductDetails_Import.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0202009 at ProductDetails_Import, Connection manager "Excel Connection Manager": SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020801C at Data Flow Task, Excel Source [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
Error: 0xC0047017 at Data Flow Task, DTS.Pipeline: component "Excel Source" (1) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at Data Flow Task, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Data Flow Task: There were errors during task validation.
SSIS package "ProductDetails_Import.dtsx" finished: Failure.
The program '[2416] ProductDetails_Import.dtsx: DTS' has exited with code 0 (0x0).
I have been trying to troubleshoot the error message given below from last evening.
I have been trying to troubleshoot the error from last morning.
Counld not figure out what is causing this error to occur.
Please help!!!!
Any pointersSuggestions would be highly appreciated.
Thanks & Regards
View 3 Replies
View Related
Dec 14, 2007
Hi there,
I have a Data Flow task which uses an XML File Source with six parellell Outputs, each going firstly to a Data Conversion Task, then the results of each end in a SQL Server Destination Object. (All using the SQL Native Client)
To eplain this further, the Xml file contains 6 different types of elements, the Dataflow splits out each type of element and processes them into different tables. The Data Transformation object exists only because the XML fields are Uni-code and the table fields are VarChar not nVarChar.
Initially using this setup I found that the Connection would timeout using the SQL Native Client so I changed the Timout on the Destination Objects to 0. This fixed the problem to some degree, however at present I can run the Pakage using the Visual Studio enviroment and everything works fine, no problem. When I run the Dtsx file using the SQL Server Agent, I end up getting the error below...
Error: 2007-12-14 14:33:19.16 Code: 0xC0202009 Source: Import XML File to SQL SQL - CP [2746] Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from
I understand that this error is somewhat of a 'catch all' and that the way the Native SQL Server Connection object works makes Error Capturing difficult. I have tried a few things which I will list as I'm sure they will be suggested...
I have played around with the 'MaxInsertCommitSize' property of the SQL Server Destination Objects to no avail (IE, changing to 50000, 10000, 1000 all of which resulted in the same problem)
I am running the ssis pakage from the server which is the destination
As mentioned above the Timeout on the SQL Server Destination Objects is set to 0
What I have already mentioned and still don'tt quite understand is that I can run the job successfully from the Visual Studio enviroment but as a job run off the SQL Server it fails...
Can
View 8 Replies
View Related
Mar 10, 2004
Hi ...
This is a C++ / ADO / SQL question. Maybe not the right forum but I am guessing there are some programmers out there ...
I am trying to use ADO disconnected recordset to insert data into a sql table. I am using AddNew(vField, vValue) with UpdateBatch(). The code below does not throw any exceptions ... but does not add data to the table.
Any comments are appreciated,
Thanks,
Chris
void CTestApp::TestDatabaseUpdateBatch1a(void)
{
int nDataCount = 0;
long nIndex = 0;
long nIndex2 = 0;
CString csMessage;
CString csErrorMessage;
CString csTemp;
CString csSQL;
BOOL bIsOpen;
BOOL bIsEmpty;
long nCount = 0;
int nTemp = 0;
int nLimit = 0;
int nTempInt = 0;
long nTempLong = 0;
double nTempDouble = 0;
HRESULT hResult;
SYSTEMTIME st;
int i = 0;
string strTemp;
_variant_t sval;
_variant_t vNull;
vNull.vt = VT_ERROR;
vNull.scode = DISP_E_PARAMNOTFOUND;
COleSafeArray colesaFieldList;
COleSafeArray colesaDataList;
vector<COleSafeArray> *pvecDataList;
pvecDataList = new vector<COleSafeArray>;
COleDateTime oledtCurrentDate = COleDateTime::GetCurrentTime();
// Convert the OleDateTime to the varient
// COleVariant vCurrentDateTime(oledtCurrentDate);
COleVariant vCurrentDateTime;
CMxTextParse *pMxTextParse = NULL;
CMainFrame *pMainFrame = (CMainFrame *)AfxGetMainWnd();
CFrameWnd* pChild = pMainFrame->GetActiveFrame();
CTestMeteorlogixView* pView = (CTestMeteorlogixView*)pChild->GetActiveView();
pView->WriteLog("Start TestDatabaseUpdateBatch1a.");
pView->WriteLog("Load table using AddNew() and UpdateBatch().");
// Define ADO connection pointers
_ConnectionPtr pConnection = NULL;
_RecordsetPtr pRecordset = NULL;
try
{
// When we open the application we will open the ADO connection
pConnection.CreateInstance(__uuidof(Connection));
// Replace Data Source value with your server name.
bstr_t bstrConnect("Provider='sqloledb';Data Source='SQLDEV';"
"Initial Catalog='AlphaNumericData';"
"User Id=cmacgowan;Password=cmacgowan");
// Open the ado connection
pConnection->Open(bstrConnect,"","",adConnectUnspecified);
// Create an instance of the database
pRecordset.CreateInstance(__uuidof(Recordset));
// Select the correct sql string. Note that we are creating an
// empty string by doing a select on the primary key. We are only
// doing inserts and we do not want to bring data back from the
// server
csSQL = "SELECT * FROM dbo.AAMacgowanTest WHERE RecordId IS NULL";
// csSQL = "SELECT * FROM dbo.DICastRaw1Hr";
pRecordset->PutRefActiveConnection(pConnection);
pRecordset->CursorLocation = adUseClient;
pRecordset->Open(csSQL.AllocSysString(), vNull, adOpenStatic, adLockOptimistic, -1);
// Test to see if the recordset is connected
if(pRecordset->GetState() != adStateClosed)
{
// The recordset is connected, we will see if we are
// at the end
if((pRecordset->BOF) && (pRecordset->GetadoEOF()))
{
// The recordset is empty
bIsEmpty = false;
}
if(pRecordset->GetadoEOF())
{
bIsOpen = false;
}
else
{
// disconnect the database
pRecordset->PutRefActiveConnection(NULL);
}
}
// disconnect the database
// pRecordset->PutRefActiveConnection(NULL);
// Disassociate the connection from the recordset.
pRecordset->PutRefActiveConnection(NULL);
// Set the count
nCount = 1;
// now we will scroll through the file
while(nCount > 0)
{
nCount--;
nDataCount = 10;
// test that we got some data
if (nDataCount >= 0)
{
// Start the insert process
// m_pRecordset->AddNew();
COleSafeArray warningList;
//int index, listIndex = -1, bitIndex; // indexing variables
// long lowIndex, highIndex, arrayIndex[2];
VARIANT vFieldList[25];
VARIANT vValueList[25];
int nFieldIndex = 0;
int nValueIndex = 0;
// Setup the fields
vFieldList[nFieldIndex].vt = VT_BSTR;
vFieldList[nFieldIndex].bstrVal = ::SysAllocString(L"Name");
nFieldIndex++;
vFieldList[nFieldIndex].vt = VT_BSTR;
vFieldList[nFieldIndex].bstrVal = ::SysAllocString(L"Section");
nFieldIndex++;
vFieldList[nFieldIndex].vt = VT_BSTR;
vFieldList[nFieldIndex].bstrVal = ::SysAllocString(L"Code");
nFieldIndex++;
vFieldList[nFieldIndex].vt = VT_BSTR;
vFieldList[nFieldIndex].bstrVal = ::SysAllocString(L"Latitude");
nFieldIndex++;
vFieldList[nFieldIndex].vt = VT_BSTR;
vFieldList[nFieldIndex].bstrVal = ::SysAllocString(L"Longitude");
nFieldIndex++;
pView->WriteLog("Set data using AddNew() ...");
// COleDateTime is a wrapper for VARIANT's DATE type. COleVariant is
// a wrapper for VARIANTs themselves. If you need to create a
// variant, you can say:
COleDateTime oledtCurrentDate2 = COleDateTime::GetCurrentTime();
// Convert the OleDateTime to the varient
COleVariant vCurrentDateTime2(oledtCurrentDate2);
//Set the DATE variant data type.
memset(&st, 0, sizeof(SYSTEMTIME));
st.wYear = 2000;
st.wMonth = 1;
st.wDay = 1;
st.wHour = 12;
// vect is a vector of COleSafeArrays containing the records
for(i = 0; i < 10; i++)
{
// Setup the data
nValueIndex = 0;
vValueList[nValueIndex].vt = VT_BSTR;
vValueList[nValueIndex].bstrVal = ::SysAllocString(L"BLUE");
nValueIndex++;
vValueList[nValueIndex].vt = VT_BSTR;
vValueList[nValueIndex].bstrVal = ::SysAllocString(L"KSTP");
nValueIndex++;
vValueList[nValueIndex].vt = VT_I4;
vValueList[nValueIndex].dblVal = 100 + nFieldIndex;
nValueIndex++;
vValueList[nValueIndex].vt = VT_R8;
vValueList[nValueIndex].dblVal = 11.11 + nFieldIndex;
nValueIndex++;
vValueList[nValueIndex].vt = VT_R8;
vValueList[nValueIndex].dblVal = 22.22 + nFieldIndex;
nValueIndex++;
// Add the record to the recordset
pRecordset->AddNew(vFieldList, vValueList);
}
pView->WriteLog("Call UpdateBatch().");
// Re-connect.
pRecordset->PutRefActiveConnection(pConnection);
// Send updates.
pRecordset->UpdateBatch(adAffectAll);
// Close the recordset and the connection
pRecordset->Close();
pConnection->Close();
}
}
}
catch(_com_error *e)
{
CString Error = e->ErrorMessage();
AfxMessageBox(e->ErrorMessage());
pView->WriteLog("Error processing TestDatabase().");
}
catch(...)
{
csMessage = "Undefined exception handled. Error message details ";
hResult = GetAdoErrorMessage(m_pConnection,
&csErrorMessage);
csMessage += csErrorMessage;
csMessage += "method: CTestMeteorlogixApp::OnTestDatabaseAdoBulkload()";
AfxMessageBox(csMessage);
}
csTemp.Format("Last Row %03d DIcastId = %s ", nIndex, strTemp.c_str());
pView->WriteLog(csTemp);
pView->WriteLog("End TestDatabaseUpdateBatch1.");
}
View 3 Replies
View Related
May 16, 2007
Its been almost 10 years since I have had to do work with good-old RecordSet objects...
I am filling a RecordSet with data returned from a SQL server via a stored procedure. I then set the ActiveConnection property to Nothing in order to disconnect it so I can make changes to it.
But when I try to set the value on a given row I get back a "Multiple-step operation generated errors. Check each status value" error message. My understanding is that this is indicative of trying to use the wrong datatype. I have verified that the type is correct (I am dealing with integers) so I am at a loss for what the problem could be.
Here is my code:
Set rsPackages = CreateObject("ADODB.RecordSet")
rsPackages.CursorLocation = adUseClient
rsPackages.LockType = adLockBatchOptimistic
rsPackages.Open "EXECUTE stp_FetchPackageData", myConn
rsPackages.ActiveConnection = Nothing
Response.Write("Value: " & rsPackages("TotalCount")) ' returns 0
Response.Write("Data Type : " & rsPackages.Fields("TotalCount").Type) ' returns 3 = adInteger
rsPackages("TotalCount") = 1 ' throws multi-step error
Oddly enough, when I look at the Attribute property of the field I get back a value of 112. When you break it down I think that value indicates the row value is read-only? (Could that be my problem? Just a really bad/unhelpful error message?) But if I try to change it I get back a message saying cannot be done since RecordSet is already open.
Thanks,
Jason
View 2 Replies
View Related
May 17, 2007
I have been using a recordset destination in a data flow where I need to perform some complex manipulation on a dataset, including combining some information from a web service and updating records that already exist, vs. inserting them.
I have a script task that modifies the dataset as needed, and then saves it back to the variable it came from.
However, when it comes time to write the data to the database, I couldn't find an appropriate tool - there's no "recordset source" object in the data flow task, and use of a "for each" loop with a sql call to a stored proc takes 20 minutes for a few thousand rows.
The best way I could find around this was as follows:
Call the .NET ".GetXML" method on the dataset and put the resulting XML data into a string variable
Generate an XSD for that XML (it comes out like <NewDataSet><Table1>...)
Use an XML source in the data flow task.This works, and the same data insert that took 20 minutes via the loop / stored procs now takes under 10 seconds.
It seems horribly inefficient to have to do this - there should be a way to just dump my dataset back into a table natively without all that extra stuff.
Anyone done anything simillar?
View 1 Replies
View Related
Mar 31, 2004
Hello, Code below returns the following error:
Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done
<%
Const adLockBatchOptimistic = 4
Const adUseClient = 3
strDataBase = "somedb.mdb"
set cnTraining = server.CreateObject("ADODB.Connection")
cnnstr="Provider=Microsoft.Jet.OLEDB.4.0;User ID=Admin;Data Source=" & server.mappath(strDataBase) & ";Persist Security Info=False"
cnTraining.Mode = 3
cnTraining.Open cnnstr
set rsSearch = Server.CreateObject("ADODB.Recordset")
rsSearch.CursorLocation = adUseClient
rsSearch.LockType = adLockBatchOptimistic
sSql = "some sql stmt" 'this works fine on its own
rsSearch.Open sSql, cnTraining
set rsSearch.ActiveConnection = nothing
cnTraining.Close
set cnTraining = nothing
%>
My goal is to get a disconnected recordset. The problem here occurs when i try to use the adUseClient value (3) for the CursorLocation. If I don't use a 3 in the CursorLocation it works fine. Hoewever i'm almost certain i have to use the 3 in order to disconnect the recordset. Any ideas? Is my connection string not set up properly to get a disconnected recordset?
View 2 Replies
View Related
Feb 21, 2007
Hi there,
I need to develop a module and wondering what would be the best implementation...
I get a list of files from a text file and store it in a Recordset Destination (object variable "CUST_INV_LIST").
I need to check that all the files in a directory are in the list. I can loop through the files in the directory using a ForEach container, but how do I check if it is in the CUST_INV_LIST recordset?
I thought about using another ForEach container to loop through the recordset, check if the physical file is equal to that row, if so set a flag, ... but it's neither elegant nor effective.
Another option would be to use a Script Task to search for the physical file name in the recordset. I tried with the Data.OleDb.OleDbDataAdapter, etc. but I get and error when I declare Data.DataTable. Anyways that method of accessing a recordset is not recommanded and seems complicated.
Any suggestion?
Thanks
View 12 Replies
View Related
Aug 1, 2006
I can't seem to find a way to make the Data Flow Destination in a Business Intelligence Visual Studion Project output an MDB file for Microsoft Access.
View 4 Replies
View Related
Apr 3, 2014
I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.
View 4 Replies
View Related
Dec 10, 2007
Is there a default destination component used when a new data flow is created? The reason I ask is simply curiosity. I have an xml file with 2 pieces of data: item A and item B. A should simply get copied out of the file. B should undergo a quick transform. I set up an XML source such that two columns are mapped correctly to the XML source data of A and B. I set up my data transform task as well. So, if I leave those two components on the .dtsx page with no other components, then will there be a default data flow destination already created? ...OR, do you always have to have a destination component?
Thanks for the input. I am just curious.
View 4 Replies
View Related
Sep 26, 2007
I have a series of data flow tasks that I want to output to a temp table. I've set the data source for RetainSameConnection and the Data Flows are DelayValidation. The OLE DB data source inside the Data Flow works fine, but the data destinations don't offer a # or ## as a target. I've tried every destination that sounds logical, without success.
Any pointers? ... Thanks!
View 6 Replies
View Related
Jan 19, 2006
I have a data flow that reads multiple rows from a table and then inserts to another table for each row. I use an ole destination for my inserts. However, after that insert I need to do other table inserts and I can't figure out how to continue the data flow with the fields in the pipeline. Out of the destination is only the Error flow - Is there a way to do this ?
thx
View 5 Replies
View Related
Mar 14, 2007
I would like to extact data from a source system even if it has errors. Then I can transform it and handle the errors in the appropriate manner. Are there any loosely-typed Data Flow Destinations?
View 11 Replies
View Related
Mar 13, 2007
I am populating a table using a SQL command. very simple.
SELECT RISKID
,RISKIDREN
,RISKIDEND
,PREMIUMSUBTOTAL
,PREMIUMTOTAL
,SURCHARGE1
,SURCHARGE2
,SURCHARGE3
,SURCHARGE4
,SURCHARGE5
,COMMPREMIUM1
FROM PREMIUM
However, the first three columns are not being populated in the destination table. The other columns come over fine.
The SQL stmt. returns data as expected when run against the source database.
I deleted the source and destination and recreated the flow to prevent metadata mapping issues. In the source editor preview I see all of the columns and data. In the destination editor preview, the first three columns of data are null ???.
It appears that the columns are not mapping properly even though they are in the source and destination of the mapping editor.
I have made sure that the destination mapping contains all the columns in the UI.
The source and destination have the columns represented in the advanced editor metedata. I also checked the XML to verify that the columns are in the destination.
There is a row count between the source and destination. which should have no effect.
This is a part of a larger DW load where I have 10 other tables populated within the dataflow. I also do not get any validation, or error messages. So, I have eliminated truncation errors or the like.
I am really puzzled. Has anyone run accross anything like this?
View 5 Replies
View Related
Nov 28, 2007
I have 5 or more tables to join to get a particular output which has to be sent to a destination table. In the 5 tables some are inner joins and some are left outer join. I am opting for stored procedure at this point. But I would like to know how can this be done in data flow transformations having multiple souce and merge joins or any other alternates. I tried using merge join, but this does not accept more than two tables.
I saw this simple post which kick started me to use ssis transformations to stored procedures. But I encounter issue.
http://www.mssqltips.com/tip.asp?tip=1322
error
"The destination component does not have any available inputs for use in creating a path".
Please advice alternates
View 3 Replies
View Related
Nov 9, 2006
Hi
Inside a data flow task, i have a oledb source and destination. In my situation, I need to pull data from a table in the source, but also hard code some columns myself, which means my source is a blend of data from table, hard coded data, which will then have to be mapped to columns in oledb destination. Does anyone which option to choose in the oledb source dropdown for the data access mode. Keep in mind, i do need to run a a select query, as well as get data from a table. Is it possible to use multiple oledb sources and connect to one destination, because that is really what intend to do here. I am not sure how it will work, or even if its possible. Basically my source access mode needs to be a blend of sql command and table columns, how would that be implemented? Any help or advice is appreciated.
MA
View 4 Replies
View Related
Jun 26, 2007
I want to insert data calling a stored procedure and call this from a Data Flow destination object. Is it possible?
I understand that Ole Db Command transformation object can call stored procedure, but that will not rollback in the event of error in the middle.
I understand that Ole Db Destination object will rollback in middle of import, but I don't see how to do the insert by calling stored procedure. "Sql Command" option in Ole Db Destination object does not seem to present solution to the problem.
Am I missing something here or is Ssis / Microsoft demanding that Insert stored procedure not be used when using Data Flow destination object to insert data into target table?
View 8 Replies
View Related
May 18, 2008
SSIS Newbie Question:
I have a simple Control Flow setup that checks to see if a particular table exists. If the table does not exists, the table is created in an alternate path, if it does exist, the table is truncated before moving to a file import Data Flow that uses an OLE DB Destination to output the imported data.
My problem is, that I get OLE DB package errors if the table the OLE DB Destination Container references does not exist when I load the package.
How can I over come this issue? I need to be able to dynamically create the table in an earlier step, then use that table to import data into in a later step in the workflow.
Is there a switch I can use to turn off checking in the OLE DB Destination Container so that it will allow me to hook up the table creation step?
Seems like this would be a common task...
Steps:
1. Execute SQL Task to see if the required table exists
2. Use expresions to test a variable to check the results of step 1
3. If table exists, truncate the table and reload it from file in Data Flow using OLE DB Destination
4. If table does not exist, 1st create it, then follow the normal Data Flow
Can someone help me with this?
Signed: Clueless with a deadline approaching...
View 3 Replies
View Related
May 16, 2008
Hi,
How do I retrieve the connections (connection managers) collections from Custom Data Flow destination? ComponentMetadata.RuntimeConnectionCollection is empty. I would like to be able to access all the connections defined in the package from the custom data flow task.
I came across code in which it was possible to access the Connections collection using the IDtsConnectionService for custom task (destination). The custom task has access to serviceProvider, whcih can be used to get access to the IDtsConnectionService interface but not the custom data flow task.
Any help appreciated.
Thanks
Naveen
View 5 Replies
View Related
Oct 18, 2015
We have a single generic SSIS package that is used to import several hundred iSeries tables into SQL. I am not looking to rewrite the process. But I am looking for ways to improve performance.
I have tried retain same connection, maximum insert commit size, lock table (tablock), removed some large columns, played with the log file location and size, and now I am working to tweak the defaultbuffermaxrows.
To describe the data flow task - there are six data flows tasks (dft) working at the same time. Each dtf has their own list of iSeries tables and columns and the corresponding generic SQL table names. Each dtf determines their list of tables based on the number of columns to import. So there is dft30 (iSeries table has 1-30 columns to import), dtf60 (iSeries table has 31-60 columns to import), etc. The destination SQL tables are generically called Staging30, Staging60, etc. Each column in the generic Staging tables are varchar(100). The dtfs are comprised of an OLE DB Source and an OLE DB Destination.
The OLE DB Source uses a SQL Command from Variable to build a SELECT statement. The OLE DB Source uses a connection manager that uses an IBM iAccess IBMDA400 provider. The SQL Command ends up looking like this for the dtf30. This specific example is importing from the iSeries table TDACLR and it only has two columns so it will be copied to the Staging30 table.
select TCREAS AS C1,TCDESC AS C2,0 AS C3,0 AS C4,0 AS C5,0 AS C6,0 AS C7,0 AS C8,0 AS C9,0 AS C10,0 AS C11,0 AS C12,0 AS C13,0 AS C14,0 AS C15,0 AS C16,0 AS C17,0 AS C18,0 AS C19,0 AS C20,0 AS C21,0 AS C22,0 AS C23,0 AS C24,0 AS C25,0 AS C26,0 AS C27,0 AS
C28,0 AS C29,0 AS C30,''TDACLR'' AS T0 from Store01.TDACLR
The OLD DB Source variable value looks like the following, but I am not showing the full 30 columns
select cast(0 AS varchar(100)) AS C1,cast(0 AS varchar(100)) AS C2,cast(0 AS varchar(100)) AS C3,cast(0 AS varchar(100)) AS C4,cast(0 AS varchar(100)) AS C5, ... cast(0 AS varchar(100)) AS C30.
The OLE DB Destination uses OpenRowSet Using FastLoad From Variable. The insert into Staging30 ends up looking like this.
insert bulk STAGE30([C1] varchar(100) ,[C2] varchar(100) ,[C3] varchar(100) ,[C4] varchar(100) ,[C5] varchar(100) , ... ,[C30] varchar(100) ,[T0] varchar(20)
Of course we then copy and transform the Staging30 data to the SQL table that equals T0.
But back to defaultbuffermaxrows. Previously the dtfs had default values of 10000 for DefaultBufferMaxRows and 10485760 for DefaultBufferSize. I added a SQL task to SUM the iSeries column sizes, TCREAS and TCDESC in this example, and set the DefaultBufferMaxRows by dividing the SUM of the columns max_length into 10485760. But I did not see a performance improvement. Do you think that redefining the columns as varchar(100) for the insert is significant? Should I possibly SUM the actual number of columns (2) as 2x100 or SUM the 30x100?
View 4 Replies
View Related
Mar 8, 2008
I have a data flow that consists of
OLE DB source which calls a stored proc that returns a result set
data conversion
Excel destination
I am in design mode in Business Intelligence studio. My excel destination (with an Excel Connection) shows no sheet name though I have an execute SQL task before the data flow to create the excel table called SHEET1. Needless to say, there are no output columns visible to do any mappings. I did go to the ExcelConnection to set the OpenRowset Property to SHEET1 but it seems to have no effect.
I can do the export in SQL Server Management studio and that works fine, but it is basic and does not meet my requirements. I have to customize the package to allow dynamic Excel filenames based on account names and have to split my result set into multiple excel sheets because excel 2003 has a max of 65536 rows per sheet. Also when I use the export wizard, I have the source as a table and eventually the source has to be a stored proc with input parms.
What am I missing or doing wrong? Thanks in advance
View 6 Replies
View Related
Feb 1, 2007
Hi all,
Can a Recordset destination be used as source for a ForEach loop.
Correct me if i'm wrong but the Recordset is stored in a variable of type Object? So what stops my ForEach loop from itterating?
Regards,
Pieter
View 3 Replies
View Related
Jul 11, 2007
I have a csv file in which I am reading into a recordset destination and want to than use a foreach loop to cycle through those records and do some things. The problem I am having is after defining the variable name of the records set as results, and than going into the foreach loop, choosing collection, using the foreach ado enumerator, i dont see anything in the dropdown under ado object source varable?????? I am new to SSIS but I basically want to parse through this file, change some columns in each line and than either update or insert data in a sql table.
View 4 Replies
View Related
Mar 3, 2006
I'm curious to know how other people are handling the Recordset Destination in to be processed by a Foreach Loop Container. It seems a little odd to me that you can only map the parameters by knowing the index of the columns of the Recordset, however, the order that you built the recordset destination doesnt stay the same. I've been debugging for a while to find out that after I saved my recordset destination the order of the fields changed. To some order without a clear logic. I'm going to guess it might be the lineage id.
The bigger problem was this was a really large record set with 60 or so columns. To try and debug the problem of finding the indexes, I had added a Multicast tranform and saved the output to an Excel Destination. Of course, the order I setup the Excel file was the order I got the fields. Why would this not be the case with the Recordset Destination?
View 4 Replies
View Related
Aug 29, 2007
Hello,
Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?
Thanks,
Yoann
View 15 Replies
View Related
Mar 28, 2008
Hi All,
I want to export data from SQL Server2005 to an Excel spreadsheet thru "Data Flow Task". I am using OLE DB for SQL Server for the source connection and a Connection To Excel as my destination source. The Excel spreadsheet (2003) exists and has the first row with column names. I don't have any warnings before trying to execute.
The SQL datable fileds are
i) ID - Int
ii) RefID
iii) txtRemarks - nvarchar(MAX)
iv) ddlWaterLevel - nvarchar(50)
While executing the tasks, I got the error
Error: 0xC0202025 at Data Flow Task, Excel Destination [427]: Cannot create an OLE DB accessor. Verify that the column metadata is valid.
Error: 0xC004701A at Data Flow Task, DTS.Pipeline: component "Excel Destination" (427) failed the pre-execute phase and returned error code 0xC0202025.
After analysing I found in the DataFlow --> Excel destination --> Advanced Editor for Excel Destination, the default data type for txtRemarks shows as "Unicode string [DT_WSTR]". But this is supposed to be "Unicode text stream [DT_NTEXT]". Even if I change the data type in the design time, It doesn't accept.
Please do help me out.
thanks
Sanra
View 4 Replies
View Related
Sep 22, 2003
Hi ,
I have maintance plan to rebuld indexes and reorg data on all db
it fails with Error
[Microsoft SQL-DMO (ODBC SQLState: 01000)] Error 0: This server has been disconnected. You must reconnect to perform this operation.
What could be the problem?
Thank you
Alex
View 10 Replies
View Related
Jun 4, 2007
I want to execute a stored procedure in an OLE DB source in a Data Flow Task.
The stored procedure has a parameter.
When I put in an SQL command as :
EXEC sp_readcustomers 1
(thus passing 1 as the parameter value I can use the Preview button and I get a list of columns being returned.
The data flow task should run inside a ForEach Loop where I assign the value of an ADO resultset to a variable and it is this variable I want to pass to the SP :
EXEC sp_readcustomers ?
In the Parameter mapping I then name the parameter @par_company_id (which is the exact same name as in the SP) and map it to the variable var_company_id.
@par_company_id has been declared as int, var_company_id as Int16
When I now try to Parse or Preview the query from the OLE DB Source Edit window I get the following errors :
Parse : Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Preview : No value given for one or more required parameters (Microsoft SQL Native Client)
I have already installed SQL 2005 SP2 and VS2005 SP1.
I have tried everything I know, please help ?!?!
View 3 Replies
View Related
May 4, 2008
Hi,
We have two disconnected tables as shown below.
Table A Table B
Fields Class_NO Class_NameT State Class_No ColA ColB ColC State
Values 1 x S 3 xO e e S
2 XR S 4 UI re er S
9 YU w re S
8 OP we we S
We want to display the data from table A and table B as below.
Class_No Class_Name State Calss_NO Cola ColB ColC State
1 x S 3 xo e e S
2 XR S 4 UI re er S
9 YU w re S
8 OP we we S
Can it be done ? What should be sql query.
View 8 Replies
View Related
Dec 12, 2007
Hi all, I am getting the following when trying to import text coloums from execl to SQL server 2005. Any ideas?
Error at Data Flow Task [Destination] Coloums "blar" and "Blar_name" cannot convert between unicode and non-unicode sting types.
Any help would be great.
Thanks
Dave
Dave Dunckley says there is a law for the rich and a law for the poor and a law for
Dirty Davey.
View 3 Replies
View Related
Jan 7, 2008
Hi,
I have SQL Server 2005 Express edition on my machine. On an SSIS project in BIDS, when i drag a "Data Flow Task" to the package it returns the following error:
The designer could not be initialized. (Microsoft.DataTransformationServices.Design)
Does this has anything to do with the fact that i don't have SSIS installed on my machine?
I thought that SSIS was only needed (on my machine) for the runtime, just to run the packages. To create and edit the pachages i need to install SSIS on my machine too? this doesn't makes sense, maybe it's another problem.
Can anyone help me on this?
Thank you,
Rafael Augusto
View 10 Replies
View Related