Oledb Source Issues &&<column Name&&> Cannot Be Found In Datasource
Oct 4, 2007
Good morning,
I have written a package which accepts variables for the server, initial catalog & table name.
I execute sql to drop the following stored procedure, then following sql statement to create it.
================================================================
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE procedure [dbo].[SP_CreateMatchProc]
@sTable varchar(300)
as
BEGIN
SET NOCOUNT ON
declare @cmd nvarchar(2000)
set @cmd = ''''
Set @cmd = 'SELECT REPLACE(field1 + field2 + field3 + field4 + field5, '' '', '''') AS dBString '
+ 'FROM ' + @sTable + ' ORDER BY <table>_ID COLLATE Latin1_General_CI_AS'
exec (@cmd)
END
GO
================================================================
Then in the Oledb source (validateexternalmetadata = false) I use "sqlcommand from variable" with a variable value of "SP_CreateMatchProc '<tableName>'"
The package runs fine in the IDE regardless of variable values, but when I created a batch file which calls dtexec I get a failure:
Error: 2007-10-04 08:46:42.82
Code: 0xC0202005
Source: Data Flow Task OLE DB Source [310]
Description: Column "dBString" cannot be found at the datasource.
End Error
Log:
Name: OnError
Start Time: 2007-10-04 08:46:42
End Time: 2007-10-04 08:46:42
End Log
Log:
Name: OnError
Start Time: 2007-10-04 08:46:42
End Time: 2007-10-04 08:46:42
End Log
Error: 2007-10-04 08:46:42.82
Code: 0xC004701A
Source: Data Flow Task DTS.Pipeline
Description: component "OLE DB Source" (310) failed the pre-execute phase and returned error code 0xC0202005.
End Error
with the ValidateExternalMetadata set to TRUE I get
Error: 2007-10-04 09:21:35.20
Code: 0xC004706B
Source: Data Flow Task DTS.Pipeline
Description: "component "OLE DB Source" (10621)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
End Error
the most notable thing I see there is that it looks like a different ID (310) with out the validation and (10621) with it.
Any help would be greatly appreciated.
View 8 Replies
ADVERTISEMENT
Oct 10, 2006
Hi all,
I got an error when i do an OLE db Source pointing to an sql 2000 database and executing a sql query inside the OLE Source. The ole source will point to an OLE DB destination which is an sql 2005 database.
But i got the below error:
Error at Data Flow Task [OLE DB Destination [245]]: the column firstname cannot be processed because more than one code page (936 and 1252) are specified for it.
Error at Data Flow Task [DTS.Pipeline]: "component "OLE DB destination" (245)" failed validation and returned validation status "VS_ISBROKEN".
Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.
Error at Data Flow TaSK: There were errors during task validation.
(Microsoft.DataTransformationServices.VsIntegration)
View 5 Replies
View Related
Apr 4, 2004
hi... i am very new to SQL server.. previously was using MySQL...
now i am trying to connect my project to SQL Server..
but i wasnt able to...
i keep getting errors
System.Data.Odbc.OdbcException: ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified at System.Data.Odbc.OdbcConnection.Open() at icms.DB.q(String mySTR) in C:icmsDB.vb:line 30
below is my webconfig
i am not very sure about the value for the user.. because if it is MySQL i can get it from my control centre.. but what about SQL server ? where should i get my value ?
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<appSettings>
<add key="db" value="icms" />
<add key="db_user" value="sqladmin" />
<add key="db_server" value="server" />
<add key="db_pwd" value="*****" />
<add key="session_timeout" value="600" />
</appSettings>
<system.web>
<compilation defaultLanguage="vb" debug="true" />
<trace enabled="true"/>
<customErrors mode="RemoteOnly" />
</system.web>
</configuration>
======================
Dim myCMD As New SqlCommand
Public Sub New()
Dim db_server = AppSettings("db_server")
Dim db = AppSettings("db")
Dim db_user = AppSettings("db_user")
Dim db_pwd = AppSettings("db_pwd")
Dim DBConnection As String = "DRIVER={SQL Server};" & _ '<==== is my driver correct?
"SERVER=" & db_server & ";" & _
"DATABASE=" & db & ";" & _
"UID=" & db_user & ";" & _
"PASSWORD=" & db_pwd & ";" & _
"OPTION=3;"
myDB.ConnectionString = DBConnection
myCMD.Connection = myDB
End Sub
'Public Function q(ByVal mySTR As String) As OdbcDataReader 'SQL Query
Public Function q(ByVal myStr As String) As SqlDataReader
myCMD.CommandText = myStr
Try
myDB.Open()
q = myCMD.ExecuteReader(Data.CommandBehavior.CloseConnection)
Catch ex As Exception
Err(ex.ToString)
End Try
End Function
Public Sub c(ByVal mySTR As String)
' COMMAND ONLY
Try
myCMD.Connection.Open()
myCMD.CommandText = mySTR
myCMD.ExecuteNonQuery()
myCMD.Connection.Close()
Catch ex As Exception
Err(ex.ToString)
End Try
End Sub
View 4 Replies
View Related
Mar 18, 2008
Hi,
I have a proprietary OLEDB data source that works alongside a SQL database. Both are pretty big. I am trying to find a fast way of copying the data from the OLE source into a table in the SQL database. I am working in ASP.NET, and the easiest way is to have 2 connections, open the OLEDB and fill a datatable, then go through the table line by line and do a parametized insert. That works but takes far too long.
I am looking for something along the lines of "SELECT INTO tblB FROM tblA", or even "INSERT INTO tblB SELECT * FROM tblA", anything that is faster than the above. Can you do this in SQL but accessing an OLEDB datasource?
Cheers,
Dave
View 1 Replies
View Related
Nov 3, 2006
We have an application using the winforms report viewer, and it displays all our reports perfectly untill I need to redeploy a report.
As soon as a report is re-deployed a refresh of the report in the viewer shows the following error message:
An error has occured during report processing. The data souce 'mydatasource' cannot be found.
This error occurs irrespective of electing to re-deploy ot not re-deploy the datasource, and the only solution seems to be to close the report viewer down and restart it.
I can reproduce the same problem hosting the reports in a web browser as well, with a slightly different error message
An error has occurred during report processing. (rsProcessingAborted). The data source 'mydatasource' cannot be found. (rsDataSourceNotFound)
Under RS2000 any changes that were made to a report, would automatically be shown to the user if the report was refreshed, without having to close the browser/application down and restart it.
Can I configure RS2005 to prevent this error occuring ? I have read that RS2005 seems to work a lot more within the IIS session for the user and cache things it thinks are usefull, so can I turn this behaviour off, or make it run like RS 2000 did, which provided me with a stable reporting platform?
If I cannot do this from the RS end of the system, is there any advice on using the winforms control to get around this issue - i've only just started using the control so I am not familiar with all its aspects !
Thanks
Andy
View 3 Replies
View Related
Apr 9, 2006
I have a small archiving SSIS pacakge that I use to archive off old orders. I was OLEDB and found that I would be deleting about 1 order (plus all its relational records) a second. Try to find some speed I tried the ADO.net connetion and turn pooling on with a min of 10. Now I am getting about 100 order records deleted a second.
Has any one else found that that ADO.net connection is faster than the OLEDB (I would have thought it would be the other way around).
The
Control flow items I am using are the Execute SQL Task and the queries are like this
Insert into arcprod.wh1.pickdetail
Select * from wh1.pickdetail where status='9' and orderkey = @Pram1 and
pickdetailkey not in (select pickdetailkey from arcprod.wh1.pickdetail)
...
delete from wh1.pickdetail where orderkey = @Pram1
etc
View 1 Replies
View Related
Aug 21, 2007
All:
I am trying to code the following SQL into an OLEDB data source but it is not allowing me to do so because I think the variables are nested in multiple SQL statements. I have seen other posts that suggest using a variable to store the SQL but I am not sure how it will work.
I would also like to mention that the OLEDB source executes from within a For Each loop that is actually passing the values for the variables, which was one of the reasons I got stumped on how I could have a variable store the SQL.
Here is the SQL:
select b.ProgramID, b.ProductCode, b.BuyerID, b.Vendor,sum(a.Ordered) As Qty_Pruchased
From SXE..POLine a INNER JOIN
(SELECT VIR_Program.ProgramID, VIR_ActiveSKU.ProductCode, VIR_ActiveSKU.BuyerID, Vendor
FROM VIR_Program INNER JOIN
VIR_ActiveSKU ON VIR_Program.ProgramID = VIR_ActiveSKU.ProgramID
INNER JOIN Vendor ON VIR_Program.VendorID = Vendor.VendorID
WHERE ProgramFreq=?) b
ON a.ProductCode = b.ProductCode
WHERE a.TransDate >=? AND
a.TransDate ?
Group By b.ProgramID, b.ProductCode, b.BuyerID, b.Vendor
Thanks!
View 5 Replies
View Related
Jan 15, 2008
I was having one package which uses a source query in OLEDB Source Control and fetches the record and a couple of lookups and then an oledb command to insert/update the records in the table using as SP. I changed the source query(Infact the package) and removed in lookup and a different SP was called similar to the old one. But my problem is the package which was before taking only minutes to update 50,000 records is now taking more than 2 hrs. The problem is the number of records it is fetching from the source each time is very less.. its fetching hardly 500 records a time compared to nearly 2500 records before. Where am i going wrong? Any suggestion greatly appreciated.
View 5 Replies
View Related
Feb 12, 2008
Hi,
I am running an MDX query in SSIS but I don't know what is the best way of doing this, performance wise.
I know I can run the MDX query through an openquery in the OLEDB, and also run it through a Datareader, no openquery needed.
I know the datareader is slower in a normal basis due to .Net, but in this case the OLEDB is running an open query to a linked server which won't be fast like running a regular SQL.
If anyone knows which of this two run faster in this scenario I'll appretiate if you let me know.
View 1 Replies
View Related
May 11, 2007
Hi,
Firtsly - I am new to SSIS if my approach could be improved then I welcome suggestions.
Scenario: I have a large SSIS package that consolidates / summarizes work week information from several data sources. Currently each data flow task in the control flow calculates the from and to date that is filtered on, for example:
DECLARE @FromDT AS DATETIME
SET @FromDT = CAST(FLOOR( CAST( DATEADD(D, -7, GETDATE()) AS FLOAT ) ) AS DATETIME)
DECLARE @ToDT AS DATETIME
SET @ToDT = CAST(FLOOR( CAST( GETDATE() AS FLOAT ) ) AS DATETIME)
I would like to remove these statements that appear in most steps and replace them with a global variable that is used throughout the package. This statement would only appear once & it would make the package much easier to run after failure etc.
Problem: I am using Data Reader Source with the 'SQLCommand' property specified. It looks like parameters are only supported if an OleDB connection is used?
So I switched to an OleDB connection and no parameters are recognised in the string - a forum search reveals that parameters in sub queries are not always found properly. The solution to this problem appears to be, to set 'Bypass Prepare' to True but this is a property for the Execute SQL task, not the Data Flow Task source.
Questions:
Does the Data Reader Source control from Data Flow Source toolbox section support parameters?
Can anyone suggest a fix to the OleDB Source issue with Parameters?
Is there a better way to solve my problem e.g. Using Execute SQL Task instead of Data Flow tasks etc
Example SQL:
This SQL is an example of the SQL for the OleDB Data Source (within a Data Flow task)
------------------------------
--RADIUS LOGINS
------------------------------
DECLARE @FromDT AS DATETIME
SET @FromDT = CAST(FLOOR( CAST( DATEADD(D, -7, GETDATE()) AS FLOAT ) ) AS DATETIME)
DECLARE @ToDT AS DATETIME
SET @ToDT = CAST(FLOOR( CAST( GETDATE() AS FLOAT ) ) AS DATETIME)
DECLARE @Attempts AS BIGINT
SET @Attempts =
(SELECT COUNT(*)
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN @FromDT AND @ToDT)
DECLARE @Failures AS BIGINT
SET @Failures =
(SELECT COUNT(*)
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN @FromDT AND @ToDT
AND Authen_Failure_Code IS NOT NULL)
DECLARE @Successes AS BIGINT
SET @Successes = @Attempts - @Failures
DECLARE @OcaV1Hits AS BIGINT
SET @OcaV1Hits = (SELECT COUNT(DISTINCT LoginName)
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN
@FromDT AND @ToDT
AND EAPTypeID = 25)
DECLARE @OcaV2Hits AS BIGINT
SET @OcaV2Hits = (SELECT COUNT(DISTINCT LoginName) AS OcaV2Hits
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN
@FromDT AND @ToDT
AND EAPTypeID = 13)
SELECT
@Attempts AS ConnectionAttempts,
@Failures AS ConnectionFailures,
(CAST(@Successes AS DECIMAL(38,2)) / CAST(@Attempts AS FLOAT) * 100) AS SuccessRate,
@OcaV1Hits AS OcaV1Hits,
@OcaV2Hits AS OcaV2Hits
Please remember, I'm new to SSIS - so be detailed in your response. Thanks for your help!
View 5 Replies
View Related
Jun 21, 2006
Iam migrating data from one database to another .I want give input of that source and traget database names through globally declared user variables (@sourcename,@targetname)
How i can map the variables in OLE DB Source ..i dint find any option to that .
Can somebody help ?
Thanks
Kumar
View 1 Replies
View Related
May 21, 2008
Hello,
Im developing an asp website, but im getting this error:
Code Snippet
Microsoft OLE DB Provider for ODBC Drivers error '80004005'
[Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
/default.asp, line 76
I have 2 servers, the website is running from RACS-IIS-001, and the SQL server is from: RACS-SQL-001
I dont know if i use the wrong driver? im useing SQL Native Client
This is my connection string:
Code SnippetobjConn.Open "Driver={SQL Native Client}; Server=RACS-SQL-001; database=RentACar; uid=sa ; pwd=PASSWORD;"
Can anyone help?
View 4 Replies
View Related
Jul 27, 2015
1. As per my current development SQL Sever Analysis Database consists of two Cube (Cube A and Cube B).  Cube A and Cube B share the same data source view (DSV1). The source for both these cubes has the same data source (DS1).
2. As per the requirement I need to create third Cube i.e. Cube C. Is it possible to create a second Data Source View (DSV2). The Source of second Data Source View (DSV2) will be the same data source(DS1).
I am thinking to create second Data Source View (DSV2) for Cube C because existing layout of the DSV1 has become complex. I wanted to know the pros and cons of creating a multiple data source view with same data source
View 3 Replies
View Related
Jan 10, 2012
I am putting the below query in a OLEDB SOURCE through a variable (it is a select statement with a where clause from one date to another).
"select TestRecordtype, request_id from department
where LOAD_TMSTP between
(select max(END_TMSTP) LOG) and
(TO_DATE("+RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , @[System:tartTime] ), 2) + "-"
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , @[System:tartTime] ), 2) + "-"
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "yy" , @[System:tartTime] ), 2) + " "
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "hh" , @[System:tartTime] ), 2) + "."
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "mi" , @[System:tartTime] ), 2) + "."
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "ss" , @[System:tartTime] ), 2) +",'DD-MM-YY HH24.MI.SS')) "
View 1 Replies
View Related
Apr 29, 2008
I am debugging a Data Flow task in my SSIS package. When I run the package in debug mode, one of the OLEDB Data Sources turns red. I have rerouted all Error Output to a flat file, and put a Data Viewer on that path: no rows get sent. When I click the Preview button on this component in Design mode, I see the expected data and get no error messages. The connection does a simple table access...no SQL command. I don't see anything different between this component and other OLEDB sources in the same package that don't trigger any errors. I've tried dropping and re-creating the component with the same results.
What else can I do to debug this?
View 7 Replies
View Related
Mar 16, 2007
Guys,
I am having a nightmarish time getting an Oracle Connection Manager working as a source in my SSIS package.
The CM is called "OLTP_SOURCE". When I inspect the configuration and test connection, it succeeds, however when I go to run the package (both in debug mode and via DTEXECUI) I get the following error:
The AcquireConnection method call to the connection manager "OLTP_SOURCE" failed with error code 0xC0202009
After this happens, if I go into an OLE DB Source within a DFT, I get the following:
No disconnected record set is available for the specified SQL statement.
Now, if I go back into the CM, enter the password and test, it succeeds. From this point, I will go to preview the data in the OLE DB Source, and it comes back fine. However, when I go to run the package, I get the same error time and time again:
The AcquireConnection method call to the connection manager "OLTP_SOURCE" failed with error code 0xC0202009
The quick reader will suggest that the password is not being persisted. To this end, I have tried each of the following techniques to no avail:
1. Double, Triple and Quadruple check that the "save" password option in the CM is checked.
2. Hardcode the connection string in the dtsx XML-behind.
3. Enable Package Configurations and hardcode the connection string in the dstsconfig file.
4. Run the dtsx file using DTEXECUI, providing it with the configuration (that includes the hard-coded password).
5. Run the dtsx file using DTEXECUI, providing it the connection string in the Connection Managager override UI.
Can anyone help shed some light on what might be going on? So far, it is obvious that there has to be something that I am doing wrong because (syntax dialect differences aside) I can't imagine that Oracle sources should be this much of a headache.
Thanks,
Rick
View 3 Replies
View Related
Apr 20, 2006
A little background first. I have a header table and a detail table in my staging area/ods. I need to join them together to flatten them out for load. The Detail Table is pretty deep - approx 100 million rows.
If I use the setting (table or view) and set the table name (say, the detail table), the package starts up nicely. But if I switch the OLE DB Source to using a SQL Statement and then join the tables in the SQL, then the Pre-Execute phase of the package takes a VERY long time. I have waited as long as 30 minutes for this phase to complete, but it never finished.
Another twist...If I take the join select statement out of the OLEDB Source and put it in a view on the server, then swith the OLE DB Source to look at the view using the (table or view) mode, then the package gets through the Pre-Execute phase just fine.
Can someone go into detail as to what the Pre-Execute phase does and why a deep table might make it take a long time? I know already that the pre-execute phase caches the lookups, but not much else.
Any help?
Mark
http://spaces.msn.com/mgarnerbi
View 3 Replies
View Related
Jul 18, 2006
Hi,
at first...
Yes, I have seen this post http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=366077&SiteID=1 and yes, my Stored Procedure contains a "Set nocount on"...
:-)
My Stored Procedure contains variable SQL-Code to check a table (given by parameter) against some masterdata tables and I would like to write the result of the check to a flatfile to be able to send it via mail.
but evoking the Stored procedure brings up no meta data for the Flatfile Destination - the preview works...
any other hints?
except for doing the ckeck completely in SSIS?
:-P
cheers
Markus
View 1 Replies
View Related
Mar 14, 2008
Hello All,
I understand that the parser is having issues parsing my second command but I was wonder if anyone had stumbled on a work around for this:
-- OLEDB Source has no problem with this:
SELECT *
FROM some_table
where
last_update_dt = ?
-- OLEDB Source errors on this:
SELECT *
FROM some_table
where
Convert( char(12), last_update_dt, 112 ) = Convert( char(12), Cast( ? as DateTime), 112 )
The first SQL statement won't return the desired rows because of the wonderful time stamp, the second works well but the provider won't parse it. I have used the "SQL command from variable" to work around this but we have some tables with 200+ columns, so the only way to use the "SQL command from variable" is to do a select *, which I'm trying to avoid, for both performance reasons and company standards.
Any thoughts?
Thanks,
Raymond
View 4 Replies
View Related
Mar 31, 2008
i need to select data by using a very complex sql statement. when i use a ole db source componente and choose SQL command as data access mode the process never ends. but when i put the sql statement in an sql task component it works fine and fast. isn't an oledb source always based on an sql statement (select *)? so how is it possible that this component becomes so slow?
View 11 Replies
View Related
Sep 15, 2006
Hi All,
Is it possible that an OLEDB Data Flow Source is imposing locks on the source tables? The source is an SQL Server OLTP environment, and although the package will be scheduled to run nightly when the application sees little to no use, I want to be sure that the process isn't impacting any application functions.
Thanks for the advice!
Rocco
View 1 Replies
View Related
Jul 21, 2005
I am trying to call a stored procedure as part of my OLEDBSource. It takes two parameters. @StartDate datetime and @EndDate datetime.
View 9 Replies
View Related
Jul 31, 2006
Hi,
I have created lastUpdatedDate variable on package level. I have run a sql task and store a date in that variable.
now i am trying to pass that variable as parameter to oledb source connection (using command). it seems that we cant pass parameter in any sub query or derived table in query. its only working in outer query as soon as we place ? in WHERE clause of inner query it start throwing an 'Syntax Error' error saying that connection provider might not support that.
any idea ?????
I dont want to use command variables as my query is going to be quite big.
Note : I have tried Sql Server Native and OLEDB provider for sql server and this behaviour is seems to be constant in both.
Thanks,
Furrukh baig
View 2 Replies
View Related
Jun 7, 2007
Hi,
I'm using an OLEDBSource to select some data and then putting to in a Flat File destination.
However, when I look at the data in the OLEDBSource, they´re like this:
1. id
2. name
3. address
...but in the flatfile it comes out in the wrong order.
How can I fix this?
Thank you so much.
View 1 Replies
View Related
May 23, 2007
I have a datasource view DSV1. It points to a datasource DS1 that is considered the "primary".
I have created a Report Model that uses DSV1 (and thus uses DS1)
I created a new datasource, DS2 that I would like to use instead of DS1. (I can't just modify DS1 because if I modify it, it will overwrite it when we go to our Production environment and break that datasource)
So, I can go into DSV1 and change all the references from DS1 to DS2.
But that's where the problem lies.
When I try to build, I get the following error:
"The Table property of the Entity "E1" refers to the Table "dbo_View", which is not in the primary data source."
Somehow, the entity is tied to the "primary" datasource. When I change it back to DS1, everything works fine. Any thoughts? What can I do?
View 1 Replies
View Related
Nov 6, 2007
This might be an ignorant question, but I can't figure this out.
What is the purpose of the Error output data flow from the OLEDB Source? I am trying to understand an example of what kind of "error" would cause a row to go down this path, and I can't come up with one.
Does anyone have a good example of how this could be useful?
View 5 Replies
View Related
Mar 17, 2008
hello guys,
I have 10 tables, table1, table2, table3, table4.......table10. all these tables have different structure.
From each of these tables I want extract data and dump into flat file csv.
So i have OLEDB source and FlatFile Destination.
If i write seperate data flow task for each of the tables, then there is no issue.
But i want to use a single data flow task for all these tables. So for this, i use a variable @SQLStr . And i dynamically set the value of this variable to select * from table1, select * from table2.........selct * from table10.
So in the OLEDB source I select Data Access Mode as : SQL Command from variable. And I use @SQLStr for this varible.
For Destination, i dynamically generate the flat file for each of the table.
But this doesnt work, i get validation errors.
So, first can i use a single data flow task to dynamically change the source(different source tables) and destination. If so, what i am missing in the above process?
any help appriciated.
Thanks
View 8 Replies
View Related
Feb 2, 2008
Slow OLEDB Source in Data Flow
Hi All,
I have a simple data flow task, composed of only an OLEDB Source, a Conditional Split, and two Execute SQL statements (both insert statements, one after the other). When I run my package in Visual Studio for debugging, I noticed that after executing around ~9800 in the first and another ~9800 records in the second insert statements, the OLEDB Source will take around 3 or 4 minutes to fetch another set of ~9800 records. I have set the DefaultBufferMaxRows property of the Data Flow to 10000. My query to retrieve those 700,000 records runs for about 2-3 mins to finish (which I think should be decent enough). Is this an expected behavior of SSIS? The expected number of records to be retrieved is 700,000, and it takes forever to finish the transfer of these records. Please help
View 4 Replies
View Related
Nov 26, 2007
Hi,
I have an Excel source, which I have hooked up to a Data Conversion task. I have defined "Output Aliases" for all my columns in the Data Conversion task.
However, when I try to map the columns from the Data Conversion task to the table columns, there is a list of column names, which do not correspond to the names I defined as "Output Aliases."
For example, one of the Output Aliases is "col1." However, when I go to map it, the column name is not "col1" but "My Excel file.col1".
Why is this happening? I have not had this problem before.
Thanks
View 3 Replies
View Related
Mar 14, 2006
We have a complicated select query that needs to build a couple temporary work tables that are then used in the final select statement (in an OLEDB Source data flow control). We can click preview and see the resultset, but if we click on the Columns view there are no columns. We can save and close the OLEDB Source control but downstream from it there are messages saying that there are no input columns. The T-SQL looks something like this (abbreviated):
SELECT fieldlist INTO #temp1 FROM table
SELECT fieldlist INTO #temp2 FROM table
SELECT fieldlist FROM table INNER JOIN #temp1 INNER JOIN #temp2
DROP TABLE #temp1; DROP TABLE #temp2
Has anyone been able to use temp tables in a source SQL statement in a data flow? Are we doing something wrong or incomplete?
Thanks, Gordy
View 3 Replies
View Related
Aug 24, 2006
I am adding two OLE DB Source components to my pipeline and giving them different names
(via the Sourcename variable) but like they are assuming default name and description i.e. OLE DB Source which is causing the following error message on opening the package that was generated.
The package contains two objects with the duplicate name of "component "OLE DB Source" (37)" and "component "OLE DB Source" (1)"
Here is how I am doing it, in the debugger the names in both instances seem to have set correctly based on the variable but when saved they are lost. What am I doing wrong here?
IDTSComponentMetaData90 source = dataFlowTask.ComponentMetaDataCollection.New();
source.Name = Sourcename;
source.ComponentClassID = "DTSAdapter.OleDbSource.1";
source.Description = Sourcename;
Thanks
View 1 Replies
View Related
Jun 25, 2007
hi,
i have a number of interfaces in which i have used oledb source.
the problem i am facing is oledb source components code page is not configurable now if i want to deploy the interface in a different environment which has a database with a different collation it gives a error that oledb source needs new metadata.
has anybody faced this problem earlier.please give me a solution to this problem ..
thanks in advance.
srikanth
View 1 Replies
View Related
Mar 11, 2008
I'm importing from a SQL table that has data fields typed as numeric(18,2) and the OLEDB data source component converts the data to integers (as viewed in the data viewer). I've preceeded the column names with (DT_NUMERIC,18,2) with no results. When the data gets saved to a table with the field typed as money, it appends .00. The truncation of pennies (decimal) results in the diminution of the daily results as much as $1,000. How do I pass the pennies through the OLEDB data source component? Is this truncation by default,or is there something I'm missing in the configuration? thanks.
Dan
View 4 Replies
View Related