OLEDB Source Running Full MDX Query When Validating
Feb 18, 2008
Hi,
I have an Integration Services project which creates a flat file report from Analysis Services, I'm using an OLE DB as data source and running an Openquery in the SQL statement.
the problem is that Integration services runs the query twice before getting the data into the flat file. I know this because the query runs two times in Profiler, and because the same query takes half the time when run in Management Studio.
Integration Services is running the whole query when validating. how can I disable this validation or better make it validate properly.
thanks
View 11 Replies
ADVERTISEMENT
Jan 24, 2007
One of our developers has written a view which will execute completely (returns ~38,000 rows) in approx 1 min out of SQLMS (results start at 20 sec and completes by 1:10 consistently).
However, if he adds a data flow task in SSIS, adds an OLEDB Data Source and selects Data Access Mode to "Table or view" and then selects the same view, it is consistently taking over 30 minutes (at which point we've been killing it). I can see the activity in the Activity Monitor, it is doing a SELECT * from that view and is runnable the whole time.
If we modify the view to SELECT TOP 10, it returns in a short time.
Has anyone run into this problem? Any suggestions? It is very problematic, as if the views change we have to hack around this problem.
Thanks for any responses.
Jeff
View 5 Replies
View Related
Jan 10, 2012
I am putting the below query in a OLEDB SOURCE through a variable (it is a select statement with a where clause from one date to another).
"select TestRecordtype, request_id from department
where LOAD_TMSTP between
(select max(END_TMSTP) LOG) and
(TO_DATE("+RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , @[System:tartTime] ), 2) + "-"
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , @[System:tartTime] ), 2) + "-"
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "yy" , @[System:tartTime] ), 2) + " "
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "hh" , @[System:tartTime] ), 2) + "."
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "mi" , @[System:tartTime] ), 2) + "."
+RIGHT("0" + (DT_STR,4,1252)DATEPART( "ss" , @[System:tartTime] ), 2) +",'DD-MM-YY HH24.MI.SS')) "
View 1 Replies
View Related
Jul 5, 2006
Provider cannot derive parameter information and SetParameterInfo has not been called. (Microsoft OLE DB Provider for Oracle)
I am getting the above error while opening the parameter box at OLEDB source for Oracle using SQL command option at Data Access Mode?? Can you any one please help me in this regard and trouble shoot this problem..
View 8 Replies
View Related
Mar 23, 2006
Can i extend the "Query Builder" dialog of OLEDB Source Editor for developing custom source component?
View 1 Replies
View Related
Nov 2, 2006
Learning how to use SSIS...
I have a data flow that uses an OLEDB Source Component to read data from a table. The data access mode is SQL Command. The SQL Command is:
select lpartid, iCallNum, sql_uid_stamp
from call where sql_uid_stamp not in (select sql_uid_stamp from import_callcompare)
I wanted to add additional clauses to the where clause.
The problem is that I want to add to this SQL Command the ability to have it use a package variable that at the time of the package execution uses the variable value.
The package variable is called [User::Date_BeginningYesterday]
select lpartid, iCallNum, sql_uid_stamp
from call where sql_uid_stamp not in (select sql_uid_stamp from import_callcompare) and record_modified < [User::Date_BeginningYesterday]
I have looked at various forum message and been through the BOL but seem to missing something to make this work properly.
http://msdn2.microsoft.com/en-us/library/ms139904.aspx
The article, is the closest I have (what I belive) come to finding a solution. I am sure the solution is so easy that it is staring me in the face and I just don't see it. Thank you for your assistance.
...cordell...
View 4 Replies
View Related
Oct 26, 2007
Hi,
I'm trying to use query parameters with an Oracle OLEDB Source in a data
flow task and I'm having problems.
I've tried formatting the query each of the following ways...
--
select
frq_code,
frq_name,
update_frq,
uptime_frq
from frequency_bcs
where update_frq > ?
and update_frq <= ?
--
Parameters cannot be extracted from the SQL command. The provider might not
help to parse parameter information from the command. In that case, use the
"SQL command from variable" access mode, in which the entire SQL command is
stored in a variable.
Additional information
---> Provider cannot derive parameter information and SetParameterInfo has
not been called. (Microsoft OLE DB Provider for Oracle).
View 7 Replies
View Related
Apr 26, 2007
Hi,
Urgent Help required..........
Can anyone explain me steps how to parameterized query to send oracle.
If you know any other control which help to do this rather than OLEDB source.
Please let me know.
THanks
View 18 Replies
View Related
Oct 10, 2006
Hi all,
I got an error when i do an OLE db Source pointing to an sql 2000 database and executing a sql query inside the OLE Source. The ole source will point to an OLE DB destination which is an sql 2005 database.
But i got the below error:
Error at Data Flow Task [OLE DB Destination [245]]: the column firstname cannot be processed because more than one code page (936 and 1252) are specified for it.
Error at Data Flow Task [DTS.Pipeline]: "component "OLE DB destination" (245)" failed validation and returned validation status "VS_ISBROKEN".
Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.
Error at Data Flow TaSK: There were errors during task validation.
(Microsoft.DataTransformationServices.VsIntegration)
View 5 Replies
View Related
Dec 13, 2006
I'm totally new to SSIS and need some direction.I'ved worked with the Import/Export wizard to create a package that imports a text file into a SQL Server table. However, I'm told the format of the text file changes over time and that's not good. I need to program in a format validation check on the source file before it gets imported. If it changes, I'm suppose to throw an alert or something.Let's say the file has the columns: field1 (string[10]), field2 (date), field3 (integer), field4 (decimal).I did some testing and tried to change the data to a longer string in field1, and SSIS recognizes that and errors out. How do I get it to send the bad record to an bad record file? Do I just set a destination file connection for bad records and connect the red arrow from the source file to the destination file?I forget if the source file connection recognizes a bad date and errors out. I'll have to check again.But when I changed the data in the datafile for field3 from integer to decimal. It didn't recognize that as an error. It read it in "successfully". That's not good.Similar thing happened when I changed field4 from decimal to integer in the data file. But I'm not too worried about that.Any hints on how to do this or a better approach on checking for file format changes would be appreciated.Ken
View 2 Replies
View Related
Sep 23, 2007
Greetings!!
I have a MsAccess db containing a table called Employees which i am transforming to a staging table in Sql server 2005. Everything is working fine. I am using Foreach File enumerator and uploading the files one by one.However I now plan to validate the schema of MsAccess before uploading it. For eg: My employee table in msaccess is as follows :
Code Snippet
Employees
empId int,
empName varchar(60),
empAge int
Since the files come from different vendor, while looping, i want to perform a check if the empid or empAge are not of type long or string etc. If they are of type smallint,i have no problem.
However if they are larger datatypes than the the ones kept in Sql server, then the file needs to be logged in the db with the reason and moved to the error folder. In short, if the datatypes in access tables are smaller than those in Sqlserver, allow it, otherwise reject it. THe schema of Sqlserver table is same as of that of Employees in msaccess.
I want to compare the schema of the incoming access tbl fields with my desired schema and all mdb's having data types that are higher or incompatible with the desired schema should be moved to the error.
How do I do it.
Thanks ,
Ron
View 1 Replies
View Related
Mar 5, 2007
I am getting following error when query using a full-text index:
select *
from workitemlongtexts
where Contains(words, 'test');
==error message==
Msg 7619, Level 16, State 1, Line 1
The execution of a full-text query failed. "Service is not running."
I have verified that
1. msftesql service is running,
2. I even rebuilt the full-text index and it didn't help
3. my full-text crawl job is running fine
4. my DB is full-text enabled
Can someone explain what "Service" the error refers to? I am using SQL 2005 Ent SP1.
View 1 Replies
View Related
Jan 15, 2008
I was having one package which uses a source query in OLEDB Source Control and fetches the record and a couple of lookups and then an oledb command to insert/update the records in the table using as SP. I changed the source query(Infact the package) and removed in lookup and a different SP was called similar to the old one. But my problem is the package which was before taking only minutes to update 50,000 records is now taking more than 2 hrs. The problem is the number of records it is fetching from the source each time is very less.. its fetching hardly 500 records a time compared to nearly 2500 records before. Where am i going wrong? Any suggestion greatly appreciated.
View 5 Replies
View Related
Feb 12, 2008
Hi,
I am running an MDX query in SSIS but I don't know what is the best way of doing this, performance wise.
I know I can run the MDX query through an openquery in the OLEDB, and also run it through a Datareader, no openquery needed.
I know the datareader is slower in a normal basis due to .Net, but in this case the OLEDB is running an open query to a linked server which won't be fast like running a regular SQL.
If anyone knows which of this two run faster in this scenario I'll appretiate if you let me know.
View 1 Replies
View Related
May 11, 2007
Hi,
Firtsly - I am new to SSIS if my approach could be improved then I welcome suggestions.
Scenario: I have a large SSIS package that consolidates / summarizes work week information from several data sources. Currently each data flow task in the control flow calculates the from and to date that is filtered on, for example:
DECLARE @FromDT AS DATETIME
SET @FromDT = CAST(FLOOR( CAST( DATEADD(D, -7, GETDATE()) AS FLOAT ) ) AS DATETIME)
DECLARE @ToDT AS DATETIME
SET @ToDT = CAST(FLOOR( CAST( GETDATE() AS FLOAT ) ) AS DATETIME)
I would like to remove these statements that appear in most steps and replace them with a global variable that is used throughout the package. This statement would only appear once & it would make the package much easier to run after failure etc.
Problem: I am using Data Reader Source with the 'SQLCommand' property specified. It looks like parameters are only supported if an OleDB connection is used?
So I switched to an OleDB connection and no parameters are recognised in the string - a forum search reveals that parameters in sub queries are not always found properly. The solution to this problem appears to be, to set 'Bypass Prepare' to True but this is a property for the Execute SQL task, not the Data Flow Task source.
Questions:
Does the Data Reader Source control from Data Flow Source toolbox section support parameters?
Can anyone suggest a fix to the OleDB Source issue with Parameters?
Is there a better way to solve my problem e.g. Using Execute SQL Task instead of Data Flow tasks etc
Example SQL:
This SQL is an example of the SQL for the OleDB Data Source (within a Data Flow task)
------------------------------
--RADIUS LOGINS
------------------------------
DECLARE @FromDT AS DATETIME
SET @FromDT = CAST(FLOOR( CAST( DATEADD(D, -7, GETDATE()) AS FLOAT ) ) AS DATETIME)
DECLARE @ToDT AS DATETIME
SET @ToDT = CAST(FLOOR( CAST( GETDATE() AS FLOAT ) ) AS DATETIME)
DECLARE @Attempts AS BIGINT
SET @Attempts =
(SELECT COUNT(*)
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN @FromDT AND @ToDT)
DECLARE @Failures AS BIGINT
SET @Failures =
(SELECT COUNT(*)
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN @FromDT AND @ToDT
AND Authen_Failure_Code IS NOT NULL)
DECLARE @Successes AS BIGINT
SET @Successes = @Attempts - @Failures
DECLARE @OcaV1Hits AS BIGINT
SET @OcaV1Hits = (SELECT COUNT(DISTINCT LoginName)
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN
@FromDT AND @ToDT
AND EAPTypeID = 25)
DECLARE @OcaV2Hits AS BIGINT
SET @OcaV2Hits = (SELECT COUNT(DISTINCT LoginName) AS OcaV2Hits
FROM dbo.Radius_Login_Records
WHERE LoggedAt BETWEEN
@FromDT AND @ToDT
AND EAPTypeID = 13)
SELECT
@Attempts AS ConnectionAttempts,
@Failures AS ConnectionFailures,
(CAST(@Successes AS DECIMAL(38,2)) / CAST(@Attempts AS FLOAT) * 100) AS SuccessRate,
@OcaV1Hits AS OcaV1Hits,
@OcaV2Hits AS OcaV2Hits
Please remember, I'm new to SSIS - so be detailed in your response. Thanks for your help!
View 5 Replies
View Related
Jun 21, 2006
Iam migrating data from one database to another .I want give input of that source and traget database names through globally declared user variables (@sourcename,@targetname)
How i can map the variables in OLE DB Source ..i dint find any option to that .
Can somebody help ?
Thanks
Kumar
View 1 Replies
View Related
Apr 29, 2008
I am debugging a Data Flow task in my SSIS package. When I run the package in debug mode, one of the OLEDB Data Sources turns red. I have rerouted all Error Output to a flat file, and put a Data Viewer on that path: no rows get sent. When I click the Preview button on this component in Design mode, I see the expected data and get no error messages. The connection does a simple table access...no SQL command. I don't see anything different between this component and other OLEDB sources in the same package that don't trigger any errors. I've tried dropping and re-creating the component with the same results.
What else can I do to debug this?
View 7 Replies
View Related
Mar 16, 2007
Guys,
I am having a nightmarish time getting an Oracle Connection Manager working as a source in my SSIS package.
The CM is called "OLTP_SOURCE". When I inspect the configuration and test connection, it succeeds, however when I go to run the package (both in debug mode and via DTEXECUI) I get the following error:
The AcquireConnection method call to the connection manager "OLTP_SOURCE" failed with error code 0xC0202009
After this happens, if I go into an OLE DB Source within a DFT, I get the following:
No disconnected record set is available for the specified SQL statement.
Now, if I go back into the CM, enter the password and test, it succeeds. From this point, I will go to preview the data in the OLE DB Source, and it comes back fine. However, when I go to run the package, I get the same error time and time again:
The AcquireConnection method call to the connection manager "OLTP_SOURCE" failed with error code 0xC0202009
The quick reader will suggest that the password is not being persisted. To this end, I have tried each of the following techniques to no avail:
1. Double, Triple and Quadruple check that the "save" password option in the CM is checked.
2. Hardcode the connection string in the dtsx XML-behind.
3. Enable Package Configurations and hardcode the connection string in the dstsconfig file.
4. Run the dtsx file using DTEXECUI, providing it with the configuration (that includes the hard-coded password).
5. Run the dtsx file using DTEXECUI, providing it the connection string in the Connection Managager override UI.
Can anyone help shed some light on what might be going on? So far, it is obvious that there has to be something that I am doing wrong because (syntax dialect differences aside) I can't imagine that Oracle sources should be this much of a headache.
Thanks,
Rick
View 3 Replies
View Related
Apr 20, 2006
A little background first. I have a header table and a detail table in my staging area/ods. I need to join them together to flatten them out for load. The Detail Table is pretty deep - approx 100 million rows.
If I use the setting (table or view) and set the table name (say, the detail table), the package starts up nicely. But if I switch the OLE DB Source to using a SQL Statement and then join the tables in the SQL, then the Pre-Execute phase of the package takes a VERY long time. I have waited as long as 30 minutes for this phase to complete, but it never finished.
Another twist...If I take the join select statement out of the OLEDB Source and put it in a view on the server, then swith the OLE DB Source to look at the view using the (table or view) mode, then the package gets through the Pre-Execute phase just fine.
Can someone go into detail as to what the Pre-Execute phase does and why a deep table might make it take a long time? I know already that the pre-execute phase caches the lookups, but not much else.
Any help?
Mark
http://spaces.msn.com/mgarnerbi
View 3 Replies
View Related
Jul 18, 2006
Hi,
at first...
Yes, I have seen this post http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=366077&SiteID=1 and yes, my Stored Procedure contains a "Set nocount on"...
:-)
My Stored Procedure contains variable SQL-Code to check a table (given by parameter) against some masterdata tables and I would like to write the result of the check to a flatfile to be able to send it via mail.
but evoking the Stored procedure brings up no meta data for the Flatfile Destination - the preview works...
any other hints?
except for doing the ckeck completely in SSIS?
:-P
cheers
Markus
View 1 Replies
View Related
Mar 14, 2008
Hello All,
I understand that the parser is having issues parsing my second command but I was wonder if anyone had stumbled on a work around for this:
-- OLEDB Source has no problem with this:
SELECT *
FROM some_table
where
last_update_dt = ?
-- OLEDB Source errors on this:
SELECT *
FROM some_table
where
Convert( char(12), last_update_dt, 112 ) = Convert( char(12), Cast( ? as DateTime), 112 )
The first SQL statement won't return the desired rows because of the wonderful time stamp, the second works well but the provider won't parse it. I have used the "SQL command from variable" to work around this but we have some tables with 200+ columns, so the only way to use the "SQL command from variable" is to do a select *, which I'm trying to avoid, for both performance reasons and company standards.
Any thoughts?
Thanks,
Raymond
View 4 Replies
View Related
Mar 31, 2008
i need to select data by using a very complex sql statement. when i use a ole db source componente and choose SQL command as data access mode the process never ends. but when i put the sql statement in an sql task component it works fine and fast. isn't an oledb source always based on an sql statement (select *)? so how is it possible that this component becomes so slow?
View 11 Replies
View Related
Sep 15, 2006
Hi All,
Is it possible that an OLEDB Data Flow Source is imposing locks on the source tables? The source is an SQL Server OLTP environment, and although the package will be scheduled to run nightly when the application sees little to no use, I want to be sure that the process isn't impacting any application functions.
Thanks for the advice!
Rocco
View 1 Replies
View Related
Jul 21, 2005
I am trying to call a stored procedure as part of my OLEDBSource. It takes two parameters. @StartDate datetime and @EndDate datetime.
View 9 Replies
View Related
Jul 31, 2006
Hi,
I have created lastUpdatedDate variable on package level. I have run a sql task and store a date in that variable.
now i am trying to pass that variable as parameter to oledb source connection (using command). it seems that we cant pass parameter in any sub query or derived table in query. its only working in outer query as soon as we place ? in WHERE clause of inner query it start throwing an 'Syntax Error' error saying that connection provider might not support that.
any idea ?????
I dont want to use command variables as my query is going to be quite big.
Note : I have tried Sql Server Native and OLEDB provider for sql server and this behaviour is seems to be constant in both.
Thanks,
Furrukh baig
View 2 Replies
View Related
Jun 7, 2007
Hi,
I'm using an OLEDBSource to select some data and then putting to in a Flat File destination.
However, when I look at the data in the OLEDBSource, they´re like this:
1. id
2. name
3. address
...but in the flatfile it comes out in the wrong order.
How can I fix this?
Thank you so much.
View 1 Replies
View Related
Nov 6, 2007
This might be an ignorant question, but I can't figure this out.
What is the purpose of the Error output data flow from the OLEDB Source? I am trying to understand an example of what kind of "error" would cause a row to go down this path, and I can't come up with one.
Does anyone have a good example of how this could be useful?
View 5 Replies
View Related
Mar 17, 2008
hello guys,
I have 10 tables, table1, table2, table3, table4.......table10. all these tables have different structure.
From each of these tables I want extract data and dump into flat file csv.
So i have OLEDB source and FlatFile Destination.
If i write seperate data flow task for each of the tables, then there is no issue.
But i want to use a single data flow task for all these tables. So for this, i use a variable @SQLStr . And i dynamically set the value of this variable to select * from table1, select * from table2.........selct * from table10.
So in the OLEDB source I select Data Access Mode as : SQL Command from variable. And I use @SQLStr for this varible.
For Destination, i dynamically generate the flat file for each of the table.
But this doesnt work, i get validation errors.
So, first can i use a single data flow task to dynamically change the source(different source tables) and destination. If so, what i am missing in the above process?
any help appriciated.
Thanks
View 8 Replies
View Related
Feb 2, 2008
Slow OLEDB Source in Data Flow
Hi All,
I have a simple data flow task, composed of only an OLEDB Source, a Conditional Split, and two Execute SQL statements (both insert statements, one after the other). When I run my package in Visual Studio for debugging, I noticed that after executing around ~9800 in the first and another ~9800 records in the second insert statements, the OLEDB Source will take around 3 or 4 minutes to fetch another set of ~9800 records. I have set the DefaultBufferMaxRows property of the Data Flow to 10000. My query to retrieve those 700,000 records runs for about 2-3 mins to finish (which I think should be decent enough). Is this an expected behavior of SSIS? The expected number of records to be retrieved is 700,000, and it takes forever to finish the transfer of these records. Please help
View 4 Replies
View Related
Nov 26, 2007
Hi,
I have an Excel source, which I have hooked up to a Data Conversion task. I have defined "Output Aliases" for all my columns in the Data Conversion task.
However, when I try to map the columns from the Data Conversion task to the table columns, there is a list of column names, which do not correspond to the names I defined as "Output Aliases."
For example, one of the Output Aliases is "col1." However, when I go to map it, the column name is not "col1" but "My Excel file.col1".
Why is this happening? I have not had this problem before.
Thanks
View 3 Replies
View Related
Mar 14, 2006
We have a complicated select query that needs to build a couple temporary work tables that are then used in the final select statement (in an OLEDB Source data flow control). We can click preview and see the resultset, but if we click on the Columns view there are no columns. We can save and close the OLEDB Source control but downstream from it there are messages saying that there are no input columns. The T-SQL looks something like this (abbreviated):
SELECT fieldlist INTO #temp1 FROM table
SELECT fieldlist INTO #temp2 FROM table
SELECT fieldlist FROM table INNER JOIN #temp1 INNER JOIN #temp2
DROP TABLE #temp1; DROP TABLE #temp2
Has anyone been able to use temp tables in a source SQL statement in a data flow? Are we doing something wrong or incomplete?
Thanks, Gordy
View 3 Replies
View Related
Aug 24, 2006
I am adding two OLE DB Source components to my pipeline and giving them different names
(via the Sourcename variable) but like they are assuming default name and description i.e. OLE DB Source which is causing the following error message on opening the package that was generated.
The package contains two objects with the duplicate name of "component "OLE DB Source" (37)" and "component "OLE DB Source" (1)"
Here is how I am doing it, in the debugger the names in both instances seem to have set correctly based on the variable but when saved they are lost. What am I doing wrong here?
IDTSComponentMetaData90 source = dataFlowTask.ComponentMetaDataCollection.New();
source.Name = Sourcename;
source.ComponentClassID = "DTSAdapter.OleDbSource.1";
source.Description = Sourcename;
Thanks
View 1 Replies
View Related
Jun 25, 2007
hi,
i have a number of interfaces in which i have used oledb source.
the problem i am facing is oledb source components code page is not configurable now if i want to deploy the interface in a different environment which has a database with a different collation it gives a error that oledb source needs new metadata.
has anybody faced this problem earlier.please give me a solution to this problem ..
thanks in advance.
srikanth
View 1 Replies
View Related