My problem I've been struggling with is the following. I have a set of text files (around 70), each with different column numbers and types. I define Flat File Connection Managers for each of them where I can nicely rename, set data types and omit certain columns. I do this once and this will be the basis for the rest of the data process (would be nice programmatically too actually). I would like to pump each of these text files into SQL Server tables using CREATE TABLE and BULK INSERT (because do it one-by-one is really a pain). The question is:
is there a way to obtain column information (Script Task) from a Connection Manager so I can run CREATE TABLE-s? I just need the names, data type for each nothing fancy...
(I bumped into interfaces like IDTSConnectionManagerFlatFileColumns90, which I cannot handle from the Script Task.)
We have a Main package and which is calling 2 more other packages. The first package contains a connection and we are using a Dataflow task. The data flow task has OleDB Data source which is taking getting columns using a Stored Procedure. And the output we need to write in a Flat File.
The second Package also contains the same(The same Tasks, Database and Stored Procedure Calling) The difference is in the stored procedure Parameters. Based on the different parameters Stored procedures returns the different Columns and Rows output. When we are trying to Get the second package output in OleDb Data source it shows all the columns which is the output of the First Package because it stores External Meta Data.
So My understanding is the Connection to the same database keeps the External metadata information with the connection and because of that it is always getting the same output columns in Ole DB Data source task in the second Package also.
How to Get my correct output from the second package in this case? Or If we dont want to store external Meta data with the Connection then is that possible? If yes then How?
I'm having an issue running the clustering algorithm in the data mining view of Visual Studio. The databases connect properly and the data subsequently loads. However, upon clicking on the "Mining Model Viewer" tab, I receive the following error message:
Errors in the metadata manager. The D Msample ~MC cube has no measure groups. Errors in the metadata manager. An error occurred when loading the D Msample ~MC cube, from the file, '\?C:Program FilesMicrosoft SQL ServerMSSQL.2OLAPDataewDM_sample.0.dbD Msample ~MC.2.cub.xml'.
We are not using a data cube, so I am assuming that this file is being called through the clustering algorithm. Furthermore, I have run the same process on different systems successfully. The only difference I can detect is that this error resulted on a 64bit system.
Hello, I'm getting the following error when I use Enterprise Manager to view a database in taskpad mode:
Internet Explorer Script Error An error has occurred in the script on this page Line: 307 Char: 2 Error: Unspecified Error Code: 0 URL: res://C:Program%20FilesMicrosoft%20SQL%20Server80ToolsBi nnResources1033sqlmmc.rll/Tabs.html
If anyone has seen this before and knows the cause and a fix your help would be greatly appreciated. If it helps, the O/S is Win2K, SQL Server 2000 Standard edition with IE 5.5. Thanks and cheers.
I have a package that uses a for loop to iterate through an unknown amount of excel files and pull their data into a table. However, there will be cases when the file is corrupted or has some sort of problem so that either the transformation will fail or the excel source will fail.
I have it so that for each iteration if the transform was successful the file is moved to an archive directory, and if it fails the file is moved to a different directory.
But I don't want the package to be marked as failed. For the control flow tasks I have set the individual components to FailPackageonFailure = False, and for the Data Flow tasks I have set ValidateExternalMetadata = False.
It no use to set the MaxErrorCount higher because I can't guarantee how many files will be processed and how many might fail.
Could anyone suggest a clean way to trap these errors? Specifically, the "Cannot Aquire Connection from Connection Manager", which is the excel connection.
I am getting frequently errors in report manager. When I open that log files, I can see some old logfiles in 2013. So now how to enable that logging feature again.
i.e. before that they enabled the logging in reportserver and disabled now?
For some reason, when I try to delete old DB connections from the connection manager, it causes compilation errors because it is not removing the references from the meta data. I get Object ID errors where it is still trying to find the connection after it has been deleted. I have tried going to the code behind and searching then deleting the invalid reference, but that still does not work. I have a connection for 2 totally different databases that are no longer being used in the SSIS package and I would like to get rid of them.
As I was developing my SSIS package, I created several variables and tasks ( FTP, WMI Reader Task ). I am now cleaning up, deleting unwanted variables and connections in the design window. I save and build the package and when I load the package, I get warnings that these variables are referenced but can't find them and errors that the WMI connection is not found.
When a package calls a sub-package, it stores the absolute path of the child package in its dtsx xml file in a Connection String property. How annoying !!! . When I deploy this to another machine with a different file structure, it becomes a problem. Why can't it store the path relative to the parent package, which would be typically in a sub-directory under the parent ?
These last 2 days have been nothing but frustration and my deadline is slipping. Any help is appreciated.
Purpose: Need to import excel source data into SQL Server 2005 tables. Excel source data comes in nulitple excel files with the same structure but different data. I would appreciate someone taking a look at the following information and notifying me of what I am doing incorrectly.
I Inserted a foreach loop container, a data flow task located inside the foreach loop contaiiner, an excel and SQL Server 2005 connections.
After trying multiple times I went the following URL and followed step by step direction on how to connect excel workbooks dynamically: http://msdn2.microsoft.com/en-us/library/ms345182.aspx . I also used http://www.sqlstrings.com/ as a reference when creating the connection string.
Creating a Foreach Loop Container:
1. Opened foreach loop container 2.Set the Enumerator to 'Foreach File Enumerator" and configured the enumerator by setting the directory location and file base name to E:ClientsDep CommBEABEA_Test_Source and *PersonnelExpense*.xls respectively. 3. Clicked Variable Mapping; created two variables called, "ExcelFile", and "ExtProperties" and closed out of the foreach loop container.
I. Created Excel Connection:
Created excel connection called, €œDynamic Excel Connection Manager,€? that initially pointed to one of the excel workbooks. Went to the connection properties by right clicking the connection manager. Expanded Expressions and clicked the ellipsis button to bring up property expressions Chose Connection String in the Property. Clicked the Expression Ellipsis button. Put the following inside the Expression multi line text box: A. "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @[User::ExcelFile] + ";Extended Properties="" + @[User::ExtProperties] + """
Clicked the Evaluate Expression button to get the following:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=;Extended Properties="" Clicked Ok button Inserted a Data flow task inside the foreach loop container.
II. Configured Tasks that is associated with Dynamic Excel Connection Manager or Package:
Set the Foreach loop container Delay Validation to true. Set the Data Flow Task Container Delay Validation to true. Set the Dynamic Excel Connection Manager Delay Validation to true. Set the SQL Server Connection Manager Delay Validation to true. Set the Package Delay Validation to true. Package Locale ID set to English
Ran the package after connecting the excel source data flow to the OLEDB destination and have inserted part of the error in this post. Please see below.
Error: 0xC0202009 at Package, Connection manager "Dynamic Excel Connection Manager": An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not find installable ISAM.".
I modified the connection string after receiving the error by removing the extended properties. The following is the modified connection string: "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @[User::ExcelFile]
I repeated step I.6 above and received the following expression: Provider=Microsoft.Jet.OLEDB.4.0;Data Source=
I ran the package and received the following error in part: OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Unrecognized database format 'E:ClientsDep CommBEABEA_Test_SourcePersonnelExpense_OCCs_051007.xls'."
I did not find anything helpful when I searched for the above errors and would very much appreciate anyone€™s assistance on this issue as this issue needs to be taken care of ASAP.
Does anyone have any ideas as to why I received this error and what can I do to resolve this issue?
Your assistance in this matter is truly appreicated! Thanks!! Lee
I am in the process of importing an Excel Spreadsheet using the natively connection manager in SSIS 2008. Â There is one column however that is causing me grief.
SSIS has natively chosen the problematic column to be a DT_WSTR(255). Â I have gone into the Excel connection manager's Advanced Editor and altered it to be a DT_WSTR(1000) (see image 1 attached).
I am still getting truncation issues though, as per image 2 attached. Â why this is?
We are going insane trying to start Report Manager on a SQL Reporting Services 2000 installation. The programmer/admin who originally set this up for us is gone.
We recently upgraded a database/application server to a new server, causing the data source being used by reports in reporting services to no longer be valid. Unfortunately, we do not have access to the original report project to 're-deploy' with the corrected data source.
We desperately need to get the reports that are installed to retrieve their data from the new database location/machine. We understand this can be done by specifying a new data source in Report Manager. (To clarify, we have NOT moved the report servier installation or database, these remain in place - it's just the deployed reports that no longer point to the correct data source.)
For some reason, our Report Manager will no longer run - when we try to launch it, we get a windows login dialog - nothing will work here. We've tried both local and domain admins and constantly get ACCESS IS DENIED 401.3 error messages that we do not have permission/problems with ACL's.
We've gone so far as to allow EVERYONE read/write access to the ReportManager and ReportServer folders and the RS virtual directories, but nothing seems to help.
Can anyone help with this? Ideas on how to change our data source, or how to get back into Report Manager?
Since we are somewhat technical, but not experts, and don't have much more time to waste, we are willing to pay $500 for this project to someone willing to access the server and fix the problem so that the reports point to the correct database and restore access to Report Manager.
I recently updated the datatype of a sproc parameter from bit to tinyint. When I executed the sproc with the updated parameters the sproc appeared to succeed and returned "1 row(s) affected" in the console. However, the update triggered by the sproc did not actually work.
The table column was a bit which only allows 0 or 1 and the sproc was passing a value of 2 so the table was rejecting this value. However, the sproc did not return an error and appeared to return success. So is there a way to configure the database or sproc to return an error message when this type of error occurs?
I have a parent package that calls child packages inside a For Each container. When I debug/run the parent package (from VS), I get the following error message: Warning: The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
It appears to be failing while executing the child package. However, the logs (via the "progress" tab) for both the parent package and the child package show no errors other than the one listed above (and that shows in the parent package log). The child package appears to validate completely without error (all components are green and no error messages in the log). I turned on SSIS logging to a text file and see nothing in there either.
If I bump up the MaximumErrorCount in the parent package and in the Execute Package Task that calls the child package to 4 (to go one above the error count indicated in the message above), the whole thing executes sucessfully. I don't want to leave the Max Error Count set like this. Is there something I am missing? For example are there errors that do not get logged by default? I get some warnings, do a certain number of warnings equal an error?
Starwin writes "when i execute DBCC CHECKDB, DBCC CHECKCATALOG I reveived the following error. how to solve it?
Server: Msg 8909, Level 16, State 1, Line 1 Table error: Object ID -2093955965, index ID 711, page ID (3:2530). The PageId in the page header = (34443:343146507). . . . . . . . .
CHECKDB found 0 allocation errors and 1 consistency errors in table '(Object ID -1635188736)' (object ID -1635188736). CHECKDB found 0 allocation errors and 1 consistency errors in table '(Object ID -1600811521)' (object ID -1600811521).
. . . . . . . .
Server: Msg 8909, Level 16, State 1, Line 1 Table error: Object ID -8748568, index ID 50307, page ID (3:2497). The PageId in the page header = (26707:762626875). Server: Msg 8909, Level 16, State 1, Line 1 Table error: Object ID -7615284, index ID 35836, page ID (3:2534). The PageId in the page heade"
A user was created with a limited privilege under the USERS group. Once this user loged in the Report Manager he is acting like an Admin and Content Manager, though he is not given even a browser role.
What do u think that this guy is acting like a Super User evenif he is restricted to a browser role on the Report Manager ????????????
On one of our machines, all of the SQL Server 2000components except for the main Server component (SQL Servercore) itself were installed (Management tools, etc) a while agoand everything was running fine. Now I go and add/install theServer component and then Service Pack 3a.It seems that Service Manager won't start up (I get an hourglass cursor)and now I find that Enterprise Manager won't run as well. No errormessages appeared and I don't think I saw anything unusual inthe log file.However, I can use Enterprise Manager on a differentmachine and connect to the database (so the databaseitself seems to be running).Any suggestions as to what the problem might be and how tofix it? I like to see if I can repair this without havingto do a reinstall.Thanks.PF
IF someone can assist me. Everytime I load up enterprise manager the service manager turns off. And the enterprise manager can't connect to the local database. But everytime i turn it back on and try to connect again it shuts it off and around and around we go. Help would be appreciated. Thanks.
I recently added a column to an existing table with a getdate default. When doing a query from that server everything works fine. When a query is ran from a remote server I get an SQLOLEDB error message saying 'inconsistant metadata'. I've tried dropping the remote server and reconnecting but that didn't seem to resolve the problem. Can anyone tell me how to resolve this error. I believe the error number is 7353.
Hi,Is it possible to get metadata (i.e. descriptions of tables etc.) insql-server? In Oracle you can retrieve this information with tables likeall_objects, user_tables, user_views etc. For example, this query selectsthe owner of the table 'ret_ods_test' (in Oracle!):select ownerfrom all_objectswhere object_name = 'ret_ods_test'What's the equivalent in sql server?Thanks a lot.
Is there a way to find out which user defined procs/child packages etc are been called in SSIS packages using some metadata. The idea is to have a document which lists the number of packages called, whats sprocs and child packages are executed by those pkgs..
I have checked the SSIS metadata whitepaper but that is too generic.
[OLE DB Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "objectName1" needs to be updated in the external metadata column collection.
A corollary question: what does right-clicking a package in Solution Explorer and clicking "Reload with Upgrade" do?
-We are using SSIS packages for various kind of data load from excel source. -If there are any change in the data type or format of excel, the package cries for the Metadata mismatch. -During design time if you accept the metadata changes, all things work fine.
But in our case we have deployed the packages on Production Server, now the excel file format/data has changed. The packages are expecting a different metadata so they are not working at all.
Do you have any suggestions for the above problem? Thanks, Vijay.
i am using icolumnsrowset interface to get some metadata about the columns in a rowset, but i never got unique and primary key columns in the columnsrowset, they always return null, when using sqlcedataadapter i simply add addwithkey option to the adapter to determine it, but i dont know how to do it by using ole db interfaces, i have tried to set DBCOLUMN_KEYCOLUMN flag to true on ccommand<cynamicaccessor, crowset> but it seems it rejects it, generating an unknown error, error object says almost nothing except that 'errors occured[,,,,,]' text
can someone tell me, how can i retrieve columnsrowset with that unique and primary key sections filled?
Hello, i would like to know if it's possible to generate automatically a word document or an excel document that will contain all the metadata definition, for example containing the source columns names, their datatype, and the destination with their datatypes, so that it would easy to create a data dictionnary .
It's now quite some time that one particular behaviour of SSIS is really frustrating me and I would like to know if I'm the only one experiencing this problem or if other people have the same problem. The issue I'm talking about is SSIS 'dependency on what is written in the XML files describing the flows. Particularly with the Data Types of columns. I'm explaining myself: Imagine your are developping a flow containing several numeric(18,0) columns... During the flow you have to perform a lookup on an Integer Field.... Of course this operation is not allowed as a numeric is not mappable with an Integer... (This is, in my opinion, a nonsense as an implicit conversion has to be possible). as a result of this behaviour, I decide to change the datatype (numeric) from my source query to an integer and use it in the Lookup which of course succeeds but now I have a second problem: each lookup in my flow has an error handling branch which I'm joining back using a Union transform. and there we have the second irritation: the Union transform doesn't replicate the Data Type changes that occured upwards in the flow... worse: it even has no interface to let you modify the data types like the advanced editor of some transforms or data sources. (I've just lost a complete dataflow while trying to modify it manually in the xml file directly :-( for those who are considering modifying directly the XML, don't!! You are asking for trouble and a lot of frustration when you'll switch back to the designer to see the effects ) My question is now: Am I misusing SSIS?? Is there somewhere an option to activate in order to get this behaviour fixed?? Has anyone else experienced this problem?? How are you solving this?? Are there any plans in the future to loose this dependency on the datatypes or at least add some implicit conversions??
Thanks in advance for your replies, suggestions,questions and other thaughts about this subject :-)
I have a question regarding my metadata information. I finally setup my fixed width file which took some time. Is there a way that I can backup my metadata so I wont have to recreate these setting again. I'm thinking the format of the file is stored in the metadata so if I have a user running the SSIS package from the Business Intell Studio they wont reset all of my columns. Is there a file I can restore or backup if this should happen
I'm running the following query in SQL Server 2005: select name from master..syslogins; It is being executed from within a stored procedure.
For user 'sa' - I get the complete list of users. For a user (say 'user1') with NO sysadmin privilege - I get only two names: 'sa' and 'user1'
Is there a way for me to retrieve the complete list of users even for 'user1' without making any changes to his profile (or making very MINIMAL changes to profile)?
I don't want to give sysadmin profile to this user. I know 'GRANT VIEW ANY DEFINITION TO public' works, but don't want to do that either.
I wanted to see if anyone has explored posting a Report KB based on metadata in the RS portal? If so, have you found a way to post the information on the RS portal?
I really am looking to add additional properties to a report. Other than author and description. I would like to add 2 to 3 more fields that would feed over to the catelog table on the report server. Then write a few reports that will allow for definitions, metadata, and links to the reports.
Does anyone have any ideas for something like this?