Is there a way to find out which user defined procs/child packages etc are been called in SSIS packages using some metadata. The idea is to have a document which lists the number of packages called, whats sprocs and child packages are executed by those pkgs..
I have checked the SSIS metadata whitepaper but that is too generic.
Can anyone tell me what Metadata Model Microsoft supports with SSIS 2005? DTS in 2000 was supporting OIM but I was not able to get some information about this topic on SSIS 2005.
The reason I need to know is that I wanna import some metadata from Business Objects Data Integrator into SSIS 2005 and Business Objects is supporting CWM....
Is there anyone know if a simple SSIS package (moving data from source table to target table) or task can be called repeatedly using a variable that obtains value once a time from a metadata table containing only table names. Basically, I would like to pass in a table variable to the SSIS package or task to start the ETL for different tables. Thanks a lot!
I have multiple xml data file in a directory say C:XMLData abc1.xml, abc2.xml, abc3.xml etc.
Need to loop through each file in ssis with Foreach loop container, and get the file name say abc1, and load the data of abc1.xml to abc1 table in sql server DB.
Next iteration will pick up the abc2.xml and find the abc2 table in sql server DB then insert the data in abc2 table.
While each iteration, xml source should also point each xsd file correspondingly.
 Tables are already created in DB
I solved my problem up to getting the file name from ech iteration and assigned file name to variable, in oledb destination data access mode I select Table or view name variable, then corresponding table will get selected for data insertation.
Just wanted to know how can I read each xsd file for each xml data files while iteration.Â
I have a scenario, need to create SQL server Tables dynamically.
I Have multiple xml data file on a particular location, and want to load those XML data into sql server tables, but he metadata of each xml data files are not same.
Hence the approach is that,
1. Pick first file from that location 2. Create a table according to that xml data file metada 3.  load data on newly created table.  4. Pickup the next xml data files. 5. loop through, till the XML data files are exists on that location.
I recently added a column to an existing table with a getdate default. When doing a query from that server everything works fine. When a query is ran from a remote server I get an SQLOLEDB error message saying 'inconsistant metadata'. I've tried dropping the remote server and reconnecting but that didn't seem to resolve the problem. Can anyone tell me how to resolve this error. I believe the error number is 7353.
Hi,Is it possible to get metadata (i.e. descriptions of tables etc.) insql-server? In Oracle you can retrieve this information with tables likeall_objects, user_tables, user_views etc. For example, this query selectsthe owner of the table 'ret_ods_test' (in Oracle!):select ownerfrom all_objectswhere object_name = 'ret_ods_test'What's the equivalent in sql server?Thanks a lot.
[OLE DB Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "objectName1" needs to be updated in the external metadata column collection.
A corollary question: what does right-clicking a package in Solution Explorer and clicking "Reload with Upgrade" do?
-We are using SSIS packages for various kind of data load from excel source. -If there are any change in the data type or format of excel, the package cries for the Metadata mismatch. -During design time if you accept the metadata changes, all things work fine.
But in our case we have deployed the packages on Production Server, now the excel file format/data has changed. The packages are expecting a different metadata so they are not working at all.
Do you have any suggestions for the above problem? Thanks, Vijay.
i am using icolumnsrowset interface to get some metadata about the columns in a rowset, but i never got unique and primary key columns in the columnsrowset, they always return null, when using sqlcedataadapter i simply add addwithkey option to the adapter to determine it, but i dont know how to do it by using ole db interfaces, i have tried to set DBCOLUMN_KEYCOLUMN flag to true on ccommand<cynamicaccessor, crowset> but it seems it rejects it, generating an unknown error, error object says almost nothing except that 'errors occured[,,,,,]' text
can someone tell me, how can i retrieve columnsrowset with that unique and primary key sections filled?
Hello, i would like to know if it's possible to generate automatically a word document or an excel document that will contain all the metadata definition, for example containing the source columns names, their datatype, and the destination with their datatypes, so that it would easy to create a data dictionnary .
It's now quite some time that one particular behaviour of SSIS is really frustrating me and I would like to know if I'm the only one experiencing this problem or if other people have the same problem. The issue I'm talking about is SSIS 'dependency on what is written in the XML files describing the flows. Particularly with the Data Types of columns. I'm explaining myself: Imagine your are developping a flow containing several numeric(18,0) columns... During the flow you have to perform a lookup on an Integer Field.... Of course this operation is not allowed as a numeric is not mappable with an Integer... (This is, in my opinion, a nonsense as an implicit conversion has to be possible). as a result of this behaviour, I decide to change the datatype (numeric) from my source query to an integer and use it in the Lookup which of course succeeds but now I have a second problem: each lookup in my flow has an error handling branch which I'm joining back using a Union transform. and there we have the second irritation: the Union transform doesn't replicate the Data Type changes that occured upwards in the flow... worse: it even has no interface to let you modify the data types like the advanced editor of some transforms or data sources. (I've just lost a complete dataflow while trying to modify it manually in the xml file directly :-( for those who are considering modifying directly the XML, don't!! You are asking for trouble and a lot of frustration when you'll switch back to the designer to see the effects ) My question is now: Am I misusing SSIS?? Is there somewhere an option to activate in order to get this behaviour fixed?? Has anyone else experienced this problem?? How are you solving this?? Are there any plans in the future to loose this dependency on the datatypes or at least add some implicit conversions??
Thanks in advance for your replies, suggestions,questions and other thaughts about this subject :-)
I have a question regarding my metadata information. I finally setup my fixed width file which took some time. Is there a way that I can backup my metadata so I wont have to recreate these setting again. I'm thinking the format of the file is stored in the metadata so if I have a user running the SSIS package from the Business Intell Studio they wont reset all of my columns. Is there a file I can restore or backup if this should happen
I'm running the following query in SQL Server 2005: select name from master..syslogins; It is being executed from within a stored procedure.
For user 'sa' - I get the complete list of users. For a user (say 'user1') with NO sysadmin privilege - I get only two names: 'sa' and 'user1'
Is there a way for me to retrieve the complete list of users even for 'user1' without making any changes to his profile (or making very MINIMAL changes to profile)?
I don't want to give sysadmin profile to this user. I know 'GRANT VIEW ANY DEFINITION TO public' works, but don't want to do that either.
I wanted to see if anyone has explored posting a Report KB based on metadata in the RS portal? If so, have you found a way to post the information on the RS portal?
I really am looking to add additional properties to a report. Other than author and description. I would like to add 2 to 3 more fields that would feed over to the catelog table on the report server. Then write a few reports that will allow for definitions, metadata, and links to the reports.
Does anyone have any ideas for something like this?
hi, I try to post a new question about metadata refresh...even if i see other thread that work on a similar problem.
I have a ssis package that import an xml huge file (500 mb); These are the main step:
1) generate a XSD file against xml using xsd.exe utility
2) using xml task, make a diff between the old xsd and the new
3) if there are no difference, I start the data flow task that import xml in sql server; otherwise I stop all the task, edit the data flow task, change the xsd reference in advanced editor and then make many "double clik / OK" on every single flow....
The underling idea is that xml file change because some columns are added but these columns are not interesting for my elaboration, so i can ignore this new column and work without mapping it.
What I'm looking for is a way for make, via SSIS, the "double clik / OK" steps....in other words, to update the metadata.
Could anybody suggest me a way? it's a sort of macro, or keyboard recorder...I'm trying to study xml package configuration; is this a good way ?
another way is to give to the end user the task to update metadata; for making this I need to open the package editor (visual studio..:!) in a more confortable environment....For example, is possible to edit the ssis package in ms access? probably i know the answer...
I'm trying to create a Data Dictionary view from system table info like tablename, fieldname, datatype etc. I can find all that I need except for the "description" field which is displayed in Enterprise Manager/ Repository metadata pane.
How can I locate this field so I can reference it in a view? Isn't it stored in a system table? Master, model? Where is it?
Hey there,Im wondering if there is a way to determine which views in my database use the "order by" statement. The reason I need this is because we need to migrate over to MS SQL 2005 where the order by statements are ignored within the views themselves. Now(in mssql 2005) you need to explicityly state the order by now when calling a view ie. select * from [viewname] order by column x, y desc, z instead of ie. select * from [viewname] where the view already had the applicable sorting done within the view.If those order by statements are ignored, some production software which rely on the ordered data will corrupt.Please let me know if there's a way to query the actual database and determine which views have 'order by' statements in them.thx
Did some searching and didn't seem to find what I'm looking for. I'm pretty new to SQL Server (most of my experience is on DB2 for z/OS).I'm building some new tables, and want to find a way to add comments to the metadata for the column. In DB2 the syntax is:COMMENT ON COLUMN TB_CREATOR.TB_NAME.COLUMN_NAME IS 'comments here';ORCOMMENT ON TB_CREATOR.TB_NAME (COLUMN1 IS ' comment here',COLUMN2 IS ' comment here', );Is there anything like this in SQL Server?Thanks!
I'm new to dba'ing so go easy on me. I'm looking after a brand new installation of sql server 2005. I've created a simple db with 1 table but when I go to MSAccess ( as a newly created user) and create a linked table to sql database all the metadata is showing and I can't stop it.
I've tried going through securables but its still visible no matter how much I try to deny. From what I've read this shouldn't be possible as making metadata invisible should be standard.Can anyone throw any light on this?
Be careful when implementing views (from SQL Server 97/2K). SQL Serverstores the metadata on the view at creation (or the last time it wassaved). This means if you have:SELECT * FROM table1it will put all the fields of table1 in the view's metadata. If youthen change table1 and add (for example) another field, this fieldwill not be visible in the view until you open it in design view andclick save (to update it).
I had a pervious thread on a metadata KB but didn't get a response. So I was wondering if anyone has added on a web based help file app to SSRS? Something that you could assign additional metadata to each report. Something that would be searchable.
At this point SSRS doesn't provide enough fields for metadata. I am looking to assign some thing like:
1. Owner/Requestor of report 2. Fields in report 3. Description (other than the one provided) 4. etc..
Does anyone have any ideas of a add on, utility, or software?
I've tried looking in sys.syscolums and sys.syscomments, but I can't seem to find where the Description information is retain for a Field in the system tables -- any hints?
When using unpivot transformation, what exactly this error denote
"Incorrect UnPivot metadata. In an UnPivot transform, all input columns with a PivotKeyValue that is set, and are pointing to the same DestinationColumn, must have metadata that exactly matches "
Is it possible to capture the runtime metadata relating to the number of rows processed by a graph, the number of rows inserted/updated in the target blah blah using SSIS.
I have only seen metadata extensions for starttime, machine name etc. but not for this
Does anyone know how to obtain the physical server name that a SQL failover cluster instance is running on through the system tables or other database commands? Thanks in advance.
Usually, the XML Source will refresh its metadata when I don't want it to, and will cause me to repair all downstream metadata when it does. This time, I actually want it to refresh, as the schema changes I've made are significant.
Yet, I don't see any "Refresh" button in the XML Source UI. Even clicking "Browse" and browsing to the schema doesn't force a refresh. I've had to change the schema name (it has to be to an existing schema!) then change it back in order to force a refresh.
Is there a "Right" way to do this?
And is there any way to "right-size" the amount of metadata disruption this causes? I was pleased to see that updating a database schema limits the SQL Server Destination to updating only the changed metadata. Yet, any change at all to the XML Schema or XML file will cause all of the XML Source metadata to be invalidated, a process that takes several minutes at 100% cpu usage!