SQL 2012 :: SSIS - Updating Source And Target Data Structure
Feb 17, 2015
I have a SSIS package that simply moves data from a SQL database A to another SQL database B. I have update (increased) the size of a nvarchar column, on both A and B.I am wondering if there is a way to "refresh" somehow the SSIS package so I don't have to rebuild and redeploy it.The error I get now is a truncation error: "Text was truncated or one or more characters had no match in the target code page".
Can anyone help me out in getting the information or execution progress of a package like "number of records migrated", "which component is getting executed at present",etc...when we migrate the datas using the package which we have created programmatically and trying to execute the package programatically.We can see these informations in the the "progress tab" when we execute the package using BIDS in SSIS.
Hi, I am less of a technical but more of a analyst professional and right now investgating on various tools / options for the new conversion project I will be leading in insurance client. One of the tools that client want to use is SSIS but the source and target database is not on SQL server but plans are to build a staging SQL server database for transformation. Does SSID supports this kind of ETL process where both source and target system are non SQL servers.
I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.
I have all the config files for my various packages in a folder call configs. It seems that I cannot add folders to the SSIS solution. I can add config files, but they get copied into the root folder, which breaks the directory structure that I have.
Am I missing something? Is there a recommended to manage config files, and other external files used by the SSIS package, in SourceSafe and in the SSIS solution? Thank you.
I am a complete newbie to SSIS. I can create a simple package to transfer data between SQL instances and thats about it.
I have tableA (source data) and tableB (Destination data). TableA has 4 column and tableB has 5. I want to transfer all of the columns from tableA into TableB, but the 5th column in tableB needs to be populated with the ServerInstance name of the server TableA sits on. Do I need to have multiple data sources to achieve this? I have tried but no matter how I set it up, the Column in the destination is set to ignore.
I'm encountering a very peculiar situation when I'm trying to compare source and target data using conditional split. Following is the Data Flow and how I'm trying to achieve this.
Source Data : Col_A (PK) Col_2 1 100 8 500 Target Data : Col_A (PK) Col_2 1 100 3 700 8 500 Look-up Target on Col_A to check for existing records. Now we have four columns in Look-up match output: Col_A, Col_B, Lkp_Col_A (Target Col), Lkp_Col_B (Target Col).
Conditional Split: Compare Col_B with Lkp_Col_B
Update target if there is any change in the existing value of Col_B.When I'm running the package for every record in source, the conditional split fails and even when there is no change in Col_B, some of the records (Not all and quite randomly) get updated with the same value. If I run the package for few records, it works absolutely fine.
I got a quite a strange problem with Mining structure for OLAP data source though. The problem is: I am not able to edit the mining structure in the mining structure editor. The whole data source view within the mining structure editor is greyed. Could please anyone here give me any advices for that?
Thank you very much in advance for any help for that.
I am able to use a custom script task to receive a MSMQ package and save the package contents to a flat file.
I can also use the bulk load task to push the flat file contents into a SQL table.
However, I would like to save the package contents to a variable (done, it works), and then pass that string variable to a data flow task for SQL upload. In other words, I don't see any reason to persist the msmq package contents to disk.
My question is: Which data flow source can I use that will accept a string variable? The string variable will then need to be processed with bulk load or an execute sql task.
Btw, the content of the string variable is a csv style string:
hello all, I ran into a problem using the "sql data source updating " method. i'm using a form view with about 20 parameters that are bind to controls in the form view and 5 other parameters that i'm setting the value to in the "sql data source updating" method. Every thing updates find execept for the last 5 parameters that i'm setting the values to in the "sql data source updating" method. My store procedure(sp) updates and insert mulitple tables. For some UNKOWN reason the tables that uses the parameteres in the "sql data source updating " are inserting multple records when only 1 record should be inserted. Have anyone have this problem?
I have a sql data source that is filtered by a date range. The results are then presented in a Gridview. What I am hoping to do is add a button that will update all the filtered rows in the sql data source. Is this possible? The data source is shown below: <asp:SqlDataSource ID="SqlDataSource3" runat="server" ConnectionString="<%$ ConnectionStrings:MYSTR%>" SelectCommand="SELECT * FROM [dbo_cheques] WHERE (([chq_banked] >= @chq_banked) AND ([chq_banked] <= @chq_banked2))" UpdateCommand="UPDATE dbo_cheques SET chq_printed = @chq_printed WHERE (chq_id = @chq_id)"> <SelectParameters> <asp:ControlParameter ControlID="Calendar1" Name="chq_banked" PropertyName="SelectedDate" Type="DateTime" /> <asp:ControlParameter ControlID="Calendar2" Name="chq_banked2" PropertyName="SelectedDate" Type="DateTime" /> </SelectParameters> <UpdateParameters> <asp:Parameter Name="chq_printed" /> <asp:Parameter Name="chq_id" Type="Int32" /> </UpdateParameters> </asp:SqlDataSource>
Hi all,I have a gridview that bound to a SqlDataSource called SqlDataSourceGridView. I have enabled Edit in my GridView, but I do all the updating in code behind with stored procedure (using onRowUpdating). Now each time when I click "Update", the update went through and all the data got updated, but I received this error: Updating is not supported by data source 'SqlDataSourceGridView' unless UpdateCommand is specified.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.NotSupportedException: Updating is not supported by data source 'SqlDataSourceGridView' unless UpdateCommand is specified. What do I need to do to fix this problem?Thanks a lot.
updating a recordset contained in an System.Object variable during runtime.
I am trying to execute multiple file actions (plus parsing those files into a set of staging tables) at separate locations in parallel. I know I can do this in C# but I have a business requirement to use SSIS for all ETL operations.
Any one site can have 0 to many of 1 to 3 files. I would like to run multiple sites at the same time, so when all files of all types are completed at that site then go on to the next site in the list. I know I can do a single site at a time in a foreach loop but if I can run lets say 3-5 sites concurrently then I should be able to save execution time.
My thought is to have a recordset of the sites, when any 1 of the 3 (or more) "control flows" is open, update the recordset to let it know that site being actioned, when that site is complete, update the recordset that the site is completed, and so on.Or am I running in the wrong direction?
I am working on task where I need to load data into DWH from OLTP. Load needs to b incremental load. For this I am planning to use CDC, to track changes and only load data which has been new inserts/updates/deletes. But to load data to destination, there are couple of joins, which I need to do. Is there a way in CDC, where I can use query as source?
In this situation do I need a proxy or forwarder at both ends to prevent connection issues? Are there plans to handle this in future SSSB upgrades. Thanks.
I need to update the ilocationid from Table 1 to all Table 2 records related to Table 1but there is no direct relation from Table 1 to Table 2. I needed Table 3 to make the connection from Table 1 to 2.
I need to create an Bulk upload utility using ASP.Net and SQL Server. Below is the process for the uploads -
Excel Template wherein user will enter the details. A Tab-delimited output file will be generated using the VBA. There are 2 tables - one is Temp Table which is replica of the the final table and second is the final table Using File.OpenText(filePath).ReadLine() - All the Rows from the tab delimited data file will be inserted into DataTable.
using SQLBulkCopy the tab-delimited data file data will be inserted into the Temp Table.
Data will be validated based on the data inserted in the temp table. If the data as errors then the temp table will be cleared else the data will be inserted from the temp table to the final table.
My Issue is that in both the tables there is a column (Name : PeopleKey (Int PrimaryKey)). If the user enters Alphabetic value then the Bulk Utility is failing. Below are the two options in my mind -
1. I can change the DataType in Temp table from int to VARCHAR. So, the data can be inserted at first and then I can validate and get the data corrected. But i am not sure whether it is the right way to fix issue as the source and target tables columns are different.
2. When the data in inserted into the Datatable by following Step 3. So, once the data in inserted into DataTable then i can validate there. Thus the source and target tables Datatype will be same.
Passing a date parameter in a stored procedure through OLEDB Source (Data flow task). We have a multipurpose stored proc in which one of the many expected parameter is a date. It keeps giving me the following error:Description: "Operand type clash: int is incompatible with date"- although there is no integer.
One way to fix it would be change the data type from date to datetime in proc but I can't do that since it is multi-purpose proc. I didn't find anything good on the net.
I a DB deployed in the field at numerous customer sites. I have made changes to the DB in my dev environment and now I need a way to update the DBs out in the field to the new DB structure. Examples of updates are adding a new column to a table, or adding some more code to a stored proc.
So what do I do ? I need to be able to email comething to the customer that they can run that will make my recent updates to their DB structure. It can't screw up their existing data either.
So what's the solution ? A big SQL script wil ALTER TABLE commands ? How do I updated stored procs through a script ? I need help seriously. What is the accepted industry standard method of updating deployed DBs when a new version is released ? Please Please help.
What could be simpler: map a flat file record structure, extract the data, and populate essentially the same flat file record struc in an Oracle table. Let the fun begin.
Specifically: the flat file record struc is fixed length 196 bytes. A particular field consists of 4 bytes of Integer data; IS deals very nicely with the definition, does not appear to be any issue with that. The issue is trying to get the 4 bytes of integer to map and load into the Oracle table. The data type in the flat file def is DT_UI4. The data type in the Oracle target is DT_NUMERIC. One would think that perhaps a simple transform and Viola?! I've defined the transform but does not seem to matter - whatever I try yeilds the same results.
I 've tried many different src/trg data type defs., but all yeild the same results.
Execution Results from debug:
Everything validates and then...
[kcd [8671]] Error: Data conversion failed. The data conversion for column "load_time_min" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
[kcd [8671]] Error: The "output column "load_time_min" (11050)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "load_time_min" (11050)" specifies failure on error. An error occurred on the specified object of the specified component.
Hi, I am copying records in a table. The source table and the target table are the same. I need the value from the id-field from both the source and target row. Is there a way to do this with one query?
I tried the following, but it doesn't seem to work:
INSERT tableOne (value1, value2, value3) OUTPUT source.id, inserted.id SELECT value1, value2, value3 FROM tableOne AS source WHERE ID = @number
Was wondering if there was a best practice minimum permissions for creating a SQL login to use when setting up a new shared Data source for SSRS report manager?
Something along the lines of them being a data read for the DB and permissions to update tempdb?
Would have thought it not advisable to have the login be able to update the main db...
Can anyone give me ideas on how I can update a table with new data while keeping the old data. Just append the new data - doing this in an ssis job? Ive been trying to use the ole db command transaction but i cant seem to get it working and I have about 10,000,000 rows this method would be slow.
I am using DTS and VBScript in DataPump tasks in order to transfer large amounts of data from text files to an SQL database.
As the database uses a normalized schema, there is often the case of inserting multiple records in a destination table from various fields of the same record of the source text file.
For example, if the source record contains information about goods sold like date, customer, item code, item name and total amount, and does so for a maximum of 3 goods per sale (row), therefore has the structure:
I have tried using a datapump task and VBScript, and I guess it has to do with the DTSTransformStat_**** constants, but none of those I used seems to work
I am new in SSIS. Anyone know how to valify number of record that I load from csv file to SQL database table?
For example, the source file call product.csv and target table in database named DSS table name PRODUCT. I load data from flat file to table then I need verification if count between source and target not match send e-mail to me.
Hi. I have a question that I have been thinking about for some time. Have in mind that I'm somewhat a SQL newbe. I work on a few webapplications that are database driven applications. When we desides to change something in the database, we normaly do the changes on our replicat development database, and after some testing the changes are suppose to be implamented in the live, public database. Now, what is the best way, or recomended way to do this? Today we bo it manualy. But at one time I experimented with the export/import wizard to se if it was possible. What I want to do is change de design of a tabel, without loosing the original data, and without geting the data from the development database into the live database. To export an stored proccedure, I usualy just create a SQL-script for the procedurs in question and the runthe script against the public DB using the Query Analyser. But if I do the same procedure on a table, I loss all the data in the database. Do I make any sense at all? Any tips/idea/best practise on this subject? /Johan Christensson
I have met the interview question, I provide answer like the following from my experience. I don't know it is correct or need to supplement. Thank you for help.
Question: How to identity existing data for updating in SSIS?
Answer: If you have the same key columns such as primary key or business key, you just use them to identify existing records from data source to destination.
If you use different key columns between data source and destination, you can create permanent link table which will store business key for data source and destination, and you can compare records from linking table when you update data.
I am not able to see the Data Source in Visual studio 2012. I have tried several things like 'Alt + shift + D' which is meant to bring up the datasource tab but it does not. I have also looked in view->Other windows but its not displayed there.
I have SSIS solution which has multiple packages and one Main package which starts all others.
I also have 2 Data Sources which I use on each package.
I need to do my testing on different Data Sources. When I try to change Data Source from one database to another - SSIS is getting confused and fails on the connections or starting to insert to the previous database.
Could you please explain the steps for doing it. What is the problem here? Can I use Windows Attentications in data source or use special account?
Also some confusion here with Config file. Should I create Config file just from Main package or from each package. For now I am working with a Source code, but later I would need to change connections string before running the executable. If I change the string in my Main package - would it be understandable in all others?
Hi,I've just installed SQL server and then IIS and SQLCE tools.I created a virtual directory and was trying to update the NTFSpermissions from SQLServer Connectivity management when i got thefollowing error - Access Control List (structure) invalid.Has anyone come across this and if so, what did u do to fix it.ThanksLyn
Please, this is an urgent call; May be someone issued the sama probleb I do :
When I'm trying to import a XML file, that contains diffgram, using the XML Source task in SSIS, I choose an Inline Schema Option - everything goes well...the tables and columns are displayed, and the Import task into a Database's table succeds.
The problem is that , it doesn't load any data into the table, though there are plenty .