Xml Data Transformation Into Single Data Row
May 22, 2008
Im attempting to put data from an xml file (formatting like its different tables) into a database table as a single data row entry. I've attempted to use the 'union all' and 'merge' transformation but this inserts the data as multiple rows. Is this possible ?.
View 8 Replies
ADVERTISEMENT
Jun 3, 2000
I have a very complex (for me, anyway) data transformation problem.
I've been given a flat-file of physician data from another system which must be automated for entry into the SQL server on a regular basis.
This was no problem until we discovered that several fields (all of which we wanted to use) had multiple pieces of information in them, separated by semicolons.
Well, this didn't seem to be too big of a problem, so I wrote a DTS activeX script to handle it. This is what I originally wrote:
'*******************
Function Transform()
'Declare variables
Dim strOffice
Dim strOfficeNew
Dim cChar
Dim x
Dim y
Dim z
'Scrub values into new rows
strOffice = DTSSource("Col050")
x = 1
y = len(strOffice)
z = 1
While x <= y
cChar = Mid(strOffice, x 1)
If cChar <> ";" Then
strOfficeNew = strOfficeNew & cChar
Else
DTSDestination("Phys_No") = DTSSource("Col001") DTSDestination("Addr_No") = z
DTSDestination("Addr_Office") = strOfficeNew
strOfficeNew = ""
z = z + 1
End If
x = x + 1
Wend
'Insert final record after last semicolon
If strOffice <> "" Then
DTSDestination("Phys_No") = DTSSource("Col001") DTSDestination("Addr_No") = z
DTSDestination("Addr_Office") = strOfficeNew
End If
Transform = DTSTransformStat_OK
End Function
'*********************
This, of course, didn't work. WHat I got was the last part of the parsed data, which for the first record, was the second Address in the field.
I searched around, and found the following script that is supposed to allow multiple rows off of a single row, but I can't seem to merge the two and still get the data out clean.
'**********
Dim nCounter
nCounter = 4
Function Main()
if nCounter > 0 then
Main = DTSTransformStat_SkipFetch
DTSDestination("PatientNumber") = DTSSource("PatientNumber") Select Case nCounter
Case 1
DTSDestination("PhysicianType") = "Admitting" DTSDestination("PhysicianId") = DTSSource
("AdmittingPhysician")
Case 2
DTSDestination("PhysicianType") = "Attending" DTSDestination("PhysicianId") = DTSSource
("AttendingPhysician")
Case 3
DTSDestination("PhysicianType") = "Referring" DTSDestination("PhysicianId") = DTSSource
("ReferringPhysician")
Case 4
DTSDestination("PhysicianType") = "Consulting" DTSDestination("PhysicianId") = DTSSource
("ConsultingPhysician")
End Select
nCounter = nCounter - 1
else
nCounter = 4
Main = DTSTransformStat_SkipInsert
end if
End Function
'**************
I'm not a VB Script expert, so there's probably something very simple that I'm missing here... if someone could point it out, I'd be greatly appreciative.
Jaysen
View 4 Replies
View Related
Jun 11, 2007
My vendor requires data to be sent in Excel format. Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576). Right now my data sits in SQL 2000. I am using MS SQL Enterprise Manager 8.0 to prepare the data. Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance.
View 3 Replies
View Related
Nov 17, 2006
Hi,
I am trying to setup Trans Replication with updating subscriber on sql2000. One column on few tables got data with single quote (').
How do I handle in this case? Did any one come across such case?
Can I Change default QUOTED IDENTIFIER from ' (single quote) to something else (@@@) on SQL2000?
If yes, how to do?
Thanks
mka
View 1 Replies
View Related
Nov 25, 2015
I have a view that give me the data of all the batched. Now I am using a query on view to get a single batched data. when I am using direct query it was taking 0 sec but when I am using Through view "select * from myView where batched=2" then its taking 30 mnt.
View 3 Replies
View Related
May 12, 2006
I am getting the following error running a data flow that splits the input data into multiple streams and writes the results of each stream to the same destination table:
"This operation conflicts with another pending operation on this transaction. The operation failed."
The flow starts with a single source table with one row per student and multiple scores for that student. It does a few lookups and then splits the stream (using Multicast) in several layers, ultimately generating 25 destinations (one for each score to be recorded), all going to the same table (like a fact table). This all is running under a transaction at the package level, which is distributed to a separate machine.
Apparently, I cannot have all of these streams inserting data into the same table at one time. I don't understand why not. In an OLTP system, many transactions are inserting records into the same table at once. Why can't I do that within the same transaction?
I suppose I can use a UnionAll to join them back together before writing to a single destination, but that seems like an unnecessary waste and clutters the flow. Can anyone offer a different solution or a reason why this fails in the first place?
Thanks in advance.
View 3 Replies
View Related
Aug 4, 2006
Hi,
I am pretty new to SSIS. I am trying to create a package which can accept data in any of several formats. i.e. CSV, Excel, a SQL Server database/table and import the data into my destination database.
So far i've managed to get this working OK. However I am now TOTALLY stuck. I'm currently trying to just concentrate on the data sources being a CSV (using a Flat File Data Source) and/or an Excel Spreadsheet.
I can get the data in and to my destination using a UNION ALL component and mapping the data sources to it so long as both the CSV file and the Excel spreadsheet exist.
My problem is that I need my package to handle the possibility that only the CSV file might exist and there is no Excel spreadsheet. In which case i'd like the package to ignore the Excel datasource completely. Currently either of my data sources do not exist I get errors and the package terminates.
Is there any way in SSIS that I can check all my data sources to see which ones exist (i.e. are valid). If they exist I want to use them. If it doesn't exist i'd like to disgard it (without error - as long as there is a single datasource the package should run)
I've tried using the AcquireConnection method in a script task on each of my connections, hoping that it would error if the file/datasource did not exist. It doesn't though (in the case of an Excel datasource it just creates a empty excel file for me).
The only other option I can come up with are to have seperate packages depending on the type of data we want to import and then run a particular package depending on the format of the source data. This seems a bit long winded. I am pretty sure I must be able to do what I want to achieve but I can't work out how.
I'll be grateful to anyone who can send me any tips/hints/links on how I can achieve this.
Many thanks
Rob Gibson
View 5 Replies
View Related
Jan 9, 2007
Hi,
In my project i want a report. In that report data is getting from more than one data sources(systems). While creating data source view i used named query for both primary and secondary data source. But at the time of crating "Report Model" i am getting below error.
An error occurred while executing a command.
Message: Invalid object name 'Table2'.
Command:
SELECT COUNT(*) FROM (SELECT SerialNum, ModelNum AS com_model
FROM Table2) t
Is there any way to create a report with multiple data sources?
View 8 Replies
View Related
Apr 4, 2006
How can i combine my data in single row ? All data are in a single table sorted as employeeno, date
Code:
Employee No Date SALARY
1 10/30/2006 500
1 11/30/2006 1000
2 10/25/2006 800
3 10/26/2006 900
4 10/28/2006 1000
4 11/01/2006 8000
Should Appear
Code:
EmployeeNo Date1 OLDSALARY Date2 NEWSALARY
1 10/30/2006 500 11/30/2006 1000
2 10/25/2006 800
3 10/26/2006 900
4 10/28/2006 1000 11/01/2006 800
PLEASE HELP I REALLY NEED THE RIGHT QUERY FOR THIS OUTPUT.
THANKS IN ADVANCE
View 3 Replies
View Related
Sep 14, 2000
We are transferring data between AS/400 and SQL Server 7.0 using DTS. Some of these transfers may need to be very close to real time. It doesn't seem like a continuously running job is the best solution for that.
Do you know any tools or utilities that can help us to move the data?
Thank you,
Anastasia.
View 2 Replies
View Related
Feb 19, 2003
i have something like this:
select * from accounts
name type amount
==== ==== ======
mary saving 123.00
mary chequing 246.00
mary investment 135.00
john saving 678.00
john chequing 987.00
john investment 0.00
what should i do to present the data in the following format?
name saving cheq investment
==== ====== ==== ==========
mary 123.00 246.00 135.00
john 678.00 987.00 0.00
Thanks.
View 3 Replies
View Related
Jan 12, 2008
Hi, newbie here with a simple?(maybe)question.
I have an Access Database that I have imported into SQL Server2000 and that worked great, but now I have to get it into 2005. My question is, How can I get the tables and all info in the tables into an SQL Script so I can run that script on the 2005 server?
The SQL 2000 is on my dev server and I have all the Tools, (Ent Manager, Query Analyzer,etc...) but the 2005 Server is Godaddy's and they only have the basic web interface. I can run Sql files and create databases and tables, but thats about it.
View 2 Replies
View Related
Jul 18, 2007
HiI was told that using DTS will allow me to schedule stored procedures to keep an sql database up to date. For example if a user registers but does not activate the registration, his details will be removed by a stored procedure which is scheduled to run every 24 hours. I use to use the global.asax file to fire a update by using a file containing a the date of the last update and then by adding 24 hours to it, it would execute a SP to delete unwanted data.I have tried to install DTS with no success. I am running the followingVisual web studio expressSQL 2005 Express. (From SQLExpr_exe) and I have told it to install all the extra componentsInstalled SQLEXPR_Toolkit.exe with all its optionsInstalled SQLServer2005_DTS.MSI When I go into the sql server using MS SQL Server Management Studio Express. I cannot see the Data transformation services node. I have also just installed server reports which I had no problems installing.Can somebody please help me.
View 2 Replies
View Related
Jan 31, 2008
I have begun using SSIS and I am a little taken aback by the complexity of it especially since I just want to do a simple data transformation such as in DTS.
Are there any tutorials for data transformation for SSIS on the web/this forum and what if I want to do a simple transformation from Access to SQL Server?
View 1 Replies
View Related
Nov 8, 2000
I tried transforming data from one server to another using DTS. and then i got an error as below,
-----------------------------------------------------------------------------
Details: Error Source: Pump Data Step
Details: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.
Facility: 4, Severity 1,Code: 1176, HRESULT:0x80040498
Desc: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.
Source : Microsoft Data Transformation Services(DTS) Package
Source : Microsoft Data Transformation Services(DTS) Package
Code: 0x80040428
Description: Package failed because step'Pump data step' failed.
Error Message: IDispatch error #552
--------------------------------------------------------------------------
This is the full description of the error dialog...
Please suggest me some solutions....
Thanks in advance,
Kamalesh D
View 2 Replies
View Related
Jul 20, 2005
I'm running a DTS package on SQL Server. The source is MS Access and thetarget is Oracle.On a "Drop Table" command the process just hangs. There are no foreign keys onthe table. Several tables have already been processed successfully by thistime.I think I've ruled out corruption by dropping and recreating the targetdatabase on Oracle.Any ideas?M Man
View 1 Replies
View Related
Mar 31, 2008
I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.
I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?
View 4 Replies
View Related
Sep 6, 2006
I have a student table that needs some clean up. My first task is to remove all the periods (.) from the middle name column. Some people have two char middle initials with two periods. Can this be done via a SSIS package (I sure it can, but not how)? I can't think of a simple update statement that would accomplish the same thing.
Any direction would be appreciated.
View 3 Replies
View Related
Oct 23, 2002
I am creating a DTS package for an import of data from MSAccess97 to SQLServer2000. I am quite new to the DTS so bear with me please.
Everything was fairly simple until I got stumped by the following problem:
In the source database there was a table with multiple fields (let's say A, B, C) for each record containing their values (let's say 1,2,3).
Now, in the destination database the table is built differently. It is a table containing the field data and definition.
So basically I need to turn this
ID A B C
0 1 2 3
1 4 5 6
into this
ID FieldName FieldVal
0 A 1
0 B 2
0 C 3
1 A 4
1 B 5
1 C 6
I am not sure how to go about that... any thoughts? :confused:
View 5 Replies
View Related
Jun 8, 2006
i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.
Thanks in advance
Karun
View 1 Replies
View Related
Apr 10, 2007
Hi,
I am having a simple difficulty regarding my transformation component that I have created.
I followed all of the relevant documentation to create the component, UI and such, and when I run a program that transfers data through my component (without it doing anything to the data) to a flat file source, all I get in the flat file is ,,,,,,,,,,
Meaning that the data is not actually being passed through my component to the destination, but it does recongize the correct number of columns. I get this warning for each column: [DTS.Pipeline] Warning: The output column "PurchaseOrderDetailID" (20) on output "OLE DB Source Output" (11) and component "Source - PurchaseOrderDetail" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
To fix this issue, all I need to do is "check" the boxes in the advanced editor for my component. However, I want to be able have these boxes "checked" automatically.
My question is, when you "check" a box next to a column name in the advanced editor, what does that exactly do that allows you to transfer the data? What do I need to program in order for it to replicate that? I actually want it to happen automatically, so by default, all data in all columns are transfered through my component out the other side, so I just need to know how to do this by code, and not how to replicate the UI.
My component, I thought, did that automatically, as I specified everything that I thought was required based on all of the documentation I read. Obviously, I am missing something. Here are the methods that I believe would all be involved.
Please let me know what I am missing.
int[] inputColumnBufferIndexes; // ...
int[] outputColumnBufferIndexes; // ... used in PreExecute
PipelineBuffer outputBuffer; // used in ProcesInput
public override void OnInputPathAttached(int inputID)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
foreach (IDTSVirtualInputColumn90 vCol in vInput.VirtualInputColumnCollection)
{
IDTSOutputColumn90 outCol = output.OutputColumnCollection.New();
outCol.Name = vCol.Name;
outCol.SetDataTypeProperties(vCol.DataType, vCol.Length, vCol.Precision, vCol.Scale, vCol.CodePage);
}
}
public override void PreExecute()
{
//base.PreExecute();
IDTSInput90 input = ComponentMetaData.InputCollection[0];
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
inputColumnBufferIndexes = new int[input.InputColumnCollection.Count];
outputColumnBufferIndexes = new int[output.OutputColumnCollection.Count];
for (int x = 0; x < input.InputColumnCollection.Count; x++)
{
IDTSInputColumn90 column = input.InputColumnCollection[x];
inputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(input.Buffer, column.LineageID);
}
for (int x = 0; x < output.OutputColumnCollection.Count; x++)
{
IDTSOutputColumn90 column = output.OutputColumnCollection[x];
outputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(output.Buffer, column.LineageID);
}
}
public override void PrimeOutput(int outputs, int[] outputIDs, PipelineBuffer[] buffers)
{
//base.PrimeOutput(outputs, outputIDs, buffers);
if (buffers.Length != 0)
{
outputBuffer = buffers[0];
}
}
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
//base.ProcessInput(inputID, buffer);
if (!buffer.EndOfRowset)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
while (buffer.NextRow())
{
// TODO: Examine the columns in the current row.
// Add a row to the output buffer.
outputBuffer.AddRow();
for (int x = 0; x < inputColumnBufferIndexes.Length; x++)
{
// Copy the data from the input buffer column to the output buffer column.
outputBuffer[outputColumnBufferIndexes[x]] = buffer[inputColumnBufferIndexes[x]];
}
}
}
else
{
// EndOfRowset on the input buffer is true.
// Set EndOfRowset on the output buffer.
outputBuffer.SetEndOfRowset();
}
}
View 6 Replies
View Related
Dec 20, 2006
I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.
View 7 Replies
View Related
Aug 8, 2006
Does anyone know how to install or make available the Data Transformation Project Template in SQL Server 2005? I can not find it using Integration Services.
View 1 Replies
View Related
Mar 6, 2008
Hi all,
I've got to change values in my source database as follows:
Source: Target:
X 1
Y 1
Z 2
Can I create a lookup table and us a look up task in SSIS to do this or do I need to script it?
Thanks
F
View 1 Replies
View Related
Jul 19, 2006
If I understand correctly, SSIS is kinda the "new DTS." And to use it, if I get the gist, I am supposed to open "SQL Server Business Intelligence Development Studio," select a new project, highlight the "Business Intelligence Projects" heading, and then select the "Data Transformation Project" template. But there is just one problem.
That template is missing. It isn't there.
What am I doing wrong?
View 1 Replies
View Related
Feb 29, 2000
Is there a way, for example to script a DTS Package, so that it can be deleted and recreated at a later date if necessary? I have quiet alot of these, but few are used regularly. The msdb database is now up to 80 MB. However I don't want to delete them and have no way to recreate them.If I took a backup of msdb and then deleted the packages, would restoring msdb at a later date restore the packages?????
View 1 Replies
View Related
May 16, 2008
In SQL Server 2000, i'm using Data Transformation Services. What a similar DTS tools in SQL Server 2005.
View 4 Replies
View Related
Feb 14, 2007
How could I remark my script component, It is isolated, I just wanna check something without losing the long script written inside.
Thanks,
Fahad
View 6 Replies
View Related
Nov 8, 2007
Hi,
I have one data flow control. Source is SQL server and destination is flat file destination. I have one derived column placed in between these two. This functionality works fine. I would like to sum one column data and count total no. of columns and put it in global variable. How can I achieve it?
Thanks,
View 7 Replies
View Related
Sep 18, 2006
Hi,
I'm creating a custom data flow transformation in c#.
I would like to use expressions within this component in the same way as in the derived column component: specifying the expression as a custom property of an output column, then evaluating this expression for each row of the buffer and using this evaluated expression to populate my output column values.
So I've added an custom expression on my output column, and set its expression type to CPET_NOTIFY
IDTSCustomProperty90 exp = col.CustomPropertyCollection.New();
exp.Name = "Expression";
exp.ExpressionType = DTSCustomPropertyExpressionType.CPET_NOTIFY;
But in the ProcessInput method I don't manage to get the evaluated expressions, when I use exp.Value I get my expression definition and not its evaluation.
Is there a way to get these evaluated expressions ?
Thanks,
Stéphane
View 6 Replies
View Related
Oct 4, 2007
I am trying to load a SQL 2005 table which contains an XML data type using SSIS. The source is a text data file which is driven by a record-type (ie. 01 = course; 02 = conference) that will be used to determine the format of the XML. Being not too familiar with XML, I am struggling with the proper way to generate the XML which I can output to a string column.
Do I use xmlTextWriter for this or someother method?
Is there a way to output the formatted Xml as a string column from my Script Transformation? Should I be using a stream data column?
Are there any good examples that I can refer to? As you can probably tell I am clueless how to handle the XML. Can I even move the xml result to a string???
Any help would be appreciated.
Thanks,
J Dubo
View 3 Replies
View Related
May 31, 2007
Hi All,
I'm using DTS package, a tool to transfer data from a txt file to database(Bulk Insert).
The Bulk Insert task provides an efficient way to copy large amounts of data into a SQL Server table or view.It seems that the Bulk Insert task supports only OLE DB connections for the destination database. But I want to use sql server authentication as OLEDB connection requires windows authentication.
So can the bulk insert be done using SQLServer authentication ? if yes then please help me.
I have given the code snippet below.
Code Sample:
Dim oPackage As New DTS.Package2()
Dim oConnection As DTS.Connection
Dim oStep As DTS.Step2
Dim oTask As DTS.Task
Dim oCustomTask As DTS.BulkInsertTask
Try
oConnection = oPackage.Connections.New("SQLOLEDB")
oStep = oPackage.Steps.New
oTask = oPackage.Tasks.New("DTSBulkInsertTask")
oCustomTask = oTask.CustomTask
With oConnection
oConnection.Catalog = "pubs"
oConnection.DataSource = "(local)"
oConnection.ID = 1
oConnection.UseTrustedConnection = True
oConnection.UserID = "Tony Patton"
oConnection.Password = "Builder"
End With
oPackage.Connections.Add(oConnection)
oConnection = Nothing
With oStep
.Name = "GenericPkgStep"
.ExecuteInMainThread = True
End With
With oCustomTask
.Name = "GenericPkgTask"
.DataFile = "c:dtsauthors.txt"
.ConnectionID = 1
.DestinationTableName = "pubs..authors"
.FieldTerminator = "|"
.RowTerminator = "
"
End With
oStep.TaskName = oCustomTask.Name
With oPackage
.Steps.Add(oStep)
.Tasks.Add(oTask)
.FailOnError = True
End With
oPackage.Execute()
Catch ex As Exception
MsgBox("Error: " & CStr(Err.Number) & vbCrLf_
& Err.Description, vbExclamation, oPackage.Name)
Finally
oConnection = Nothing
oCustomTask = Nothing
oTask = Nothing
oStep = Nothing
If Not (oPackage Is Nothing) Then
oPackage.UnInitialize()
End If
End Try
View 1 Replies
View Related
Feb 9, 2007
Hello,
I have some data coming up from an SQL Server source, I have to add it to another SQL Table on the basis of what destination table already have.
For instance, if there is already an entry there for particular record (based on matching primary key value), then I just need to updated 3 columns, otherwise need to insert.
The problem is, I know how to do it by using script component, but I am wondering if there is a better tool that saves me writing code
Any idea?
Thanks,
Fahad
View 15 Replies
View Related