KSAM Data Integration Advice
Aug 1, 2007
Hi there,
I have to retrieve data from a KSAM (kerridge) database and can only use a file dsn ODBC connection to connect to the database.
I can get access to the database through excel and thereby see the tables but when I try and open a connection through the development environment, it keeps my machine busy for what seems to be an eternity whith no result.
I want to use SSIS to extract the data to a SQL 2005 database but will need to get my connections/connection managers to work.
Is there any advice that anyone can give me on perhaps the best approach on the data extract?
Regards
Mike
View 2 Replies
ADVERTISEMENT
Nov 24, 2006
Hi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
View 5 Replies
View Related
Jun 19, 2007
I've been following Soctt Mitchell's tutorials on Data Access and in Tutorial 1 (Step 5) he suggests using SQL Subqueries in TableAdapters in order to pick up extra information for display using a datasource.
I have two tables for a gallery system I'm building. One called Photographs and one called MS_Photographs which has extra information about certain images. When reading the MS_Photograph data I also want to include a couple of fields from the related Photographs table. Rather than creating a table adapter just to pull this data I wanted to use the existing MS_Photographs adapter with a query such as...1 SELECT CAR_MAKE, CAR_MODEL,
2 (SELECT DATE_TAKEN
3 FROM PHOTOGRAPHS
4 WHERE (PHOTOGRAPH_ID = MS_PHOTOGRAPHS.PHOTOGRAPH_ID)) AS DATE_TAKEN,
5 (SELECT FORMAT
6 FROM PHOTOGRAPHS
7 WHERE (PHOTOGRAPH_ID = MS_PHOTOGRAPHS.PHOTOGRAPH_ID)) AS FORMAT,
8 (SELECT REFERENCE
9 FROM PHOTOGRAPHS
10 WHERE (PHOTOGRAPH_ID = MS_PHOTOGRAPHS.PHOTOGRAPH_ID)) AS REFERENCE,
11 DRIVER1, TEAM, GALLERY_ID, PHOTOGRAPH_ID
12 FROM MS_PHOTOGRAPHS
13 WHERE (GALLERY_ID = @GalleryID)
This works but I wanted to know if there's a way to get all of the fields using one subquery instead of three? I did try it but it gave me errors for everything I could think of.Is using a subquery like above the best way when you want this many fields from a secondary table or should I be using another approach. I'm using classes for the BLL as well and wondered if there's a way to do it at this stage instead?
View 7 Replies
View Related
Dec 27, 2007
I have 2 flat files to load into a datamart via SSIS.
Need to implement:-
1. How can I prevent loading of same file again?
2. If by chance wrong data has been loaded how can I rollback?
Kindlt guide asap as I have to implement these.
JigJan
View 1 Replies
View Related
Feb 1, 2007
Hi,Is there a programmatic wasy to convert the results of a sql data set to xls, csv, etc. Ideally a user would be able to make a selection to view the data (result set has, E.g. make, model, year, condition viewed in a datagrid or similar control) and then be able to export the file to the format they choose, and have a download box popup from the browser to download the file.E.g. Export this data to: __ XLS __ CSV __TXT . I know DTS can do this but any advice on how to encapsulate this in a C# web app woudl be greatly appreciated! Thanks!
View 1 Replies
View Related
Feb 29, 2008
Hello,I'm looking into offering a custom data driven web app that I wrote for an organization that I'm apart of to other similar organizations. I would be hosting the data and web application code on my dedicated server. This application is using the membership api supplied in .NET 2.0 and also has my own custom data tables within it.My question is what would be the best way to add clients to this? Should I simply create a new database for each new client like so: ACME_Database, ABC_Database, AAA_Database etc. Or should I add some sort of client "Tag" (tag meaning column within each datatable) to these databases and then update my SQL queries to process them accordingly. I imagine I could do both but I guess I need some advice from people that already had experiance with providing this kind of service. Thanks!Jason
View 4 Replies
View Related
Jul 6, 2006
Hello all,
I have recently been task with rewriting a database that holds large volumes of data, whilst ensuring that query can be run in optimal time. Having never really delved into this sort of thing before, I hoped you guys might be able to offer some advice and guidance.
The design I have inherited is based around 2 main tables:
[captured_traps]
[id] [int] IDENTITY (1, 1) NOT NULL
[snmp_version] [int] NULL
[community_name] [varchar] (255)
[packet_type] [varchar] (50)
[oid] [varchar] (500)
[source_ip] [varchar] (15)
[generic] [int] NULL
[specific] [int] NULL
[time_stamp] [varchar] (15)
[trap_entered] [datetime] NULL
[status] [int] NULL
[captured_varbinds]
[id] [int] IDENTITY (1, 1) NOT NULL
[captured_trap_id] [int] NOT NULL
[varbind_oid] [varchar] (500)
[varbind_text] [varchar (500)
The relationship between the two tables is on the "captured_traps (id)" to "captured_varbinds (captured_trap_id)". Currently the "captured_traps" table contains around 350 million rows, the "captured_varbinds" table contains around 900 million rows.
Now as you can probably gather this model runs like a....well it sort of hobbles more than runs hence the need to redesign.
My current thoughts on this are:
- Normalising all varchars - there is alot of duplicate values in most of the varchar fields.
- Full Text Indexing
However beyond that I am not sure which route to go down. After googling for most of today I have come across a number of "solutions" however I do not want to go steaming down the track of one of these to discover that it is fatally flawed somewhere.
View 6 Replies
View Related
Aug 29, 2007
Hello everyone,
Here's my situation...
I'm running a web service which involves 51 seperate servers and databases.
There are fifty licensee servers (One for each US state) and one corporate server.
Each night I need to upload sales and membership data from the licensee's databases to the corporate database to compile reports.
The application platform I'm using is ASP.NET 2.0 and the the database is SQL2005 express.
I want this process to be run automatically, so I believe it's a scheduled windows service I need to setup up in .NET to make the data transfers.
If anyone has already set something up like this, or knows the steps to take? I would love to have your input.
Thanks in advance,
Robert
View 5 Replies
View Related
Aug 2, 2004
Hi,
I'm about to embark on writing some code in perl or VBscript that automatically synchronises a constantly updated Access database with an MSSQL database.
I know MSSQL has an import tool built into Enterprise manager but I'm wondering if theres a stored procedure that does this?
The way I'm thinking of doing it is to read the all the access tables into separate hash arrays and then INSERTing them into the MSSQL database after checking for any duplicates. This all sounds a bit time consuming (there are a large number of tables) and processor intensive.
If anyones done anything like this before, I'd love to hear their views......!
Thanks!
View 9 Replies
View Related
May 21, 2008
I have an application that automatically reads a lot of data from a third-party application into my database, via XML. For example, I might read a couple thousand rows-worth of XML data, one row at a time in a foreach loop.
To reduce the load on their server and database, I thought about putting a 2 second delay in between each of my automatic requests. Would this really help much, or is there enough overhead (setting up/tearing down connections, etc) with each request that it wouldn't reduce server load much anyway?
Is 2 seconds enough? Too little or too much?
View 3 Replies
View Related
Jan 30, 2008
Greetings!
I would like to create a database for keeping track of payroll data for employees where the supervisors (job coaches) on our workshop floor can use a Pocket PC device to record the hourly employee data on the fly. Then at the end of the day, the supervisor can place the device in a cradle of some sort and synch the newly entered data into the main database.
I'm guessing that SQL Server Compact edition would be perfect for this type of task? Is that correct? Can someone give me recommendations on how to go about setting this up? What should I use as the main database? SQL Server? Access? Any advice is appreciated!
View 1 Replies
View Related
Nov 10, 2015
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:
SQL Server:
Oracle:
DDL:
create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);
Result:
I follow up this article: [URL] ....
VB Script:
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
[Code] ..........
View 8 Replies
View Related
Sep 28, 2015
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
View 4 Replies
View Related
Oct 14, 2005
Does anyone know of any cross-references between SQL Server data types and the new data types introduced with SQL Server Integration Services?
View 6 Replies
View Related
Aug 28, 2015
I have to value [CreateDate] in the data pump of my Flat File Source into my OLE DB Destination SQL Server Table. With a Variable within the SSIS Package or with a Derived Column task within the Data Flow between the Flat File Source and OLE DB Destination?
View 2 Replies
View Related
Aug 18, 2015
How to purge data in transaction table or we can delete some data and store in separate table in data warehouse?
View 7 Replies
View Related
Jun 5, 2007
Please help! I am trying to import data from an ODBC data source to a SQL Server database using Integration Services. I am new to SQL Server 2005 but all was working happily on 2000 using DTS.
I am trying to follow the tutorials using a data flow task but cannot get my ODBC database into the connection managers tab, because OLE DB for ODBC isn't one of the options! Am I missing something? Any help on this would be greatly appreciated as I am struggling to come to terms with 2005 and cannot migrate the 2000 DTS packages
Many thanks
View 5 Replies
View Related
Apr 5, 2007
Hi, I have a question regarding the Integration Services Data Types.
From http://msdn2.microsoft.com/en-us/library/ms141036(d-printer).aspx, I found a table that shows me the Mapping of Integration Services Data Types to Database Data Types.
For example, how the DT_BOOL Data Type maps to bit for SQL Server.
In this case, I am okay, as I know exactly what the mapping is, however, for some of the datatypes, I do not.
Here is an example. The DT_CY datatype maps to smallmoney and money ... how do I know which one to map to? For me, which one I map to does indeed matter because their representation is different.
DT_NUMERIC maps to decimal and numeric ... this one does not matter as much
DT_STR/DT_WSTR ... I need to know whether its char, varchar, ncahr, or nvarchar for padding purposes mostly.
Any help would be gladly appreciated.
View 5 Replies
View Related
Nov 18, 2015
I am having a requirement where I need to load the correct data into the target table and needs to save the bad data for analysis, how can I do that in SSIS.
View 6 Replies
View Related
May 30, 2007
Hi friends ,
Can any buddy tell me how can i update a particular table by integration services.... I just need to update some of column value
if i write query ...i need to write approx 35 update statement (Query)
So is there is any way by which i can replace existing data to my current data .
View 1 Replies
View Related
May 11, 2015
I've got 4 massive pipe delimited flat files that should go into 4 different SQL Server database tables. They constantly throw errors.
What is the best way to get the data into the database...varchar(Max), varchar(100)?
I just want the data to load. What is my best bet before the cart me off to the loony bin?
View 7 Replies
View Related
May 24, 2007
Hello all,
I have an ETL process like that:
- Step 1: extract data from database DB2/AS400 to stage database SQL Server (with provider "ADO.NET ODBC")
- Step 2: extract data to stage database SQL Server to SQL Server
(With provider "Native OLE DBSQL Native Client")
During the step 2, if I have a table with a lot of lines (1 million or more), the data-flow takes a lot of time (some minutes) before to start extracting the data. That's quite frustrating and it affects the global time to integrate the data...
I don't know if SSIS scan the table before to integrate it. Moreover I'm almost sure that it was not like that some time ago (and I'm wondering what could have changed...)...
I just specify that the 2 SQL Server databases are on the same server.
Any idea is welcome.
Thanks,
Guillaume
View 1 Replies
View Related
May 25, 2015
I'm using - Destination - Oracle driver - oraOLEDB.Oracle.1 (native ole dboracle provider for ole db)
Source - SQL driver - microsoft ole db prover for sql server. I want to import data from sql server to oracle. Challenge is, I have 1 million records on oracle. I have 100 records on sql server (these 100 records count will change daily). So, I thought of using 'lookup' task looking taking record from ms sql and fetch corresponding record from oracle. But when I use lookup, all records from oracle are loading into cache, which is taking approx 3 hrs.
View 4 Replies
View Related
May 25, 2015
I have a requirement to compare data between two tables in SQL Server.
What is the fastest way to do it using SSIS? There are approx 6~7 millions of records in each table.
My solution: Read both the tables and store the data in Object Type variable. Then run an except query. But I am stuck at except query part. How do I implement it?
View 5 Replies
View Related
Oct 5, 2015
Im newto SSIS. I want to develop package for data validation.
FirstName
1. Mandatory field checking: if Null, reject the record
2. If field length > 50, then reject the record
SSN
1. If field length > 12, then reject the record
2. If SSN is not in valid format, issue warning and process rhe record without SSN value.
3. Valid format: 9 digit numeric values should present after striping off all non-numeric characters.
4. Only send 9 digits to MDM
Like these i have 30 rules. And I have to shop the error msg if the validation fails like "Mandatory feild is missing".
View 2 Replies
View Related
Mar 10, 2008
Hi everyone,
I've got a problem to retrieve data from a Xml Source.
Basically, I call a method from a Web Service which gives me a Xml file.
The problem is that the XML structure is not really good. But we can't touch it.
Here is the Xml File :
Code Snippet
<?xml version="1.0" encoding="utf-16"?>
<ArrayOfWSTargetVO xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<WSTargetVO>
<ProjectId>
<Value>131</Value>
</ProjectId>
<Id>
<Value>Toto</Value>
</Id>
<Name>
<Value>bateau</Value>
</Name>
</WSTargetVO>
<WSTargetVO>
<ProjectId>
<Value>131</Value>
</ProjectId>
<Id>
<Value>Tata</Value>
</Id>
<Name>
<Value>F35</Value>
</Name>
</WSTargetVO>
...
</ArrayOfWSTargetVO>
As you can see, for each WSTargetVO, we have a projectid, an id and a name. But the value is not directly put into these nodes but in a new one : <value>
That causes my problem because here is the xsd file generated by visual studio :
Code Snippet
<?xml version="1.0"?>
<xsd:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsd="http://www.w3.org/2001/XMLSchema" attributeFormDefault="unqualified" elementFormDefault="qualified">
<xs:element name="ArrayOfWSTargetVO">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" maxOccurs="unbounded" name="WSTargetVO">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="ProjectId">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="Value" type="xs:unsignedByte" />
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element minOccurs="0" name="Id">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="Value" type="xs:string" />
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element minOccurs="0" name="Name">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="Value" type="xs:string" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xsd:schema>
And when I try to use the outpul results from the Xml file, I can't see how I can get a datatable with three columns corresponding to projectid, id and name.
Integration Services only asks me to choose between WSTargetVO or ProjectID or Id or Name and give me the <value> value.
I don't know if it is possible to modifiy the contents of the XmlFile or something else using XPath.
Of course, if I try to modifiy the XSD file and delete the value node to have a simple structure, I see my three columns but i can't get any data.
I'm aware that the XML file is pretty bad but it is impossible for me to change it.
If somebody has an idea, I would be happy to hear it :-)
(I'm a beginner in Integration Services)
Thank you,
Radik
View 3 Replies
View Related
Nov 5, 2015
I am loading incremental data from sql server to oracle by using ssis and while data convert it says data type dont match.
SQL column data type is:smallint:SQL Server 2008 r2
Oracledata type is:Number(5):Oracle 10 g.
View 5 Replies
View Related
Aug 24, 2015
I have a data in excel sheet which is to be loaded to sql table. The Column called seq_num has data with leading 0's these 0's are ignored while loading through ssis.
Example if seq_num is like 0099988 the sql table would get 99988, how to get the whole data with missing anything.
FYI: seq_num on excel source has a data type as dt_r8.
View 5 Replies
View Related
Jun 2, 2015
I'm trying to extract the SharePoint List data into excel file using SSIS package. I'm using SharePoint 2013 and BI 2008 R2. how to do it.
View 5 Replies
View Related
Nov 20, 2015
So I have to make a fairly dynamic Data flow. I will get the most of the configuration from a database table. I will look up the name of the procedure to run as a source (I can use expressions or a script component source for this), I will lookup columns names from a database table.I can use expressions (maybe) or a destination script component for the destination including the destination table name and column names, these will be looked up in a database table.What I am not sure is how I will do the mapping. How can I make this dynamic? The logic for mapping will be in the database as well. Could I create a custom dataflow all in one script? A source, destination and mappings all in one script? Is there an example of this out there.my task ios to make the data flow completely dynamic.all config info would be kept in a SQL Server database.A complete custom script component dataflow task.
View 3 Replies
View Related
Apr 25, 2006
when executing my data flow package that contains only one source and one destination
OLE db source -> SQL server destination
the following errors occurs in my output
Error: 0xC0202009 at Data Flow Task(infraction action), SQL Server Destination [3600]: An OLE DB error has occurred. Error code: 0x80040E14.
Error: 0xC0202071 at Data Flow Task(infraction action), SQL Server Destination [3600]: Unable to prepare the SSIS bulk insert for data insertion.
Error: 0xC004701A at Data Flow Task(infraction action), DTS.Pipeline: component "SQL Server Destination" (3600) failed the pre-execute phase and returned error code 0xC0202071.
i've checked the structure of my source and destination table but nothing seems to be wrong
if someone have ever faced these errors help me :D
View 22 Replies
View Related
Nov 4, 2015
I have two tables that I UNION to retrieve data for users. A combination of these should have only one employee in the table. The problem is there is a unique id created for the position of instructors. In the other table, it holds all employees with an employee number. Some data such as username, email address, etc., does not change. So even though UNION should remove duplicates, I still have duplicates because of usernames is what I'm filtering on, it is the same in each table. In the combined table I'm only selecting specific employees based on Job class and Job code. For employee id in the first table it is preceeded with 'B', and the second by 'T' (this is only to identify which table the data is taken from). Here is what I am getting when I Union both tables.
query
SELECT
distinct 'B-'+ Employee_ID
as Employee_ID
, Username
,Email
[code]...
View 8 Replies
View Related