Data Loading

Apr 10, 2000

What is the best way to load large amounts of data? I am working on a project where I will need to load data into approx. 20 tables. Into several of the tables I will need to load around 400,000 records. I am familiar with the concepts involved in using BCP but was hoping I could avoid the step of going to text files. I am pulling data from Access (either 97 or 2000). Any suggestions would be welcome.

View 1 Replies


ADVERTISEMENT

Integration Services :: SSIS VB Script Loading Data Into Oracle DB Missing Some Data

Nov 10, 2015

I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below: 

SQL Server:
Oracle:
DDL:

create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);

Result:

I follow up this article: [URL] ....

VB Script: 
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

[Code] ..........

View 8 Replies View Related

Master Data Services :: Error Code 8 While Loading Data From MDS Stage To Model

Apr 22, 2015

I am getting ErrorCode 8 while loading the data from stage to model. I have checked my error view it states that "Member Code is Inactive".

Initially I have loaded same set of data in Model from MDS Stage table but then deleted with ImportType = 5 which removed all the data from the MDM model.

Now i want to load it back but its giving the Error Code 8 ..  Before loading the same data i have changed the stage table Importtype to 2 and Importstatusid to 0.

View 6 Replies View Related

Do We Have To Alawys Use Slowly Changing Dimensions (SCD) Component In The Data Flow For The Loading Of Table Data?

Feb 28, 2008

Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,

View 4 Replies View Related

Problem Loading Data From FlatFile Source Data For Column Overflowed The Disk I/O Buffer

Sep 10, 2007



Hi i am trying to do a straight forward load from a Flatfile source , i have defined the columns according to the lenghts defined in the Data Dictionary Provided but when i am trying to run the Task i am encounterring this error

The column data for column "Column 20" overflowed the disk I/O buffer.

I tried to add another column 21 at the end and truncate or leave that column unmapped to destination but the same problem occurs for column 21 what should i do to over come this .

In case of Bad Data how to clean up the source.. Please help me with this








View 5 Replies View Related

Loading Data Froma A Text File To SQL Data Base

Sep 10, 2007

Hello!! searching information about how to migrate some date from an old data base (any tipe) from SQL I´v found this:
LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name.txt'
[REPLACE | IGNORE]
INTO TABLE tbl_name
[FIELDS
[TERMINATED BY 'string']
[[OPTIONALLY] ENCLOSED BY 'char']
[ESCAPED BY 'char' ]
]
[LINES
[STARTING BY 'string']
[TERMINATED BY 'string']
]
[IGNORE number LINES]
[(col_name_or_user_var,...)]
[SET col_name = expr,...)]
Does anybody know how does it works and how to use it????I´d like to know because I have to load data from a text file to a SQL Data Base and this seems to be te fastest an easiest way to do it...Thanks!!!!bye!

View 1 Replies View Related

SSIS - Data Loading Job -- Update Col B With Col A If Col B Is NULL In The Data File?

May 10, 2007

How do u achieve this -- While SSIS Data Load Execution itself?

Update Col B with Col A value if Col B is NULL in the Data File?

View 1 Replies View Related

Data Missing When Loading Data Into Sql Server 2005

Jan 17, 2008



Hi, Experts

The project is a C/S data analysis system built with .Net 2.0 in windows environment, OS: Microsoft Windows 2003 R2 standard Edition Service Pack2, Database used in this project is: Sql server 2005. As a data analysis system, we need to load large amount of data from file to database, we do it by create a dts package and then do data loading by execute "m_Package.Execute(null, variables, m_PackageEvents, null, null)".

The problem is, we fount that DTS miss some data randomly sometimes, we can't find the rule till now. for example we've data as follows in data file, all data field splited by '|'
11234|26341|2007-09-03 00:00|0|0|0.0|0|0.011470833793282509|1|0.045497223734855652|0|0|1|0|3|2929|13130|43|0|2|0|0|40|1|0|0|0|0|0|1||0|0|3|0|0|0|0|0||0|3|0|0|43|43|0|41270|0|3|0|0|10|3|0|0|0|0|0||0|1912|0|0|3|0|0|0|0|0|0|0|3|0|0|5|0|40|0|9|0|0|0|0|0|0|0|0|29|1|1|24|24.0|16|16.0|0|0|0.0|0|0|24|23.980693817138672|0|0.0|0|0.0|0|0.0|0|0.0|11|2.0764460563659668|43|2|0|0|30|11|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|3|0|0|0|0|0|0|0|0|0|6|0|0|0|0|0|6|0|0|45|1|0|0|0|2|42|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|0|0|2|0|0|0|0|0|0|51|47|85|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||0|0|0|0|0|97.117401123046875|0|0|83|57|||0.011738888919353485|0|1|0.065955556929111481|0|4|||0.00026658669230528176|1|0.00014440112863667309|1|68|53|12|2|1|2.0562667846679688|10|94|2|0|0|30|11|47|4|13902|7024|6878|18|85|4.9072666168212891|5|0.0|0|0.0|0|0.0|0|0.0|0|358|6448|6324|0|0|0|0||0||462|967|0|41|39|2|0|0|0|1|0|0|0|0|0|0|0|0|3|0|0|3|0|0|0|0|0|0|0|0|0|3|0|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|46|0|1|0|1|37|0|0|46|0|1|0|1|37|0|0|0|0|0|0|0|0|0.0|0|0|6|4|2|0|0|2|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|1|0.012290795333683491|0|44|44.0|0|0.0|0|0|0|30|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|2|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|1|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|0|0|27|0|0|2112|411|411|45|437|2|0|2|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|4|0|4|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|6|6|0|3|2|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|600|600|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|6|0|0|0|0|0|0|6|0|9|1|2|2|3|0|1|0|0|0|0|0|0|0|0|0|0|0|13|3|2|5|1|1|1|0|0|0|102|0|1|1|0|0|0|3|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0||||||||||0|0|0|0|0|0|0||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|46.0|46|0.0|0|0.0|0|0.011469460092484951|1|0.0|0|0.0|0|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|100.0|100.0|0|1|0|1|0|0|0.02481505274772644|1|0.014951236546039581|1|0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|4695.5556640625|42260|7126.66650390625|62040|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

11213|26341|2007-09-03 00:00|0|0|0.0|0|0.068584725260734558|2|0.14375555515289307|27|0|2|1|11|3966|13162|97|0|13|0|0|83|3|2|3|0|0|0|26||0|0|11|0|0|0|1|0||0|1|0|3|97|98|0|246795|0|11|1|0|3|14|0|0|0|0|0||0|1912|0|0|12|0|0|0|0|0|0|0|12|0|0|17|0|83|2|2|2|0|0|0|0|0|0|0|73|3|1|24|24.0|16|16.0|0|0|0.0|3|0|24|23.905364990234375|2|0.0040000001899898052|0|0.0|0|0.0|0|0.0|11|2.0772171020507812|97|7|0|0|80|10|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|12|12|0|0|0|0|0|0|0|0|0|41|0|0|0|0|0|41|0|0|99|25|0|0|0|0|74|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|0|0|0|0|0|0|0|0|0|177|158|332|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|3|||||||||||||0|0|0|0|0|0.0|0|0|321|198|||0.041950233280658722|0|2|0.1999288946390152|0|5|||0.00088306842371821404|1|0.00051651173271238804|1|529|113|4|8|2|2.0671226978302002|10|274|7|0|0|80|10|66|17|13934|7024|6910|31|332|4.7173333168029785|5|0.000033333333703922108|1|0.000033333333703922108|1|0.000033333333703922108|1|0.0|0|358|6448|6356|0|0|0|0||0||1609|3423|0|83|78|5|0|0|26|0|0|0|0|0|0|0|0|0|2|0|1|1|0|0|0|0|3|0|0|0|0|2|0|2|0|0|0|0|0|0|0|0|0|0|2|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|65|0|1|0|1|72|0|0|65|0|1|0|1|72|0|0|0|0|0|0|0|0|0.0|0|0|12|7|0|2|3|16|5|5|6|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|2|0.04131799191236496|0|48|48.0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|1|0|0|0|0|0|0|9|0|5|1|0|0|0|1|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|4|2|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|0|0|0|0|3|0|0|0|0|0|0|0|0|121|0|1410|6046|558|1400|192|2467|10|0|5|1|0|0|0|2|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|15|0|10|0|0|0|0|3|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|21|9|12|10|3|1|0|1|4|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|163|4|144|91|92|2|92|0|0|0|0|0|101|92|0|0|0|0|101|0|0|0|0|600|596|1|0|0|3|0|0|0|0|0|0|0|0|0|0|0|9|0|0|1|0|0|0|8|0|34|3|4|14|7|2|3|0|1|0|0|0|0|0|0|0|0|0|41|6|4|23|5|2|1|0|0|0|289|0|7|7|0|0|0|11|11|0|0|4|4|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|4|0|0|0|0|0|4||||||||||3|0|0|0|0|0|3||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|55.814689636230469|47.931411743164062|48|0.0|0|0.0|0|0.068584725260734558|2|0.0|0|0.0|0|14|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|1|0|0|1|0|0|0|0|0|||0|100.0|100.0|0|26|26|0|0|0|0.088263392448425293|2|0.056161552667617798|2|0|0|0|0|0|0|0|0|0|5|22|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|16308.888671875|146780|23162.22265625|184840|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

11220|26341|2007-09-03 00:00|0|0|0.0|0|0.309184730052948|2|0.17757916450500488|0|0|7|4|18|3925|13682|172|0|19|0|0|164|10|5|4|0|0|0|2||0|0|20|0|0|1|4|0||0|5|0|4|172|172|0|1165085|0|20|4|0|20|8|0|0|0|0|0||0|1912|0|0|24|0|0|1|0|0|0|0|23|0|0|30|0|164|4|6|8|0|0|0|0|0|0|0|121|10|15|24|24.0|16|16.0|0|0|0.0|4|0|24|23.646148681640625|1|0.0040013887919485569|0|0.0|0|0.0|0|0.0|11|2.0849723815917969|172|5|0|0|123|44|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|26|24|0|0|0|2|0|0|0|0|0|192|1|0|0|0|0|191|0|0|190|12|0|0|0|0|178|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|0|0|0|0|0|1|0|0|0|0|1008|953|2758|0|5|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|4|||||||||||||0|0|0|0|0|84.418106079101562|0|0|2626|1420|||0.29022222757339478|0|5|1.5045944452285767|0|5|||0.0058597764000296593|2|0.0046600494533777237|2|1340|1114|80|119|27|2.2584490776062012|10|1180|5|0|0|123|44|953|55|14462|7024|7438|52|2758|3.0037333965301514|5|0.021266667172312737|1|0.00036666667438112199|1|0.0|0|0.0|0|362|6440|6880|0|0|0|0||0||13711|27667|0|159|149|10|0|0|1|1|0|0|0|0|0|0|0|0|7|0|0|7|0|0|0|0|4|0|0|0|0|7|0|7|0|0|0|0|0|0|0|0|0|2|3|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|842|0|111|0|102|1702|0|1|842|0|111|0|0|1703|0|0|0|0|0|0|0|0|0.0|0|0|47|26|11|3|7|37|1|20|16|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|4|0.24921548366546631|0|44|44.0|0|0.0|0|0|0|1003|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|10|0|8|2|0|0|0|0|0|0|0|0|0|0|81|1|60|10|10|0|0|0|0|0|0|0|0|0|1|1|0|0|0|0|0|0|0|25|16|4|2|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|53|27|17|4|5|0|0|0|0|0|0|0|0|0|421|0|8685|67179|2138|12104|80|26285|104|1|73|16|14|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|87|1|77|7|2|0|0|0|0|0|0|0|0|0|1|0|0|1|0|0|0|0|0|0|0|0|0|0|16|0|9|5|2|0|0|0|0|0|0|0|0|0|155|155|0|105|51|32|9|13|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|102|0|0|0|0|0|600|445|4|20|32|63|16|15|4|1|0|0|0|0|0|0|0|37|0|0|5|0|0|0|32|0|230|7|10|99|68|22|21|0|3|0|0|0|0|0|0|0|0|0|286|18|10|182|53|17|6|0|0|0|2528|0|10|10|0|0|0|22|22|0|0|25|25|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|30|0|0|0|0|0|30||||||||||23|0|0|0|0|0|23||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0.0|45.998283386230469|46|0.0|0|0.0|0|0.30917638540267944|2|0.0|0|0.0|0|8|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|1|1|0|0|0|0|0|0|0|||0|100.0|100.0|0|2|1|0|0|1|0.73375397920608521|5|0.41600942611694336|6|0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|98115.5546875|865520|176626.671875|1159360|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

We found that some of the data field become 'null' after the load action finished, if we load the same data again, problem disappeared, we can't 100% reproduce this issue each time, we don't know why, Anybody here can help us to solve this issue or give us some clue?


View 3 Replies View Related

Deleting Existing Data Before Loading New Data

Apr 10, 2007

I have a package which loads data from a flat file (csv) to 4 tables in a database.
Now, the load is incremental.

I want to clear the data of all 4 tables(in the database) before loading the data from flat file everytime.How can i do this?
Iam using 4 Oledb Destinations, 1 multicast, 1 source component to do this.
Also can it happen like a transaction? because if it deletes the existing data and couldnt load new data there will be a problem!.how to avoid this?

View 4 Replies View Related

Loading Data

May 18, 2001

Hi,
I am loading Data from Mainframe to SqlServer on WinNt.
Normally it was taking 35 mts do the dts job.
on last two days it runs for more then six hours still the job doesnt get over.
I am at a loss to know what to do and how to fix this problem. THe main frame ppl said sqlServer is fetching the data very slowly.
If anyone knows the solution pl post a solution

View 5 Replies View Related

Loading Data From Xml

Sep 17, 2006

I have an xml file I want to use as the source. It's not overly complicated, but not simple either. It has one hierachy, and one optional field and looks like this



a

1 'text'

2 'text'

b

1 'text'

2 'text' 'optional field'



Ok, now I want the data to load like this:

a,1,text

a,2,text

b,1,text

b,2,text, optional field



but when I try to use the xml source it won't create the xsd...anything I can do?

View 9 Replies View Related

Help Loading XML Data

May 1, 2008



I am trying to enhance an existing package that actually does the xml to table load, this packge is using a script component to parse and load the xml file into multiple tables (3 tables) using VB.net code, I want to add a component to it where if there are any xml rows that are erroring out , they get redirected to a different table, this log table is having only one column and the xml records are supposed to be loaded into this column in XML format. this is the existing design and I have to live with it and at the same time I am not a big .net coder, any help is appreciated.

Thanks

View 5 Replies View Related

Loading Data To Db2

Jun 29, 2006

Hi,

trying to load data from sql2000 to DB2 UDB 7.2 ,

building SSIS package using wizard

case 1:

source ole db -sql2000 destination IBM ole db provider for DB2

data conversion compnent placed on data flow page , package run fine

case 2

source ole db -sql2000 destination microsoft ole db provider for DB2

data conversion component NOT placed on data flow page ,just source and destination , package fail (can't detect db2 code page)

questions

if I transfer 1000000 rows from sql to db2 ,what will happen ?

in case 1 :

is it single transaction ?

is possible to set "Maxinsertcommitsize" or other property and commit every N rows ?

case 2:

if I remember microsoft ole db driver for db2 (used in host integration server) forced commit every row. (I might be wrong)

again , any property could be set to commit every N rows ?

Thanks

Alex



View 1 Replies View Related

What Is My Best Strategy For Loading Data.

Apr 22, 2007

I have been developing a genealogy application using a SQL Server 2000 database and ASP .NET 2.0.  In this application a process, Ged.Parse, converts data from the GEDCOM standard format (a heirachical file format that looks as if it was designed for 80-column cards) into my SQL Server database.
As we started to load reasonable quantities of data into the system we found that the on-line response became abysmal.  This problem was fixed by defining a number of secondary indexes (response times dropped to under a second, from previously exceeding 2 minutes and often timing out).  Unfortunately however the processing time of Ged.Parse then tripled, and it may now take up to an hour to process a GEDCOM. I believe that this is a byproduct of defining several indexes that are not needed by Ged.Parse itself, but which are of course maintained as Ged.Parse inserts new records into the database.  
I am wondering what my best strategy is, apart from putting Ged.Parse into a background task and just letting it trickle away.  (I will probably do this anyway). What I'd like to be able to do is to have Ged.Parse load records without creating the secondary indexes, and then create the indexes for the newly-added records as a penultimate step just before it makes them available for general use.  Of course there is no way that you can do this:  records in a table are either indexed or they are not.
Proposed change:  recode Ged.Parse to load data into temporary tables, say NewPeople, NewFacts, etc., with these tables having only the indexes required by Ged.Parse. Then, as the last process in Ged.Parse run a SQL procedure with code like: -            Insert into People Select * From NewPeople            Delete from NewPeople            etc
This is a reasonable amount of programming, so before I make this change could somebody tell me:  will this be significantly faster overall, or is this likely to make little or no improvement compared to the present process in which Ged.Parse loads data directly into People, Facts, etc?   Two facts that may influence the answer.  First, all record relationships are through GUIDs, so records in NewPeople, NewFacts, etc would already have their final key values.  Second: although Ged.Parse needs to form relationships between records, these relationships are only within the new records (created from the same GEDCOM), and Ged.Parse does not need to relate any of these new records to earlier records.
Thank you,
Robert Barnes.
 

View 2 Replies View Related

Loading Data Into SQL7

Jun 26, 2001

Hi!
I need to load text data into SQL7. The tricky part (at least for me) is that this data may be duplicated.
How can I load this data discarding the duplicated rows? AFAIK, the sql job (or DTS) will fail if a primary key is violated.
TIA,
Fabio Aneas

View 1 Replies View Related

Can I Automate Data Loading Through DTS?

Sep 19, 2000

Using SQL Server 7.0, I need to watch for a file to be placed in a directory and then load it automatically. What is the best way to do this? I have the Bulk Insert process set up in DTS and would like to just add another process if possible.

View 2 Replies View Related

Loading Data Into Sql Table

Apr 25, 2008

I want to read file from c:abc.txt through sql programming and Insert this file data into table.

suppose c:abc.txt has following data
aman 10 50,000
sumesh 20 40,000

suppose I have created table abc( a varchar(30), b int, c int)
How I will do it

Thanks
Ajay

View 3 Replies View Related

Loading Data Into A Database

Sep 13, 2007

Greetings,

I am new at SQL and I have been asked to load some data into a database. I was given a file that has an extention .sql. shown below are the first few lines

quote:TRUNCATE TABLE henrylee.[HenryLee]
GO
INSERT INTO henrylee.[HenryLee] ( [custno], [company], [address1], [address2], [city], [state], [zip], [phone], [email] ) VALUES ( 'C00001', 'SUPREME NOVELTY', '5954 S PULASKI ROAD', '', 'CHICAGO', 'IL', '60629', '', '' )

several more rows after this

This looks to me like it was built to be scripted in or use some function of SQL to create and populate the table... does that make sense? Anyway, is there an easy way to insert this data into a table?

Thanks
-B

View 15 Replies View Related

Loading Data To A Table

Feb 20, 2008

Hi,
I created a package to load a fact table
which loads more than 7 million records.
When loading the table it took nearly 15 minutes.
Then indexes were created for the table at the DB level after which the time to load same number of records has increased.
How to resolve this time delay?

View 4 Replies View Related

Loading Data From Sql To Access

Oct 25, 2007

Hi the scenario is,

I have a remote website which uses sql server database. the sql job runs in a server called goofy which imports the data from remote sql to an access sitting on another server called dumpy. I get below error when trying to import data. I check but no one is accessing this data.


Executed as user: GOOFYSYSTEM. ...e Package Utility Version 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 7:45:29 PM Progress: 2007-10-25 19:45:29.44 Source: Data Flow Task 1 Validating: 0% complete End Progress Progress: 2007-10-25 19:45:29.44 Source: Data Flow Task 1 Validating: 33% complete End Progress Progress: 2007-10-25 19:45:29.60 Source: Data Flow Task 1 Validating: 66% complete End Progress Error: 2007-10-25 19:45:29.68 Code: 0xC0202009 Source: importdata Connection manager "SponsorshipData 1" Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "The Microsoft Jet database engine cannot open the file '\dumpyd$CompanyCCNASponsorshipData.mdb'. It is already opened exclusively by another user, or you need permission to view its data.". End Error Error: 20. The step failed.

my another question is. there is a column of type memo in access. but in remote sql i have that field as ntext how could i also pull ntext into memo field. i saw in forums but i found it too complex to understand is there any step by guide to pull ntext data into memo field in ssis.

thanks in advance.

View 3 Replies View Related

Loading Data From XML To Tables

Dec 1, 2007

Hi,

I have got an xml file with size more than 2 GB. I have to load this file into tables. With 32 bit platform, I am unable to load this file using SSIS. Ram is 8 GB, but it is still bombing out. As I know it uses XML DOM Parser and tries to shred the file in memory and because of memory limition, it fails. Although I have already written code in C# using XmlTextReading object(implemetation of SAX Parser) to load data in tables, but I want to keep this loading process within the limits of DBAs.

I am stuck. Can someone guide me through the situation?

Thanks for your help!

Navnish

View 8 Replies View Related

No Loading Data From ScriptComponent.

May 17, 2006

Dear all,

I've created a Data Flow scenario as follow:

At first I've got a Flat File Source and then Script Component Task and then OleDb Destination, linked among them by arrows, of course. When I run the SSIS all of them is successfully executed except the last task. Why? I don't know but it isn't awared of nothing.

649 rows are passed to Script Component from the file but they aren't going to my Sql table.

Let me know any advice or thought regarding ths.

Thanks a lot,

View 2 Replies View Related

Loading Data From Excel

May 20, 2008

Can this be done in SSIS?

I have an xls spreadsheet with multiple worksheets. From each worksheet I need to load the data into sql server. The data I need to retrieve from each worksheet is not row by row, but cell by cell. For example, I need to load data from cells C44, G65, I23, A78, etc. in all the worksheets. Is this possible using SSIS?

View 3 Replies View Related

Loading Old Data From Staging

Jan 3, 2007

I'm populating a table (B) in SQL Server from a Staging table (A) using a stored procedure.At any point of time, the Staging table holds 60 months' old data.In the first load of the destination table B, I get 13 months of old data whereas for every subsequent load,I need to load the data for the most recent month and delete data for the 1st(oldest) month. For example, if the load procedure runs on December 02,2006, it should pick data for the month of November,2006 from the Staging table and delete data for the 1st month.

I have a column DATA_MONTH_KEY in table B which maps to the column DATA_MONTH in my staging table A. I get the data for the first 13 months using:

(B.DATA_MONTH_KEY BETWEEN ( DATEADD(month,-13,@startdate)) AND @startdate) where startdate is the current date on which the procedure for populating table B is run. I get the value of startdate from a function.

How do i get data for the most recent months and delete the oldest month in subsequent loads?

Any help appreciated. Thanks.



View 7 Replies View Related

SQL Management Express - Loading A Lot Of Data

Oct 5, 2006

Hello,I'm trying to load a lot of data into SQL Server Express database, using SQL Management Express.  How can I import data?  Because Express doesn't come with BIDS (new DTS), I can't create a package to do the import.  How can I do it?Thanks.

View 1 Replies View Related

Loading Data Into Columns 75 - N In Tables?

Jun 23, 2004

I'm having a very irritating time trying to migrate data from a COBOL system to SQL Server.

One of the A/R Master files has approx. 200 columns.

I can export this file any number of ways that will (sometimes) load partially into my database, but always when the load succeeds, columns 75 through N simply contain NULL, even though there is data in the file. When the load fails in DTS, the error is always missing column delimiter. Using BULK INSERT the error is always something like data too long at column 75.

Putting all this together, I have deduced that something isn't working if I try
to load a staging table with more than 74 columns of data. This seems like a way-too-low threshold for a robust database, especially since SQL Server is supposed to be able to handle up to 1,024 columns per table.

Has anyone ever encountered this problem?

Thanks in advance for any help

View 2 Replies View Related

Node ID - Loading Hierarchical Data

Aug 3, 2013

CREATE TABLE #Source
(
Id int identity(1,1)
,categoryint
,Leaf_Node_code varchar(10) --
,Level1_Name varchar(20)
,Level2_Name varchar(20)

[Code] ....

Here category 1 has 3 levels ,

category 2 has 4 levels ,
category 3 has 5 levels ,

Below is the target table, here Leaf_Node_code should populate to only for leaf nodes for each category .. Need to populate Node_id with hierarchical data

I am unable frame a sql query to handle different levels , in future #Source may have more levels .

How to handle multiple hierarchy levels .. here only leaf node should have Leaf_Node_code

CREATE TABLE TARGET_TABLE
(
ID INT IDENTITY(1,1) primary key
,Node_id HIERARCHYID
,category int
,Parent_id int references TARGET_TABLE(id)
,Leaf_Node_code varchar(10)
,Namevarchar(20)
)

Here is the expected output:

IDcategoryParent_idLeaf_Node_codeName Node_id
11NULLNULLWorld
211NULLAsia
312101India
42NULLNULLa
524NULLaa
625NULLaaa
726102aaaa
83NULLNULLb
938NULLbb
1039NULLbbb
11310NULLbbbb
12311103bbbb

View 1 Replies View Related

Loading Data From Xml To Sql Server 2005

Aug 13, 2007

Please find the code below (which I am using).
1)
CREATE TABLE rssFeeds( feedXML XML)
SELECT * FROM RSSFEEDS
DECLARE @xmlDoc XML
SET @xmlDoc = ( SELECT * FROM OPENROWSET ( BULK 'C:Test.xml', SINGLE_CLOB ) AS xmlData)

2)
INSERT INTO rssFeeds (feedXML) VALUES (@xmlDoc)

I am able to get the contents of xml file into sql server in only one field of Table. Now I am asked to make one table with the schema same as xml file's. How to proceed? Pls let me know some URL for this.

Regards,
Ashish Johri

View 1 Replies View Related

Odbcbcp Error Loading Data

Sep 5, 2007

I'm receiving the error, "The Microsoft Jet database engine could not find the object 'tri_actfile.txt'. Make sure the object exists and that you spell its name and the path name correctly."

How do I call a table name in sql after the -Q?


SET QUOTED_IDENTIFIER ON
GO
SET ANSI_NULLS ON
GO


alter Procedure dbo.sp_cmdshell_test
(@ClientAbbrev nchar(4), @FileDate nvarchar(8), @SessionId nvarchar(50) )
AS

DECLARE @DataUpdates nvarchar(255)
DECLARE @Common nvarchar(255)
DECLARE @UServer nvarchar(255)
DECLARE @FullFileName nvarchar(68)
DECLARE @FileName nvarchar(64)
DECLARE @FileExt nvarchar(3)
DECLARE @SQL varchar(3500)
DECLARE @triFileName nvarchar(255)
Declare @DataArchive nvarchar(255)

Declare @Ip_FileName nvarchar(255)
Declare @Ip_Active char(1)




Set @UServer = '\AO3D$'

select @Common = sd_value from Msystem.dbo.Msysdata WHERE SD_Property = 'CommonDir'
select @DataUpdates = sd_value from Msystem.dbo.Msysdata WHERE SD_Property = 'DataUpdate'



Declare ActiveFile Cursor For
Select Ip_FileName, Ip_Active from tbl_InputFiles
where Ip_Active = 'Y'

Open ActiveFile
Fetch Next From ActiveFile
Into @Ip_FileName, @Ip_Active


While @@Fetch_Status = 0
Begin


Exec('truncate table tri_'+ @Ip_FileName)

Set @FullFileName = @Ip_Filename + '.csv'

Set @sql = 'master..xp_cmdshell ''' + @UServer + @Common + 'scriptsodbcbcp -D CCBM_TEST -i ' + @UServer + @DataUpdates + '' + @ClientAbbrev + '' + @FullFileName + ' -Q tri_' + @Ip_FileName +''''

Print(@sql)
Exec(@SQL)

Fetch Next From ActiveFile
Into @Ip_FileName, @Ip_Active

End

CLOSE ActiveFile
DEALLOCATE ActiveFile

return


GO
SET QUOTED_IDENTIFIER OFF
GO
SET ANSI_NULLS ON
GO

View 1 Replies View Related

Loading Data From Text Files

Aug 4, 2006

In MySQL, I use "LOAD DATA INFILE 'my_path/data_file.txt'" to load datafrom a plain text file. Of course, the actual statement is a bit morecomplex once one considers the various options (e.g. comma delimited vstab delimited, record termination strings, &c.).My problem is that I have yet to find the equivalent within MS SQLserver. I did find a LOAD statement in T-SQL, but at first glance itseems to do something completely different.How does one normally load data from a plain text file into a table inMS SQL? This needs to be relatively efficient since, once inproduction, it will be used to load tens of megabytes of data into thedatabase (a feed from a data provider). Is it flexible enough to allowme to specify whether the fields are tab delimited vs comma delimited,optionally enclosed by quotes, record termination charactors, &c.?All I really need is direction to the right part of the T-SQL reference(MS SQL Server 2005). Anything else, such as examples, is icing on thecake.ThanksTed

View 2 Replies View Related

Loading Data From Xml Http Stream

May 8, 2007

Hi,I have a problem which I don't entirely know how to tackle:Essentially, is it possible to query a web service (via http), usingsql server 2000, and then import that data in to the database?I have seen many posts on openxml and sql servers bulk load facilitiesbut nobody seems to mention whether you can open an http stream andread the xml in from there.Any help would be greatly appreciated.Thanks.

View 3 Replies View Related

Loading Data Into SQL Server From A SAS Dataset

Jul 20, 2005

Hi,SQL Server 2000 SP3Has anyone ever successfully loaded data into SQL Server from a SASdataset. I have tried using DTS and SAS OLE DB drivers but get thefollowing errorError Description:A provider specific error occurred (%1:%2)Context:Error calling GetRowSet to get DBSCHEMA_TABLES schema info. Yourprovider does not support all the schema rowsets required by DTS.It does seem to me to be a problem with the OLE DB providers but ifanyone has seen this issue with DTS previously , ler me know......Any help is appreciated....Thanks in advanceReg

View 1 Replies View Related

Loading Huge Volume Data

May 31, 2007

Hi Good morning to all,



My day started with loading huge volume of data and my data flow task failed to do so.



My data flow has a flat file connected to a OLEDB target. This is a one to one mapping. My source file contains 50 lac records and it is of 500 MB in size.



I'm processing the data with all the default buffer settings. I have 4 CPUs in my server.



the system process DTSDebug.exe is utilizing more than 2GB page size. My average CPU usage being 70% when one of those CPU s is hitting 100% utilization.



I'm very new to SSIS. So, please provide me some info how do i set my buffers and do we have any PDF for performance and tuning in SSIS ?



Do we have any bulk load transformation in SSIS to load into DB2UDB ?



If so how do i get it installed?





Thanks in advance,

Suresh N

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved