Why Is My Data Import Going So Slowly?

Jul 20, 2005

Hi group,

I have a 175MM record table, with a record length of 200 bytes (about 20
columns).

Sometimes when I run a very simple DTS to import our monthly text file
of new records (about 10 million records) it really flies (takes less
than an hour or 45 minutes).

However, sometimes it takes forever...running over 6 or 7 hours before
finishing.

When it takes forever I run sp_who and don't see any blocking
processes...

To ensure that things move as quickly as possible I always drop the
primary key and the indexes before importing, so it shouldn't be getting
tied up trying to update the indexes.

What sort of things could be holding up what I would expect to be a very
simple process of appending the records to the end of the current
table...?

Thanks,

Warren Wright
Experian-Scorex



*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

View 2 Replies


ADVERTISEMENT

Do We Have To Alawys Use Slowly Changing Dimensions (SCD) Component In The Data Flow For The Loading Of Table Data?

Feb 28, 2008

Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,

View 4 Replies View Related

Sql Server 2005 Loads Data Slowly

Oct 1, 2007

This is the code i use.

Dim cnn As ADODB.Connection
Dim rst As ADODB.Recordset



Private Sub Form_Load()


Set cnn = New ADODB.Connection
cnn.ConnectionString = "driver={SQL Server};" & "server=SCHS-SQL;uid=sa;pwd=sa;database=Library"
cnn.Open

Call loadrst

End Sub

Public Sub loadrst()
Set rst = New ADODB.Recordset
Dim sql1 As String
sql1 = "select * from Books order by srno"
rst.Open sql1, cnn, adOpenDynamic, adLockOptimistic, adCmdText

If rst.EOF = True Then
MsgBox ("No records are present")
Command1.Enabled = False
Else
Call display
Command1.Enabled = True
End If


End Sub



This is the code i use basically to connect my vb6 application to sql server 2005. I had started out lately trying to use sql server instead of access. So far none of the program have given any problems as the databases has a max one of 120 records. But the one which this code connects to has about 5200 records. I had imported the tables from access into sql server. The size of the database was around 17.67mb so i shrank it and it became 4mb. But still it takes roughly 2 minutes for the user to see the records in the grid. Could you tell me what to do?

Sheldon

View 3 Replies View Related

Data Warehousing :: Foreign Key Relationship Between Two Slowly Changing Dimensions?

Apr 20, 2015

Should we have foreign key relationship between two slowly changing dimensions?

Most of the cases we talked about are about how to build an SLC, any example of trying to have two SLCs with FK relationship in a snow flake schema?

View 3 Replies View Related

SQL Server Import And Export Wizard Fails To Import Data From A View To A Table

Feb 25, 2008

A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server.
I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard.
However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection


Operation stopped...

- Initializing Data Flow Task (Success)

- Initializing Connections (Success)

- Setting SQL Command (Success)
- Setting Source Connection (Error)
Messages
Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
(SQL Server Import and Export Wizard)

Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)


- Setting Destination Connection (Stopped)

- Validating (Stopped)

- Prepare for Execute (Stopped)

- Pre-execute (Stopped)

- Executing (Stopped)

- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)

- Post-execute (Stopped)

Does anyone encounter this problem before and know what is happening?

Thanks for kindly reply.

Best regards,
Calvin Lam

View 6 Replies View Related

Import Data From MS Access Databases To SQL Server 2000 Using The DTS Import/Export

Oct 16, 2006

I am attempting to import data from Microsoft Access databases to SQL Server 2000 using the DTS Import/Export Wizard. I have a few errors.

Error at Destination for Row number 1. Errors encountered so far in this task: 1.
Insert error column 152 ('ViewMentalTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 150 ('VRptTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 147 ('ViewAppTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 144 ('VPreTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 15 ('Time', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.

Could you please look into this and guide me
Thanks in advance
venkatesh
imtesh@gmail.com

View 4 Replies View Related

IMPORT New Data Since Last IMPORT - DTS/Stored Procs?

Jan 7, 2004

Hello:

I am not sure how to implement the following, but I believe it entails using DTS, and hopefully it is fine that I post it here b/c ultimately I will need this backend data for my frontend .aspx pages:

On a weekly basis, I need to IMPORT some data located on a remote Oracle DB into SQL Server 2k. Since there is so much data to transfer, I would only like to transfer the data that is new to the table since the last IMPORT, i.e. a week ago and leave behin the OLD data.

Is DTS the correct way to go or do I have more control via DTS with STORED PROCEDURES? Does anyone have any good references for me?

On a similar note, once this Oracle data is IMPORTED into a certain table, I would like to EXPORT some of these NEWLY acquired rows matching certain criteria into another table for auditing purposes. For this scenario, should I implement a TRIGGER UPDATE event here on the first table?

Any advice will be greatly appreciated!

View 3 Replies View Related

Import Data From Excel-Sheet Via OleDb In VB.Net - How To Get A Columns Data As String?

Oct 25, 2007

Hello,

i want to import data from an excel sheet into a database. While reading from the excel sheet OleDb automatically guesses the Datatype of each column. My Problem is the first A Column which contains ~240 Lines. 210 Lines are Numbers, the latter 30 do contain strings. When i use this code:







Code BlockDim sConn As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & conf_path_current & file_to_import & ";Extended Properties=""Excel 8.0;HDR=NO"""
Dim oConn As New OleDb.OleDbConnection(sConn)
Dim cmd1 As New System.Data.OleDb.OleDbCommand("Select * From [Table$]", oConn)
Dim rdr As OleDb.OleDbDataReader = cmd1.ExecuteReader
Do While rdr.Read()
Console.WriteLine(rdr.Item(0)) 'or rdr(0).ToString
Next




it will continue to read the stuff till the String-Lines are coming.
when using Item(0), it just crashes for trying to convert a DBNull to a String, when using rdr(0).ToString() it just gives me no value.

So my question is how to tell OleDB that i want that column to be completly read as String/Varchar?

Thanks for Reading

- Pierre from Berlin


[seems i got redirected into the wrong forum, please move into the correct one]

View 1 Replies View Related

Exported Flat File Data Will Not Import To Same Table Without Extensive Data-type Manipulation

Jul 13, 2007

I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."



Seems simple, right?



I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?



Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.



View 12 Replies View Related

How To Optimize Data Import With Huge Volumes And Joins Across Data Sources Not All SQL Server Based?

Jun 7, 2006

I need to periodically import a (HUGE) table of data from an external data source (not SQL Server) into SQL Server, with the following scenarios:
Some of the records in the external data source may not exist in SQL.Some of the records in the external data source may have a different value at different imports, but this records are identified univocally by the same primary key in the external datasource and in SQL Server.Some of the records in the external data source may be the same in SQL.

Due to the massive volume of the import, I would like to import only the records which are different from what I have in SQL Server (cases 1 and 2 above). In fact case 2 is the most critical.

I thought of making a query with a left outer join between the data in the external data source table (SOURCE) and the data in the SQL Server table (DESTIN). The join is done on the respective primary keys (composed keys of up to 10 columns) and one of the WHERE conditions will be that the value in SOURCE is different from the value in DESTIN.

The result of this query would be exactly what I need to import.
How to do this in SSIS??? I couldn't figure out how to join tables in different data sources yet.

In fact I cannot write a stored procedure to do that, since one of the sources is in a datasources not SQL Server.
I have seen the Lookup transformation in this article http://www.sqlis.com/default.aspx?311 but this is not exacltly what I want to do.
Another possibility is to use the merge join, but due to the sorting I believe its performances would be terrible!

Thanks in advance for your suggestions!

View 9 Replies View Related

Where Can I Find The Import Data/Export Data Options If I Only Have The SQL Server Management Express Studio?

Oct 4, 2007



Hi all,

It looks like these options are only available in the SQL Server Management Studio? I installed SQL Server Management Express Studio and I can't even find the DTSWizard.exe on my machine.

Can you please help how I can import data from excel or where can I download the SQL Server Management Studio?

Your prompt response is greatly appreciated.

Thanks!!
Tram

View 8 Replies View Related

Memo Data Type Import Error While Importing Data From Access File Into SQl Server 2005

Sep 10, 2007

I have one column in SQL Server 2005 of data type VARCHAR(4000).

I have imported sql Server 2005 database data into one mdb file.After importing a data into the mdb file, above column
data type converted into the memo type in the Access database.

now when I am trying to import a data from this MS Access File(db1.mdb) into the another SQL Server 2005 database, got the error of Unicode Converting a memo data type conversion in Export/Import data wizard.

Could you please let me know what is the reason?

I know that memo data type does not supported into the SQl Server 2005.

I am with SQL Server 2005 Standard Edition with SP2.

Please help me to understans this issue correctly?

View 4 Replies View Related

Transact SQL :: Excel Data Import Truncate Data Length To 255

Jul 29, 2015

I am trying to import data from an excel Sheet to SQL Database using OPENROWSET. After import I found that all the cells containing data of more than 2000 length got truncated to  255 characters only. I tried finding the solution and found that We need to have the data with length more than 255 in first 8 rows of Excel sheet. It worked for me also. But In real scenario the data that I cant do the manual work on excel. I tried out with Dot Net utility and SSIS package also but the truncation is still the issue.

INSERT into tmp_Test
SELECT
*
FROM
OPENROWSET('Microsoft.ACE.OLEDB.12.0','Excel
12.0;Database=D:Book1.xlsx', [Sheet1$])

View 9 Replies View Related

Import Data Where Current Data Has Unique Identity

Aug 10, 2000

I have data for online catalogue in SQL 7.0. The web grogrammer asked me to add a unique key for reference. I used int datatype with identity seed of 1 and increment of 1. This works fine BUT when I try to import new data I get an error because the csv file has no column and therefore no value for the unique field which will not allow null by definition.

How can I maintain a unique field to act as primary key in my data when
I want to add (and delete) data that doesn't have this field.

I tried adding the uniqueidentifier field but this gives error message.

The only work round is to delete the unigue field altogether and then add the new data and afterwards create a new unique field. At 600000 + lines of data, this is time and memory consuming

Any help appreciated,Thanks, Keith

View 1 Replies View Related

Import Data From Excel Data Sheets

Jan 6, 2007

Hello

Looks like I'am the first one to pop this forum!

But I'am a forward guy so lets get to the problem.


A problem with my replication system as occured.

I have a working SQL server that can do replication trough internet, everyting works

The problem is when I try to import large amount of data (10000 rows) to my database on the SQL server

the subscriber on my client don't get the rows. That will say imported data is not being replicated.

only rows that i have manually inserted will be replicated.

I used the import wizard that came with SQL server.

Is there a solution to this problem?

View 5 Replies View Related

Import Excel Data - Missing Data

Mar 4, 2008

Hello all, I am using the Import Wizard to pull in data from an Excel spreadsheet. One column in particular SQL Server sees as a float data type but it contains varchar data. So I change this in the wizard but some of these values are missing when I select * from Sheet1$ in SQL Server 2005. Any ideas why this would happen? I have formatted the particular column as text in Excel.

View 6 Replies View Related

Import Data From Excel Data Sheets

Jan 6, 2007

Hello

A problem with my replication system as occured.

I have a working SQL server that can do replication trough internet, everyting works

The problem is when I try to import large amount of data (10000 rows) to my database on the SQL server

the subscriber on my client don't get the rows. That will say imported data is not being replicated.

only rows that i have manually inserted will be replicated.

I used the import wizard that came with SQL server.

Is there a solution to this problem?

 

View 7 Replies View Related

Why Would This Query Run Slowly?

Jan 20, 2005

Why could the following query be going slowly (over 30 minutes and running)? The ID is the PK and clustered index on the table. The estimated execution plan is a basic clustered index seek.


select * from recipients where id = 3006392307

View 14 Replies View Related

Astonishing Slowly

Apr 12, 2007

hi everyone,



Primary platform XP Sp2 RAM 2gb. Sp2 for sql25k not applied yet.



Does anyone have any tip about how to speep up the development when you have plenty of tasks? My dtsx owns several loop container, for each, and ten tasks between sql and script tasks.



Indeed, is difficult work this way.



TIA,

View 1 Replies View Related

COUNT And COUNT_BIG Are Too Slowly, What To Do?!

Mar 31, 2007

hello..
I used COUNT_BIG to get the number of rows but it was too slowly.
I have 100,000 rows in the table.



select COUNT_BIG (*) from topicstbl where partID like '" & PagePartID & "'



the first execute for this query taked 5 secondes. and in another times it taked 1 second.

this mean when the rows count will be 1 million it will take 10 secondes or more!

what I have to do?

View 2 Replies View Related

Slowly Changing Dimensions

Jan 29, 2007

I am new to SSIS and I am investigating using the Slowly Changing Dimension transform.

The data source that I receive is a daily snapshot of the external source system table. I need to store the history of the entity attributes (Type 2 SCD) and I am using the Start / End Date mechanism.

When an entity (identified by the business key) is no longer received in the source snapshot, I would like the data flow to update the End Date of the current row to show that the entity has now expired.

Does anyone have any suggestions for a good way to achieve this ?

NB: Changing the source system extract to include and flag expired entities is not an option for me.

Many thanks

Graham

View 3 Replies View Related

Report Running Really Really Slowly!!

Jun 11, 2007

Hi, I have a report with 18 cascading report parameters. Each report parameter has a unique dataset which passes the value of the previous parameter into the sql string. As I am selecting the report parameters it is taking longer to query the further down I go. I think this is because of the number of where conditions that are being passed through the sql query. The last report parameter is passing 17 where conditions.

In access when I have done this the parameters near the bottom were being refreshed quicker than the top ones - why is it the opposite way round in reporting services??? Any ideas of how to speed this process up?

View 4 Replies View Related

About Slowly Changing Dimension

Oct 25, 2007

I have one question regarding Slowly Changing Dimension component in SSIS. Does SCD also delete records in warehouse if they does not exist in source anymore, or does SCD only insert new and update existing records? Can someone explain me a little bit more about inferred members? Thanks.

View 1 Replies View Related

Slowly Seek In Access

Nov 30, 2006

Hi

I have a frontend Access and backend SQL
works fine but when i in my customer table seek in name
it takes very very long time . I use ODBC to my connection

is there anayway to this better.?



regards

alvin

View 3 Replies View Related

SQL Server V7 Running Slowly - CPU Hogging

Jul 7, 1999

I've just had a call saying that their SQL Server v7 box on a Compaq 7000 that has been installed for about 6 months is now running slowly,
a look on the CPU shows that SQL server is grabbing 50% upwards of the CPUs (dual processor) even when the front end application is not running
, is there anything that I should be checking

Thanks

Richard

View 1 Replies View Related

SQL Server Used As A Client Running Slowly

Apr 18, 1999

I have and SQL server that had NamedPipes, Multiprotocol,IPX/SPX and TCP/IP all loaded. It is running on a 10Mbps network.
I have run setup and turned all other protocols off except TCP/IP to increase speed. This has increased speed on all the workstations but the server now runs very very very slowly.
If anyone out there has any ideas please let me know.

Thanks in advance
Joe Gentile

View 1 Replies View Related

Type I Slowly Changing Dimension

May 16, 2008

I have a Type I SCD situation, ie, insert if new (by checking the business ID) and update if any attributes for a given business ID has changed.

The way I usually do this, (and I believe this is how most people do it), is I use a LOOKUP TASK to determine if the business ID exsist in the target table. If it doesn't then I insert. If the business ID exists, then I bring back the associated attributes and use a CONDITIONAL SPLIT TASK to compare if any of the incoming attributes are different. If there are changes, then update.

In doing this comparision, I often run into situations where I end up comparing a NULL value to something, which does not result in FALSE, but a NULL result. To get around this, I first check for NULLs and convert them into something valid before I do the comparision, but this results in a messy comparison expression, especially if I have to compare a lot of attributes.

So, how do you guys handle this?

As an alternative, I am looking into the SLOWLY CHANGING DIMENSION TASK, which I also have some questions on, but I would like to first address the above. Thank you.


View 11 Replies View Related

Package With Slowly Changing Dimension.

Jan 15, 2008

I have a package using Slowly changing dimension in the data flow task. It works fine if the number of records are less but for a large file the package fails with the "Violation of Primar Key" error even though there are no duplicate records in the table.
for eg i have a table with employee database with a composite primary key comprsing of Name and Employee Id. I need to do an UPSERT depending on the Name and Employee Id combination. I have a file with 100,000 records and when i try to execute the package it gives an error 'cannot insert duplicate data' even though the combination does not exist in the database.

Please help.

Aashna Behl

View 3 Replies View Related

Slowly Changing Dimension With 600,000 Rows

Jan 17, 2006

Hi,

We have been using tasks generated from the SCD wizard. We have smaller dimensions (< 30,000 rows) that work well. Our Product Dimension package is giving us performance problems (taking 7 hours to do 600,000 rows when 80,000 records are updated; the rest new inserts). It is similar to the smaller dimensions. Several columns are type 1 and are doing update statements; several are type 2 doing updates and inserts. The package had a complicated view as the initial task, but we have since modified to use a SQL command with variable and now the initial read appears quick, but is chunking in 10,000 record increments and taking the 7 hours (never let finish previously). So the package is pretty basic now (reading a source, a small derive and data conversion, a small lookup (cached 30,000 records) for a description, then the SCD). Before I start replacing what the SCD generates with stored procedures, anyone have any suggestions as to what might be the issue? We believe we have increased the number of type 2 columns and the SCD definately has more to do than just an insert or update, but 7 hours for 600,000 records seems excessive. Interestingly, the source task never turns green. Previously when we had a Merge Join it completed the read and bottlenecked at a sort and a Merge Join. Now that has been removed and simplified, and all tasks remain yellow with the 10,000 (actually 9,990 I think) chunks appearing at the source, then the SCD before the next chunk appears to be read. On the general release (not the beta). Thanks in advance!

View 3 Replies View Related

Why Does Report Manager Open So Slowly

Apr 22, 2007

I am using SQL Server 2005 Developer Edition and have created several reports, deployed them to SQL Server and tested them.

Everything works correctly, but when I open the Report Manager using localhost/Reports, the page takes several minutes to connect and open. I am doing this on a new laptop I use for development and testing, so I can see where it would be a little slow, but a few minutes seems extreme. I don't have any problem with SQL Server itself, nor with VS 2005.

Are there any tips or setup issues which might affect the startup speed?

Many thanks
Mike Thomas

View 1 Replies View Related

Slowly Changing Dimension [58] Error

Sep 21, 2006

i am using SCD to insert or update .my source and destination table are Oracle and i am using Orcale OLEDB provider . i am getting the following error while executing the package.what could be the solution

[Slowly Changing Dimension [58]] Error: An OLE DB error has occurred. Error code: 0x80040E5D. An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80040E5D Description: "Parameter name is unrecognized.".


Thanks

Jegan

View 3 Replies View Related

Parent ID In A Slowly Changing Dimension

Dec 7, 2006

Hi There,

Just wondering if any of you implemented a (Kimball type 2) dimension structure, in which a ParentID column exists which points to a record from the same dimension table, using a SCD objects in SSIS. The ParentID column would have to be "Historical".

The challange here is that you would need to go through the table twice somehow, because if I would do a lookup of the parent record in the first run, I wouldn't be sure if I got the right parent record.

Thnx, Jeroen.

View 13 Replies View Related

Slowly Changing Dimension Wizard

Jun 6, 2007

I have a Company Dimension table that consists of various sources. One source will provide me address information, another source will provide industry info, etc. I created a historical load package that will pull all of this together so that I have all the necessary data related to a company in one record. All is well.



Since my company data is coming from various sources, how can I tell the SCD to update certain fields but not others for a type 2 change. In essence, I would like to "pull forward" the data that was in the original database row and then update it with only the changes coming from the proper source data. For example, if an address changed I will get the new address from the source but will not have the industry info. I would like to create the new record with the new address but also keep the industry data in tact. Is this possible?



Currently I will get the new record with the new address but will have null values for the industry data.







Thanks



View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved