How To Speed Up Table Data Transfer Thru Bcp In?
Jan 30, 2008
for bcp in,
1. use fixed length format file or delimitered file?
2. table w/o index including primary key?
3. sort the text file before bcp in (will it speed up indexes creation after data uploading?)
which pt will or will not improve the overall bcp in processing?
thx...
View 4 Replies
ADVERTISEMENT
May 30, 2008
Is it possible/advisable when transfering very large amounts of data from server to server to:
trasnfer the data to a new table first
second alter new table adding indexes, defaults, ets based on original table
if it is what flow item would be used to transfer/alter the indexes and defaults?
I'm very new to ssis so the more detail you can give the better.
Thanks
View 5 Replies
View Related
Oct 8, 2015
I have table having around 100 million rows.Everyday we have an ETL process in which table will be trucnated and relaoded. Will creating a partition on the table increase the inserting speed?
View 4 Replies
View Related
Jan 14, 2014
What is the best way to transfer data from the staging table into the main table.
Example:
Staging Table Name: TableA_satge (# of rows - millions)
Main Table Name: TableA_main (# of rows - billions)
Note: Staging table may have some data same as the main table.
Currently I am doing:
- Load data into staging table (TableA_stage)
- Remove any duplication of rows from the staging table (TableA_stage)
- Disable all indexes on main table (TableA_main)
- Insert into main table (TableA_main) from staging table (TableA_stage)
- Remove any duplication of rows from the main table using CTE (TableA_main)
- Rebuild indexes on main_table (TableA_main)
The problem with the above method is that, it takes a lot of time and log file size grows very big.
View 9 Replies
View Related
Jun 9, 2013
I have two database(MYDB1 , MYDB2) on two different server's(SERVER1 , SERVER2) . I want to create an store procedure in MYDB1 on SERVER1 and get some data from a table of MYDB2 on SERVER2. How can i do this?
View 5 Replies
View Related
Nov 15, 2006
I have an excel sheet that contain colummns as in a table in a sql database i want to transfer this data from the sheet to the table frombusiness logic code layer not from the enterprise manager by wizardwhat can i do?? ...please urgent
View 1 Replies
View Related
Nov 26, 2015
I am newbie to MDS of SQL,want to know how we can transfer tables from two different SQL Databases to MDS.Suggest me the steps to proceed with any examples.
View 2 Replies
View Related
Sep 7, 2005
I need to transfer data from one table to another.I will be using the SQL Query Analyzer to do this.This is not a simple transfer of data to the same structured tables.These tables are completely different, for the most part.For instance, I will be selecting certain fields of one table.....SELECT fldOne, fldTwo FROM someTableI need to take this information, one row at a time and input it into a different type table.So, something like this to insert into the other table...INSERT INTO otherTable( fld1, fld2) VALUES( value1, value2 )I've looked around for a sample to achieve this, but may have overlooked it?Anyone have a link to show this or a quick sample?Thanks all,Zath
View 3 Replies
View Related
Dec 9, 2007
Hello,
Just requesting for help about transferring data from one table to another.
For example:
In Table1, there are 3 columns:
Col1 Col2 Col3
T1C1R1 T1C2R1 T1C3R1
T1C1R2 T1C2R2 T1C3R2
T1C1R3 T1C2R3 T1C3R3
T1C1R4 T1C2R4 T1C3R4
In the target Table2, Col1 should be a string constant,e.g. "A" and Col2 from Col1 of T1 and Col3 from Col2 of T1:
Col1 Col2 Col3
A T1C1R1 T1C2R1
A T1C1R2 T1C2R2
A T1C1R3 T1C2R3
A T1C1R4 T1C2R4
I found something like
"INSERT INTO TABLE2 (COL1, COL2, COL3) SELECT COL1, COL4, COL7 FROM TABLE1"
but this code requires colomn one of the table 1 be selected as well but the column one of the
target table 2 requires a constant string.
I also tried this but doesn't seem to work:
Code Block
dtTable1 = ds.Tables(0) ' Table1
bSQL.Length = 0
For Each dr In dtTable1.Rows
bSQL.Append("INSERT INTO Table2(Col1, Col2, Col3) ")
bSQL.Append(" VALUES(")
bSQL.Append("'" & "A" & "',") ' Constant for column 1 of Table2
bSQL.Append("'" & dr(0).ToString() & "',") ' All data in the first column of Table1
bSQL.Append("'" & dr(1).ToString() & "')") ' All data in the second column of Table1
.......
.......
Next
.....
.....
Thanks,
K
View 3 Replies
View Related
Apr 21, 2008
Guys
i have two database on my sql server,, and i m trying to create a report in whch both database need to share their data.. @ the moment what i did, i simply create a view on one database to access the table of other database.. but is anyone has a better idea how can i transfer data from one database to another database... i think if i do with creating temp table that might resolve this problem.. but when it comes on another server of another database , how can i will do this ...
please give me any idea if u got my point .
Thanks and looking forward.
View 5 Replies
View Related
Feb 19, 2008
hi iam new to writting stored procedures
so anyone please help me out.
i have to transfer data from one table to three other new tables.
and if there are any duplicates in original table i have to send them to
duplicates table.the remaining data should be send to three other tables.
so can anyone help writting stored procedure for this.
thanks for your suggestions and answers
regards
ramya.
View 20 Replies
View Related
Aug 29, 2006
Is there a way to transfer ntext data from one table to another?I tried thisUPDATE [projects]SET [description] = (SELECT [description_ntext] FROM [table] WHERE[id]=1)WHERE [id_project] = 1;and thisDECLARE @DESCRIPTION ntextSET @DESCRIPTION = (SELECT [bids].[bid_conditions] FROM [bids],[projects] WHERE [bid_accepted_id] = [bids].[id_bid] AND [id_project] =@ID_PROJECT);UPDATE [projects]SET [description] = @DESCRIPTIONWHERE [id_project] = 1;none of those work in MSSQL2K,error reported is "The text, ntext, and image data types are invalidfor local variables."
View 2 Replies
View Related
Aug 8, 2006
Hello. I want to ask about the possibility of copying both a tablestructure and it's contents from aSQL server table to a table within MS access. The problem cannot besolve with a permanent table structure at the target location.The names of the columns are essentially data with the application andso are subject to change. I am targeting a solution using SQL QueryManager.The approach I have tried (with failure) isSELECT *INTO <linkedserver table>FROM <local table>This should create and copy. However, I am not sure if this isachievable with this approach.Refer to the dialogue;-------------------------------------------------------USE MASTERGOEXEC sp_addlinkedserver@SERVER = 'Freddie',@PROVIDER = 'Microsoft.Jet.OLEDB.4.0',@SRVPRODUCT = 'OLE DB Provider for Jet',@DATASRC = 'C: empHMIS_Recipe.mdb'-- I am not sure if this is requiredEXEC sp_addlinkedsrvlogin 'Freddie', false, 'sa', 'Admin', NULLSELECT * FROM Freddie...FRED -- This is OKSELECT * INTO #Temp FROM Freddie...FRED -- This is OK-- This fails - Refer errorSELECT * INTO Freddie.FRED65from #tempServer: Msg 2760, Level 16, State 1, Line 1Specified owner name 'Freddie' either does not exist or you do not havepermission to use it.-- This also fails and I thought reflected the above select with naming- Refer errorSELECT * INTO Freddie...FRED65from #tempServer: Msg 117, Level 15, State 1, Line 2The object name 'Freddie...' contains more than the maximum number ofprefixes. The maximum is 2.EXEC sp_dropserver 'Freddie',@droplogins = 'droplogins'------------------------------------------------------------Thank you.Regards JC...
View 3 Replies
View Related
Sep 14, 1999
Hello:
I am running on mssql 6.5, sp4. We have been trying to use EM transfer manager to move one test database on one server to another database on another server.
We are dealing with 135 tables on this database. The transfer works up until about the 80th atble and then just dies but the scheduled task says it failed and check error log. The transfer creates the tables on the destination database but only loads the data until this one table.
WE use all of the options in EM Transfer manager which are st as defaults.
THere is no one on the source or destination databases locking this table.
Other smaller databases were successfully transferred from one database on one server to the other database on the other server without any problems today and yesterday.
Has any one run across something like this?
THanks.
David Spaisman
View 1 Replies
View Related
Apr 2, 2008
Hi All,
can anybody help me in creating the SSIS package to transfer the data from SQL table in database engine to OLAP cube in Analysis services
Thanks in Advance.
Archana
View 1 Replies
View Related
Sep 18, 2015
I've a SSIS 2008 parent/child package solution to manage data transfers between two different data sources, so we can copy multiple tables and capture how many rows were transferred and duration for each transfer. This solution was working fine up until last week, when I made some changes to allow the package to perform a source count using standard SQL determined by an expression, or SQL provided from configuration tables, I also changed the package to Truncate or not the destination table, again controlled by configuration settings in a table. The child packages which perform the data flows have not changed!
The day after the controlling package promotion to live, I saw the bizarre behaviour of the Package log stating all rows transferred, but the actual table counts were not what the log stated, see attached file. The package solution works ok on other servers and was ok in DEV, but there were less tables and rows transferred.Re-running the package gave the same errors, but on some of the same tables and some different ones.Â
As it is the child packages doing the transfers and nothing has changed in them. I cannot see how the log would be able to say all rows are transferred and yet not all of the rows are actually moved?
Process output - where you can see counts and log Table transfer controller (as txt not dtsx)
An example of the data transfer child packages (as txt not dtsx)When I set the ExecuteOutOfProcess = True the package worked fine, unfortunately, this is not a good solution as SSIS 2008 does not tidy up the Dtshost.exe processes it starts and I'd be left with a memory issue after a very short time, we transfer hundreds of tables each day. ( I could write a .net script in the controlling package to kill the child processes, but that would still have hundreds of processes running before I could end them, as we have three parallel streams to allow a bit better performance.
View 6 Replies
View Related
Oct 18, 2007
Hi,
I have several data bases on a server (SQL Server 2000 only, no web server installed) and lately, as the company keeps gowing, my users complain saying the server gets slow, (this dbs are well designed and recieve optimizations and integrity checks, etc) because of this, Im thinking about getting a new server to repleace my old ProLiant ML 330 which was bought 4 years ago but Im concerned about what server arquitecture or characteristic can help me best to improve response performance, is it HD speed? Processor speed? or more Ram? I want to make a good decision, so I´d really appreciate your help...
Thanks, Luis Luevano
View 1 Replies
View Related
Oct 2, 2007
I have the following scenario (with 2K5 Express)
I have a table with 160.000 records.
I have to retrieve 10.000 - 40.000 records by their ids (<3seconds would be sufficent)
I first used single requests, then one single command as batch (simply joined the single commands into one string).
But that was very slow (30 seconds if cached). so I created one big statement
select myfields
from mytable where id in(1,2,..,35000)
if everything is cached the speed is fine (<1second), but if I retrieve the data for the first time
it takes 15-30 seconds, that's a bit too slowish.
the total database size is 100MB - so a file scan should be faster, I thought at least
so HERE is the problem why I post this
to force the table scan I used
Select myFields From mytable With (Index(0) ...
that took > 3 minutes
I tested the raw IO-time, that was 2,5-3 seconds with the db-file
has SQL Server a problem with the 35.000 items in the condition?
(If it loopes 35.000 x 160.000 times instead of using a hash for the items that would explain the slow speed)
or another reason:
is table scanning always much slower then the raw io operations?
the id-index is not grouped and ( I really don't know why) not marked as primary key,
but that shouldn't have any impact on a file-scan, I guess.
Has anyone faced (and solved) a similiar problem?
View 3 Replies
View Related
Jan 21, 2008
I have inherited a half-finished sql-server based project from a recently departed coworker. The critial point of this project is one app thread that reads barcodes, queries a single table in the database for the one record with that code as its primary key, and makes desisions based on that record. The faster that I can make that go, the better the process will run, up to a max rate as high as 20 queries per second if that were possible. I have a limited
general knowledge of sql, but very little of sql-server express.
My question is what is the best way with sql-server to maximize my single-table request rate?? On some other databases I could create an in-memory temp copy of the table with trigger events on the main table to keep the copy in sync, or I could do an initial select on the entire table to hopefully get the table into cache memory, or I could use some kind of ado-like table on the app side (but do I really gain much of anything doing this??)
With SQL server, what is my best approach to maximize my throughput under these conditions??
FYI..The c++ app uses direct odbc calls to a localhost database. Table theoretically could have 75000 ever-changing records in it. There are 5 or 6 other processes also hitting on this table, but at a far more lakadaisical (say once every 10 seconds level) rate.
View 3 Replies
View Related
Apr 27, 2007
Hello,
I'm trying to pull 1.2 million rows down from an Oracle database into SS2005. It has been painfully slow (about 20 - 30 minutes). Is there a way that I could speed this up?
While I am connecting to Oracle 9, I tried downloading the latest drivers (10g) from Oracle because I thought I could use the ODP.NET drivers. Unfortunately it did a number on my system and I could no longer even ping one of the databases.
I've seen this question sort of answered, but the only answer that I've found seems to be to use a custom connector developed by http://www.persistentsys.com. That is great that they have developed something, but I would expect that I should be able to use Oracle and/or Microsoft's drivers to get similair performance. (OK, I just don't want to pay for it )
I was thinking I could do a bulk dump from Oracle, into a flat file and then bulk load it into SQL Server, but I don't have (or know about) tools that can bulk dump from Oracle.
Thank you for the help.
-Gumbatman
View 7 Replies
View Related
Mar 19, 2014
In the full recovery model, if i run a transaction that inserts 10MB of data into a table, then 10 MB of data is moved in the data file. Does this mean then that the log file will grow by exactly 10MB as well?
I understand that all transactions are logged to the log file to enable rollback and point in time recovery, but what is actually physically stored in the log file for this transactions record? Is it the text of the command from the transaction or the actual physical data from that transaction?
I ask because say if I have two drives, one with 5MB/s write speed for the log file and one with 10MB/s write speed for the data file, if I start trying to insert 10 MB of data per second into the table, am I going to be limited to 5MB/s by the log file drive, or is SQL server not going to try and log all 10 MB each second to the log file?
View 6 Replies
View Related
May 8, 2008
I'm pretty new to .NET and am looking for advice on how to speed up a simple stored procedure that returns 35,000 plus rows. It's a super simple query that returns a client list. It's just that there is soooooo many rows, it's super slow when the page loads.
View 4 Replies
View Related
Jul 9, 2015
I have just been running a query which I was planning on improving by removing a redundant GROUP BY (there are about 20 columns, and one of the columns returned is atomic, so will mean that the "group by" will never manage to group any of the data) but when I modified the query to remove the grouping, this actually seems to slow the query, and I can't see why this would be the case.
Both queries return the same number of rows (69000), as I expected, and looking at the query plan, then they look nearly identical, other than at the start, there is a "stream aggregate" and "sort" being performed. The estimated data size is 64MB for the non-grouped query (runs in 6 min 41 secs), vs 53MB for the aggregated query (runs in 5 min 31 secs), and the estimated row size is smaller when aggregated.
Can rationalise this? In my mind, the data that is being pulled is identical, plus there is extra computation for doing an unnecessary aggregation, so the aggregated query should be unquestionably slower, but the database engine has other ideas; it seems to be able to work more quickly when it needs to do unnecessary work :) Perhaps something to do with an inefficient query plan for the non-aggregated query? I would have thought looking at the actual execution plan might have made this apparent, but both plans look very similar.
Edit: More information, the "group by" query had two aggregations on it, a count of one of the columns, and an average of another one. I changed this so that it was just "1" instead of the count, and for the average, I changed it to be the expression within the average aggregate, since the aggregation effectively does not do anything.
View 2 Replies
View Related
Nov 30, 2005
Hi,I am expanding our data warehouse solution with new filegroups onseveral subsystems.I want to know which idea is better!- create clustered indexes on tables to 'move' them to new filegroups- create these tables on the new filegroups.The background of this question is as follows:- we want the whole data on the new filegroups- we want to know if there is any difference in performance between the2 solutionsThanks in advance,Danny
View 4 Replies
View Related
Mar 2, 2008
i have been trying to determine which is the most efficient, with regards to speed and efficiency, between a view and a common/nested table expression when used in a join.
i have a query which could be represented as index view or a common table expression, which will then be used to join against another table.
the indexed view will use indexes when performing the join. is there a way to make the common table expression faster than an indexed view?
View 2 Replies
View Related
Jun 11, 2007
My vendor requires data to be sent in Excel format. Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576). Right now my data sits in SQL 2000. I am using MS SQL Enterprise Manager 8.0 to prepare the data. Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance.
View 3 Replies
View Related
Aug 29, 2006
I have created a SSIS package that transfer data from a Foxpro database to an instance of SQL Server 2005 Express. I used the wizard to create the package but I load and execute the package within a custom application that I have written in C#.
The way the custom application is intended to work is that the user can have the database in any location on the computer and all he has to do is specify the location then the application programatically changes the location of the source on the package that it has loaded and then execute it. When I initially run the package the first time (using the original path), it works fine and transfers the data. However, every subsequent time I run the application and specify a different path, the database on the SQL Server side gets created as expected but the data is not transfered!
Where am I going wrong? Do I need to save the package after I modify the source then reload and run it again or do i need to change something else in the Data Flow to make this work?
View 4 Replies
View Related
May 8, 2007
Hi,
I have a SSIS project where I am transferring data from DB2 table to SQL Server table. There is a column called REC_ID which I need to encrypt before we store it in SQL Server. Now, SQL Server has buildin encryption functionality and we need to use that as there are views that will decrypt this column and give data to authenticated users.
So, the question is, is there anyway that I can encrypt the column data in my SSIS package using my target SQL server database key and using SQL server encryptbykey function while transferring?
Thanks,
Ujjaval
View 1 Replies
View Related
Feb 1, 2007
Hi,
Can I transfer data between two dataflow.
Is it possible through anyway?
Thanks
Dharmbir
View 4 Replies
View Related
Mar 5, 2014
I created a schema, Admin. I have to transfer a table from the dbo schema to the admin schema. I keep getting an error that I do not have permission or the table does not exist.
Simply looking for confirmation here - is my syntax correct?
ALTER SCHEMA Admin TRANSFER MyShop.Addresses;
(MyShop is the Database, Addresses is the table)
NOTE: When I created the schema, I did not create an inner table. The syntax for that was simply CREATE SCHEMA Admin;
View 8 Replies
View Related
Jan 25, 2007
My problem is very simple and that is that I'm trying to copy some tables between databases, but these tables are in different schemas.
let's say I have
dbo.tableA
sch1.tableA
sch2.tableA
sch3.tableA
And I just want to copy let's say sch1.tableA to a Different DB.
If I use Transfer SQL Server Object task and select the table and save the package and try to open the task again, all the tables with name TableA will be selected!! it seems like although it does show the schema ( when I am selecting the table manually ) but it doe snot store the schema detail in the tablelist collection property of the task.
Would please recommend any other way to achieve this.
Many Thanks in advance
View 1 Replies
View Related
May 7, 2008
I've got a DB2 database that contains a lot of tables. I need to extract the data from some of them and put them in a SQL Server database. The number of tables needed may change, so I need an easy way of controlling this.
So, I created a lookup table in the target SQL Server DB that lists the tables I want with any selection criteria needed.
I have a For Each loop that goes through every row in the table. using the current row, it deletes the contents of the destination table (done and working), and then tries to load the source into it.The destination OLE DB uses a variable for the table name to insert into.
The problem I've got is whilst I can set up the OLE DB Source to use a SQL command as variable, I have to do it in the advanced editor or in the properties, as the normal editor gives an OLE DB Error. This is probably due to the fact that the variable has nothing in it at this point. I've turned EvaluateExternalMetaData off to avoid any design time errors, but when attempting to run I still get :
Code Snippet
[DTS.Pipeline] Error: "output "OLE DB Source Output" (11)" contains no output columns. An asynchronous output must contain output columns.
Which I understand, as there are none until runtime. How do I get around this problem of not having any columns to map until I know which table I am processing?
Thx
Rob
View 3 Replies
View Related
Jan 10, 2007
Hello,
I am trying to transfer a database from one server to another using the Import Export wizard in SSIS and I am consistantly getting this error on 2 different tables so far.
- Execute the transfer with the TransferProvider. (Error)
Messages
* ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of
"output column "ErrorCode" (79)" and "output column "ErrorCode" (14)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC} (Microsoft.SqlServer.DtsTransferProvider)
This error message is beyond cryptic and when I click on the link it sends me to a web page the just tells me that there is no information available for my current issue. I am transfering the tables to an empty database so I do not understand why I am receiving this error. I have to say that I am not impressed with SSIS at all. I know alot of developers think it's the best thing since sliced bread, however either I am doing something wrong or Microsoft needs to come out with a service pack that fixes these bugs...
Any help would be appreciated...
Thanks,
David
View 29 Replies
View Related