SSIS Performance Turning
Aug 28, 2007
Hello Everyone,
Can any one update me up performance turning of SSIS and what difference would it make if I change the default value of this two parameter in each Data Flow.
DefaultBuffermaxRows
DefaultBufferSize
Also update me on what is these parameters used for.
Thank you
View 3 Replies
ADVERTISEMENT
May 1, 2007
All:
When creating a package, SSIS assumes varchar columns as Unicode (DT_WSTR) so before loading data into the target tables, I have to perform a data conversion from DT_WSTR to DT_STR.
Is there any way to turn UNICODE off? So I do not need to do the conversion? Please advise...
Rohan
View 4 Replies
View Related
Sep 30, 2015
I have list (in an input file) where each row is about 20K in size (so it can't be stored in a sql table). I want to convert the list into a table as shown below:
before:
---------------
pk1, c1, d1, c2, d2, c3, d3,......
pk2Â Â c1,d1,c2,d2,c3,d3.....
where "pk" denotes a primary key and in a given row (c1,c2.c3,...) are all distinct. ("c" are columns names, "d" is the associated data)
after: (desired conversion)
---------------
pk1,c1,d1
pk1,c2,d2
pk1,c3,d3
...
pk2,c1,d1
pk2,c2,d2
pk2,c3,d3
....
I was planning to have SSIS pull in the "before" data, run a custom C# program in SSIS against it to massage the data to vertical (3-column format), then export the massaged data to a new text file. The new text file would later be imported into a sql table using SSIS.Â
View 5 Replies
View Related
Jan 5, 2007
Hi All,
Our client is a multinational Cigar company and we have been building SSIS packages for them for extraction of data into data marts.
We tried to do all the tasks using T-SQL procedures also and tried comparing the performance of procedures and SSIS packages.
We found out that procedures are better than SSIS packages in performance(Time taken). Why is this so?
We expected the SSIS packages to be much quicker but it was not the case...
Thanks,
Prakash.P
View 2 Replies
View Related
Feb 7, 2006
hi,
I have a scenerio in Which I have to handle millions of rows.
The Data should be read in Chunks and Written into Custom Destination into Chunks, How will I Acheive it.
Thanks
Dharmbir
View 1 Replies
View Related
Jun 19, 2006
I have been working on a project last few months. I have developed the project on my laptop, which is resonably powerful. It runs through fine within 9 mins with sample data set.
If I replicate the same environment on a 64 Bit machine with 32 Bit Win 2003 and SP1, it takes just over 7 mins.
If I rerun it on a 64 Bit machine with 64 Bit Win 2003, it takes between 21 and 24 mins.
We are executing the packages via dtexec on a command prompt.
Has anyone seen same behaviour?
Thanks
Sutha
View 19 Replies
View Related
Jan 31, 2007
I am currently in the process of migrating DTS packages to SSIS. I am finding that most of the packages are running faster, but some of them are taking longer to execute.
The DTS package copies data from our Production server to Development. It uses a Copy SQL Server Objects Task to copy only the data from about 50 tables. This takes about 3.5 minutes. I created the exact same package in SSIS using the Transfer SQL Server Objects Task and it is running about 5 minutes.
Another package I am having this problem with is only copying data from 1 table using a Copy SQL Server Objects Task. This package executes in 19 minutes. I have created the exact same package twice. Once using Transfer SQL Server Objects and once with a data flow task. The Transfer SQL Server Objects package takes about 50 minutes while the Data Flow package takes about 40 minutes.
As I said most of the packages are faster with SSIS so this is why I am confused on these couple where they are just copying the data.
Any help is much appreciated.
Thanks,
Sam
View 3 Replies
View Related
Feb 12, 2007
Hi:
I've written new SSIS packages to do what DTS packages did and the
performance I'd say is about 20 times slower. In this package, I have
a loop that loops through different servers based on server entries in
a SQL database. Each loop pumps 10 tables. The source query is set by
a variable and the destination table is set also by a variable, since
all this data goes to the same tables on the SQL server and the
definitions are all the same on the source server (Sybase). It's still going and has taken about 12 hours to pull roughly 5 million records.
The source query ends up being:
SELECT *, 'ServerName' FROM SourceTable1 WHERE Date >= Date
The 'ServerName' , the "sourcetable1" and the "Date" are all set by
variables which in turn build the source query variable.
Anyway, I just mention this for completeness--I would think setting
the variables which have anything to do with the pump's performance. How can I check
to see where the performance is getting held up?
Also, I have checked via ping the timeout to the 3 servers. The slowest one pings in about 62 ms, the fastest at 1 ms, and the other somewhere in between.
View 7 Replies
View Related
Feb 26, 2007
Hi,
I have 4 execute packages say A, B, C, D. These packges contains packages for data transfer in turn.
Package A - 15 packages
Package B - 15 packages
Package C - 20 packages
Package D - 20 packages
When i run these packages one at a time in the order A, B, C, D then the execution time is around 17 mins.
If I make a parent package and put A, B, C, D in sequence in it, executing this parent package increases the execution time to arnd 50 mins.
Pacakge A, B, C, D doesnot run in parallel. They run one at a time. So i was wondering why the there is so much time difference.
Please let me know if theres some configuration settings to make the parent package efficient.
All the test conditions are same in both the cases. The source and the target are SQL server which are on the same machine as the SSIS.
Thanks,
Vipul
View 5 Replies
View Related
Aug 16, 2006
I am trying to return a 0 when there is a NULL as a result, I thought that Sum(ISNULL(Payment,0) would do the trick, I was wrong...any ideas:
MonthlyCredit =
(Select
Sum(ISNULL(Payment,0)) from transaction as t
Where AccountNumber = @AccountNumber
and
Month(t.PayDate) = @Month
)
View 6 Replies
View Related
Jan 13, 1999
Is There any way to stop a database from writing to the log file when you alter the design
of a table.
View 1 Replies
View Related
Jan 16, 2002
Is it possible to turn a trigger off in SQL Server as you would be able to do in oracle? If so, how?
Thanks
David
View 1 Replies
View Related
Aug 22, 2001
Im trying to do an INSERT SELECT statement in the following manner:
INSERT INTO
DB1.dbo.TABLE
SELECT *
FROM dbo.TABLE1
dbo.TABLE2 ON dbo.TABLE1.column = dbo.TABLE2.column
And Im given this error message:
An explicit value for the identity column in table 'DB1.dbo.TABLE' can only be specified when a column list is used and IDENTITY_INSERT is ON
So if anyone knows how to turn it on it would be a great help.
Sincerely,
Matt
View 3 Replies
View Related
Feb 15, 2005
Hi is it possible to run a certain SQL statement agaisnt SQL Server and ask it not to fire any triggers? Or is would it be better to disable the trigger and then reable it after ward? If so how? Thanks Ed
View 3 Replies
View Related
Nov 28, 2006
I have a device application that simply needs to upload data to a server. The preferred DB server is Oracle but I've made it work using RDA and SQL Server. The problem I'm having is that it just needs to upload data, whichh I send using the RDA.Push() method. The data arrives just fine, the first time. With every subsequent upload all of the previous data is deleted fromt he server. Apparently RDA is tracking the deletion of the previously uploaded data locally and on the next .Push deleting that data from the server.
My question is: Is it possible to prevent RDA from deleting data on SQL Server? I attempted to delete the rows from the __sysDeletedRows/__sysRowTrack tables but got a "Data is read only" error.
View 1 Replies
View Related
May 12, 2008
--Environment: SQL Server 2000
I am doing following query for who is reaching of Age 65/75 in May 2008 but Member_DOB's or Spouse_DOB's showing different month. Month should show '05' because I want to see results who is reaching of age 65/75 in May 2008 only.
Please help in this regard.
--Query:
Select m.empid, m.dob "Member_DOB",
round(datediff(dd,m.dob,'05/30/2008')/365.25,1) Member_Age,
d.dob "Spouse_DOB",
round(datediff(dd,d.dob,'05/30/2008')/365.25,1) Spouse_Age
from member m
left outer join (SELECT * FROM depend where depcode = 'S' and activestatus = 1)d
on m.empid = d.empid
where (datepart(yy,m.dob) in (1933,1943) or
datepart(yy,d.dob) in (1933,1943))
and round(datediff(dd,m.dob,'05/30/2008')/365.25,1) in (65,75)
or round(datediff(dd,d.dob,'05/30/2008')/365.25,1) in (65,75)
--Results:
Empid Member_DOB Member_Age Spouse_DOB Spouse_Age
000000033 1931-12-07 00:00:00.000 76.500000 1933-06-16 00:00:00.000 75.000000
000000085 1933-05-23 00:00:00.000 75.000000 1938-03-10 00:00:00.000 70.200000
000000695 1933-06-10 00:00:00.000 75.000000 1934-07-08 00:00:00.000 73.900000
000000792 1931-01-15 00:00:00.000 77.400000 1933-06-05 00:00:00.000 75.000000
000002406 1933-05-27 00:00:00.000 75.000000 NULL NULL
000004149 1933-05-20 00:00:00.000 75.000000 NULL NULL
Desired results:
Member_DOB and Spouse_DOB's should show '05' i.e. 1931-05-07, 1931-05-15
View 2 Replies
View Related
Feb 26, 2007
Does anyone know if there is an easy way to turn off the "Select All" option from appearing on reports with multi-selects? I am going to have a hard time getting the development staff to update all of our reports AGAIN after making them conform to SP1.
Please let me know if there is a way before I install SP2.
Thanks.
View 8 Replies
View Related
Feb 10, 2007
I need to turn off validation and I've seen some threads saying this is not possible but my situation has a twist.
A customer needs the package to connect to different modem dialup connections to connect to different servers (they use dialup for security reasons). We have written two VB script tasks at the beginning and end of a loop, with data flows in between. Before the loop the dialup connection info is read into a recordset along with Data Source connection information. The first script uses this information to dialup and the last script hangs up the connection. The problem is the package tries to validate the data connections and the package has not dialed up yet, so it fails.
We managed to confirm it works in a test environment by putting a break in the first script, manually VPNing into the test network (to allow validation of the data flow to work), and then manually disconnecting from VPN during the break. The script dials in and pumps the data. But this won't be an option in production.
So if anyone has figured out a way to turn off validation, great. Otherwise, any ideas to make this work? I was thinking about setting up a dummy connection that would be connected outside the package before running just for validation (and then the script would disconnect to begin, but I would prefer to handle all of this within SSIS.
Any help? While I see the point of validation it's a bummer that MSFT didn't put this in the hands of the user.
Thanks, Kayda
View 4 Replies
View Related
Feb 7, 2008
I have a bunch of packages that take views and create tables from them. Some of the views are rather complex, but the packages themselves are very simple... drop and re-create a table using the data from a view on the same server. We create a new DB for each year, and this year we've upgraded to a new server with SQL 2005, so our DTS packages on the 2000 SQL server had to be recreated in SSIS on the new server. No problem, as I said the packages are really simple. But when I create the packages in SSIS they now take an extremely long time to execute, and I cannot figure out why.
For instance, one DTS package would take approximately 5 minutes to run when the view contained hundreds of thousands of rows and the underlying tables contained millions. But now, even with MUCH smaller tables (since it's the beginning of the year, new DB) the SSIS package I created on the new server takes over an hour, literally. The view that the SSIS package is using to create the table only takes about 15 seconds to execute in management studio (only about 16,000 rows). How can this possibly take so long??
the new server is virtually the same hardware-wise... 4 x 2400mhz, 4gb ram, win2k3 server
View 14 Replies
View Related
Aug 20, 2007
Hi All,
I'm working on a conversion project and I'm trying to compare performance of SSIS with Other ETL Tools, especially Informatica PowerCenter. Which one do you think is better ETL performer, when source and destination being SQL Server databases. Is there any benchmark available?
Thanks.
View 3 Replies
View Related
Sep 27, 2006
Hi,
our package have design like this,
OLEDBSource à Derived Column à Lookup
|
Matching Records Un Matched Records
| |
OLEDBCommand OLEDBDestination
(Update) (Insert)
and our source & destination table are oracle. when we execute the package the performance is very low and some times its showing like processing ( yellow color) even for 1 hrs .what could be the problem.can any one help us.is there any reason like when we use orcale database this will slow down the performance of package
Jegan
View 3 Replies
View Related
Nov 3, 2006
Hello again,
I'll just throw my question: how could I increase SSIS-performance?
I have a really heavy job with thousands of records my base selection, then I perform some lookups (I replaced most of them by sql) and derived columns (again, I replaced as much as possible by sql). Finally, after a slowly changing dimension task, I do update/insert on a given table. Is there a trick to speed up lookups and inserts (something like manipulating the buffer sizes - just asking).
Fact is that I replaced a script task by pure sql-joins and gained 6 of the 12 hours this job took.
Any ideas?
Greets,
Tom
View 2 Replies
View Related
Oct 5, 2007
Dear Friends,
I always use this forum to find support and to try help others.
But this time I need to receive your feedback about my package that will be in prodution in few weeks.
So.. could you give me your opinions? I prefer i write the comments in the blog, but you can write here to...
http://pedrocgd.blogspot.com/2007/10/bicasestudy-package-v2.html
Kind regards and thanks!!
Good work!!
View 1 Replies
View Related
Apr 23, 2008
. Have you faced any performance issues with SSIS?
View 6 Replies
View Related
Feb 19, 2008
I created a dataflow that transferred about 1 million records from a SQL database on one server to a differend SQL database on the same server. The processing took about 30 minutes. I used the Fast Load option.
I then created a "Execute SQL Task" and wrote a "SELECT * INTO TABLE" and this processing took about 30 - 60 seconds.
Can someone tell me why creating a Data Flow Tak would take so much longer or give differences between the two options above? Can someone give some pointers on how to make a Data Flow task more efficient?
Thanks.
View 11 Replies
View Related
Nov 14, 2005
I have a multi-threaded C# application that loads a bunch of tables into ado.net datasets in memory for surrogate key lookups. Depending on what else is going on, it can process 100,000 to 170,000 rows per minute and usually utilizes 20-30% of each cpu.
View 8 Replies
View Related
Mar 7, 2008
Hi,
I created procedure which completes execution in 20 mins in sql server 2005 but if i kept the same procedure in Execute SQL Task in SSIS and executing means, it is taking 3 hrs.
Is there any way to increase the performance for the above same.
Any help would be appreciated
Thanks
Dinesh
View 9 Replies
View Related
Oct 11, 2006
On 32 bit SSIS installations, both of the following performance counter objects are visible in perfmon.
SQLServer:SSIS Service
SQLServer:SSIS Pipeline
On 64 bit SSIS installations, only the following is available.
SQLServer:SSIS Service
The SQlServer:SSIS Pipeline counters are nowhere to be found.
Should I re-install? Is this a known issue with 64 bit SSIS?
P.S. Remote or local access administrative access with perfmon makes no difference, the "SQLServer:SSIS Pipeline" performance counters don't appear in the listbox when connecting to Windows 2003 x64 server.
View 4 Replies
View Related
Jul 27, 2006
Hello,
I have been running massive ssis packages and testing the performance.
This is my execution design:
I have a main package that gets a list of packages to execute from a table.
Then using a foreach loop in send the package to execute ( somewhere in the middle i delete the corresponding old log file for that package ), each of the packages configures themselves from the parent package variables.
What i have been analysing tells me that for example a package runs in 2 minutes and then the time wasted from the end of the package to the start of the other task is in average 3 to 6 minutes... thats alot... since i run about 20x12 packages witch gives me of wasted time about 20 hours.
My question is... what can be causing the delay between the end of package and the start of the other one...
The tasks types i am using in the execution controller package are:
Foreach loops, For Loop, File System task, Execute Package Task and some Execute SQL Tasks
Best Regards,
Luis Simões
View 15 Replies
View Related
Apr 1, 2008
Hi,
Can anyone tell me how to check the performance of a SSIS package?
Thanks
View 4 Replies
View Related
Nov 17, 2006
Hi,
Can anyone tell me the best way in SSIS to log performance at control flow level i.e. per task I have in my control flow and what performance characteristics it is possible to log.
Thanks in advance
View 1 Replies
View Related
Dec 28, 2007
We have SSIS installed and everything is working great. We are now to the point of wanting to tune one of our longer running packages and the Performance counters are not working. At all. They show up ok but the counter is always at 0. Is there anything special I have to do to get this to work.
One comment I fould was that the Performance Logs and Alerts service needs to be running to see these counters. I tried to start it and it immediately quit. I set it to automatic startup up and run a package. The counter still read 0.
Is there anything else out there I can try to get these counters to return somthing. Thanks.
Aaron
View 9 Replies
View Related
Apr 15, 2007
I am in a scenario where my tables are refreshed every morning by a batch update. I have built a few views off of one table. To increase speed I would like to take all the rows from one of the view s and insert them into their own table. I know this can be done with some T-SQL but I'm a noob to it and don't know how to specifically do it.Any detailed help would be greatly appreciated. -Nate
View 1 Replies
View Related