Any Problems With Large Packages?

Aug 28, 2007

I've been suffering with a variety of symptoms lately, and I don't know why:



Can only debug once in BIDS. After that, it can never attach to DtsDebugHost.exe. DtsDebugHost.exe never receives any CPU time, and never grows past its initial memory allocation.

Save takes a long time with 100% CPU usage. This can be as simple as openning a SQL Server Destination component to correct the metadata due to an added column in the underlying table, then saving. It takes several minutes at 100% CPU.

I've had several Visual Studio crashes today (DW has reported these)
I've read here about people with 10MB or 17MB packages, so I feel badly complaining about my 3.2MB package! However, something is going on, and I'd like to know what it is, so I can debug again.

Thanks for any help.

View 4 Replies


ADVERTISEMENT

Problem With Saving Large SSIS Packages

Jan 25, 2008



When I work with large dtsx files I have problem with saving them in Business Inteligence Studio.
This problems arise when dtsx files is has size more than 7-8 MB.
Any hint about it?

View 5 Replies View Related

Running A Large Number Of SSIS Packages (with Dtexec Utility) In Parallel From A SQL Server Agent Job Produces Errors

Jan 11, 2008

Hi,

I have stumbled on a problem with running a large number of SSIS packages in parallel, using the €œdtexec€? command from inside an SQL Server job.

I€™ve described the environment, the goal and the problem below. Sorry if it€™s a bit too long, but I tried to be as clear as possible.

The environment:
Windows Server 2003 Enterprise x64 Edition, SQL Server 2005 32bit Enterprise Edition SP2.

The goal:
We have a large number of text files that we€™re loading into a staging area of a data warehouse (based on SQL Server 2k5, as said above).

We have one €œmain€? SSIS package that takes a list of files to load from an XML file, loops through that list and for each file in the list starts an SSIS package by using €œdtexec€? command. The command is started asynchronously by using system.diagnostics.process.start() method. This means that a large number of SSIS packages are started in parallel. These packages perform the actual loading (with BULK insert).

I have successfully run the loading process from the command prompt (using the dtexec command to start the main package) a number of times.

In order to move the loading to a production environment and schedule it, we have set up an SQL Server Agent job. We€™ve created a proxy user with the necessary rights (the same user that runs the job from command prompt), created an the SQL Agent job (there is one step of type €œcmdexec€? that runs the €œmain€? SSIS package with the €œdtexec€? command).

If the input XML file for the main package contains a small number of files (for example 10), the SQL Server Agent job works fine €“ the SSIS packages are started in parallel and they finish work successfully.

The problem:
When the number of the concurrently started SSIS packages gets too big, the packages start to fail. When a large number of SSIS package executions are already taking place, the new dtexec commands fail after 0 seconds of work with an empty error message.

Please bear in mind that the same loading still works perfectly from command prompt on the same server with the same user. It only fails when run from the SQL Agent Job.

I€™ve tried to understand the limit, when do the packages start to fail, and I believe that the threshold is 80 parallel executions (I understand that it might not be desirable to start so many SSIS packages at once, but I€™d like to do it despite this).

Additional information:

The dtexec utility provides an error message where the package variables are shown and the fact that the package ran 0 seconds, but the €œMessage€? is empty (€œMessage: €œ).
Turning the logging on in all the packages does not provide an error message either, just a lot of run-time information.
The try-catch block around the process.start() script in the main package€™s script task also does not reveal any errors.
I€™ve increased the €œmax worker threads€? number for the cmdexec subsystem in the msdb.dbo.syssubsystems table to a safely high number and restarted the SQL Server, but this had no effect either.

The request:

Can anyone give ideas what could be the cause of the problem?
If you have any ideas about how to further debug the problem, they are also very welcome.
Thanks in advance!

Eero Ringmäe

View 2 Replies View Related

Calling SSIS Packages From ASP.NET - Packages With File System Tasks End Abruptly

Jan 9, 2007

I've run into a problem with SSIS packages wherein tasks that write or copy files, or create or delete directories, quit execution without any hint of an error nor a failure message, when called from an ASP.NET 2.0 application running on any other machine than the one where the package was created from. By all indications it appeared to be an identity/permissions problem.

Our application involves a separate web server and database server. Both have SQL Server 2005 installed, but the application server originally only had Integration services. The packages are file system-deployed on the application server, and are called using Microsoft.SqlServer.Dts.Runtime methods. For all packages that involve file system tasks, the above problem occurs.

When the above packages are run using the command prompt (either DTEXEC or DTEXECUI) the packages execute just fine. This is expected since we are using an administrative account. However when a ShellExecute of the same command is called from ASP.NET, the same problem occurs.

I've tried giving administrative permissions to the ASPNET worker process user to no avail.

I have likewise attempted to use the SQL Server Agent job approach but that approach might not be acceptable for our clients since it means installing SQL Server 2005 Database services on the application server.

I have read the relevant threads in this forum, namely http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1044739&SiteID=1 and http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=927084&SiteID=1 but failed to find any solution appropriate for our set up.

Anybody got any idea on how to go about this?

View 33 Replies View Related

Integration Services :: Remotely Execute Packages On SSIS Server - Packages Are Deployed In File System

Apr 22, 2015

We manage some SSIS servers, which has only SSIS and SSIS tools installed on them and not the sql server DB.

SSIS packages and configuration files are deployed on a NAS. We run the SSIS packages through DTEXEC by logging in to the server.

We want to allow developers to run their packages on their own on the server, but at the same time we dont want to give them physical access on the server i.e we do not want to add them into RDP users list on server properties. We want them to allow running their packages remotely on the server.

One way We could think of is by using powershell remoting and we are working on that. But is there any other way or any tool already present for the same.

View 4 Replies View Related

Upgrading System To Run DTSX Packages Instead Of DTS Packages

Aug 2, 2006

Hi all

Our data management system currently runs DTS packages using DTSPKG.dll.

I am currently looking at the possibliity of replacing the DTS packages and SQL 2000 with DTSX packages using SSIS in SQL 2005.

Do I need a new dll? or will the current dtspkg.dll handle the new DTSX packages?

Many thanks in advance!

View 1 Replies View Related

Large DataBase

Jul 23, 2007

What happens if you have a website and the hard drive on your server is say 250GB. Then the database exceeds that and the database is 300GB in size.How would you spread your database into two different hard drives?Thanks

View 2 Replies View Related

Large Tables In SQL 7.0

Jul 12, 2001

We currently have a data warehouse running on SQL 7.0, SP2. One of our primary fact tables is now well over 155 million rows in it. The table is not very wide, as it only contains 17 columns, most of which are defined as integers. The entire database is only 20 GB.

The issue is that the loads from the staging table to this fact table have significantly deteriorated over the last month or so, dropping from over 400 transactions per second to around 85. We drop all the indexes on the fact table before we load the data into it.

Are there issues with a manageable table size in SQL 7.0 that we need to be concerned about? And should we consider partitioning the table into several smaller tables and join them with a "union all" view?

I really need to get this performance issue resolved, as our IT support vendor is pushing us to port the data warehouse to UDB because they tell us that SQL server is not scalable enough to handle this volume of data.

Thanks for any help you can provide.

George M. Parker

View 6 Replies View Related

Maintenance A Very Large Db

Aug 31, 2001

Hi, anyone who administering the pretty big database not less than 30 Gb with the average number of rows in a table about 2M and more, please share you experience with maintenace of such a db. Esspecially i'm interesting in:

1) Indexes maintenance (When and how - just regular dbcc, maint. plan or some script to split the job twice and so on.)

2) Remove unused space from db. (not major)

The serever works 24*7, and it's transactional environment. SQL 7 sp3 on claster.

I run the sp. to rebuild all the indexes it takes about 2-3 hrs to determin the objects withfragmentation less than 80% and actually rebuilding, during this process the users experience the performance (specially for update/insert) problem. It looks like I need to change the plan or strategy to do this. Any thoughts appreciated!

Thanks in advance.
Dmitri

View 2 Replies View Related

Large Tables

Aug 10, 2000

Hi,

How can i partition the large tables so that the insert and updates which iam doing on the tables take less time.

I want to know how can i partition large tables and if i do that how is that the performance is going to be increased.

Thanks.

View 1 Replies View Related

Log Files Too Large.

Oct 20, 2000

Hi,

I have inherited some databases whith extremely large Log files.
I tried the truncate transaction log but did not work.
Can some body please tell me how to truncate these log files.

Thanks in advance.

Attaullah

View 2 Replies View Related

Large Database

Apr 17, 2003

Hello,

I would like to know whether MS SQL Server 2000 Can handle Large Database size 28 gig, currently we are using Sybase but we want migrate to SQL Server 2000. Are there any performance issues?

Any help is appreciated.

Thanks

Sejal

View 1 Replies View Related

Large Tables

Mar 13, 2001

How can I find largest 5 or 10 tables in a database?

Thanks in advance
Chan

View 2 Replies View Related

Hi Availability && DR For Large No. Of DBs

Feb 11, 2005

Hi,

I want some help for this particular scenario.

I have more than 350 Databases growing on an average of 5 Databases per month.

I am concernd about the DR Strategy to be adopted.

Can any one help me

View 1 Replies View Related

Large BCP Problem

Oct 6, 2005

OS - Window 2003 Standard
SQL - 2000 Standard SP4
2 CPU 4 GB Memory

We have a table this is approximate 155 GB with 450 million rows. 90 days worth of data and normalized pretty tight. Obviously this beast is too much to handle (reporting, maintenance) so we are partitioning the data off into weekly tables. I will never need to store more than 90 days worth of data so at the end of each week, we will truncate the oldest week's table and change the constraint on it to handle the next week. Anyway, We have done this and it is working well in 2 testing environment with a lot less data.

I need to migrate the data from the 1 table 450 million rows to the 13 weekly tables that will approximately make up the 90 days worth of data. I hoped to BCP out the data and Bulk Insert (fast mode). Even 1 weeks (Approximate 35 million rows) goes forever and and don't even see it start to output. I have changed the batch to be larger. Is there a better/faster way to do this or am I just going to have to wait. I'm open to any suggestions.

Note: Hardware is all Raid 1+0, reading from 1 array and writing to a different array. Performance just seems to slow but I'm not sure what to expect from something this big.

Ideas.

View 1 Replies View Related

Database Is Is Too Large Need Help

Aug 17, 2006

We have many installations of our shopping cart database. One specific database is huge, now about 25 GIG, compared to the others that range from 20 to 75 MEG. The server this one resides on has three other instances of the same database that are normal size.

In a particular table in the large database there are 9700 rows taking 380MB, the same table in a normal db has 162000 rows and takes 6.MB. The tables are identical and the indexes are the same.

Any ideas out there?

View 6 Replies View Related

Large DB Backup

Jun 19, 2008

I have this large DB around 250GB. In which 70% of the space is occupied by one single table. This is not a high transaction DB, it has only a few users.

Backup is taking 2:30 hours!

Is there a way I can reduce this time?

In desperation, I am thinking of doing,

1. detach the DB (off peak hours)
2. copy .mdf and .ldf files (backup)
3. attach the DB

Any suggestion ?



------------------------
I think, therefore I am - Rene Descartes

View 20 Replies View Related

Error.log Too Large

Jan 10, 2007

this file is getting upto 110GIG why?

View 4 Replies View Related

Transaction Log Too LARGE

Jan 30, 2007

I often in my job come across the following scenario:

Client rings up and says Run out of server space due to SQL 2000 Transaction log has consumed all the space or has consumed a very large portion of it.

what is the correct procedure in resolving this ASAP working with Full mode SQl 2000 Databases. as i have other
guys within my company that all do different things with good result and sometimes bad results.

The procedure i use is the following:

METHOD 1

1.backup both database and transaction log
2.Right click the database and select Detach, which from my understanding is a clean detach method which ensures that uncommited transactions are commited to the database.
3. Rename the old transaction log to .ldfOLD
4. REATTACH Database which creates a new transaction log.

METHOD 2

1. I dont use this method but im pretty sure its risky and if possible
can someone provide me with the reasons why:

1.Change database method from full to simple mode, shrink logs
and then change back to full mode.



basically what i am asking is what is the fastest way to sort out the above issue most effectively and with the abilty to Roleback succesfully.

PLEASE dont comment on why is the transaction log so big as i dont want to look into that now all i am sking is what is the most effective method to shrink the log down and save space.

View 13 Replies View Related

PK On A Large Table

Nov 16, 2007

I am developing an application that has a table with lots of records(network traffic) but the data is summarize every so often to create summary records (old records are deleted). The problem is that I have a PK based on an autoincrement ID (int) that will run out of numbers. However, this ID is not referenced anywhere, (not a foreign key from another table, not use for deletion and there is no update in this table whatsoever).

So my possibilites are:
1.- reseed the id when it is about to run out.
2.- make the id bigint
3.- remove the id and change the PK to 2 other fields
4.- remove the id and without PK

I am leaning toward option 4, because I do not see the need for a PK, but I understand that it is quite out of the normal.. So I would like to hear from other people ( I do not have much experience with DB).

I also like option 3. I already have a index on one of the other fields (time).

Any input will be appreciated.

Claudio Robles

View 7 Replies View Related

Large Datasets

Mar 27, 2008

Hi,

I have a table which has over 450,000 records in it, I have now split this into 4 so each table has around 100,000 records in it but I'm still having the problem of the data being returned really slowly.

What I need to do to this data is group it by a code and show the total for each code for every month of the year (this is currently based on one column and selecting the data accordingly). I have created views and put some indexes onto my table but the results are still being returned slowly. Does anyone have any suggestions of how I can speed this up?

Thanks

Gemma

View 5 Replies View Related

Large SQL Export

Mar 30, 2008

I'm trying to export data from an SQL table using classic ASP. In the past, I have created tab delimited text files using code similar to the following code:

Response.AddHeader "Content-Disposition", "attachment;filename=results.txt"
Response.ContentType = "application/octet-stream"

Response.Write "Field1"&vbTAB
Response.Write "Field2"&vbTAB
Response.Write "Field3"&vbTAB
Response.Write vbCRLF

RSb.Open "SELECT * FROM tblData1 WHERE ID=1 ORDER BY txtField1", con, 3, 3
Do Until RSb.EOF
Response.Write RSb("txtField1")&vbTAB
Response.Write RSb("txtField2")&vbTAB
Response.Write RSb("txtField3")&vbTAB
Response.Write vbCRLF

RSb.MoveNext
Loop
RSb.Close

This always worked fine in the past because the tables were small. Now, the tables are exporting 100,000 records and getting errors. I'm setting the page timeout and the sql connection timeout, but I'm still getting errors. I'm not exactly sure what's happening, but the export stops after exporting about 5Mb. It varies where it stops, so I'm thinking it's timing out somewhere.

Is there a better way to do this? Possibly use an export function in MS SQL that would export faster?

TIA

- PR

View 4 Replies View Related

BCP Large Table.

Jul 23, 2005

If I use BCP to export a very large table will that table be blockedfor writes during the export process? I don't want to prevent usersfrom accessing that table during the bcp process?Thank You, TFD.

View 1 Replies View Related

Large Inserts

Jul 5, 2007

Dear Experts,What is the best way to do a large insert WITHOUT having direct accessto the machine SQL Server is running on? For example, imagine I wantto insert something like 20,000 records. If I were to have access tothe server, I could BULK INSERT into a temp table and then insert intothe destination table. But if I can't create a file on the server touse for BULK INSERT, what is the next best alternative to doing lotsof 1 record insert statements?Thanks,-Emin

View 4 Replies View Related

Best Practices For Large DB

Jul 20, 2005

Hi All,My question is what are the best practices for administering largeDBs. (My coworker is the DB administrator. I'm more of thedeveloper. But slowly being sucked in.) My main concern is that wehave some DBs that take approx 3 hrs a night just to rebuild theindexes. I know that with MSSQL 2000, I can use partitioned views tobreak out the table(s) into smaller databases and tables. But we alsohave an older server that runs MSSQL 7. Lastly how do you handledrive space issues? Do you spread out the DB across multiple MDFfiles on different drives? Thanks in advance.

View 1 Replies View Related

Large Text

Mar 18, 2008



Hi,
I have one column which contains some large data more than 30000 characters. When I exported the data to Excel file this, some data in this column showed as #######. This # are giving the problem in SSIS package. If I delete that data(####) then SSIS package is working fine.
How to solve this problem.

Thanks in advance.

View 1 Replies View Related

Log File Is Too Large

May 28, 2008

I administer an application that runs on SQL 2000, without being an SQL expert myself. I have observed that if I create a maintenance plan then the log file gets trimmed, but otherwise the log file keeps growing. All was going well at one site, but recently the log file has become huge. The server is also running out of disk space. Is that the problem? I'm not sure how to control this. Can I delete a log and then it will re-create? Two smaller databases share the same maintenance plan and these logs are small. Thank you.

View 6 Replies View Related

Database Has Gotten Relatively Large

Sep 12, 2006

I upgraded a database from SQL2000 to SQL2005 and it went pretty smooth. After the transition was made, I backed up the DB. The size of the database was as expected, about 5 GB. About 5 hours later, a maintenance plan executed a few optimization jobs in the following order: Reorganize Index, Rebuild Index, Shrink Database, Update Statistics. Soon after that, another job backed up the database and it was then 32 GB. During the time after the first backup, no one was really using the database.

I've been trying to track this down for several days. Does anyone have any ideas for me?

Thanks,
Bobby Crawford

View 1 Replies View Related

Why Is Database So Large?

Dec 4, 2006

Hello!
I have a database with one table inside. The table has six columns with the following datatypes:
col1 -->smallint (2byte)
col2 -->int (4byte)
col3 -->smallint (2byte)
col4 -->smallint (2byte)
col5 -->smallint (2byte)
col6 -->int (4byte)
I have insert 1.359.320 rows with data and the size of the sdf file is 40.116.224 byte.
According to my calculation:
1.359.320 * 2byte + 1.359.320 * 4byte + 1.359.320 * 2byte + 1.359.320 * 2byte +1.359.320 * 2byte + 1.359.320 * 4byte = 21.749.120 byte
I hope somebody can explain me, why the database is so large.

Thank you
Sascha

View 4 Replies View Related

How To... Backup Large DB...

Jul 23, 2007

hardware configuration:



2 node cluster, active-passive

+

for backup: direct attach tape library (dell powervault 132T with 2 tape drive's) on 1 node. Tape type is LTO (100Gb, and with compression 200Gb). Last time one filegroupe backup with errors:



Job 'BackupFG_index_ToTape1' : Step 1, 'step' : Began Executing 2007-07-02 Job 'EscortBackupFG_index_ToTape1' : Step 1, 'step' : Began Executing 2007-07-16 01:00:01

10 percent backed up. [SQLSTATE 01000]
20 percent backed up. [SQLSTATE 01000]
30 percent backed up. [SQLSTATE 01000]
40 percent backed up. [SQLSTATE 01000]
50 percent backed up. [SQLSTATE 01000]
60 percent backed up. [SQLSTATE 01000]
70 percent backed up. [SQLSTATE 01000]
Msg 4028, Sev 16: End of tape has been reached. Remove tape '\. ape1' and mount next tape for BACKUP DATABASE...FILE=<name> of database 'pakdb'. [SQLSTATE 42000]


i'am use native SQL Server backup, t-sql script:



BACKUP DATABASE pakdb
FILE = 'pakdb_index',
FILEGROUP = 'db_index'
TO TAPE = '\. ape1' WITH INIT, FORMAT, STATS, NOUNLOAD


i have 2 way's for solving this problem:

1) using any software (veritas as axample)

2) breack large filegroupe on any small filegroups - don't good think



Question:

How you solve this problems. Use veritas (example) - it problem too, if server full dead and veritas repository too - how i'm can restore information from my types (it's my ideas).

View 4 Replies View Related

Very Large Recordset

Aug 10, 2007



I have a stored proc which returns a very large recordset (150,000 rows +). Each row has upto 65 - 95 fields. I need to do row transformations on each row.

I am using an Execute SQL Task to execute the stored procedure to return the results into a Object variable which I then shred inside a Source Script Component (by casting it into a recordset datatable and looping it and assinging into an Output columns).

Obviously the recordset variable is enormous - stuffed with 150K large rows. But what choices do I have in this scenario? Can I persist the recordset variable to disk? I cannot stream the result of a stored proc execution. So what I do about the sluggish performance of this enormous variable?

thanks in advance

View 3 Replies View Related

Large Lookups

Apr 28, 2008

Hello,

I have a source table with few million rows in it. As a part of transformation, I need around 10 lookups in 10+ different tables, all of them having few million rows each. I am looking for an approach that would be reasonably speedy and easy to manage future changes.

Here are some of the things that I have tried...
(1) If I implement as lookup components, they cause developer machine to go really slow and takes forever to run.
(2) I tried having OLE DB Source query to fetch required data up front. But the source query becomes very complicated which will be even harder for future changes. And this big query cause SQL Server to become unresponsive.
(3) Update queries on target table are also causing server to be unresponsive.

What would you guys suggest for this type of implementation?

Kapil

View 5 Replies View Related

Updating A Large Set Of Data

Oct 4, 2006

hello guyshere is my problem:i am developing a asp.net web app in .net 2.0. i have some sensitive data in my database. which is encrypted using DES ( with some key which is only known by the top level authorities ). now there is an option of changing the secret key. on changing the key the sensitive data has to decrypted using the old key and then again encrypted using the new key. Now if the no of records increases i am afraid that it might take a longer time and the application might look as it got hanged. guys i have no clue on how to do this. if you guys have any idea on how to implement this please let me know. any help would be appreciatedVignesh

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved