Import Large Field From SSIS

Apr 20, 2007

Hi,

I am making a SSIS package that imports data from a application using a custom ODBC driver. The field in the application is set to be a "longvarchar" type field and can be from 2 characters to 2MB of data.

I've created a ODBC data connection in the SSIS package and use a "DataReader Source" to read the data I need. The sql statement is very simple


Select log from tablename
When I try to run the SSIS package with that statement it just goes to yellow on the DataReader Source and stops. It stays like that until I stop it. If I select other fields except for that field it works fine. Also I've been able to get it to succeed getting the log field if I select a log record that's not too big. The largest one I've been able to get is 800 characters, but I got one with 2500 characters that just stops on yellow.

In the Progress log the last line says:

[DTS.Pipeline] Information: Execute phase is beginning.
Does anyone have any ideas on how to resolve this?

View 6 Replies


ADVERTISEMENT

DTS: Import Large Database

Oct 8, 2004

Importing data from an Access database, I cannot overcome the limit of 1,000 records.
In DTS, I "copy one or more tables", select tables, run, and cannot see my 1,052 entries.
Where can I set a max size of ~1,500 in my sql target base?

View 1 Replies View Related

V Large Data Import

Mar 19, 2001

I was wondering if anyone can help me.

I am trying to import data into SQL Server 7. The table will be 700-800 columns, and the data will be about 150,000 records at a time.
The data source is flat file.

First I create the table using a database schema, and secondly I would like to populate the table.
The problem is that most of the data is numeric, and to be used for statistical analysis.

So far I have tried Bulk Insert, bcp, and dts.
DTS is the only method that has worked in any way, shape or form, but that requires importing each column as a Varchar. Importing to my pre-created table doesn't work, because it is interpreting some of the source columns as character data and refusing to insert them into an int field.
Bulk Insert and bcp both give error messages, and I am wondering if that is because of the size of the insert statement that is required to handle so many fields.

For the moment I am just trying to import the data in any way, but eventually, it will have to be run as an automated process, with the table structure probably needing to be altered as well.

Any help/suggestions would be very greatfully received.

View 2 Replies View Related

Import Large Access Database

Jun 21, 2007

Hello, All:

I have many, many Access databases that are roughly 1.5GB-3GBs each and they have millions of records. Each MS Access Database file corresponds to one Database in SQL server. I'm trying to simply transform the data as it is in Access to MS SQL 2005.

I'm using the 64 bit version of Windows Server 2003 and the 64 bit version of SQL 2005. The server is running four dual core AMD Operton processors and has 8GB of RAM with a 1TB RAID 5 configuration. I think the hardware should be sufficient but the SQL Server Import and Export Wizard can't seem to handle the large number of tables/records. If I do one table at a time, it works well; however, it produces the following error message whenever I try to import the entire database:

Pre-execute (Error)
Messages
Error 0xc0202009: {5A5BF7AD-E86B-4316-AD43-1912358C56F4}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Unspecified error".
(SQL Server Import and Export Wizard)

Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)

Error 0xc004701a: Data Flow Task: component "Source 64 - District Corporal Punishment Class" (5743) failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard)




Any ideas would be much appreciated!
Thank you,
Cody

View 1 Replies View Related

SQL Import Of LArge Amount Of Data

Oct 7, 2005

This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney

View 5 Replies View Related

VERY Large Binary Import/export Headache

Oct 13, 2006

Hi,

I am currently importing (and exporting) binary flat files to and from Db fields using the TEXTPTR and UPDATETEXT (or READTEXT for export) functions. This allows me to fetch/send the data in manageable packet sizes without the need to load complete files into RAM first.

Given that some files can be up to 1Gb in size I am keen to find out a new way of doing this since the announcement that TEXTPTR, READTEXT and UPDATETEXT are going to be removed from T-SQL.

I had a quick foray into SSIS but couldn't find anything suitable which brings me back to T-SQL. If anyone knows a nice elegant way of doing this and is prepared to share, that would be grand.

Thanks for your time,
Paul

View 9 Replies View Related

SQL 2012 :: Create Script That Will Import Large XML Files?

Jul 28, 2014

I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.

What is the best and fastest way of importing such files. I have played around with smaller files and found the following.

1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.

how to import large XML quickly in a relational table structure?

View 9 Replies View Related

Field Size Too Large

May 9, 2008

I have set up transaction replication between two databases. Data from a table in the first database is replicated to the same table in another database.

The table at the publisher already has some data in it. The table at the subscriber is empty. When the replication is synchronizing, I get the following errors in the replication monitor:
*The process could not bulk copy into table "dbo"."virtualdatalocations_waitingqueues". (Source: MSSQL_REPL, Error number: MSSQL_REPL20037) Get help: http://help/MSSQL_REPL20037
*Field size too large

The table looks like this:
CREATE TABLE virtualdatalocations_waitingqueues (
dataid int ,
personid int ,
queueid int ,
CONSTRAINT FK_vw_dataid
FOREIGN KEY(dataid) REFERENCES datalocations(id) ON DELETE CASCADE ,
CONSTRAINT FK_vw_personid
FOREIGN KEY(personid) REFERENCES persons(id),
CONSTRAINT FK_vw_queueid
FOREIGN KEY(queueid)REFERENCES waitingqueues(id)
);

It used to run fine in the past. I couldn't find any help on google or on forums.

Any help or comments are greatly appreciated.

View 6 Replies View Related

Importing A Large Text Field

May 3, 2002

I'm importing a large text field from an Excel spreadsheet into my Sql dbase using Enterprise Manager and I'm getting the error message "Data for source column 31 'fieldname' is too large for the specified buffer size." How do I go about changing the buffer size to allow for larger text fields? Thank you.

View 1 Replies View Related

Truncation Error On Large Field

Apr 3, 2007

I have an tab delimiter ed file that I'm trying to load into a database using SSIS. The the database have a column called Comments that can hold up to 1000 Unicode characters (nvarchar[1000])

I have appropriately defined the flat file connection and marked every field to the intended length, but every time I run it it will give me the following error:


Data conversion failed. The data conversion for column "COMMENTS" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.
All the columns have a match, this specific column is has no field longer than 1000, larger record has 528 characters in it and the fields are defined as Unicode string in the file connection.

I already ran out of ideas of what may be giving this error, anyone has an idea of what else to try?

View 3 Replies View Related

Use Of Large Field Definitions For Small Values

Aug 2, 2007

HiThis is a question of "what does it cost me".Lets say I have an integer value which would fit into a smallint fieldbut the field is actually defined as int or even larger as bigint.What would that "cost" me ? How would definitions larger than I need forthe values in the field affect me ?Its obvious that the volume of the database would grow but with the sizeof resources etc that we have nowadays disc space isn't a problem likeit used to be and i/o is much faster and many people would tell me "whocares" , or IS it a problem ?How does it affect performance of data retrieves ? Searches ? Updatesand inserts ? How would it affect all db access if tables are pointingat each other with foreign keys ?Thanks !David Greenberg

View 3 Replies View Related

SQL Server 2008 :: Replication Error - Field Size Too Large

Feb 1, 2011

I've got two databases on the same server and replicate some tables from one database to another.The replication is configured so not to drop the table if it exists, but to delete the data based on the filter if one exists. There are two tables on the subscriber that have some extra columns.

I get "field size too large" error when trying to replicate them. Is there a workaround without having to make the publisher and the subscriber tables identical by schema?

View 5 Replies View Related

SSIS: Excel Import: SSIS Not Reading Dates

Apr 26, 2008

Hi. I need to import excel file in database. i first need to do an unpivot task. the column names are dates and SSIS seems to be unable to pick up the column name as it is replaced by F2 F3 F4etc Can you advise of a solution. thanks ken

View 1 Replies View Related

Large XML File Source In SSIS????

Jan 19, 2007

Hi,

I have a problem where I want to import a 1.6 GB XML file with SSIS into a SQL Server database. My hunch is that SSIS is not very good with handling such large amount of XML data. My test shows that SSIS tries to read all of the file into memory.

Does anyone know if there is any solution of solving this memory problem. My problem is that I want to take this source XML file import it into a database, make some transformations on it (eliminate duplicates etc) then produce a NEW XML file as output in a different XSD-format.

Is really SSIS the right tool for this operation?

The source XML file also have mixed content on Complex Types which seems to be a problem for SSIS as well.

Best regs,

//Patrick

View 1 Replies View Related

SSIS Break Up A Large Package

Jan 2, 2008

Hello,

I have a package that has over 80 sequence containers and as I added more sequence containers to the package I started encountered System.OutOfMemoryException error. I am planning to break up this package to six small packages. Is there a quick and safe way to do this? I have tried to use the copy and paste method by creating a new project and move a list of sequence containers over, that seemed to create even more work. Also, if the main package is splitting up to six smaller packages (the main package has the global variables that need to pass the values to the other child packages) how do the variables values pass onto the child packages?

Thank you so much in advance,
Andrew

View 4 Replies View Related

Import Data Into Field

Apr 11, 2008

Hi,

I am new here, and looking for some help with SQL import.

I am using SQL Server 2000.

Problem: I have a table called 'People', with a field that I wish to populate inside of it (Town).
I already have 500 records in the table.

I have a seperate csv file containing the primary key record (ID) information, and the information that I wish to put into the field (Town).

How can I import the information into the field 'Town' by inserting the correct information next to the ID in the CSV file (There are some that I do not have to import, so I cannot just insert straight in).

Best Regards

View 5 Replies View Related

Problem With Saving Large SSIS Packages

Jan 25, 2008



When I work with large dtsx files I have problem with saving them in Business Inteligence Studio.
This problems arise when dtsx files is has size more than 7-8 MB.
Any hint about it?

View 5 Replies View Related

SSIS Create Large Temp Files!!!

Oct 22, 2007



Hello,

I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM.
For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).

How can it be that there is a need to create temp files if there is so much RAM available?
Why is the RAM filled more and more during the SSIS package execution?
Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data)
Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)

Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath.
There are sufficient permissions set, but temp file is still created in user temp folder...

Any kind of help is very much appreciated!

Best Regards,
Stefan

View 5 Replies View Related

How To Pass Large Queries Into Variables In SSIS?

Nov 26, 2007








Hi,
I want to pass below given query into a variable

"if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[ <POS_MONTH>]]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
DROP TABLE [dbo].[ <POS_DATE>]
GO

SELECT * INTO [dbo].[<POS_DATE>] ]
FROM SG_POS_Template
WHERE 1 = 0;
GO"
Where [<POS_DATE>] is a parameter by which value will be assigned dymanically.....anybody please help me out....!!

View 5 Replies View Related

Import Problem With Varchar(max) Field

Oct 30, 2006

I'm trying to import some Assessor data from a text file into a table and the field for Legal Description (column 2 in the source text file) in the table is of data type varchar(max) because some of the data goes over the 8K size. I get an error on the first row of importing that refers to column 2 (see 'Initial errors' below). I read the related post and changed the size of input column 2 to 8000 and got this error. Finally I set the size of the of input column 2 to 4000 and it ran. So I'm thinking there is a limit on the size of varchar data that can be imported. Just want to clarify what that limit is and how I might go about importing this data.

Thanks, John

Error with input column 2 set to size of 8000:

Setting Destination Connection (Error)



Messages

Error 0xc0204016: DTS.Pipeline: The "output column "Column 2" (388)" has a length that is not valid. The length must be between 0 and 4000.
(SQL Server Import and Export Wizard)


Exception from HRESULT: 0xC0204016 (Microsoft.SqlServer.DTSPipelineWrap)


Initial errors:

Executing (Error)



Messages

Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "Column 2" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)


Error 0xc020902a: Data Flow Task: The "output column "Column 2" (18)" failed because truncation occurred, and the truncation row disposition on "output column "Column 2" (18)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)


Error 0xc0202092: Data Flow Task: An error occurred while processing file "\Scux00assrdumpsSQLServerDBexportsql.txt" on data row 1.
(SQL Server Import and Export Wizard)


Error 0xc0047038: Data Flow Task: The PrimeOutput method on component "Source - exportsql_txt" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
(SQL Server Import and Export Wizard)


Error 0xc0047021: Data Flow Task: Thread "SourceThread0" has exited with error code 0xC0047038.
(SQL Server Import and Export Wizard)


Error 0xc0047039: Data Flow Task: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
(SQL Server Import and Export Wizard)


Error 0xc0047021: Data Flow Task: Thread "WorkThread0" has exited with error code 0xC0047039.
(SQL Server Import and Export Wizard)


View 2 Replies View Related

Large Fixed Width Text Files Using SSIS

Aug 13, 2007

What is the easiest way to get a large fixed width text file (200 columns) defintion into SSIS? To have to define each column with the ruler would be very cumbersome.

View 5 Replies View Related

Data Import / Truncation - Very Long Field

Dec 7, 2007

Hi,

I am having issues importing data from a text file into a SQL database table using the import wizard in SQL Server 2005.

The text file contains polygon coordinates and a polygon name and looks like this:

"42.2342342,-121.1351398|42.3467984752,-122.2349234, ... ..., 42.1897498174,-122.131983","Polygon Name 1"

"42.2342342,-121.1351398|42.3467984752,-122.2349234, ... ..., 42.1897498174,-122.131983","Polygon Name 2"

and the SQL table looks like this:

polycoordinates varbinary(MAX)
polygonname varchar(50)

The length of the longest polygon coordinates record is about 115,000 characters. I believe the varbinary(MAX) type should hold that data, but SQL throws a truncation error every time I try to import the data.

Any suggestions?

Thanks,
Jay

View 4 Replies View Related

.dbf File Import (duplicate Field Names)

Mar 27, 2006

I am importing a file creating by an application which exports the file into .dbf format.  Very unfortunately, this .dbf file can have fields with IDENTICAL column_names.  Utilizing ActiveX, I create an ado connection to the .dbf file using a visual foxpro drver.  However, and not unexpectantly, I can not do the 'select *' from the file if there are duplicate names.

Can anyone make recommendations here that might help?

Oh, this is SQL200 in case that impacts what you might advise!!!!

View 2 Replies View Related

Update/Insert Date Field, Which Did Not Import From Access

Jul 12, 2007

First off, it has been a few years since I have done extensive work with SQL and that was using Oracle. But I am trying to develop a simple asset database for work, as we have nothing in place. I started out with Access, and decided to move to SQL express for many reasons.

What I have now is that I imported my data from my access 97 database to Excel, only my AssetTable did not import dates, I assume because Access and Sql Express handle dates differently... so a the time I just ignored that column.

Is it possible to insert the dates into the now populated SQL Express database AssetTbl where the AssetID's match? Here is what I have.

Sql Express Database Name: BAMS
Table Name: AssetTbl
fields: AssetID, SerialNum ...(many other fields)... DateAcq <- currently Null

Excel file: AssetDateAcq.xls
fields: AssetID, DateAcq (in format 07/12/2007)

To me it sounds like I need to do a short script/program to loop through the file read an AssetID from the excel file, and the DateAqcuired and then have it do an update on the DateAcq field, but it has been so long since I've done any work with SQL that I am finding there is a lot of "Dust" to blow off, and I don't know if I'm heading down the right track... or completely off course.

Thank you.

View 9 Replies View Related

Question About By Import Oracle Data And Field Is Null

Jan 23, 2007

Hi, I've a question about importing Oracle data and some fields are null. I get an error 'Conversion failed because the data value overflowed the specified type'. When i look in preview query result, via OLE db Source editor > Preview, this field contains '<value too big to display>'.

What do i do wrong? Can somebody help me?

Thanks in advance

Olaf

View 6 Replies View Related

Can I Insert/Update Large Text Field To Database Without Bulk Insert?

Nov 14, 2007

I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind.  I've tried using the new .write() method in my update statement, but it cuts off the text after a while.  Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.

View 6 Replies View Related

Identity Field Settings Not Copied By Import Wizard (2005)

Feb 26, 2006

It appears that when you use the import/export wizard from within Microsoft SQL Server Managment Studio, the identity attributes of the table being copied are not transferred. For example, say the source table has a column

[ref] [int] IDENTITY ( 1 , 1 ) NOT NULL,

When the import wizard is done the destination table will have a column named ref, but will not be an identity column. The column definition will be

[ref] [int] NOT NULL,

instead. Is there a way to change this behavior somewhere in the gui? When doing the import, the only options seems to be 'Enable Identity Insert', but checking this does not affect the definition of the column.

-Eric

View 7 Replies View Related

DTS Import Of MDB In SQL Server 2000 Drops Memo Field Data

May 22, 2006

I have used DTS in SQL Server 2000 to import an MDB filed (MS ACCESS) of a table. When the table is imported the primary key is lost and the memo field data is completely gone.

I use the tranformation option in the DTS wizard to add the primary key and make sure the data type for the memo field is varchar and has a size of 8000. I need that large size since I am storing lots of html code.

When I preview the data I see the html code that is supposed to get imported. However, when I return all rows from the table in Enterprise Manager the field is empty.

So I tried to manually copy the data from the MS Access Database into SQL Server. Could not figure out if SQL Server has an interface like MS Access to simply copy data into a table. So I linked to the tables from MS Access to the SQL Server table.

When I opened the linked table I see the data in the description field. However, if I return the rows from within SQL Server no data is present.

I have some ASP code trying to read the data in the SQL Server table. However, nothing is returned and when I run the SQL Statement, nothing gets returned. The SQL statement returns all rows. All the other data is present but nothing in the description field.

What am I doing wrong? Any suggestions anyone, please!

TIA

View 1 Replies View Related

Error With Text Qualifier In Qualified Field During Flat File Import

Nov 8, 2007



We have a flat file import proces which imports data from a series of unicode flat files.

The files have text qualifiers and are being imported to a table with the following format:
CREATE TABLE [dsa].[OBS](
[Kundenummer] [nvarchar](10) NULL,
[Navn] [nvarchar](60) NULL,
[Adresse] [nvarchar](50) NULL,
[PostnrBynavn] [nvarchar](50) NULL,
[Kursusdato] [datetime] NULL,
[Varighed] [decimal](18, 2) NULL,
[Kursustype] [nvarchar](100) NULL,
[Risikokoder] [nvarchar](50) NULL
) ON [PRIMARY]

In one of our files we have two rows that looks like this:
"19298529";"THIS IS ROW 1";"ADDRESS 9 -13";"4200 SLAGELSE";"02-05-2006";8.00;"Kombikursus Førstehjælp - Brand 8 lek.";"37"
"19448242";"THIS IS ROW 2";"ADDRESS 50";"4140 BORUP";"04-05-2006";4.00;""Fra vil selv - til kan selv". Om børn 1½ - 3 Ã¥r";"22"


Both rows are OK according to the format, but the second row actually contains the text qualifier in one of the qualified fields (""Fra vil selv - til kan selv". Om børn 1½ - 3 Ã¥r"). It's the title of a course with a comment.
The proces fails on this file, and wont even redirect the row, as it does on other erroneous rows in other files we import.

We believe this is a valid text, but apparently SSIS doesn't
Is this a bug or is this record not allowed?
Is there a work around, and why wont SSIS redirect the row?

We believe the reason is that the field before is not text quaified (which is of course specified in the connection manager).

Thanks in advance,

Lasse

View 4 Replies View Related

Need Help For SSIS Package Creation With INSERT,UPDATE Large Amount Of Records Through Business Intelligence Studio

Jun 1, 2006

Hi ,

How to INsert and Update Large Amount of Records (4 Lacs) into Destination Table Through Business Intelligence Studio Using SSIS Pacakge .How to Achieve this .i tryed with left outer join & conditional split but the problem its not able to insert & update records simultaneously . can any one give me a sample .
Thanks & Regards
Jeyakumar.M

View 3 Replies View Related

SSIS Import

Feb 10, 2007

In MS Excel, the ability exists to run a "web query." This functionis accessed via the data menu's import external data option. The webquery wizard accepts a URL address, and then is able to import thedata from that address into an excel worksheet.What I would like to do is use SSIS to import data from the same website. In other words, I now use Excel's web query functionality toimport data from a website with a url of xyz.asp. I save the excelworkbook, and then run a DTS package to import the data into SQLServer. I would like to entirely bypass Excel, instead of using it asan intermediary to bring data from the asp site into SQL Server.However, I can't figure out how to set up the connection from SQLServer. I am using SQL Server 2005, SSIS. Is there a way to haveSSIS open the asp website, extract the data and import it into a SQLServer table?For the record, this is not XML data.Thanks.

View 1 Replies View Related

SSIS Import

Nov 15, 2007

I am trying to import either .csv or excel files using the SSIS import/export tool in Visual Studio and/or managment studio. the data does import in but I am loosing some data in the files, most often date fields.

View 1 Replies View Related

Running A Large Number Of SSIS Packages (with Dtexec Utility) In Parallel From A SQL Server Agent Job Produces Errors

Jan 11, 2008

Hi,

I have stumbled on a problem with running a large number of SSIS packages in parallel, using the €œdtexec€? command from inside an SQL Server job.

I€™ve described the environment, the goal and the problem below. Sorry if it€™s a bit too long, but I tried to be as clear as possible.

The environment:
Windows Server 2003 Enterprise x64 Edition, SQL Server 2005 32bit Enterprise Edition SP2.

The goal:
We have a large number of text files that we€™re loading into a staging area of a data warehouse (based on SQL Server 2k5, as said above).

We have one €œmain€? SSIS package that takes a list of files to load from an XML file, loops through that list and for each file in the list starts an SSIS package by using €œdtexec€? command. The command is started asynchronously by using system.diagnostics.process.start() method. This means that a large number of SSIS packages are started in parallel. These packages perform the actual loading (with BULK insert).

I have successfully run the loading process from the command prompt (using the dtexec command to start the main package) a number of times.

In order to move the loading to a production environment and schedule it, we have set up an SQL Server Agent job. We€™ve created a proxy user with the necessary rights (the same user that runs the job from command prompt), created an the SQL Agent job (there is one step of type €œcmdexec€? that runs the €œmain€? SSIS package with the €œdtexec€? command).

If the input XML file for the main package contains a small number of files (for example 10), the SQL Server Agent job works fine €“ the SSIS packages are started in parallel and they finish work successfully.

The problem:
When the number of the concurrently started SSIS packages gets too big, the packages start to fail. When a large number of SSIS package executions are already taking place, the new dtexec commands fail after 0 seconds of work with an empty error message.

Please bear in mind that the same loading still works perfectly from command prompt on the same server with the same user. It only fails when run from the SQL Agent Job.

I€™ve tried to understand the limit, when do the packages start to fail, and I believe that the threshold is 80 parallel executions (I understand that it might not be desirable to start so many SSIS packages at once, but I€™d like to do it despite this).

Additional information:

The dtexec utility provides an error message where the package variables are shown and the fact that the package ran 0 seconds, but the €œMessage€? is empty (€œMessage: €œ).
Turning the logging on in all the packages does not provide an error message either, just a lot of run-time information.
The try-catch block around the process.start() script in the main package€™s script task also does not reveal any errors.
I€™ve increased the €œmax worker threads€? number for the cmdexec subsystem in the msdb.dbo.syssubsystems table to a safely high number and restarted the SQL Server, but this had no effect either.

The request:

Can anyone give ideas what could be the cause of the problem?
If you have any ideas about how to further debug the problem, they are also very welcome.
Thanks in advance!

Eero Ringmäe

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved