I backed up my database on the live server copied it to the backup server and restored it successfully. I then set up a DTS export on the live server to copy objects and data to the backup db. The job keeps failing. Each time it fails it reports an "ALTER TABLE statement conflicted with COLUMN FOREIGN KEY constraint ...... However, there is nothing wrong with any foreign key, I have checked and the data in the tables are definitely fine and key is definitely fine (I removed it and added it again checking the data). I am able to DTS these tables which are supposed to have problems as a separte DTS job. I believe the job is inserting data in the table which has the foreign key before it inserts data in the table with the primary key and this is causing the problem. As an aside this job used to work perfectly well.
Can someone offer a suggestion on what I should do.
I have a SSIS job, one of the last steps it performs is to execute a SQL 2000 DTS package. This has to be done as a SQL 2000 DTS package as it is performing rebuilds of SQL 2000 Analysis Services dimensions and cubes. We've found that when the DTS fails the SSIS job is happily completing showing as a success, we would prefer to know it went wrong.
As far as I'm aware SSIS merely starts the DTS off and doesn't care about it's result. I've taken a look in to turning on the logging for the execute DTS package and thought that the ExecuteDTS80PackageTaskTaskResult would give me the answer I need...but is merely written to the log not available as an event-handler. It also looks like it is not safe to put a SQL task in as the next item to go look at the SQL 2000 system tables to look at the log for the DTS package as the SSIS documentation warns that the DTS package can continue to run after the execute DTS package task has ended.
Ideally I want any error raised within the DTS package to cascade up to be an error in the SSIS job, I can then handle it appropriately. I cannot find a way to do this. Is there a way?
If not, can anyone suggest how in the remainder of the SSIS tasks I can be sure that the DTS has completed before I start any other tasks that will check for the SQL 2000 log of its execution?
I have developed an SSIS package for ETL purpose. I am invoking the SSIS package through .Net console application by referencing the ManagedDTS Assembly. I am able to execute the package in Sql Server 2005 Developer Edition and it runs fine till completion.
But when i try to execute the packahe in Sql Server 2005 Standard edition, by invoking the package through .Net console application the status of the package is failure.
Can any one help me how to over come this problem.
I have a SSIS Package with a "Execute Package Task" to call a child package. I am trying to have the master/parent package complete its execution regardless the outcome (failure or success) of the child package. The overall structure of the master package is:
1. Perform Pre-load tasks (stored procedure).
2. Execute Package Task (call child package)
3. Perform Post-load Tasks (stored procedure)
I have try everything and cannot get the results that I want... I have tried the combination of "failparentonfailure", "forceexecutionvalue", etc. The master package stops at childs failure. I would like to resume to completion
When I try to run any dts package that tries to copy stored procedures from one db to another, it fails and i get this message:
Need to run the object to perform this operation [SQL-DMO]Code execution exception: EXCEPTION_ACCESS_VIOLATION
The problem appeared a couple of weeks ago, the same package that now fails ran with no problems for some 6 months or so.
When i searched the Microsoft site i found a bugfix that fit the problem exactly but the workaround and SP3 installation (which supposedly fixes the problem) didn't work. Please help!!
I have a DTS package which keeps failing giving me the error message “A connection with the transaction manager was lost". The DTS package is using two different servers as connection and the DTS package has Use Transaction, Rollback transaction on failure and Join transaction if present ticked. I have checked the MSDTC service is running on both servers, the following options on both servers have been checked; Network DTC Access, Allow Inbound and Outbound. Does anybody know what is causing this error message?
Because of recent changes in the company, I have inherited the SQL Server. I have a little bit of experience, but I will be knee deep in study for the foreseeable future. In the meantime, I do have to manage the SQL Server and deal with all of the problems. It has been fairly self-sufficient until now. I have a regularly scheduled SSIS package that has recently been erroring out. I just don't even know where to start figuring this out. When I look at the History of this two step job, I see that it's failing on the second step, which is a T-SQL script. I am way more of a GUI kinda guy (at least for now) and I am lost when looking at the script. I know that I am not giving enough info in order for someone to help me, but my problem is I just don't know where to even look. Can someone at the very least point me to a place where I can get a better clue as to what is happening and why???
Thanks for listening and God bless anyone who understands at all what I'm even trying to ask!!!!!
Package Load Failure Package 'Microsoft.SqlServer.Tools.PublishWizard.VSCommands.MenuCommandsPackage, MenuAndCommands, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bg3856ad364e35' has failed to load properly ( GUID = {A38CF415-6AD2-45F6-8132-9CC33EBE0629}).
I get this error after I have opened VS and then open the Server Explorer Window.
I have been using VS2005 Professional since December of 2005 and have never encountered this before. I have never had a beta version of anything on this PC. I have been searching the Net but haven't found a solution. At this point, I haven't tried reloading VS because even if it does fix it, what is the guarantee that I will not encounter it again. I would rather find and fix the cause.
Does anyone have any ideas?
Additional: I have now uninstalled and reinstalled VS2005 and SQL Server 2005 including service packs and am still getting the error.
Additional: When I ran devenv.exe, the section of the resulting log referencing the GUID indicated an invalid Package Load Key:
i'm using a Foreach Container to run certain tasks, these ones are inside a Sequence Container. I'd like to know if there is a way to have a task failure without a foreach failure, i don't need the hole process to fail, just the current iteration so i can log it.
Anyone seen this error or know what this could be? I got this when executing a package. It works on my dev machine, however failed with this error on another test machine/environment.
The session was canceled. (Exception from HRESULT: 0x800700F0) (Microsoft Visual Studio)
------------------------------ Program Location:
at System.Runtime.InteropServices.Marshal.ThrowExceptionForHRInternal(Int32 errorCode, IntPtr errorInfo) at Microsoft.DataWarehouse.VsIntegration.Interop.NativeMethods.ThrowOnFailure(Int32 hr, Int32[] expectedHRFailure) at Microsoft.DataWarehouse.VsIn
We have an application comprising of a number of SSIS packages that run every few hours. When a package fails, we would like an email to be sent to a pre-configured address with the required information. I would like to use Send Mail task with SMTP server like smtp.sbcglobal.yahoo.com, but this server requires authentication and I need to provide my user_id and password for that. I have not been able to figure out where I can configure user_id and password for getting authenticated by the SMTP server. If SMTP connection cannot be used, is there any other way to notify admin of the failure?
I am running a DTS Package from a stored procedure using xpcmdshell. The DTS Package begins with a SQL Task to delete records from 2 tables (this works fine), but the data transfer task for importing records from a SQL Anywhere 5.0 database gives me the error 'Unable to connect to database server: Unable to start database engine'. the weird thing is that from Enterprise Manager I can execute the DTS Package and it works fine. What am I missing here?????
Trying to build a deployment package. I have a number of dtsx in a project that share a connection config file. When I build, the error states: 'Could not copy file "whatever.dtsconfig" to the deployment utility output directory. ... The file already exists'
I have a package that is failing because of a truncation error. Now, by default (and I leave this for ALL my packages) if one row fails processing the entire package should fail and nothing gets loaded into db. But instead I am actually getting a partial db load.
I have confirmed the "Rows per btach" value (blank) and the "Maximum insert commit size" value (0) for the OLE DB Destination Editor so I have no idea what is going on. Are there any other properties I should be checking?
I am developing an SSIS package and need the execution of the package to continue even if one of the tasks within the package fails. I have an OnError event handler for this task which fires when it fails but want the rest of the package to continue.
I have a job that moves a file to import directory where the SSIS package picks it up and processes it. 95% of the time is works flawlessly and fast.
The other 5% of the time the process fails. It appears that the file is inaccessible to SQL Server. I run it again and it works perfectly. It appears to be completely hit and miss.
The last step in the package is on-completion delete the file. When the package fails, the file (as directed) is not deleted.
I've included a snippet of the error log. I can place the log out here if it would help more.
Any ideas what could possibility going wrong? I hate inconsistent failures.
Warning: 2008-04-06 15:05:30.80 Code: 0x80070002 Source: Data Flow Task Flat File Source [317] Description: The system cannot find the file specified. End Warning Error: 2008-04-06 15:05:30.80 Code: 0xC020200E Source: Data Flow Task Flat File Source [317] Description: Cannot open the datafile "D:Processes 40720080105260140306.txt". End Error Error: 2008-04-06 15:05:30.80 Code: 0xC004701A Source: Data Flow Task DTS.Pipeline Description: component "Flat File Source" (317) failed the pre-execute phase and returned error code 0xC020200E. End Error
I had a Send email task linked to my Sequence Containers in my package and it was working fine. Everytime the container fails it would send an email to myself.
At some point all Failure constraints stopped working. Failure constraints work if I add brand new tasks, but with the existing tasks, they don't work. The Task which fails, turns red and execution stops. Next failure task is not executed.
I am not sure what triggered it to stop working. I cannot get anything on the log
i have a package that contains a foreach loop container, in this container i have sql tasks, and execute package tasks, that end with a send mail task. if there something wrong with the smtp server, or it's down, the send mail task fails the package. i don't want this to happen, what i want that if the send mail taks fails, the package will continue it's execution.
--i thought of using the event handler... but i don't know if it works...
This means, I am using the Application object and the Package object to actually load the package using the path to it.
So my code looks something like this:
-----
pkgLocation = @"<package_path>/Package1.dtsx";
app = new Application();
pkg = app.LoadPackage(pkgLocation, null);
pkgResults = pkg.Execute();
Console.WriteLine(pkgResults.ToString());
Console.ReadKey();
---
My package reads in a flat file (located on another server) and transforms it and saves it to a database (on that same server). And mind you, I can execute this package just fine when I do it manually from within BIDS.
But when I try to execute the above mentioned code in my C# solution (compiled to a command line executable), I always get a "Failure".
Can somebody point out what I am doing wrong here?
I am using SQL Server 2012 SP1. I have built an SSIS package that imports flat file data from various files to SQL Server. I have got it to do everything I want it to do when things are going well, and am now on what I want it to do when it encounters a failure executing specific tasks and containers. For example, I have a Foreach Loop container that executes a dedicated stored procedure for each csv file in the target folder. If any of the store procedures fail to run for any reason I want to carry out certain actions.
For the most part I think I will be fine using the Event Handlers. What I can't seem to find is how to tell the package to stop executing on a Failure event after carrying out the actions defined by the relevant Event Handler. Or, perhaps it isn't necessary as that would be the default behaviour on a failure?
Hi everybody, I'm newby in SSIS. I have been using DTS in SQL 2000. Trying to learn how to execute SSIS package from C# code of ASP.NET web server. Here's my case: 1. SSIS package with simple data transformation from one table to CSV is stored in SQL Server 2005 storage 2. CSV is for simplicity placed in C: .txt 3. I haven't used SSIS configuration files 4. Protection level of package = EncryptSensitiveWithUserKey 5. Executes OK from Bussiness inteligence studio 6. I've created console application with this code:
7. When I connect through Remote Desktop Connection I successfully execute this console application on SQL server host machine. 8. When I try execution from computer where I develop package and where I successfully executed it from Bussiness inteligence studio I'm geting FAILURE was a result
Connection params for SQL are same in console application and in SSIS project of Bussiness inteligence studio.
(1) contains a for loop task (in which all the logic is contained) that loops through a particular folder for excel files WITHIN THE FOR LOOP: (2) pulls data from an excel file into SQL Tables(Data Transformation Task) (3) run stored proc to validate data (Execute SQL TASK) (4)ON SUCCESS of executing the SQL Task (Script Task - move file to success or reject folder based on value returned from sproc) (5)ON FAILURE of executing the stored proc (Script Task - move file to bad format failure)
NOTE: I have modified the MaximumErrorcount property of (1 FOR LOOP) and (3 EXECUTE SQL TASK) and the package itself to 0. In order to deal with badly formatted excel files...I do not want the package to stop for every missing tab in excel file or data entry error. I simply want the badly formatted file to be moved to a special folder
PROBLEM: on failure logic is never executed (I have 2 options after step (3)) on success do step (4) on failure do step (5) step (3) fails...then it simply iterates to the next file step 5 is never executed
Is this because I changed the maximumerrorcount property? What am I doing wrong witht he Precendence Logic?
I am attempting to run an SSIS package from a web service. Right now both the service and package are on my local machine which is running XP. I have accessed the web service from a client application in debug mode. I am not sure if it is actually running under aspnet_wp.exe because it is XP and a development environment? (separate question)??? The package fails with a series of OnError messages similar to:
The result of the expression ""/c DEL /F /Q "" + @DeployFolder + "\catalog.diff.lz""" on property "Arguments" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
An initial supposition is that the permissions of the web service are inadequate for the package. I have the authentication as "Windows" and <identity impersonate="true" /> in the Web.Config file. When I break in the debugger the Environment.UserName and Environment.UserDomainName are mine and I am an Admin on the box. the authorization is 'deny users="?".
The article that describes basic implementation of this in a Web Service states:
With its default settings for authentication and authorization, a Web service generally does not have sufficient permissions to access SQL Server or the file system to load and execute packages. You may have to assign appropriate permissions to the Web service by configuring its authentication and authorization settings in the web.config file and assigning database and file system permissions as appropriate. A complete discussion of Web, database, and file system permissions is beyond the scope of this topic.
And how!
Note that the load is fine and that this is a run time error and that the package runs correctly when run manually from SQL Server using the 'run package' menu item in the Object Explorer tree of the SQL Server Management Console.
I need to know if this is an ASP.NET issue per se or XP or if this is even a security issue. And how to solve it! This is critical path so an expeditious reply with a solution would be greatly appreciated.
We are using SSIS to load some 100k records from flat file to Oracle Destination. We are using Oracle 10g client. But during the execution after some 5hrs or 6hr with 900k records upload we are getting the message Package execution completed. In the Execution results there is no message related to success or failure and the tasks in the Data Flow where yellow in color. What might be the problem? Any information regarding this case will be helpful for us.
I apologize in advance for posting yet another connection failure issue. I went through quite a few posts and could not find the actual answer so here is the issue. I have a main SSIS package to call five other packages. This seems to work fine in my BIDS workstation; however, when I copied it, including the bat file that was created by dtexecui utility, to the production environment (which runs on 64 bit - probably has nothing to do with the 64 bit platform), the main package executes fine but the child packages failed with this error: Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Login timeout expired". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote conne ctions.".
It seems that the child packages are still using the old connection strings from my workstation. My question is how can I pass the connection string from the batch file to all the child packages that the parent (main) package is using.
I have a package with 10 synchronous dataflows, which, combined, load about 300MB of flat file data to a database. This package would run successfully on 2 of our database servers, but would regularly fail on a third. The server on which it was failing is a 4 processor box with 16GB Ram with Windows Server 2003, SQL 2005, SSIS and SSRS installed - much more robust than one of the others that the package worked on. The SSIS error messages returned alternated between the following (with no apparent reason why one would show up rather than another, though the first was the most common):
"The file name "\Server1Folder1File1.txt" specified in the connection was not valid."
"The file name property is not valid. The file name is a device or contains invalid characters."
"An error occurred while initializing the flat file parser."
For the first error message, the error would report different connection managers and their associated file as invalid from run to run. All of the files across the 10 dataflows resided in the same network folder, and the package would read in and process a few of them before failing, so the problem was definitely not the connection string.
Searching the forums, etc. for these errors provided no useful information - given the real cause of the problem, these error messages are worse than unhelpful, they send you looking in the wrong direction. It was only when trying to track down another problem on the same server that I discovered the issue. When trying to copy database backups greater than 12GB over the network to this server, the operation would fail with an "Insufficient System Resources" message.
Some research led to the discovery that problem was caused by the /3GB switch in the boot.ini file of the server (don't let your Server team use that switch if you have 16GB of memory or more). Removing the switch and setting SQL to utilize AWE, fixed both the file copy problem AND the SSIS package failure problem. The SSIS package failed, not due to a bad connection string, but rather to insufficient server resources (read memory) to handle the simultaneous connections.
I hope this may help any others trying to track down this kind of SSIS package failure.
I will also provide here what I have gleaned about setting up Memory usage for SQL Server 2005 running on 32 bit Windows Server 2003 (with the caveat that I am no expert €“ corrections and additional information are welcome).
The following links got me started in my research (thanks to the folks who provided such useful information): http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=55191 http://articles.techrepublic.com.com/5100-10878_11-6091280.html http://www.simple-talk.com/community/blogs/brian_donahue/archive/2007/09/30/37747.aspx http://blogs.technet.com/askperf/archive/2007/03/23/memory-management-demystifying-3gb.aspx http://www.modhul.com/2007/11/10/optimising-system-memory-for-sql-server-part-i/
Also, search BOL for: Server Memory Options Enabling Memory Support for Over 4 GB of Physical Memory Enabling AWE Memory for SQL Server
Windows Server 2003 provides access to 4GB of virtual address space. By default, 2GB is assigned to the OS and 2GB to applications. This default can be change to 1GB for the OS and 3GB for applications by the use of the /3GB switch in the boot.ini file.
Physical memory over 4GB can be addressed by enabling Physical Addressing Extensions (PAE), which is done by setting the /PAE switch in the boot.ini file. This does not increase the systems virtual address space, rather it increases the size of the page table (which is maintained within the virtual address space), adding entries to reference the physical memory above 4GB.
It is important to note that these two switches are not interdependent (they do different things and you can turn each on or off regardless of the others status), though the combination of them has an impact on server performance and the maximum amount of physical memory which can be addressed.
The /3GB switch only impacts the allocation of the first 4GB of memory (virtual address space) between the OS and applications (default 50/50 % split, with switch on - 25% OS and 75% applications). The /PAE switch enables the system to reference/manage physical memory above 4GB, but does not alter the allocation percentages of the first 4GB of memory between the OS and applications. However, when PAE is enabled, the OS requires more memory within the first 4GB to manage the physical memory above 4GB (due to increased page table entries). With the /3GB switch, the OS has only 1GB of virtual address space, and only enough space to manage a total of 16GB of physical memory. If 32GB of physical memory is installed, 16GB of it will go to waste.
Address Windowing Extensions (AWE) is an API that allows an application to address more than the 2-3GB of memory that is available to applications within the virtual address space (first 4GB of memory). SQL Server can utilize AWE to take advantage of memory above the first 4GB that is made available via PAE, and can even reserve portions for its own use. I believe (though I can€™t remember where I got this bit) that SQL utilizes AWE memory only for the page cache (buffer pool €“ which seems to be a misnomer), and not for other operations.
To enable AWE, see the BOL references above.
The big question: what are the recommended settings for all of these? That all depends on what you have running on the server. You need to leave space for the OS, SQL Server and any other applications you have.
The hard and fast rules: If you have more than 4GB of RAM, you must use the /PAE switch in order to take advantage of it. If you have more than 16GB of RAM, you must NOT use the /3GB switch in order to take advantage of it.
Based on anecdotal evidence, I€™ve noticed the following generally recommended guidelines €“ assuming the server is dedicated to SQL.
Use of the /3GB switch seems to be a generally accepted practice if you have 8GB of RAM or less. For between 8 and 16GB, some say never use the /3GB switch, others say you can use it up to 12GB and still others up to 16GB. I interpret this to mean that it all depends on what types of loads are being placed on the server and that testing on individual servers will be required to determine whether or not to use the switch. Certainly that was my experience - the /3GB switch worked fine with 16GB RAM, until the server encountered a certain workload. For me, no more /3GB switch.
For setting SQL to use AWE, most seem to agree that it should be enabled if you have more than 4GB RAM. The setting of max server memory is more complicated. BOL seems to suggest (the €˜Server Memory Options€™ entry) a formula of Total Physical Memory minus 1-2GB for the operating system. Based on a desire to be a bit more conservative, I am now using the following formula:
max server memory = total physical memory
minus
4GB for the OS and application processes (since the AWE memory is utilized for page cache, not SQL processes)
minus
AWE memory required by other applications, including other instance of SQL Server
If anyone has additional insight, or a more refined equation, I could certainly benefit from it.
I've had a DTS Package scheduled to run every morning since June last year, however on Monday and again this morning, the DTS package has failed on two of its steps. This DTS package runs on SQL Server 2000 and the server is a Windows 2000 Server.
Basically, the DTS Package steps that fail are populating a "cache" table from a "live" table and then performing Inserts / Updates on the "live" table from a CSV file depending upon whether data exists in the "cache" table and whether the CSV file contains an "Insert" or "Update" flag. The live table has just over 800,000 rows of data and our nightly CSV extract, containing both inserts and updates, has about 6000 rows.
The error / failure has only happened in the last two days and I'm not too sure if there's any "timeout" feature in DTS and if there is, how I can modify it, or whether this may be something to do with long transactions or temp dbspace running out.
The time that the DTS package runs was modified last week to run two hours earlier (03:00 instead of 05:00) and has ran okay for a few days. I don't know whether this change is a red herring, as I have checked other DTS packages on the machine and none appear to run at the same time.
Here's the message from the third step in the DTS Package (second one that's failing).
Executed as user: SOEincaservice. ... Start: DTSStep_DTSExecuteSQLTask_1 DTSRun OnStart: DTSStep_DTSExecuteSQLTask_10 DTSRun OnError: DTSStep_DTSExecuteSQLTask_1, Error = -2147467259 (80004005) Error string: Timeout expired Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 Error Detail Records: Error: -2147467259 (80004005); Provider Error: 0 (0) Error string: Timeout expired Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_1 DTSRun OnError: DTSStep_DTSExecuteSQLTask_10, Error = -2147467259 (80004005) Error string: Timeout expired Error source: Microsoft OLE DB Provider for SQL Server Help file: Help context: 0 Error Detail Records: Error: -2147467259 (80004005); Provider Error: 0 (0) Error string: Timeout expired Error source: Micros... Process Exit Code 2. The step failed.
Any help is greatly appreciated as I've searched these forums and the web and can't really find any answers.
I am looking for solution for "Communication link failure" Â since many months in google but no luck, am running an SSIS package to load data. job failing many times with error 'Communication link failure', searched every where but found nothing.
Below is the complete error description when job failed.
OS - Windows server 2008 R2 Enterprise Edition RAM - 198GB Â SQL server 2008 R2 Enterprise Edition and error description is below,
Started:  6:22:40 AM  Error: 2015-08-19 18:50:32.70   Code: 0xC0202009    Source: Data Flow Task Lookup [193]    Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.  An OLE DB record is available. Â
I have a parent package that calls child packages inside a For Each container. When I debug/run the parent package (from VS), I get the following error message: Warning: The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
It appears to be failing while executing the child package. However, the logs (via the "progress" tab) for both the parent package and the child package show no errors other than the one listed above (and that shows in the parent package log). The child package appears to validate completely without error (all components are green and no error messages in the log). I turned on SSIS logging to a text file and see nothing in there either.
If I bump up the MaximumErrorCount in the parent package and in the Execute Package Task that calls the child package to 4 (to go one above the error count indicated in the message above), the whole thing executes sucessfully. I don't want to leave the Max Error Count set like this. Is there something I am missing? For example are there errors that do not get logged by default? I get some warnings, do a certain number of warnings equal an error?
I have a serious problem with my SSIS Package while executing using 32-bit DTExec and 64-bit DTExec.
Here are the details:
Environment:
Windows Server 2003 64-bit (Build 3790: Service Pack 2) SSIS 32-bit & 64-bit installed SQL Server 2005 (Microsoft SQL Server 2005 - 9.00.1399.06 (X64) - RTM)
SSIS Package details (compiled in 64 bit)
Script tasks only Microsoft Visual Basic .NET (using TRY...CATCH block) PreCompileScriptIntoBinaryCode = TRUE Run64BitRunTime = TRUE
Execution
Batch file that uses DTExec to execute the Package.
SCENARIO I am trying to exeucte the above SSIS package using both 32-bit and 64-bit DTExec to make it failure by providing invalid connection string. Here are the details,
Wrong connection String using 32-bit Execution
While establishing the connection the error message has been nicely captured in to my Exception block and writes into the log file.
Wrong connection String using 64-bit Execution
While establishing the connection the error has not been catpured anywhere (although I have TRY CATCH block) and it haults there itself with the message "Process is terminated due to StackOverflowException". Later I found that the error is due to the connection string along with the unhandled exception.
Please suggest any one of the following my findings, also if you have any other advice would be very much appreciated.
1. Shall I go ahead and fix the issue by handling those unhandled errors? (e.g Appdomain, application). I tried several but still not working using 64-bit DTExec.
2. Shall I go ahead and use 32-bit DTExec to execute the package? If so, is there any other major issue...like performance or anyother bug?
P.S: We cannot apply any service pack for SQL Server 2005 at the moment. Sorry abt it. If you have any specific hotfix for DTExec (without affecting SQL Server) then we can decide.
Sorry for the lengthy one and Thanks very much for you help in advance .
I am using ODBC to connect SQL Server 2000 + SP4 server runnig on Windows 2003 standard edition Server +SP1. But, sporadically my application server connectivity to DB fails and i receive the following error messages
[Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionWrite (WrapperWrite()). [Microsoft][ODBC SQL Server Driver][DBNETLIB]General network error. Check your network documentation.) In D:de.cpp 702 Apr 12 2006 20:34:47 [Microsoft][ODBC SQL Server Driver]Communication link failure) In D:de.cpp 702 Apr 12 2006 20:34:47 20071017 08:23:10 TID (00000ff0) Sev (3) Err (0) Msg (Read failure. General SQL error.
Strange thing is that i receive this error only sporadically, Please Please advice !
I copied and added an existing package as a new package to a project and have been having trouble with settings reverting to those for the original package after I modify and save the changes for the new package. Sometimes happens with the save itself, other times it happens when I close and re-open the package. Most cases are with connections that revert back to the original file reference, but there are also control flow and data flow elements that keep reverting back to either settings from the original package or defaults that result in the re-opened package being in error. Not sure how to get around this issue short of developing the new package from scratch which I'd rather not do since it is fairly complex. Any help anyone can provide is appreciated. Thanks.