We're having a performance problem with a package since an error occurred. The original error came from the Job Manager whicj was unable to start a thread. Our understanding is that this is a memory related issue and we'll deal with that.
What is realy odd is what happened after that. The package executes from a SQL Agent job that includes 3 other packages. Each package is stored on the file system. Package execution time for the affected package changed from less than a minute to over 5 minutes. The other packages continued to execute normally.
In checking the logs there is large time gap between the start of the SQL Agent step and the first pre-validation message that accounts for 4 minutes, as if there is some issue in loading the package. The issue is ongoing. Does anyone know what happens between the start of the SQL Agent job and the first prevalidate message? Is this some type of caching issue?
SQL Agent step starts at 4:05:27
First Prevalidate message: 4:09:24
Package Execution start: 4:09:57
End Execution 4:10:59
Is there a way ,(if so what is the syntax?), to set up a DTS package that loads a table that has an identity column. I am trying to load the data from another table, (leaving the identity field unmapped), and de-selecting the "enable identity insert" from the advanced tab of the Data Transformation Properties window. I keep getting errors due to the table not allowing null values. I tried using the set_identity command, but this still did not work. Any help would be appreciated. TB
I have a project with 48 packages. When I press the green button for Start Debug, it seems like Integration Svcs. is re-loading or some type of validation occurs (screen flashes with each package being highlighted in solution manager) for all packages in the project before executing the specific package.
Also, If I close all the packages except one that is to be executed and press the green start debug button, all the packages are loaded before executing the specific package. There is always a delay before the actual package is executed when using the green start debug button.
If I right click on the specific package in solution manager and execute it, the package is executed immeadiately without any delay.
I did not have this issue until I tried to build and deploy the package.
We've recently upgraded to SQL Server 2014, and are now using SSIS integrated with Visual Studio. We have a SSIS project which contains about 20 packages which are nested in Sequence Containers and executed concurrently. These packages have been set up as project references.
The problem is that when I press the start button to run the packages, they all light up green reporting completion before the data has finished loading into the SQL database. If I press the stop button without waiting a sufficient length of time, then not all of the data gets loaded. i.e. a certain number of rows will be missing from some of the SQL tables.
If I click through to the individual package items and check the data flow progress while running, some of the data flows appear to hang at a certain number of rows without ever reaching completion. The number of rows indicated in the data flow is incorrect - i.e. it will count up to ~150,000 and stay there indefinitely in the running state, when in actual fact there are ~500,000 rows to load.
To clarify, the main package will show all items green and display the "Finished: Success" message in the log window, however when I drill through to certain packages in the set, they'll be stuck in the yellow running state, with no way of knowing whether they've actually completed or not.
My current workaround is to just wait a certain length of time before pressing the stop button. This bug doesn't seem to inhibit rows being loaded - it just incorrectly identifies the point when the load finishes, causing people to terminate the load prematurely.
This issue only occurs if I run the project from the main package container. If I execute the child packages individually, they correctly report the number of rows being loaded and light up green once complete.
I'm new to integration services. I want to create a centralized reporting system for our customers. Some customers have up to 1,000 sites and some are expected to grow past 5,000 sites. The sites are running POS applications and I want to extract the POS sales data from these sites. Is it practical to expect that SSIS can handle the extraction of data from this many sites and load the data into a central SQL database? The POS sales data at the sites is stored in SqlExpress databases but the data is also available in XML format. If it's practical for Integration Services to do this, what frequency is it possible to pull this data? I realize that the amout of data is relative but just wondering if anyone is attempting to do this with integration services. If not with integration services, then what method(s) are available and used to extract data from this many remote sites?
I'm having serious problems with the IDE for SSIS for projects that contain more than 5 packages. Especially if these packages call each other with a run package task thats configured with a file connection. Especially annoying are the 20+ "Document contains one or more extremely long lines of text.." messages that pop up during loading / validating. For my project with around 30 packages it takes me around 10 minutes to click through all these pop ups in addition to the long loading time.
Anyone got any tips on this specifically or how to improve performance in the IDE in general? As it is now, the product is a REAL pain to work with for large projects.
I have a general question concerning the impact on the performance of massive parallel data imports in one SSIS-package.
We have a Database on a SQL2005 SP1-Server (2 Xeons 3,8 Ghz, 4GB of RAM) for a report web-app which is updated every day with data of the last year/3 years. The data is extracted from several different DBs on multiple machines at different locations. Right now, there are imports/transformations from 7 companies at 3 locations. The table has ~80 columns and about 2 Mio. Rows. I built a SSIS-Package with one companies import and added the others by c&p-ing all the tasks in the package and changing connection parameters and values. Soon there will be 6 more companies to do imports with, and there will possibly be about 20 some day.
Now, when these 7 imports run parallel, there are 3 simultaneous imports from the same Source Server. Sometimes one of these imports seem to hang up. I cannot reproduce it, when I run the package 2 or 3 times, it´s gone. So I put some of the imports in line to reduce the parallel working tasks to 4. Then the problem dissappears. The "MaxConcurrentExecutables" Value is set to 6. "Retain same connection" is set "TRUE".
My questions, regarding stability and performance, are:
1.) Is it better to do those imports in seperate packages, if yes can I schedule multiple packages to execute parallel at the "SQL Server Agent"?
2.) Or should they be combined in one package, running (partly) parallel?
3.) What is the appropriate value for the "MaxConcurrentExecutables" Value and what options do I have to speed up those imports?
I've run into a problem with SSIS packages wherein tasks that write or copy files, or create or delete directories, quit execution without any hint of an error nor a failure message, when called from an ASP.NET 2.0 application running on any other machine than the one where the package was created from. By all indications it appeared to be an identity/permissions problem.
Our application involves a separate web server and database server. Both have SQL Server 2005 installed, but the application server originally only had Integration services. The packages are file system-deployed on the application server, and are called using Microsoft.SqlServer.Dts.Runtime methods. For all packages that involve file system tasks, the above problem occurs.
When the above packages are run using the command prompt (either DTEXEC or DTEXECUI) the packages execute just fine. This is expected since we are using an administrative account. However when a ShellExecute of the same command is called from ASP.NET, the same problem occurs.
I've tried giving administrative permissions to the ASPNET worker process user to no avail.
I have likewise attempted to use the SQL Server Agent job approach but that approach might not be acceptable for our clients since it means installing SQL Server 2005 Database services on the application server.
I have read the relevant threads in this forum, namely http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1044739&SiteID=1 and http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=927084&SiteID=1 but failed to find any solution appropriate for our set up.
We manage some SSIS servers, which has only SSIS and SSIS tools installed on them and not the sql server DB.
SSIS packages and configuration files are deployed on a NAS. We run the SSIS packages through DTEXEC by logging in to the server.
We want to allow developers to run their packages on their own on the server, but at the same time we dont want to give them physical access on the server i.e we do not want to add them into RDP users list on server properties. We want them to allow running their packages remotely on the server.
One way We could think of is by using powershell remoting and we are working on that. But is there any other way or any tool already present for the same.
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server with aparticular query. It would take approximately 22 seconds to return 100rows, thats about 0.22 seconds per row. Note: I ran the query in singleuser mode. So I tested the query on the Development server by taking abackup (.dmp) of the database and moving it onto the dev server. I ranthe same query and found that it ran in less than a second.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue is relatedto some external hardware issue like: disk space, memory etc. Or couldit be OS software related issues, like service packs, SQL Serverconfiguations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating system related issue.Any Ideas would help me greatly!Thanks,Brian T*** Sent via Developersdex http://www.developersdex.com ***
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server witha particular query. It would take approximately 22 seconds to return100 rows, thats about 0.22 seconds per row. Note: I ran the query insingle user mode. So I tested the query on the Development server bytaking a backup (.dmp) of the database and moving it onto the devserver. I ran the same query and found that it ran in less than asecond.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue isrelated to some external hardware issue like: disk space, memory etc.Or could it be OS software related issues, like service packs, SQLServer configuations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating systemrelated issue.Any Ideas would help me greatly!Thanks,Brian T
Would using the Restore cmd be the best option to take a backup from one database(db1) and apply it to another database(Edb2) even though they have different names?
I am trying to load many XML files into SQL Server 2005 Standard Edition. Each XML file contains many individual XML items, each starting with: <?xml version="1.0" encoding="UTF-8"?> The files do not have a .xml extension. I do have a schema for the data. I can also get hold of a dtd file for the data.
Since I am new both to XML and to databases I have been trying to follow examples wherever I can find them. I have not found the right example or tutorial. The closest that I found was in Kirk Haselden's book (a very nice book to learn from, by the way). On page 73 he sets up an OLE DB Source. His example works fine. But when I try to use XML Source instead of OLE DB as the source, I cannot get it to work. A typical error that I get reads as follows:
"Error at Data Flow Task [XML Source [1]]: There was an error setting up the mapping. The 'http://www.w3.org/XML/1998/namespace:lang' attribute is not declared."
Evidently I am doing something wrong, but what? Can someone point me to an example of using SSIS to load many XML documents that I could use as a model?
We are trying to load flat text files with upwards of 7 million records into a table on SQL. The table has a clustered index on 3 fields. We are sometimes able to complete smaller tables (500,000-750,000 records) and build the indexes prior to importing the data, however when we try the larger tables an error occurs :
Error at Destination for row number 6785496. Errors encountered so far in this task: 1
Location: somerge.c:1573 Expression: mrP->mrStatus!=MERGERUN::NONE SPID: 11 Process ID: 173
None of the recods end up importing. The row number it gives is always the total number of records that was in the text file I was trying to import. I tried to import the text files first and then build the clustered indexes but a table with only 300,000 records ran for nearly 4 days without completing before we killed it.
Hi, I am loading Data from Mainframe to SqlServer on WinNt. Normally it was taking 35 mts do the dts job. on last two days it runs for more then six hours still the job doesnt get over. I am at a loss to know what to do and how to fix this problem. THe main frame ppl said sqlServer is fetching the data very slowly. If anyone knows the solution pl post a solution
Hi! maybe this question is very common, but i can resolve that. I have a DB marked 'loading', and nothing seems to change this state. Is any way to revert to a normal state, or the only way is destroy and re-create? Thanks in advance
What is the best way to load large amounts of data? I am working on a project where I will need to load data into approx. 20 tables. Into several of the tables I will need to load around 400,000 records. I am familiar with the concepts involved in using BCP but was hoping I could avoid the step of going to text files. I am pulling data from Access (either 97 or 2000). Any suggestions would be welcome.
I gave the command to restore database and in the middle of recovery i canceled it. now the database is showing 'loading'. I am not able to do any thing to it.please suggest how to make it operational.I also want the data upto the time, I gave the command 'recover database'. thanks.
How would you fix these two problems. 1. On the window of Enterprise Manger and under the databases, the database shows one of the databases like this manag_dat(recovering)rather than manag_dat. (manag_dat is the name of the database). What could cause the probem and how to fix it? 2. Same position as above, but instead it shows manag_dat (loading). I tried to delete its device and the database. it hang. How would it happen and how to fix it?
I've a server X with 100 logins. Peridiocally this X server gets new logins created and gets the users password reset. Aside from this, I've a Y server. what I would like to do is, Have all my logins from X server get created on Y server with the same login/password every end of the day.
I've tried BCP. But i dont know, If BCP will work or not. Is there a way to do this?
I have just restored one of my databases. Now, that database is no longer available to me. It has been marked as 'Database(Loading)'
I have reviewed and tried the suggestions given from previous posts of this same question, and they have not been successful (i.e. 'RESTORE DATABASE DPeople WITH RECOVERY' as well as re-restoring the database).
I am getting the following errors when I try to load a db dump from Prod to another db. The first error occurred on the first run and the second error occured on the second run. The db is in the 'loading' status. Any help will be great. Thanks, Glen
2823, Severity: 20, State: 1, Process 10 tried to remove a PROC_BUF 0x2262848 named 'sp_DPSegVolsByDealID' that it does not hold in the hold procedure linked list..
Error : 2827, Severity: 20, State: 1, Procedure 'sp_DPSegVolsByDealID' in procedure buffer 0x2262848 is not properly hashed..
I am running SQL 2000 SP2 on a Win 2000 box. I attempted to restore adB a couple of times and the restore failed, but the dB appears in EMtwice, once with Loading and the other with Read-Only. I stopped SQLand attempted to delete the LDF & MDF files, but there was a sharingviolation.How can I detach these dBs from the SQL server?? Detach is not anoption, because the 'OK' button is not clickable.HELP anyone.Larry...
I have to load a 4 level XML file into Relational Database.
1. The input xml file contains a DTD definition reference. (Its an external file)
2. When tried using an XML source it prompts to set the prohibitdtd setting to false on xmlreader - How to do this.
3. I have primary keys (identity columns) - which are foreign keys in the next level and these primary keys can be generated only once I insert the record into the table. - What would be the best approach for this kind of Data transformations.
I have been getting the error below from an SSIS Package that has been running successfully in production for 2 Months. Also, this started after we installed SP2. We had errors durring the SP2 installation durring the Notification Services and Workstation Tools install but the installation program reported that the installations were successfull so I am not sure if this is related.
Here is the error message:
Date 6/12/2007 4:53:12 PM Log Job History (SchedulerEMailer)
Step ID 1 Server MISSRV7 Job Name SchedulerEMailer Step Name new step Duration 00:00:08 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message Executed as user: XSRV7SYSTEM. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 4:53:12 PM Error: 2007-06-12 16:53:19.90 Code: 0xC0010018 Source: Description: Error loading value "<DTS:LogProvider xmlnsTS="www.microsoft.com/SqlServer/Dts"><DTSroperty DTS:Name="ConfigString">MISSRV7.SSISConfig.ssisuser</DTSroperty><DTSroperty DTS:Name="DelayValidation">0</DTSroperty><DTSroperty DTS:Name="ObjectName">SSIS log provider for " from node "DTS:LogProvider". End Error Could not load package "ApptEmailer" because of error 0xC0010014. Description: One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors. Source: Started: 4:53:12 PM Finished: 4:53:19 PM Elapsed: 7 seconds. The package could not be loaded. The step failed.