SSIS Package Validation Taking A Long Time
Apr 19, 2006
I have SSIS Projects taking a long time to open with packages with a large number of data flows. Is there a way to turn off validation of metadata when a package opens? Turn off validation during execution on SSIS Service (after previously validated in dev)? Or be able to control when validation takes place in general?
In my one package (1 of 5) I have 43 data flows (with a single source to target mapping) in 4 sequence containers, and it takes approximately 2-3 seconds per source to target mapping and sequence container to validate which will translate to 1 ˝ to 2 ˝ minutes to open. When the project with all 100+ tables for the data warehouse goes through validation, I can make coffee in the time it takes to open the project. I have to delete *.suo file (or verify all packages are closed in the designer and save the project file), and when I open the project, I have to jump immediately to SSISĂ Work Offline to set it to not validate the metadata to be able to work in a timely fashion. DelayValidation=TRUE does not help much.
Running in debug mode, has an effect of causing packages that were not open and validated to go through validation though I am not running those packages. Validate once during design and run forever.
Even if I re-open a package that I just closed from designer and had gone through validation, it will go through the validation process again.
It would be great if there could be an on-demand option off the menu bar to allow one to control when validation can take place for a project, or a more granular validation option for a specific data flow or container.
View 7 Replies
ADVERTISEMENT
Feb 1, 2008
I would like to know how to disable valiation of SSIS when the SSIS runs in server.
The SSIS takes about 20% of time to validate the environment.
Are there any configuration in SSIS to tell SSIS the environment is stable and no validation is required?
Thanks.
View 4 Replies
View Related
Apr 14, 2008
Hi,
I have a DTS in SQL Server 2000, Where I am importing some data from a remote server (SQL 2000) to local server (SQL 2000).And this is working fine. it is taking max of 1 min to execute the package.
Now I have created the same DTS in sql server 2005 (SSIS) where the source server is sql server 2000 and the destination server is 2005.and I have created the ssis in that server. The same logic which i have created in sql 2000. But here it is taking almost 10 min to execute the package.
Where as the same in sql server 2000 taking max of 1 min. Why this happening.. Is there any configuration to execute the SSIS package.?
View 12 Replies
View Related
Jul 25, 2007
I am using SSIS to populate a star schema.
The issue is in the data flow for loading and setting the Fact table dimension keys (the dimensions are all loaded fine). After 16 rather pedestrian Lookup Transformations, I have an escalating problem adding additional Lookup transforms to the Data Flow. The problem is not in execution; the problem is adding more transforms in design mode.
Lookup # Fields in Data Flow Time to validate that lookup
<17 47 Sub-second
17 48 2 sec
18 49 4 sec
19 50 8 sec
20 51 16 sec
21 52 32 sec
22 53 64 sec
While I€™m intrigued by the mathematical progression that is forming here, the issue is that I have at least 6 more Lookups to perform. I hope you can see my dilemma.
I have gone to where it takes a little over 4 minutes each to validate the lookup transform and its associated Derived Column transform and Union transform (Total 12 Minutes). Not only does this add up to many idle minutes to each design step, BUT it breaks the debugger as it pre-validates the ENTIRE data flow before it ever switches into debugging mode.
Some notes:
1. It doesn€™t matter what order the Lookup transforms occur in, the timings are exactly the same.
2. I tried many Data Flow execution optimizations, but they don€™t improve the validation times (or even get a chance to improve the execution times!)
I realize this may be somewhat of a unique problem.
Thanks for any help you are able to lend.
-Dave
View 3 Replies
View Related
Sep 7, 2006
I load SSIS package using following code:
Application app = new Application();
Package pac = app.LoadFromSqlServer(packageName, serverName, null, null, null);
For simple package containing 2 tasks this code executes about 20 seconds.
If I load old version (SQL2K) package from SQL2000 then it takes 5 seconds.
Is any way to increase loading speed for SSIS packages?
View 9 Replies
View Related
Sep 4, 2007
Hi:
I have a query which returns approximately 50000 records, I am using a linked server to connect to two databases and retrieve data. For some reason it is taking a liitle more than hour to execute the query, but on MS Sql Server query window it comes after few minutes but the query runs for a long time.
How can expediate my query execution process.
Environment details
Database: MS Sql Server 64bit 2005
MS Sql jar file: sqljdbc_1.2.jar
OS: Windows both server and client.
Connect String in java code:
jdbcqlserver://sample_server:1433;databaseName=sample_db;user=admin_user;password=admin_pwd
and use PreparedStatement and ResultSet.
Regards
Arup
View 2 Replies
View Related
May 16, 2007
I'm running a query (see below) on my development server and its taking around 45 seconds. It hosts 18 user databases ranging from 3 MB to 400 MB. The production server, which is very similar but with only 1 25 MB user database, runs the query in less than 1 second. Both servers have been running on VMWare for almost 1 year with no problems. However last week I applied SP 2 to the development server, and yesterday I applied Critical Update KB934458. The production server is still running SQL Server 2005 Standard SP 1. Other than that, both servers are identical and running Windows 2003 Server Standard SP 1. I'm not seeing this discrepancy with other queries running against user databases.
use MyDatabase
GO
select db_name(database_id) as 'Database', o.name as 'Table',
s.index_id, index_type_desc, alloc_unit_type_desc, index_level, i.name as 'Index Name',
avg_fragmentation_in_percent, fragment_count, avg_fragment_size_in_pages,
page_count, avg_page_space_used_in_percent, record_count,
ghost_record_count, min_record_size_in_bytes, avg_record_size_in_bytes, forwarded_record_count,
schema_id, create_date, modify_date from sys.dm_db_index_physical_stats (null, null, null, null, 'DETAILED') s
join sys.objects o on s.object_id = o.object_id
join sys.indexes i on i.object_id = s.object_id and i.index_id = s.index_id
where db_name(database_id) = 'MyDatabase'
order by avg_fragmentation_in_percent desc
--order by avg_fragment_size_in_pages desc
--order by page_count desc
--order by record_count desc
--order by avg_record_size_in_bytes desc
View 4 Replies
View Related
May 31, 2007
i have sql 2000 db of about 120 GB. its taking about 10 -12 hours to restore on the same disk as new database.
server configuration is good.
when i try to restore another db of about 10 GB size, its restoring in about 5 minutes.
View 2 Replies
View Related
Oct 13, 2003
Sometime is necessary to stop MSSQLSeverOLAPServieces to do a full backup in my OLAP Server disks. After backup had finished and I tried to star MSSQLSeverOLAPServieces but it takes almost 30 minutes to the services starts.
What can it be causing that?
Paulo
View 5 Replies
View Related
Aug 2, 1999
I have a stored procedure that normally takes about 5 hours to complete:
DELETE tblX WHERE PROC_DT < dateadd(day, -93 , getdate())
tblX has about 55 million records and has an index on PROC_DT.
I have this running as a scheduled task. Over the weekend, the task executed and it is still running 56+ hours later. Does anybody have any ideas as to where I should look for the problem? I am afraid to kill the process because of the rollback time.
View 4 Replies
View Related
Mar 7, 2002
Hi,
I have a table with 48 million rows,when i executed following update query it is taking 10 HOURS in SQL SERVER 2000 with SP1.
Where as when i executed same query in SQL SERVER7.0 with same table then it is taking 13 MINUTES. Comming to Machine...SQL 2000 Server has more processors and greater memory than SQL 7.0 m/c.
It looks strange but this is true.Does any one faced such problem..is there any bug in SQL 2000?????
Here is Query::
update cus_pay_jan_dist set univ_regdate = b.dayid
from cus_pay_jan_dist a with (nolock), tm_dayids b with (nolock)
where a.univ_regdate = b.dayidnum and a.univ_regdate like '2001%'
Thanks
Ananth
View 6 Replies
View Related
Sep 19, 2007
Hi All,
Scenario:
There are two applications running on different server say ServerA and ServerB. Both applications are using same database server SQL Server 2005 say ServerB. Called the application as ApplicationA and ApplicationB with respect to Server names
It means for ServerA the database is remote and for ServerB, database is local.
Both the applications are Java application and using datasource to connect to the database. The driver used are SQL Server 2000 driver (which includes 3 jars). This can be a question that why 2000 driver is used for 2005. The reason is, application on ServerA is getting error while using SQL Server 2005 as Driver not proper.
Problem Area:
When ApplicationB (local to database) is doing some DB operations (which includes select and then batch insert), ApplicationA (remote) is trying to insert a record which is taking too long time (around 40 sec.). This is causing timed out in ApplicationA.
ApplicationA is inserting the data into the same table from where ApplicationB is selecting the data.
Any help????
Cheers
Nitin
View 2 Replies
View Related
Mar 27, 2008
I have an update statment in my SSIS that use to take 10 minutes in SQL 2000 dts and now its take 1 hour 15 minutes in SQL 2005.
this is my sql update statment -
Update WeeklySalesHistory set
weekendingdate =
(SELECT LastTransDateTime from ReplicationControl
where TableName = 'WEEKHST')
where weekendingdate is null
It is using ole db connection. About 36,000 records that it is updating.
I have read ole db can be slow and to use staging table. Does that mean on all updates like this I have to use a staging table and then insert. I didn't use to have to do this in SQL 2000. Has it changed. Are there any other options?
any input greatly appreciated.
View 7 Replies
View Related
Mar 30, 2007
Hello,
I'm trying to figure out why my transaction log backup is taking up to an hour to complete. I started off with a full recovery model with a Full database back up every Sunday, differential backups every Tuesday/Thursday and log backups every 5 minutes. I would have thought that the log file backups would execute much quicker because I'm backing them up more often.
Here is my backup statement, I'm hoping I've got a wrong option that you can point out to me:
BACKUP LOG [xxxx] TO [LogFilexxxxBackups] WITH NOINIT , NOUNLOAD , NAME = N'xxxx log backup', SKIP , STATS = 10, NOFORMAT
View 1 Replies
View Related
Apr 30, 2008
The process is as follow,
The destination table is truncated and indexes are dropped before loading and after data being inserted we re-create the indexes.
Before this, a view extracts data from more than 22 tables from a staging database and tries to insert this data in the destination table.
it used to take 12-15 mins, but since yesterday loading one particular table never completes. While loading, the database is set to Simple recovery. There are no blocking. It's part of a daily batch thats loads 6 GB of data everyday. But while loading on particular table it's just keep running for hours. I tried rebuilding the indexes and re-starting the SQL Server but of no use.
Any help is much appreciated as this production batch job.
Thanks in advance.
View 4 Replies
View Related
Sep 1, 2006
Good afternoon everyone, I have written a view that pulls customer demographic infomration as well as pulling data from multiple scalar-valued functions. I am using this view to pull and send data from one database to another in the same SQL server. The problem that I am having is that I am running this import as a scheduled job in windows. The job is taking almost 24 hours to complete this task. The total number of records that are being pulled is around 21,000+. I have tried removing the functions from the view and it only takes the view 20 seconds to pull the demographic information from the same 21,000+ records but when I add the function calls this is where the time to complete goes through the roof. Has anyone encountered this before if so what would you suggest doing? Any help would be appreciated. Here is the syntax for my view: SELECT TOP 100 PERCENT CUS_EMAIL AS Email, CUS_CUSTNUM AS MemberID, CUS_PREFIX AS Prefix, CUS_FNAME AS FirstName,
CUS_LNAME AS LastName, CUS_SUFFIX AS Suffix, CUS_TITLE AS Title, CUS_STATE AS State, CUS_COUNTRY AS Country, CUS_ZIP AS ZipCode,
CUS_SEX AS Gender, CAST(CUS_DEMCODEA AS nvarchar(20)) + ',' + CAST(CUS_DEMCODEB AS nvarchar(20))
+ ',' + CAST(CUS_DEMCODEC AS nvarchar(20)) + ',' + CAST(CUS_DEMCODED AS nvarchar(20)) AS DemoCodes,
dbo.GetSubScribedDateMLA(CUS_CUSTNUM, CUS_EMAIL) AS MLASubscribedDate, dbo.GetSubScribedDateMLP(CUS_CUSTNUM, CUS_EMAIL)
AS MLPSubscribedDate, dbo.GetSubScribedDateLDC(CUS_CUSTNUM, CUS_EMAIL) AS LDCSubscribedDate, dbo.GetMLAExpiration(CUS_CUSTNUM,
CUS_EMAIL) AS MLASubExpireDate, dbo.GetMLPExpiration(CUS_CUSTNUM, CUS_EMAIL) AS MLPSubExpireDate,
dbo.GetLDCExpiration(CUS_CUSTNUM, CUS_EMAIL) AS LDCSubExpireDate, dbo.IsProspect(CUS_CUSTNUM, CUS_EMAIL) AS AGMProspect,
dbo.IsCurrentCustomer(CUS_CUSTNUM, CUS_EMAIL) AS AGMCurrentCustomer, dbo.IsMLAMember(CUS_CUSTNUM, CUS_EMAIL) AS MLAMember,
dbo.IsMLPMember(CUS_CUSTNUM, CUS_EMAIL) AS MLPMember, dbo.IsLDCMember(CUS_CUSTNUM, CUS_EMAIL) AS LDCMember,
dbo.CalculateTotalRevenue(CUS_CUSTNUM, CUS_EMAIL) AS AGMTotalRevenue, dbo.GetPubCodes(CUS_CUSTNUM, CUS_EMAIL)
AS ProductsPurchased, dbo.GetEmailType(CUS_CUSTNUM, CUS_EMAIL, CUS_RENT_EMAIL) AS EmailType, CUS_COMPANY AS Company,
CUS_CITY AS City
FROM dbo.CUS
WHERE (CUS_EMAIL IS NOT NULL) AND (CUS_EMAIL <> '') AND (CUS_EMAIL_VALID = 'Y') AND (CUS_EMAIL LIKE '%@%.%') AND (CUS_RENT_EMAIL = 'Y' OR
CUS_RENT_EMAIL = 'R' OR
CUS_RENT_EMAIL = 'I') AND (CHARINDEX(' ', CUS_EMAIL) = 0) AND (CUS_EMAIL NOT LIKE '@%')
Thanks in advance Michael Reyeros
View 9 Replies
View Related
Apr 18, 2002
I have a backup mainentance plan that does a full backup daily at 03:00am and then 2 minute transaction log backups throughout the day to a raided hard drive (It is set to overwrite after 2 weeks), When i go into enterprise manager and select the database to restore it just seems to take too long to read the backup history in. Can this time be reduced as i need to be able to restore the database A.S.A.P but still need a point in time restore to within 2 minutes of going down??
Thanks
View 1 Replies
View Related
Jan 14, 2006
Hi,
Why is taking so long to open/create/render the reports for the first time? Is there any configuration to change this? I don€™t think this behavior is related to Report Execution or cache! I think there is something else going on! Thanks.
View 12 Replies
View Related
Dec 11, 2007
Does the validation time of DTSX package depend on the amount of data?
I have a data transfer task which contains a oledb source data from a SQL Server 2005 view. Then the data go through 3 Lookup transfromations before going into another view, also on SQL Server 2005 but a different database.
The purpose of this package is to load fact data, so it has to deal with few million of rows. Before setting DelayValidation to true, it takes few minutes just to open the package in BI Studio. Now with DelayValidation set to false, I can open the package without any problem. But it takes more than 5 minutes during Validation, Prepare for Execution and Pre-execution Phase. During the time the memory usage and cpu time on SQL Server goes up significantly. The CPU doesn't hit 100% though. My client machine doesn't have any significant activity.
I have similar packages ( Oledb view -> 3 or 4 lookup Transformations -> Oledb Table) but dealing with dimension data. With DelayValidaion set to False, those packages can be opened in BI Studio within few seconds. They also take only few seconds during those 3 phases before starting actual execution phase.
So I have an impression that the validation time depends on the amount of data in the database. Shouldn't it just depends on the metadata?
View 5 Replies
View Related
Nov 9, 2007
Dear friends,
Our package which at the time of normal execution takes 2-2:30 mins for fetching data on VPN with some select queries. But some times its job runs for hours and hours. What could be the exact reason behind it? I guess its queries are stuch somewhere, but not when we run from the BIDS or run the job manually.
Please help. Thanks.
View 6 Replies
View Related
Nov 22, 2007
I have two columns in Informix data base One has Data Type of date and another column of data type string.
Time is stored in string format. I have to Validate wether both are correct, not null, greater than 1753 and concate to get one datetime field to transfer to SQL Server.
Right now I am doing it in script component, as I need to log error if any thing is wrong
Is there any better way to do it,(derived column or any other component) so that I can log the error also.
Thanks
Dharmbir
View 7 Replies
View Related
Jan 19, 2006
Whenever I open a saved SSIS package, validation takes over and it can take a long time to do that. Is there a way to disable the validation process when opening the SSIS package? Thanks.
View 6 Replies
View Related
Jun 11, 2007
Is there a way you can open an SSIS package without validating it?
The reason is - when I take a package from PROD and open it in DEV - initially all settings-variables are wrong - and the validation takes heck of long time in that case. And then I need to change the variables and reload the package. And bloody hell - if I forgot to change a variable - I sometimes have to validate package 3 times. And sometimes - I only need to get visual look of package - so why do I always need to wait for validation...
View 6 Replies
View Related
Jan 20, 2006
We are going to be running a package repeatedly 24/7. The same package against the same data store, filtered using a "stageFlag" so as not to read rows previously processed. We have various timing statistics and have yet to fine tune; but on the surface it appears that it takes approximately three minutes to validate and another three minutes to run. If we have no additional data on the second run it still takes three minutes to complete - to do nothing but skip rows already processed.
Is it possible to set this up to run repeatedly without the validation on each iteration?
Any ideas as to how this would be accomplished would be greatly appreciated.
View 17 Replies
View Related
Mar 14, 2006
My SSIS package will just hang (do nothing) after validation of the package tasks. I realised that it does 2 validation. It then hits "starting exectuing" and then nothing. I mean nothing. It just stays the same. When I look into the logg file, the same message as in the output window. My package has parallel extract of data from the same datasource, but different tables. I dont know if that the problem but i really doubt it because i have done parallel table downloads countless times in version 2000. When i go into the data task window, the source task does no even indicate that its downloading (color yellow). Is there any reason why this will happen? Ooo, but the tasks executes just fine when i execute them individually (right click > execute).
View 17 Replies
View Related
May 21, 2008
A few pointers would be appreciated.
I am looking at building multiple SSIS packages. There will be some similarities. Flexibility is of highest importance. The main packages will need to connect to SQL Server1 as a source and SQL Server2 as a destination to transfer over dimenion data from multiple databases. (other SSIS packages may need to use SQL Server2 as a source and SQL Server1 as a destination)
For a single dimension table containing column dim_id on the target server (SQLServer2). I need to pass the results of the following SQL and insert into SQLServer2.database.dim_table
select dim.id
from SQLServer1.database08.dim_table
union
select dim.id
from SQLServer1.database07.dim_table
union
select dim.id
from SQLServer1.database06.dim_table
Now next year the names of the databases on SQLServer1 will be database09,database08,database07!
Now so far my best thought is creating views in my destination SQL Server. So I need some way of dropping and recreating the views. Previously in DTS I would expect to see SQL Server connection that I could use as source and destination. Now I can see SQL Server destination but not source? Also How do I just use SSIS to run some SQL. i.e execute a stored procedure, drop and creat views?
Many thanks,
Ells
p.s Flexibility is the key, in the last three months all the ip and server names have changed more than once so need to be as flexible as possible.
View 2 Replies
View Related
Jan 19, 2007
When I push my SSIS packages up to my production server (which has a different data source than my developement environment) and I try to open the package on the production server, it takes forever for to validate all the steps of the SSIS package because it's trying to validate against a datasource that isnt there, so it just waits for each element it's validating to time out. This is exceptionally annoying.
Is there a way to turn off this validation 'feature'?
View 14 Replies
View Related
Oct 24, 2007
Hi
I have created a package which executes every 10 mins. Last week end for maintenance purpose, I shutdown my database. Now as an initial execution process, my package does the default validation steps on which the database connection validation step fails. As this is the default functionality of SSIS I am not able to capture this error. Is there anyway to capture this error inside SSIS Package?
Thanks in advance.
Gnan
View 3 Replies
View Related
Dec 15, 2014
I have an SSIS package authored in SSDT for VS 2013 that cancels itself immediately after validation completes and execution commences. This behavior occurs when executed either in VS 2013 or from within SQL Server. No error messages are thrown in either the debug window or the log output (log is capturing everything). The only thing that occurs differently on this package as compared to another package I am able to execute successfully is that a command line window briefly flashes when the package cancels itself—but it is gone so fast I cannot read it. The last several lines of the debug output are as follows:
-----------------
Information: 0x40043006 at Merge Info, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Merge Info, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Merge Info, All Users CSV [2]: The processing of file "C:...AllUsers.csv" has started.
Information: 0x400490F4 at Merge Info, Lookup Org [47]: Lookup Org has cached 957 rows.
Information: 0x400490F5 at Merge Info, Lookup Org [47]: Lookup Org has cached a total of 26719 rows.
[Code] ....
What circumstances an SSIS package would cancel itself without throwing any errors?
View 2 Replies
View Related
Jul 20, 2007
I am creator of this package. This package used to work fine both from studio and deployed on server. I come back this project, but can't get package even runing debug in studio with protectionLevel set as EncryptSensitiveWithUserKey or EncryptSensitiveWithPassword.
Does anyone see this problem before?
Here is my error message:
OnError,PC6071,SLCNTFJ3845,Get Address Parcel Route,{948E2EC3-4B1D-4465-B5B9-2DD95F91B1B3},{35E95EAA-0C59-4D79-A07E-6E876D603253},7/20/2007 9:54:27 AM,7/20/2007 9:54:27 AM,-1071611876,0x,The AcquireConnection method call to the connection manager "GEODB" failed with error code 0xC0202009.
OnError,PC6071,SLCNTFJ3845,Get Address Parcel Route,{948E2EC3-4B1D-4465-B5B9-2DD95F91B1B3},{35E95EAA-0C59-4D79-A07E-6E876D603253},7/20/2007 9:54:27 AM,7/20/2007 9:54:27 AM,-1073450985,0x,component "get parcel from Sub Struct" (75) failed validation and returned error code 0xC020801C.
OnError,PC6071,SLCNTFJ3845,Get Address Parcel Route,{948E2EC3-4B1D-4465-B5B9-2DD95F91B1B3},{35E95EAA-0C59-4D79-A07E-6E876D603253},7/20/2007 9:54:27 AM,7/20/2007 9:54:27 AM,-1073450996,0x,One or more component failed validation.
OnError,PC6071,SLCNTFJ3845,Get Address Parcel Route,{948E2EC3-4B1D-4465-B5B9-2DD95F91B1B3},{35E95EAA-0C59-4D79-A07E-6E876D603253},7/20/2007 9:54:27 AM,7/20/2007 9:54:27 AM,-1073594105,0x,There were errors during task validation.
Thanks!
View 1 Replies
View Related
Feb 13, 2008
Hello.
I have a query that takes 1,5second to execute, but only 150ms of CPU. The query is quite simple, just one where statement against a clustered index.
SQL Server Execution Times:
CPU time = 156 ms, elapsed time = 1595 ms.
SELECT column1, column3, column4, ..., column10 FROM table WHERE column2 IN (37, 41, 43, 45, 49, 53, 55) ORDER BY column3 DESC
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC))
|--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=37 OR [u].[searchparamid]=41 OR [u].[searchparamid]=43 OR [u].[searchparamid]=45 OR [u].[searchparamid]=49 OR [u].[searchparamid]=53 OR [u].[searchparamid]=55 OR [u].[searchparamid]=59) ORDERED FORWARD)
I have tried to rewrite the query to an INNER JOIN instead.
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC))
|--Nested Loops(Inner Join, OUTER REFERENCES:([spal].[number]))
|--Index Seek(OBJECT:([MP].[dbo].[__search_parameters_lookup].[IX___search_parameters_lookup] AS [spal]), SEEK:([spal].[hash]=-1726604993) ORDERED FORWARD)
|--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=[spal].[number]) ORDERED FORWARD)
but the query still takes 1,5 seconds.
It spends 59% (according to execution plan) of sorting. 14% for the index seek of the __search_parameters_lookup table and then 24% of a clustered index seek of the __searchtest table.
How come it only uses that small of CPU but it still takes 1,5 seconds? It seems to be reading from memory as well so it shouldnt be an IO-problem?
The index I have on the table is a clustered index on (column 2).
Any ideas of how I can improve this? I have tried with DTA, also with a non clustered index on column3.
If I remove some columns from the SELECT-list the query will execute alot faster:
SQL Server Execution Times:
CPU time = 32 ms, elapsed time = 32 ms.
Booth the CPU and the elapsed time goes down and now appears to be more normal.
So there seems to be a problem caused by data transfer.
I tried to do a remake and normalize the table and when I do that I get the query execute with a speed of 400ms CPU and 400ms total. And this is still the exact same result, so why does it only spend 400ms of "rendering" or fetching the data when the tables are normalized but 1500ms when its denormalized?
Any ideas?
I am running Microsoft SQL Server 2000 - 8.00.2039
View 6 Replies
View Related
Jan 28, 2008
Hello.
I have a query that takes 1,5second to execute, but only 150ms of CPU. The query is quite simple, just one where statement against a clustered index.
SQL Server Execution Times:
CPU time = 156 ms, elapsed time = 1595 ms.
Code Snippet
SELECT column1, column3, column4, ..., column10 FROM table WHERE column2 IN (37, 41, 43, 45, 49, 53, 55) ORDER BY column3 DESC
Code Snippet
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC))
|--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=37 OR [u].[searchparamid]=41 OR [u].[searchparamid]=43 OR [u].[searchparamid]=45 OR [u].[searchparamid]=49 OR [u].[searchparamid]=53 OR [u].[searchparamid]=55 OR [u].[searchparamid]=59) ORDERED FORWARD)
I have tried to rewrite the query to an INNER JOIN instead.
Code Snippet
|--Sort(TOP 1000, ORDER BY:([u].[LastActivityDate] DESC))
|--Nested Loops(Inner Join, OUTER REFERENCES:([spal].[number]))
|--Index Seek(OBJECT:([MP].[dbo].[__search_parameters_lookup].[IX___search_parameters_lookup] AS [spal]), SEEK:([spal].[hash]=-1726604993) ORDERED FORWARD)
|--Clustered Index Seek(OBJECT:([MP].[dbo].[__searchtest].[cix___searchtest_] AS [u]), SEEK:([u].[searchparamid]=[spal].[number]) ORDERED FORWARD)
but the query still takes 1,5 seconds.
It spends 59% (according to execution plan) of sorting. 14% for the index seek of the __search_parameters_lookup table and then 24% of a clustered index seek of the __searchtest table.
How come it only uses that small of CPU but it still takes 1,5 seconds? It seems to be reading from memory as well so it shouldnt be an IO-problem?
The index I have on the table is a clustered index on (column 2).
Any ideas of how I can improve this? I have tried with DTA, also with a non clustered index on column3.
If I remove some columns from the SELECT-list the query will execute alot faster:
SQL Server Execution Times:
CPU time = 32 ms, elapsed time = 32 ms.
Booth the CPU and the elapsed time goes down and now appears to be more normal.
So there seems to be a problem caused by data transfer.
I tried to do a remake and normalize the table and when I do that I get the query execute with a speed of 400ms CPU and 400ms total. And this is still the exact same result, so why does it only spend 400ms of "rendering" or fetching the data when the tables are normalized but 1500ms when its denormalized?
Any ideas?
I am running Microsoft SQL Server 2000 - 8.00.2039
View 7 Replies
View Related
Jun 16, 2015
I've built an SSIS package in SSDT 2014. The package was running successfully
When I close and open the package it hangs on validation of single task out of all.
OLEDB DESTINATION Task
I have tried several times to close the SSDT and open but facing same issue.
These seems like bugs in SSIS/SSDT. What would cause the relocation of some script code to hang the validation process like this?
View 2 Replies
View Related