HELP!! How To Estimate The Time It Take...
Nov 1, 2004
to do a database repair on SQL server v7.
This is what I did
dbcc checkdb ('mydatabase', REPAIR_REBUILD) with estimateonly
It gave me error ....
estimateonly is not an option.
What did I do wrong? Thank you for your help!!!!!!!!!!
View 7 Replies
ADVERTISEMENT
Dec 3, 2003
I have a ordersystem with repeating orders. Each order has a orderline
with a frequency and a intervaltype (day, week, month and so on) for
indicating how often a task should be done.
When an task/job is done, an equivalent receipt with receiptlines is
generated.
When a new job/task should be suggested, it has to calculate the
shortest next intervallength for each repeating orderline. This
intervallength must then be added to the date of the last receiptline.
I want to to this all in a SPROC. My result should be the one next
orderline that should be carried out with a date for when this is.
RESULT
OrderID TaskID TaskDesct EstDate
5000 GETGROCERIES Pick up groceries for miss Lama 19.12.2003
************************************************************************
Is this possible to do all in a SPROC and not having to do any in .NET
(which this is developed in), and can you help me with some ideas?
************************************************************************
To se my starting point, se the diagram at:
http://www.promotion.no/div/diagram.gif
My mind is stuck and this is so far I've reached:
ALTER PROCEDURE dbo.GetNextOrders
AS
CREATE TABLE #Interval
(
ivID int IDENTITY(1,1),
ivTaskID char(20),
ivTaskDescr nvarchar(200),
ivHour int
)
INSERT INTO #TemporaryTable
SELECT RepID, RepIntName
FROM RepeatInterval
SELECT InvoiceDetails.iID, InvoiceDetails.ReceiptID,
InvoiceDetails.OrderID, InvoiceDetails.TaskID,
InvoiceDetails.TaskDescr, InvoiceDetails.DateRegistered
FROM InvoiceDetails LEFT OUTER JOIN
Orders ON InvoiceDetails.OrderID = Orders.OrderID
WHERE (Orders.DateStart < GETDATE()) AND
(Orders.DateEnd > GETDATE() OR Orders.DateEnd IS NULL)
View 2 Replies
View Related
Jan 19, 2007
I have a 2.5TB database that we need to run checkdb on. Is there a formula to figure how log it would take with no options? Doing research, it appears the we are stuck with doing the PHYSICAL_ONLY due to time constraints.
View 7 Replies
View Related
Jan 28, 2008
As in title. Is there any tool? I'm asking beceuse, I have some big bases, and processing may take a lot of time (at least few hours), and I'll be glad if it's possibility to know estimate time before query runs. I'm using MsSql 2005 Developer edition.
View 14 Replies
View Related
Apr 30, 2008
how long does it take to convert a DTS package to ssis package?
View 9 Replies
View Related
Jun 12, 2007
I would like to ask some help for estimating necessary hardware for a database, I can tell you about usage profile the following:
- 2 tables, approximately 100.000 records, the tables joined in 1:1 relation, one record is approximately 500 bytes long
- typical scenario:
- select what returns 1 or zero record
- if it returned 1 record an update performed on this record
- the event logged in a third table (one insert)
- this typical scenario needs to run 100-300 times in every second
Can you tell me any advice on sizing of hardware for this? We can rely on a well designed database structure (proper indexes, etc...)
Thanx in advance
View 1 Replies
View Related
Apr 15, 2008
Hi, I've read that a SQL Server Express database is limited to GB. If I have a database with about 120 tables which store only simple datatypes (int, money, datetimes, varchar, etc.), no blobs or large files, how can I estimate the amount of data or number of rows it can hold before it runs out?
View 11 Replies
View Related
Sep 3, 2007
Hello,how can I estimate the size (KB) of a temptable?Thank Yousilas
View 7 Replies
View Related
Sep 16, 2014
I was having interesting discussion on estimation of log file with a fellow collegue who happens to be quite knowledgable as well.
He told me if we identify the most frequently hit tables for a database and then (sum their sizes * 1.5) for OLAP we get rough estimate for disk space to be allocated for log file.
View 1 Replies
View Related
Feb 4, 2015
I have a table like below,
CREATE TABLE Student
(
Id BIGINT not null
,Name NCHAR(20) not Null
,Branch NVARCHAR (64) null
)
The table contains : 100000 rows .
Getting details of below
1)Number of rows in a data page
2)Total number of pages required for the table
3)Total Table size in KB or MB
4)Total file size in Kb or MB
View 1 Replies
View Related
Jan 31, 2007
My client's website database is hosted by a third party. I need to alter one of the column definitions for the largest table in the database. Unfortunately, the transaction log fills up if I try to alter the table. I've done all the usual stuff like truncating the log, etc., but the simple fact is that the operation requires more log space than we have available. Therefore, we need to purchase additional disk space for the database.
What I'm looking for is a way to roughly estimate how much log space will be required to alter this table so that we purchase enough but not too much additional space. The table has an identity primary key and 4 other single column indexes: one int, one datetime and two varchar(30) columns.
Any suggestions? Thanks in advance.
View 4 Replies
View Related
Oct 16, 2014
How SQL calculates the estimated row count when < or > are used in the WHERE clause of a statement. For example:-
WHERE Created_Datetime_utc > CONVERT(DATETIME,'2014-10-14 10:00:00',102)
I know how the estimated number of rows are calculated when an = is used but I've been googling and cannot find anything about < and >.
View 2 Replies
View Related
Feb 4, 2015
I have a table like below,
CREATE TABLE Student
(
Id BIGINT not null
,Name NCHAR(20) not Null
,Branch NVARCHAR (64) null
)
The table contains : 100000 rows .
1)Number of rows in a data page
2)Total number of pages required for the table
3)Total Table size in KB or MB
4)Total file size in Kb or MB
View 4 Replies
View Related
Aug 30, 2007
Hi,
I need to estimate the effort required in writing some SSIS packages.
Could anyone provide some pointers to the various factors (E.g. number of tables, source, etc.) which influence the effort estimate for SSIS packages?
Thanks in advance.
Regards,
B@ns
View 4 Replies
View Related
Jan 31, 2005
i need to full-text index a table so that i can easily search the text fields of that table.. the table has about 21,000 rows, and i was wondering how long it might take to full-text index it?
thanks
View 1 Replies
View Related
Mar 24, 2015
How to estimate the size of datafile while creating the database?
View 2 Replies
View Related
Jun 10, 2015
I've been asked to put together an estimation for the performance impact that replication would have on our database server during a particular operation. I know that this depends on a lot of different factors, including:
* Number of articles being replicated
* Types of articles being replicated
* Number of DML transactions that would result in delivery of replicated data
Any way to turn this into a meaningful metric?
View 0 Replies
View Related
Apr 22, 2015
What is the best way to forecastestimate space for non-clustered index on a table?
Example :
Table name : Test123
Row : 170000000
Reserved : 18000000 KB
Data : 70000000 KB
Index: 40000000 KB
Note: Test123 already has clustered index and 2 non clustered indexes.
View 7 Replies
View Related
Aug 7, 2007
Hi all,
I have created a report in SSRS 2005 which is being viewed by users from different Time Zones.
I have a dataset which has a field of type datetime (UTC). Now I would like to display this Date according to the User Time Zone.
For example if the date is August 07, 2007 10:00 AM UTC,
then I would like to display it as August 07, 2007 03:30 PM IST if the user Time Zone is IST.
Similarly for other Time Zones it should display the time accordingly.
Is this possible in SSRS 2005?
Any pointers will be usefull...
Thanks in advance
sudheer racha.
View 5 Replies
View Related
Apr 25, 2014
Sample Table
USE [Testing]
GO
/****** Object: Table [dbo].[Testing] Script Date: 4/25/2014 11:08:18 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
[Code] ....
It seems to work fine with one million records.
Each primary key is unique, but the begindate is non-unique, and i guess even if i use datetime2 and add nanoseconds, from what i have read, there is a chance that i could have a duplicate datetime since the date is imported via XML from multiple sources.
View 7 Replies
View Related
Jun 11, 2015
Is there a way to keep track in real time on how long a stored procedure is running for? So what I want to do is fire off a trace in a stored procedure if that stored procedure is running for over like 5 minutes.
View 5 Replies
View Related
Oct 22, 2015
I am trying to load previous days data at 3 am via a SSIS job.
The Date variable is initiated as DATEADD("dd",-1, GETDATE()) in the for loop.
Now, as this job runs at 3 am, and I set the variable as GETDATE() - 1, it excluded the data from 12 am to 3 am in the resultset as Date is set as YYYY-MM-DD 03:00:00:000 I need this to be set as YYYY-MM-DD 00:00:00:000
How can i do this?Â
View 2 Replies
View Related
Oct 3, 2015
I hope to update a DateTime column value with a Time input parameter.  Poor attempt below but it looks like the @ApptTime param is coming in as 10:45:00.0000000 and I might have an existing @SendOnDate as: 2015-10-05 07:00:00.000...I hope to end up with 2015-10-05 10:45:00.000
ALTER PROCEDURE [dbo].[SendEditUPDATE]
@QuePoolID int=null
,@ApptTime time(7)
,@SendOnDate datetime
[code]...
View 14 Replies
View Related
Dec 15, 2006
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open()
sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table
scb = New SqlCeCommandBuilder(sda)
sda.Update(dataset)
cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open()
Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
cmd.CommandType = CommandType.Text
cmd.ExecuteNonQuery()
cn.Close()
StartTime = Environment.TickCount
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
View 1 Replies
View Related
Apr 21, 2015
SELECTÂ
  CONVERT(VARCHAR(10),attnc_chkin_dt,101) as INDATE,
  CONVERT(VARCHAR(10),attnc_chkin_dt,108) as TimePart
FROM pmt_attendance
o/p
indate   04/18/2015
time part :17:45:00
I need to convert this 17:45:00 to 12 hours date format...
View 8 Replies
View Related
Nov 12, 2007
Hi,
We need to select rows from the database that have been recently inserted/updated. We have a main primary table (COMMIT_TEST) and a second update table (COMMIT_TEST_UPDATE). The update table contains the primary key and a LAST_UPDATE field which is a datetime (to tell us when an update occurred). Triggers on the primary table are used to populate the update table.
If we insert or update the primary table in a transaction, we would expect that the datetime of the insert/update would be at the commit, however it seems that the insert/update statement is cached and getdate() is executed at the time of the cache instead of the commit. This causes problems as we select rows based on LAST_UPDATE and a commit may occur later but the earlier insert timestamp is saved to the database and we miss that update.
We would like to know if there is anyway to tell the SQL Server to not execute the function getdate() until the commit, or any other way to get the commit to create the correct timestamp.
We are using default isolation level. We have tried using getdate(), current_timestamp and even {fn Now()} with the same results. SQL Queries that reproduce the problem are provided below:
/* Different functions to get current timestamp €“ all have been tested to produce the same results */
/*
SELECT GETDATE()
GO
SELECT CURRENT_TIMESTAMP
GO
SELECT {fn Now()}
GO
*/
/* Use these statements to delete the tables to allow recreate of the tables */
/*
DROP TABLE COMMIT_TEST
DROP TABLE COMMIT_TEST_UPDATE
*/
/* Create a primary table and an UPDATE table to store the date/time when the primary table is modified */
CREATE TABLE dbo.COMMIT_TEST (PKEY int PRIMARY KEY, timestamp) /* ROW_VERSION rowversion */
GO
CREATE TABLE dbo.COMMIT_TEST_UPDATE (PKEY int PRIMARY KEY, LAST_UPDATE datetime, timestamp ) /* ROW_VERSION rowversion */
GO
/* Use these statements to delete the triggers to allow reinsert */
/*
drop trigger LOG_COMMIT_TEST_INSERT
drop trigger LOG_COMMIT_TEST_UPDATE
drop trigger LOG_COMMIT_TEST_DELETE
*/
/* Create insert, update and delete triggers */
create trigger LOG_COMMIT_TEST_INSERT on COMMIT_TEST for INSERT as
begin
declare @time datetime
select @time = getdate()
insert into COMMIT_TEST_UPDATE (PKEY,LAST_UPDATE)
select PKEY, getdate()
from inserted
end
GO
create trigger LOG_COMMIT_TEST_UPDATE on COMMIT_TEST for UPDATE as
begin
declare @time datetime
select @time = getdate()
update COMMIT_TEST_UPDATE
set LAST_UPDATE = getdate()
from COMMIT_TEST_UPDATE, deleted, inserted
where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
GO
/* In our application deletes should never occur so we don€™t log when they get modified we just delete them from the UPDATE table */
create trigger LOG_COMMIT_TEST_DELETE on COMMIT_TEST for DELETE as
begin
if ( select count(*) from deleted ) > 0
begin
delete COMMIT_TEST_UPDATE
from COMMIT_TEST_UPDATE, deleted
where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY
end
end
GO
/* Delete any previous inserted record to avoid errors when inserting */
DELETE COMMIT_TEST WHERE PKEY = 1
GO
/* What is the current date/time */
SELECT GETDATE()
GO
BEGIN TRANSACTION
GO
/* Insert a record into the primary table */
INSERT COMMIT_TEST (PKEY) VALUES (1)
GO
/* Simulate additional processing within this transaction */
WAITFOR DELAY '00:00:10'
GO
/* We expect at this point that the date is written to the database (or at least we need some way for this to happen) */
COMMIT TRANSACTION
GO
/* get the current date to show us what date/time should have been committed to the database */
SELECT GETDATE()
GO
/* Select results from the table €“ we see that the timestamp is 10 seconds older than the commit, in other words it was evaluated at */
/* the insert statement, even though the row could not be read with a SELECT as it was uncommitted */
SELECT * FROM COMMIT_TEST
GO
SELECT * FROM COMMIT_TEST_UPDATE
Any help would be appreciated, we understand we could make changes to the application/database to approximate what we need, but all the solutions have identified suffer from possible performance issues, or could still lead to missing deals (assuming the commit time is larger than some artifical time window).
Regards,
Mark
View 8 Replies
View Related
Sep 21, 2015
I need to do a time test for restoring an Azure SQL database from a point in time. Can I automate this through PowerShell.
View 3 Replies
View Related
Dec 26, 2014
I need to take a temporary table that has various times stored in a text field (4:30 pm, 11:00 am, 5:30 pm, etc.), convert it to miltary time then cast it as an integer with an update statement kind of like:
Update myTable set MovieTime = REPLACE(CONVERT(CHAR(5),GETDATE(),108), ':', '')
how this can be done while my temp table is in session?
View 2 Replies
View Related
Oct 7, 2015
I have a table called employee_punch_record that we use to store employee time clock punches.
The columns are:
employeeid,
punch_timestamp,
punch_type (In / Out),
closed (bit used as status for open or closed pay periods),
ident
Here are some examples of a record:
bkingery62015-10-06 16:59:04.000In0
bkingery72015-10-06 16:59:09.000Out0
bkingery82015-10-06 16:59:13.000In0
bkingery92015-10-06 18:22:44.000Out0
bkingery102015-10-06 18:22:46.000In0
bkingery112015-10-06 18:22:48.000Out0
bkingery122015-10-06 18:22:51.000In0
tfeller52015-10-05 17:00:05.000In0
We are using SQL Server 2008 as our database and use Access as a GUI. I am looking to create a form in Access where employees can access their time card and request changes from management. I want to use the format from the attached screen shot for the form. I pretty much know how to do it all, the only point of complication is trying to figure out the easiest way to get the transaction punch record data on employee_punch_record into a format where I can easily populate the form in the horizontal format you see in the screen shot.
I am not super strong in SQL, but figure I can do it using a formatting table of some sort. quick and easy way to move transaction records into a more horizontally oriented record?
View 0 Replies
View Related
Feb 19, 2008
Hi all,
I have a very simple time series model which processing works fine without any problem. However when I run the following query
SELECT
[TimeSeries].[PriceChange],
[TimeSeries].[Symbol],
PredictTimeSeries(PriceChange, -3, 2)
From
[TimeSeries]
WHERE
[TimeSeries].[Symbol] = 'x'
I get the following error:
TITLE: Microsoft SQL Server 2005 Analysis Services
------------------------------
Error (Data mining): A time series prediction was requested with a start time further in the past than the internal models of the mining model, TimeSeries, specified in the HISTORIC_MODEL_GAP and HISTORIC_MODEL_COUNT parameters can process.
The following is the excerpt of the minding model script related to the two parameters:
<AlgorithmParameters>
<AlgorithmParameter>
<Name>MISSING_VALUE_SUBSTITUTION</Name>
<Value xsi:type="xsdtring">Previous</Value>
</AlgorithmParameter>
<AlgorithmParameter>
<Name>HISTORIC_MODEL_GAP</Name>
<Value xsi:type="xsd:int">1</Value>
</AlgorithmParameter>
<AlgorithmParameter>
<Name>HISTORIC_MODEL_COUNT</Name>
<Value xsi:type="xsd:int">10</Value>
</AlgorithmParameter>
</AlgorithmParameters>
These HISTORIC_MODEL_GAP (1) and HISTORIC_MODEL_COUNT (10) should accommodate PredictTimeSeries(PriceChange, -3, 2). Could anyone shed some light on this?
View 3 Replies
View Related
Apr 30, 2015
we have problems with our SQL Reporting Service 2012 (SSRS) server . We have setup Kerberos delegation between SSRS and the database server (SQL Server Always-on cluster) so users are authenticated down to the database. The issue occurs from time to time that SSRS loses the ability to delegate the user credentials to the database. At this point in time the Report Server logs contain rejected database connections because of ANONYMOUS logon. After restarting SSRS the problem is gone.
View 2 Replies
View Related
May 26, 2005
Hi,
I have a table which has a few fields, one being "datetime_traded". I need to write a query which returns the row which has the closest time (down to second) given a date/time. I'm using MS SQL.
Here's what I have so far:
Code:
select * from TICK_D
where datetime_traded = (select min( abs(datediff(second,datetime_traded , Convert(datetime,'2005-05-30:09:31:09')) ) ) from TICK_D)
But I get an error - "The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.".
Does anyone know how i could do this? Thanks a lot for any help!
View 2 Replies
View Related
Aug 17, 2007
Ok, so I have some horribly convuluted SQL that I would love to optomize. I'm not happy leaving it in it's current state, that's for sure!
I'm currently working on our test bed servers, so obviously my stats are out because of the "crap-ness" (yes, that's the technical term) of the hardware, but still, it should NEVER need to take this long!!
Basically, the issue arises in the nasty join to the career table (one employee can have multiple career lines). Just to make things complicated, employees can have any number of career records on any given date, these can even be input for future career events. The following SQL picks out the latest-current career date for each employee based on the career_date being <= GetDate() and the date of entry for this date being the greatest.
E.g.
career_date | datetime_created
2009-01-01 | 2006-05-05 13:55:21.000
2007-01-01 | 2006-05-05 13:54:18.000
2007-01-01 | 2006-05-05 13:52:55.000
From the above we want to return
2007-01-01 | 2006-05-05 13:54:18.000
SET STATISTICS IO ON
SET STATISTICS TIME ON
SELECT a.sAMAccountNameAs 'sAMAccountName'
, a.userPrincipalNameAs 'userPrincipalName'
, 'TRUE'As 'Modify'
, RTRIM(e.unique_identifier)As 'employeeID'
, RTRIM(e.employee_number)As 'employeeNumber'
, RTRIM(e.known_as)
+ CASE WHEN RTRIM(e.surname) IS NOT NULL THEN
' ' + RTRIM(e.surname) ELSE NULL ENDAs 'displayName'
, RTRIM(e.known_as)As 'givenName'
, RTRIM(e.surname)As 'sn'
, RTRIM(c.job_title)As 'title'
, RTRIM(c.division)As 'company'
, RTRIM(c.department)As 'department'
, RTRIM(l.description)As 'physicalDeliveryOfficeName'
, RTRIM(REPLACE(am.dn,'\',''))As 'manager'
, t.full_mobile
+ CASE WHEN RTRIM(t.mobile_number) IS NOT NULL THEN
' (DD: ' + RTRIM(t.mobile_number) + ')'ELSE NULL END
As 'mobile'
, t.mobile_numberAs 'otherMobile'
, ad.address_ad_countryAs 'c'
, ad.address_ad_address1
+ CASE WHEN ad.address_ad_address2 IS NOT NULL THEN
', ' + ad.address_ad_address2 ELSE NULL END
+ CASE WHEN ad.address_ad_address3 IS NOT NULL THEN
', ' + ad.address_ad_address3 ELSE NULL END
+ CASE WHEN ad.address_ad_address4 IS NOT NULL THEN
', ' + ad.address_ad_address4 ELSE NULL END
+ CASE WHEN ad.address_ad_address5 IS NOT NULL THEN
', ' + ad.address_ad_address5 ELSE NULL ENDAs 'streetAddress'
, ad.address_ad_poboxAs 'postOfficeBox'
, ad.address_ad_cityAs 'l'
, ad.address_ad_CountyAs 'st'
, ad.address_ad_postcodeAs 'postalCode'
, RTRIM(ad.address_ad_telephone) +
CASE WHEN RTRIM(a.othertelephone) IS NOT NULL
AND RTRIM(ad.address_ad_telephone) IS NOT NULL THEN
' (Ext: ' + RTRIM(a.othertelephone) + ')'
ELSE
CASE WHEN RTRIM(a.othertelephone) IS NOT NULL
AND RTRIM(ad.address_ad_telephone) IS NULL THEN
'Ext: ' + RTRIM(a.othertelephone)
ELSE NULL
END
ENDAs 'telephoneNumber'
FROM employee e
LEFT
JOIN career c
ON c.parent_identifier = e.unique_identifier
AND c.career_date =(
SELECTmax(c2.career_date)
FROMpwa_master.career c2
WHEREc2.parent_identifier = c.parent_identifier
ANDc2.career_date <= GetDate()
)
AND c.datetime_created =(
SELECT max(c3.datetime_created)
FROMpwa_master.career c3
WHEREc3.parent_identifier = c.parent_identifier
ANDc3.career_date = c.career_date
)
LEFT
OUTER
JOIN AD_Import am
ON am.employeeNumber = c.manager_number
INNER
JOIN AD_Import a
ON a.employeeID = e.unique_identifier
LEFT
JOIN AD_Telephone t
ON t.unique_identifier = e.unique_identifier
LEFT
JOIN AD_Address ad
ON ad.address_pwa_location = e.location
LEFT
JOIN xlocat l
ON l.code = c.location
WHERE (a.employeeNumber IS NOT NULL
OR a.employeeID IS NOT NULL)
SQL Server Execution Times:
CPU time = 0 ms, elapsed time = 0 ms.
(1706 row(s) affected)
Table 'AD_Import'. Scan count 4, logical reads 106, physical reads 0, read-ahead reads 0.
Table 'AD_Address'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0.
Table 'AD_Telephone'. Scan count 2, logical reads 10, physical reads 0, read-ahead reads 0.
Table 'Worktable'. Scan count 868, logical reads 956, physical reads 0, read-ahead reads 0.
Table 'xlocat'. Scan count 2, logical reads 8, physical reads 0, read-ahead reads 0.
Table 'career'. Scan count 5088, logical reads 2564843, physical reads 0, read-ahead reads 0.
Table 'people'. Scan count 1697, logical reads 5253, physical reads 0, read-ahead reads 0.
Table 'Worktable'. Scan count 826, logical reads 914, physical reads 0, read-ahead reads 0.
SQL Server Execution Times:
CPU time = 15203 ms, elapsed time = 8114 ms.
Any advice on what I can do to optomize?
Oh judt to point out that "employee" is a view on the "Table 'people'."
EDIT: I know it's pointing out the obvious, but I'm pulling out the managers "DN" from AD_Import based on the manager_number and employeeNumber matching.
View 14 Replies
View Related