Query Performance With DateTime Versus Int Condition
Jul 10, 2007
Good day,
The following query performs acceptably (2 seconds against 126,000,000 rows in the main table):
SELECT Count(*)
FROM
Message1_2_3 INNER JOIN
VDMVDO ON Message1_2_3.VDMVDO_ID = VDMVDO.VDMVDO_ID INNER JOIN
NMEA ON VDMVDO.NMEA_ID = NMEA.NMEA_ID
WHERE
NMEA.NMEA_ID BETWEEN 14000000 AND 14086000 AND
VDMVDO.RepeatIndicator = 0 AND
NMEA.SentenceFormatterID = 'VDM'
When we change the first condition from an Int column to a DateTime as in:
NMEA.TimeDate BETWEEN CONVERT(DATETIME, '2007-07-09 8:30:00', 102) AND CONVERT(DATETIME, '2007-07-09 9:30:00', 102)
the query performance falls to 14 seconds, even though both columns are indexed and a similar number of rows are found. When the select clause changes from a simple Count to a complex Max expression, response time falls to over a minute!
Any thoughs on optimizing the DateTime search would be greatly appreciated...
View 4 Replies
ADVERTISEMENT
Nov 6, 2014
I want to be able to return the rows from a table that have been updated since a specific time. My query returns results in less than 1 minute if I hard code the reference timestamp, but it keeps spinning if I load the reference timestamp in a table. See examples below (the "Reference" table has only one row with a value 2014-09-30 00:00:00.000)
select * from A where ReceiptTS > '2014-09-30 00:00:00.000'
select * from A where ReceiptTS > (select ReferenceTS from Reference)
View 5 Replies
View Related
Nov 1, 2000
I have a complex(long) SQL statement inside of a stored procedure which feeds several variables from 2 tables. Something like
Select @var1, @var2, etc from
table1, table2 where
table1.id = table2.id
Is there any advantage to creating a view for this statement
and selecting from that, even though this resides in a stored procedure?
View 1 Replies
View Related
Jul 23, 2005
Hi,My company has a scenario where we would like to change the data typeof an existing primary key from an integer to a char, but we areconcerned about the performance implications of doing so. The scriptfor the two tables that we need to modify is listed below. TableFR_Sessions contains a column named TransmissionID which is currentlyan integer. This table contains about 1 million rows of data. TableFR_VTracking table also contains the TransmissionID as part of it'sprimary key and it contains about 35 millions rows of data. These twotables are frequently joined on TransmissionID (FR_Sessions is theparent). The TransmissionID column is used primarily for joins and isnot typically displayed.We need like to change the TransmissionID data type from int tochar(7), and I had a few questions:1) Would this introduce significant performance degradation? I haveread that char keys/indexes are slower than int/numeric.2) Are there collation options (or any other optimizations) that wecould use to minimize the performance hit of the char(7)...if so whichones?I am a software architect by trade, not a database guru, so please goeasy on my if I overlooked something obvious :)Any suggestions or information would be greatly appreciated.Thanks,Tim-------------------CREATE TABLE [FR_Sessions] ([TransmissionID] [int] IDENTITY (1, 1) NOT NULL ,[PTUID] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL ,[PortNum] [numeric](6, 0) NOT NULL CONSTRAINT [DF_FR_Sessions_PortNum]DEFAULT (0),[CloseStatus] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[RecvBytes] [int] NULL ,[SendBytes] [int] NULL ,[EndDT] [datetime] NULL CONSTRAINT [DF_FR_Sessions_EndDT] DEFAULT(getutcdate()),[LocalEndDT] [datetime] NULL ,[TotalTime] [int] NULL ,[OffenderID] [numeric](9, 0) NULL ,[UploadStatus] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL CONSTRAINT [DF_FR_Sessions_UploadStatus] DEFAULT ('N'),[SchedBatchID] [numeric](18, 0) NULL ,[SWVersion] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[DLST] [bit] NULL ,[TZO] [smallint] NULL ,[Processed] [bit] NOT NULL CONSTRAINT [DF_FR_Sessions_Processed]DEFAULT (0),[CallerID] [varchar] (13) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[PeerIP] [varchar] (16) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,[XtraInfo] [varchar] (1024) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[IdType] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,CONSTRAINT [PK_FR_Sessions] PRIMARY KEY CLUSTERED([TransmissionID]) WITH FILLFACTOR = 90 ON [PRIMARY]) ON [PRIMARY]CREATE TABLE [FR_VTracking] ([TransmissionID] [int] NOT NULL ,[FrameNum] [int] NOT NULL ,[LatDegrees] [float] NOT NULL ,[LonDegrees] [float] NOT NULL ,[Altitude] [float] NOT NULL ,[Velocity] [float] NOT NULL ,[NumPositions] [smallint] NOT NULL ,[NavMode] [smallint] NOT NULL ,[Units] [smallint] NOT NULL ,[GPSTrackingID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[dtStamp] [datetime] NULL ,CONSTRAINT [PK_FR_VTracking] PRIMARY KEY CLUSTERED([TransmissionID],[FrameNum]) WITH FILLFACTOR = 90 ON [PRIMARY]) ON [PRIMARY]
View 5 Replies
View Related
May 15, 2008
I was hoping I wouldn't be another poster with performance issues after migrating to SQl 2005 from SQL 2000 but here I am.
I am in the process of testing out our databases on Sql Server 2005 for migration from SQL Server 2000 and there are certain portions of code that have been affected negatively. I have read thru many of the posts here and have tried out most of the recommendations. I will start out with things I've done and then provide the actual SQL.
1) I have rebuilt all indexes ( using the DBCC REINDEX using the table option).
2) Updated the db engine to latest hot fix (build 3239) that addresses speed related fixes.
3) I also ran sp_createstats using the 'fullscan' option to create stats on all columns of all tables (minus indexed columns)
4) Since nothing seemed to work, I even ran UPDATE STATICS with FULL SCAN on all tables even though I did not need it as the REBUILD woudl have created stats. But I was willing to try anything.
I have confirmed that the execution plans are different even though the data on both sql 2000 and sql 2005 are identical (i put a copy on 2005). The plans themselves are huge as the queries are huge. Here is the query.
SELECT InterimView.* ,TestView.*
FROM View_LabDataExport_TestFormData_55 TestView
RIGHT OUTER JOIN ( SELECT ReqView.*, CDView.*
FROM View_LabDataExport_FormData_55 ReqView
LEFT OUTER JOIN View_LabDataExport_FormData_CD_55 CDView
ON ( CDView.DB_SubjectID_CD = ReqView.DB_SUbjectID )
) InterimView
ON ( InterimView.DB_FormID = TestView.DB_FormID_T AND
InterimView.DB_LabSampleID = TestView.DB_LabSampleID_T )
The above query takes abotu 8 secs to run on 2000 and about 1 minute to run on 2005. This is for a small dataset and on larger datasets this is only going to more pronounced ( as confirmed by other teams that have already migrated in my company). Another point worth mentioning might be if I remove the TestView.* from the select list, it works in 5 to 6 seconds. Is there an issue with Sql 2005 and a large number of columns or anything of that sort? On 2000, the time remains the same , about 8 seconds if I remove this from the select list.
Here is the statistics ion on 2005
(21234 row(s) affected)
Table 'Worktable'. Scan count 75490, logical reads 3676867, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabTestToReportPanel'. Scan count 476, logical reads 1524, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabReportPanel'. Scan count 0, logical reads 260, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'DiscreteValue'. Scan count 1, logical reads 176106, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabReleasedSampleTest'. Scan count 1, logical reads 2078, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabSample'. Scan count 1360, logical reads 18567, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Form'. Scan count 2302, logical reads 8225, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabTest'. Scan count 1, logical reads 23, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabSampleDef'. Scan count 1, logical reads 10530, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabArea'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Lab'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Location'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Study'. Scan count 0, logical reads 6, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Item'. Scan count 1335, logical reads 32940, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'ObjectState'. Scan count 1, logical reads 10972, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Object'. Scan count 0, logical reads 20674, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Subject'. Scan count 0, logical reads 3293, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'FormDef'. Scan count 2, logical reads 70, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'PrintedLabSampleLabel'. Scan count 0, logical reads 13144, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'PrintedForm'. Scan count 0, logical reads 4219, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'StudySite'. Scan count 0, logical reads 2756, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'StudyEvent'. Scan count 18, logical reads 40, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'StudyEventDef'. Scan count 0, logical reads 36, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'FormDefToStudyEventDef'. Scan count 1, logical reads 43, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'LabSampleDefToFormDef'. Scan count 1, logical reads 255, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Here is the statistics ion on 2000
Table 'LabTestToReportPanel'. Scan count 2123, logical reads 4820, physical reads 44, read-ahead reads 0.
Table 'LabReportPanel'. Scan count 130, logical reads 260, physical reads 0, read-ahead reads 0.
Table 'DiscreteValue'. Scan count 103914, logical reads 208214, physical reads 0, read-ahead reads 0.
Table 'Location'. Scan count 19031, logical reads 38062, physical reads 2, read-ahead reads 0.
Table 'Lab'. Scan count 19031, logical reads 38062, physical reads 0, read-ahead reads 0.
Table 'LabArea'. Scan count 19031, logical reads 38062, physical reads 0, read-ahead reads 0.
Table 'LabSampleDef'. Scan count 24670, logical reads 49340, physical reads 0, read-ahead reads 0.
Table 'LabTest'. Scan count 19406, logical reads 39575, physical reads 0, read-ahead reads 0.
Table 'LabReleasedSampleTest'. Scan count 4289, logical reads 73865, physical reads 1014, read-ahead reads 24.
Table 'Study'. Scan count 4291, logical reads 8582, physical reads 0, read-ahead reads 0.
Table 'LabSample'. Scan count 5647, logical reads 31382, physical reads 308, read-ahead reads 4.
Table 'Form'. Scan count 4291, logical reads 9272, physical reads 2, read-ahead reads 10.
Table 'PrintedLabSampleLabel'. Scan count 4289, logical reads 17097, physical reads 114, read-ahead reads 308.
Table 'ObjectState'. Scan count 6860, logical reads 13760, physical reads 1, read-ahead reads 0.
Table 'Object'. Scan count 6860, logical reads 23559, physical reads 90, read-ahead reads 701.
Table 'PrintedForm'. Scan count 1375, logical reads 4505, physical reads 40, read-ahead reads 16.
Table 'StudySite'. Scan count 1378, logical reads 2756, physical reads 4, read-ahead reads 0.
Table 'Subject'. Scan count 1599, logical reads 3332, physical reads 2, read-ahead reads 0.
Table 'StudyEvent'. Scan count 18, logical reads 52, physical reads 0, read-ahead reads 0.
Table 'StudyEventDef'. Scan count 18, logical reads 54, physical reads 0, read-ahead reads 2.
Table 'FormDefToStudyEventDef'. Scan count 1, logical reads 69, physical reads 0, read-ahead reads 23.
Table 'FormDef'. Scan count 2, logical reads 78, physical reads 1, read-ahead reads 4.
Table 'LabSampleDefToFormDef'. Scan count 1, logical reads 308, physical reads 1, read-ahead reads 306.
Table 'Item'. Scan count 1335, logical reads 36510, physical reads 140, read-ahead reads 1047.
(21234 row(s) affected)
(147 row(s) affected)
One difference between the two is the work table that 2005 creates versus 2000. I can attach the plans but they are huge. I will attach it if you ask.
What I was looking for was suggestions on what I could do short of rewriting code or any suggestions in general.
FYI, this has also been posted on the SQL Server Engine forum.
Thanks
View 10 Replies
View Related
Sep 22, 2015
I have a cube with 2 many-to-many dimensions where a special mdx query needs about 5 seconds. When I resolve the many to many relationships by multiplying the data in the fact table the query needs 21 seconds.
In general do many-to-many dimensions slow down query performance of a cube?
Without the many-to-many dimensions of course the fact table has much more rows. Could this be the reason for the performance loss?
how to tweak query performance of a cube in general?
View 3 Replies
View Related
Feb 21, 2007
Can anyone comment on the engine performance difference between SQL2005 Enterprise Edition versus Standard? I'm talking generalized performance of the engine and not admin features (parallel index operations) or scaled-storage (partitioning)
( http://www.microsoft.com/sql/editions/enterprise/comparison.mspx )
The marketing literature makes note of two things:
Enterprise can use more then 4 processors
Enhanced read-ahead and scan (super scan)
(note: I cannot find anything about this 'feature')
One un-noted Feature:
only Enterprise supports 'lock pages in memory'
We are in the process of migrating from SQL2000 to SQL2005 in an OLTP environment. Based on the marketing literature; I would have chosen SQL2005-standard. But based on our limited testing, we are seeing some strange differences.
Query Performance
With MaxDOP=1 and using a large batch query (select top 1500000); SQL2005-Enterprise is twice as fast as SQL2005-Standard.
(Note: this difference persists regardless of lock-pages-in-memory setting)
CPU Utilization
In addition, taskmgr shows that SQL2005-Enterprise uses a single processor at ~90%. While SQL2005-Standard shows a single processor at ~20%.
Lock Behavior
We are also seeing lock-behavior differences. A single DML statement that attempts to modify ~5000 rows will cause Table-locks on SQL2005-Standard but obtain normal row-locks on SQL2005-Enterprise.
These empirical differences make me wonder if the engine codebase is fundamentally different between the two?
Any insight would be appreciated.
View 4 Replies
View Related
Jul 20, 2005
I use datetime condition in sql-query. For example:select something from sometablewhere date between '06/15/04 00:00:00' and '06/15/04 23:59:59'and result on two synchronized servers is different.In USA server result contains where date equal '06/14/04' only,but Russian server gives where date equal '06/15/04'.I read mysql online book article 'Writing International Transact-SQL Statements'and not found any recommendations how solve this problem.Thanks for any help.
View 2 Replies
View Related
Jun 12, 2007
Hi,
I'm trying to check if a row was created yesterday?
This does not seem to work?
(MyId == "10") && DATEPART("dd",GETDATE()) == (DATEPART("dd",MyDateTimeColumn) - 1)
Does anybody know how I can accomplish this?
Many thanks.
View 5 Replies
View Related
Jul 6, 2004
Hi,
I've a problem with my alerts on a SQL 2000 SP3.
I can not add sql server performance condition alerts. The table master..Sysperfinfo is empty.
And the existing ones doe not work properly.
I've already tried unlodctr & lodctr en rebooted the server more than once. But it didn't work.
Can someone help me with this problem ? thanks !!!
View 6 Replies
View Related
Jul 23, 2005
I am trying to setup a Performance Condition Alert and on this one SQL2000 Server the 'Type:' drop down does not give me the option to choose"SQL Server performance condition alert". "SQL Server event alert" isthe only item listed in the drop down. Is there some setting on theServer that I am missing that needs to be activated? I am connectedwith the SA account; Windows 2000 sp4; SQL 2000 sp2; SQL Litespeed(this is the only server with LiteSpeed).Thank you.
View 1 Replies
View Related
Feb 4, 2004
I'm pretty new to MSSQL. We have SQL 2K and I was wondering if anyone has seen this issue before.
When I create/modify an Alert there is no option for "SQL Server Performance Condition Alert" in the Type under General tab.
Has anyone seen this before? What can I do to get this type of Alerts?
Just a note -- on my other SQL servers they have the options.
thanks in advance !
View 5 Replies
View Related
Mar 3, 2004
I have a SQL 2000 sp3 installation on a Active - Active
cluster. I wanted to create a performance alert, but when
I tried, the Sql Server Performance Condition Alert choice
is missing from the drop down menu on the new alert setup screen. Does anyone know why? Any help will be greatly
appreciated.:confused:
View 3 Replies
View Related
Jul 20, 2005
Hello,I have a test database with table A containing 10,000 rows and a tableB containing 100,000 rows. Rows in B are "children" of rows in A -each row in A has 10 related rows in B (ie. B has a foreign key to A).Using ODBC I am executing the following loop 10,000 times, expressedbelow in pseudo-code:"select * from A order by a_pk option (fast 1)""fetch from A result set""select * from B where where fk_to_a = 'xxx' order by b_pk option(fast 1)""fetch from B result set" repeated 10 timesIn the above psueod-code 'xxx' is the primary key of the current Arow. NOTE: it is not a mistake that we are repeatedly doing the Aquery and retrieving only the first row.When the queries use fast-forward-only cursors this takes about 2.5minutes. When the queries use dynamic cursors this takes about 1 hour.Does anyone know why the dynamic cursor is killing performance?Because of the SQL Server ODBC driver it is not possible to havenested/multiple fast-forward-only cursors, hence I need to exploreother alternatives.I can only assume that a different query plan is getting constructedfor the dynamic cursor case versus the fast forward only cursor, but Ihave no way of finding out what that query plan is.All help appreciated.Kevin
View 1 Replies
View Related
Jul 8, 2002
Hello,
Specifically, I'm having an issue that I can't resolve using the database space utilization procedures recently submitted by Paul Matthews. The servers are appropriately linked and the procedures have been created on the required servers (mix of 7.0 and 2000 systems).
Executing the sp_dbspaceall procedure from query analyzer is successful but it fails when called from a SQL Agent job. It only returns the information of the local server in that instance.
The error message tells nothing of the problem and the logs show nothing at all about the incident. Can someone help me out?
View 3 Replies
View Related
Aug 15, 2007
We have an interesting problem. We are attempting to migrate from sql 2000 to sql 2005. the schema we have is exactly the same. the new 2005 box is more powerful than our 2000 box.
here is our schema:
tbl_Items
ItemID int pk
ReferenceID int
sessionid varchar(255)
StatusID int
tbl_ItemsStatus
statusid int pk
isinternalstatus bit
there is an index on (ReferenceID, SessionID, StatusID) and (SessionID, StatusID)
this is the query:
DECLARE @referenceid INTEGER
SET @referenceid = 1019
SELECT MAX(i2.itemid)
FROM tbl_Items i2 (NOLOCK)
JOIN tbl_ItemsStatus s (NOLOCK)
ON i2.StatusID = s.StatusID
WHERE
s.IsInternalStatus = 0
AND i2.referenceid = @referenceid
AND i2.sessionid IN (
SELECT i3.sessionid
FROM tbl_Items i3 (NOLOCK)
WHERE
i3.referenceid = @referenceid
AND i3.status <> 7
AND i3.status <> 8
AND i3.status <> 10
AND i3.itemid IN (
SELECT max(i4.itemid)
FROM tbl_Items i4 (NOLOCK)
WHERE i4.referenceid = @referenceid
GROUP BY i4.sessionid
)
AND i3.itemid NOT IN (
SELECT MAX(i7.itemid )
FROM tbl_Items i7 (NOLOCK)
WHERE
i7.referenceid = @referenceid
AND i7.SessionID IN (
SELECT i5.SessionID
FROM tbl_Items i5 (NOLOCK)
WHERE
i5.status <> 11
AND i5.referenceid = @referenceid
AND i5.itemid IN (
SELECT MAX(i6.itemid)
FROM tbl_Items i6 (NOLOCK)
WHERE
i6.referenceid = @referenceid
AND i6.status IN (7,11,8)
GROUP BY i6.sessionid
)
)
GROUP BY i7.SessionID
)
)
GROUP BY i2.sessionid
we know this query is pretty bad and can be optimized. however, if we run this query as is on 2005 it takes about 2 hours to run...if we run the exact same query on 2000 it takes 9 seconds.
so this query on 2005 if run takes 2 hours..however, if we omit the s.IsInternalStatus = 0 or the i2.referenceid = @referenceid line it takes about 9 seconds.
why would this be? it makes no sense why omitting one of those where clauses would increase the performance of the query by 2 hours? we know its a bad query...but this doesnt make sense.
any one else run into this problem?
View 1 Replies
View Related
Mar 17, 2007
Hey :)I'm facing a lot of troubles trying to create a new pause/break-system. Right now i'm building up the query that counts how many records that is inside 2 fields. Let me first show you my table:
ID (int) | stamp_start (Type: DateTime) | stamp_end (Type: DateTime) | Username (varchar)0 | 17-03-07 12:00:00 | 17-03-07 12:30:00 | Hovgaard
The client will enter a start time and a end time and this query should then count how many records that are inside this periode of time.
Example: The client enter starttime: 12:05 and endtime: 12:35.The query shall then return 1 record found. The same thing if the user enters 12:20 and 12:50.My current query looks like this:SELECT COUNT(ID) AS Expr1 FROM table WHERE (start_stamp <= @pausetime_start) AND (end_stamp >= @pausetime_end)But this will only count if I enter the exact same times as the one inside the table.Any ideas how I can figure this out?Thanks for your time so far :)/Jonas Hovgaard - Denmark
View 2 Replies
View Related
Apr 12, 2007
Dear Friends,
I have a doubt relation when converting a field¦
I m converting a string to datetime inside a SQL command in an OLE DB Source using this:
convert(datetime,[Maturity],103) As MaturityDate
Its better to use inside the OLE DB Source or is better to convert it with data conversion transformation??
Regards!!
Thanks
View 11 Replies
View Related
Jan 25, 2015
We are trying to create some alerts in our SQL Server 2014 BI edition.Issue is that, after I chose "Type" as "SQL Server performance condition alert" nothing is listed in the "Object" list box.SQL Server event alerts are working. Issue is only with "SQL Server performance condition alert".
View 3 Replies
View Related
Jun 23, 2006
Hello Everyone,I have a very complex performance issue with our production database.Here's the scenario. We have a production webserver server and adevelopment web server. Both are running SQL Server 2000.I encounted various performance issues with the production server with aparticular query. It would take approximately 22 seconds to return 100rows, thats about 0.22 seconds per row. Note: I ran the query in singleuser mode. So I tested the query on the Development server by taking abackup (.dmp) of the database and moving it onto the dev server. I ranthe same query and found that it ran in less than a second.I took a look at the query execution plan and I found that they we'rethe exact same in both cases.Then I took a look at the various index's, and again I found nodifferences in the table indices.If both databases are identical, I'm assumeing that the issue is relatedto some external hardware issue like: disk space, memory etc. Or couldit be OS software related issues, like service packs, SQL Serverconfiguations etc.Here's what I've done to rule out some obvious hardware issues on theprod server:1. Moved all extraneous files to a secondary harddrive to free up spaceon the primary harddrive. There is 55gb's of free space on the disk.2. Applied SQL Server SP4 service packs3. Defragmented the primary harddrive4. Applied all Windows Server 2003 updatesHere is the prod servers system specs:2x Intel Xeon 2.67GHZTotal Physical Memory 2GB, Available Physical Memory 815MBWindows Server 2003 SE /w SP1Here is the dev serers system specs:2x Intel Xeon 2.80GHz2GB DDR2-SDRAMWindows Server 2003 SE /w SP1I'm not sure what else to do, the query performance is an order ofmagnitude difference and I can't explain it. To me its is a hardware oroperating system related issue.Any Ideas would help me greatly!Thanks,Brian T*** Sent via Developersdex http://www.developersdex.com ***
View 2 Replies
View Related
Nov 1, 2007
Hi,
I am creating a stored procedure which takes a few paramaters. One of which is a string of comma separated codes. At the end of my select query the codes will be used in the where clause.
What I'm doing is parsing the comma separated string and populating a temp table called #codes. So my query will look something like this:
select * from tableA
where tableA.col1 = 'something'
and tableA.code in (select * from #codes)
and....
However, the code paramater can be null and if this is the case I want the query to be called for all codes - i.e. I effectively want to remove the and tableA.code in (select * from #codes) part of the where clause.
Is there any clever way of doin this other than having a if... else... and writing 2 queries - one with the and and one without it?
Hope this is clear.
Thanks,
Wallace
View 6 Replies
View Related
Jan 4, 2008
Hello all,
I've got some problem with the following query:
select
subscriber_id, sum(convert(int,n_inboundcalls)) as totalinbound
from calldiversityratedaily
where totalInbound < 10
group by subscriber_id
The error message provided is:
Server: Msg 207, Level 16, State 3, Line 1
Invalid column name 'totalInbound'.
I'm using MSSQL server 2000, version 8.00.194 and it appears that I can't use the name I give to the sum in the where clause.
Could anyone help me please?
View 1 Replies
View Related
Aug 1, 2004
My table is as follows and I have a string U2,U4
CustomerID UserID OtherUsers
C1U1#U3~#U5~
C2U2
C3U3
C4U4#U1~#U2~
C5U5
C6U1#U2~#U5~
C7U2
C8U3#U1~#U4~
Now I should issue a sql which should extract the CustomerID where UserID in (U2,U4) or (U2 or U4) in OtherUsers.
i.e I should get the CustomerIDs C2,C4,C6,C8
I have trouble in coming out with the sql. Does anyone have idea how to achieve this?
View 4 Replies
View Related
Jun 12, 2008
create function stzipcustinfo
( @campaign varchar(50), @startdate datetime, @enddate datetime,@audience_type char(10) )
RETURNS TABLE
AS
RETURN
(Select distinct t.campaign,t.Sopnumbe,t.Custnmbr,t.Custname,
t.Address1,t.Address2,t.City,t.State,t.Country,t.Zipcode, t.Phone,sum(t.totalquantity) as Totalquantity,
t.Audience_type
from
(
select distinct
v.sopnumbe, v.campaign, v.custnmbr, v.custname, v.address1, v.address2,v.city,v.state,v.country,v.zipcode,
v.Phone, v.totalquantity as totalquantity,
case @audience_type
when v.custclas then v.custclas
end as audience_type
from vwstzipcustaudienceinfo v
Where (v.itemnmbr = '' or v.itemnmbr=v.itemnmbr) and v.Campaign = @Campaign and (v.docdate between @startdate and @enddate)
) as t
where t.audience_type is not null
group by t.campaign,t.Sopnumbe,t.Custnmbr,t.Custname,
t.Address1,t.Address2,t.City,t.State,t.Country,t.Zipcode, t.Phone, t.Audience_type
)
--select * from mystatezipcustaudienceinfo('cpd','1/1/2008','5/15/2008','INDPATIENT') --getting 500rows
--select * from mystatezipcustaudienceinfo('Cpd'','1/1/2008','5/15/2008','INST-HOSPITAL') -- note getting single row
problem have a β-β in them..i m not getting result..so what condition should be in red color one..
thanks
View 7 Replies
View Related
Sep 10, 2014
I do have a table type as input
---------------------------
num|columnname|value
---------------------------
1|Name|3X5900
2|employeeID|null
3|JOD|null
4|Address|null
From this I have to form the where clause like "Name like'3X5900' ".This will be dynamic , as the search may vary on the next call for search.
How I have to implement this dynamic query in the below mentioned code.I have add the dynamic query where i have mentioned as '---- Where clause has to be added '. Sometime the table input will be having all the value for search. How I can implement this in my query with good performance.
DECLARE @Where varchar(4000);
Select @where = 'v.Name LIKE ''3X5900'''
SELECT * from
(SELECT Row_number()Over(Order By V.ProjectId) Rownum,
v.Firstname,
[Code] ....
View 1 Replies
View Related
Jul 7, 2007
How can update / insert records from table1 colum to table2 colum
with condition ?
I want to insert/update records like this.
update employee
set joine_date = select joine_date from master_table
where emp_id.employee=emp_id.master_table
regards
Martin
View 2 Replies
View Related
Sep 12, 2007
Hi,
hope someone can help or suggest something to help me with the issue.
I have a list of projects. this list contains all master (parent) and all subprojects(child). when I click on a project I want to be able to retreive information about that project and it's subprojects. here is my delema. I want my sql query to check if project is master or sub. if master then get all data for this project and its subprojects but if it's sub then get data only for that sub. below is a sample data that I hope it clear things up
parent_ID child_id Type projname
-----------------------------------------------
100 100 P parent_X_proj
100 25 C child_X_proj
100 29 C child_X_proj2
200 200 P parent_Z
300 300 P parent_Y
etc................
this is how my table is constructed. my application passes child_id and what I want is if someone clicks on parent_X_proj I want to be able to retreive the three projects (100,25,29) but if someone clicks on child project (29) then I want only that project.
so I want my query to look for the type and if my type is P (parent) then get all project where parent_id = 100(for example) but if type= C then get child_id = 29.
I know it could be done in stored procedure buy my application cannon executre SP but only sql statements.
Thank you for any help
View 3 Replies
View Related
Sep 28, 2007
Hi all,
i have a query as below that creates a count in the field Total. I wanted to then be able to say only show me where there are more than 2 incidents. But i can't get it to recognise my created field. Any advice?
Thanks.
select a.vchcompanyname, count(*) as total
from company a inner join incident b
on b.iownerid = a.icompanyid
where b.iincidentcategory = 1
and a.icompanytypecode = 102165
--and total > 2 (Wont recognise Total)
group by b.iownerid, a.icompanyid, a.vchcompanyname
order by 2 desc
View 3 Replies
View Related
Aug 12, 2007
First I have to apologize about posting the problem here , since it not actually relate to SQL Server
I am really new to database programming.
I use ADO (the ActiveX one , Not ADO.NET) In an MFC Application to access MS ACCESS Database
I can insert date in to table using statement like this
insert into tableXXX (date_field) values ('1/2/2007')
but when i tried to query it using this statement
select * from tableXXX where date_field = #1/2/2007#
Nothing return
I also tried to use #1-2-2007# , #01-02-2007# but it all the same
Can someone tell me how to successfuly use date as a query condition ?
Thank in Advance
View 2 Replies
View Related
Dec 16, 2007
Hi all,
I have to write a select query which need some logic to be implemented.
Query is like
select name,number,active,
if active ="true" then
select @days=days from tbl_shdsheet
else
@days=''
end
from tbl_emp
In the above query there will be days row for that employee if active is true else there won't be any data for that emp in the tbl_shdsheet
So how can i write queery for this.
View 3 Replies
View Related
Feb 20, 2001
Hi, I was wondering if someone could help out. I need to create a stored procedure, but before i do so, I need to know if I can store a conditional expression in a string and just use that variable as my condition for the WHERE clause. The reason I ask this is because I am trying to create a stored procedure that queries different things depending on what the inputs are.
View 2 Replies
View Related
Oct 11, 2013
I have one table which contines 3 columns they are
1.emp
2.monthyear
3.amount
in that table i am having 6 rows like below.
emp monthyear amount
1 102013 1000
2 102013 1000
1 112013 1000
1 112013 1000
2 122013 1000
2 122013 0000
i want a total on group by condition on each employee. which will have the data from NOV to dec 2013 only.
out put should be like this
1 2000
2 1000
View 9 Replies
View Related
Jan 28, 2015
I wish to make a query with if condition Implemented in a database sql server 2008 R2.
I would like to set up a system FEFO (first expired first out) based on batch number based on the dates of Lapsed but since I struggle to put my request in place. The principle of my request is:
if amount of movement (QTEMVT)> = amount entered by the user via a user interface then withdraws the amount entered by the user in the batch (NUMLOT) the amount of movement of the item that lapses the first (execution of my request).
else if amount of movement (QTEMVT) <amount entered by the user
then the difference between the amount entered by the user and the amount of movement (QTEMVT) (execution of my request) and the following conditions:
the amount of movement (QTEMVT) = the amount of movement (QTEMVT) of this item stored in my database and the amount of movement (QTEMVT) <> 0
by taking the difference of the item requested directly from the batch (NUMLOT) of the item directly after lapses.
Basically I want to set up an item management query based on batch number and expiration dates.
ps: QTEMVT = quantity of the item stored in my database
NUMLOT = batch number items
DATFABRIC = manufacturing date items
DATPEREMP = expiry date items
Here is my request:
SELECT f.NUMLOT,f.DATFABRIC,f.DATPEREMP,f.QTEMVT
FROM FAIRE f INNER JOIN mouvement m ON f.CODMVT = m.CODMVT
INNER JOIN TYPE_MOUVEMENT tm ON tm.CODTYPMVT = m.CODTYPMVT
INNER JOIN SOUS_SITE ss ON ss.CODSOUSIT = m.CODSOUSIT
INNER JOIN SITE s ON s.CODSITE = ss.CODSITE
[Code] ....
View 9 Replies
View Related