OpenWithServiceComponents 5 Times Slower For Oracle Than For SQL Server?
Apr 28, 2004
Hi,
I am experiencing some problems accessing an Oracle database through
OLE DB from an MTS application using OpenWithServiceComponents (which
is supposed to give me connection pooling).
When I connect to a SQL Server database it only takes 3.1 ms to open a
connection, while with Oracle it takes 15.5 ms (both DB's running on
the same machine, I made 1000 calls and took the average).
Am I doing something wrong, am I missing something here?
I have tried to use the plain Open method, but in this case it takes
5.4 ms with SQL Server and 31.4 ms with Oracle.
Is Oracle really that much slower when accessing it through OLE DB?
In order to return a table for a specific input parameter, I am using Function, but the performance is just awful! After I have tried same code as SP, the whole thing is running under 1 sec (like 0.5 sec), while the function is about 10 times slow (4-6 sec). I know in SQL 2000 function is slower than SP, but that cannot be as bad as 10 times slower.
Now, in order to use that table from SP, I have to create a temp table, then insert result into that temp table, before I can direct use any "select" sentence. Any explanation here? Or how to "select" from a SP directly?
I've found a very interesting case where SQL 2005 is more than 4 times slower than its predecessor, despite superior hardware.
To compare, I have some logs from the ISA Server has stored into the database, which is then filtered against my blocklists. The result are the most popular sites I have not blocked or trusted. I have done some tuning on the query to utilize the data patterns that are in the database. This allows much better scaling towards large datasets. The engine is now capable to use hash-matches instead of nested loops over the entire datasets.
It currently runs in 1 to 1.5 minutes on SQL 2000, which is powered by a VIA C3 Nehemiah at 1 GHz, with only 350 MB RAM too spare in optimistic conditions (The remainder of the 1 GB is used by many other applications). THe server is limited to 384 MB memory usage. The other box running SQL 2005 is a Athlon 800 MHz, which has superior processing power, larger cache and more memory bandwidth and has 1.25 GB RAM, where SQL server can use the needed 629 MB without any problem. Despite these facts, my query takes more than 4 times longer on the SQL 2005 box. 1.5 minutes compared to 7 to 8 minutes.
SQL 2000 Database Schema:
CREATE TABLE [dbo].[WebProxyLog]( [ClientIP] [bigint] NOT NULL, [ClientUserName] [nvarchar](514) NOT NULL, [ClientAgent] [varchar](128) NOT NULL, [ClientAuthenticate] [smallint] NOT NULL, [logTime] [datetime] NOT NULL, [service] [smallint] NOT NULL, [servername] [nvarchar](32) NOT NULL, [referredserver] [varchar](32) NOT NULL, [DestHost] [varchar](255) NOT NULL, [DestHostIP] [bigint] NOT NULL, [DestHostPort] [int] NOT NULL, [processingtime] [int] NOT NULL, [bytesrecvd] [bigint] NOT NULL, [bytessent] [bigint] NOT NULL, [protocol] [varchar](12) NOT NULL, [transport] [varchar](8) NOT NULL, [operation] [varchar](24) NOT NULL, [uri] [varchar](2048) NOT NULL, [mimetype] [varchar](32) NOT NULL, [objectsource] [smallint] NOT NULL, [resultcode] [int] NOT NULL, [CacheInfo] [int] NOT NULL, [rule] [nvarchar](128) NOT NULL, [FilterInfo] [nvarchar](128) NOT NULL, [SrcNetwork] [nvarchar](128) NOT NULL, [DstNetwork] [nvarchar](128) NOT NULL, [ErrorInfo] [int] NOT NULL, [Action] [varchar](32) NOT NULL, [GmtLogTime] [datetime] NOT NULL )
CREATE TABLE [dbo].[TrustedHosts]( [Hostname] [varchar](60) NOT NULL, [Comment] [varchar](500) NULL, CONSTRAINT [PK_TrustedHosts] PRIMARY KEY CLUSTERED ( [Hostname] ASC ) ON [PRIMARY] ) ON [PRIMARY] GO
CREATE TABLE [dbo].[DeniedHosts]( [ReasonId] [smallint] NOT NULL, [Hostname] [varchar](80) NOT NULL, [Path] [varchar](50) NOT NULL CONSTRAINT [DF_DeniedHosts_Path] DEFAULT ('%'), [Comment] [varchar](500) NULL, CONSTRAINT [PK_DeniedHosts] PRIMARY KEY CLUSTERED ( [Hostname] ASC, [ReasonId] ASC, [Path] ASC ) ON [PRIMARY] ) ON [PRIMARY] GO
ALTER TABLE [dbo].[DeniedHosts] WITH NOCHECK ADD CONSTRAINT [FK_DeniedHosts_DenyReason] FOREIGN KEY([ReasonId]) REFERENCES [dbo].[DenyReason] ([ReasonId]) GO ALTER TABLE [dbo].[DeniedHosts] CHECK CONSTRAINT [FK_DeniedHosts_DenyReason] GO ALTER TABLE [dbo].[DeniedHosts] WITH NOCHECK ADD CONSTRAINT [CK_DeniedHosts_HostName_NoTwoDots] CHECK ((((not([Hostname] like '%..%'))))) GO ALTER TABLE [dbo].[DeniedHosts] CHECK CONSTRAINT [CK_DeniedHosts_HostName_NoTwoDots] GO ALTER TABLE [dbo].[DeniedHosts] WITH NOCHECK ADD CONSTRAINT [CK_DeniedHosts_NoPercentDot] CHECK (([Hostname] <> '%.')) GO ALTER TABLE [dbo].[DeniedHosts] CHECK CONSTRAINT [CK_DeniedHosts_NoPercentDot] GO ALTER TABLE [dbo].[DeniedHosts] WITH NOCHECK ADD CONSTRAINT [CK_DeniedHosts_NoWildcardMiddle] CHECK ((((not([Hostname] like '_%[%]%'))))) GO ALTER TABLE [dbo].[DeniedHosts] CHECK CONSTRAINT [CK_DeniedHosts_NoWildcardMiddle] GO ALTER TABLE [dbo].[DeniedHosts] WITH NOCHECK ADD CONSTRAINT [CK_DeniedHosts_ValidPath] CHECK (([Path] is null or [Path] = '%' or [Path] like '/%' and [Path] <> '' and [Path] <> '/%')) GO ALTER TABLE [dbo].[DeniedHosts] CHECK CONSTRAINT [CK_DeniedHosts_ValidPath] GO ALTER TABLE [dbo].[DeniedHosts] WITH NOCHECK ADD CONSTRAINT [CK_DeniedHosts_WildcardStart] CHECK ((((not([Hostname] like '[%]%'))) or ([Hostname] like '[%].%' or [Hostname] = '%'))) GO ALTER TABLE [dbo].[DeniedHosts] CHECK CONSTRAINT [CK_DeniedHosts_WildcardStart]
I'm not going to give you any data as: WebProxyLog contains 1 223 878 rows; 524 MB (few more on SQL 2000 database). DeniedHosts contains 52 338 rows; 3 MB TrustedHosts contains 2 183 rows; <1 MB
The relevant query is:
CREATE PROCEDURE [dbo].[GetTrustedHosts] AS SELECT Hosts, Requests --, DistinctRequests FROM(SELECT AA.Hosts, COUNT(*) AS Requests, COUNT(DISTINCT Path) AS DistinctRequests FROM(SELECTCASE WHEN CHARINDEX('':'', SUBSTRING(URI, 8, CHARINDEX(''/'', URI, 8)-8), 8) <> 0 THEN SUBSTRING(URI, 8, CHARINDEX('':'', URI, 8)-8) ELSE SUBSTRING(URI, 8, CHARINDEX(''/'', URI, 8)-8) END AS Hosts, SUBSTRING(URI, CHARINDEX(''/'', URI, 8), 50) AS Path FROM dbo.WebProxyLog wpl WHERE URI LIKE 'http://%/%' AND ResultCode BETWEEN 200 AND 399 AND (Service = 1) -- filter for only forward proxy ) AA WHERE NOT EXISTS (SELECT * FROM dbo.DeniedHosts dhp WHERE Path <> ''%'' AND LEFT(AA.Path,3) = LEFT(dhp.Path COLLATE SQL_Latin1_General_CP1_CI_AS, 3) AND AA.Hosts LIKE (dhp.Hostname COLLATE SQL_Latin1_General_CP1_CI_AS) AND ( AA.Path LIKE (dhp.Path COLLATE SQL_Latin1_General_CP1_CI_AS) OR AA.Path LIKE ((dhp.Path COLLATE SQL_Latin1_General_CP1_CI_AS) + ''[?]%'') ) ) AND NOT Hosts IS NULL-- this seems to give a speed advantage GROUP BY AA.Hosts HAVING COUNT(*) >= 25 ) A WHERE NOT Hosts IN (SELECT Hostname COLLATE SQL_Latin1_General_CP1_CI_AS FROM dbo.TrustedHosts thc --WHERE NOT thc.Hostname LIKE ''%[%]%'' ) AND NOT Hosts IN (SELECT Hostname COLLATE SQL_Latin1_General_CP1_CI_AS FROM dbo.DeniedHosts dhc WHERE dhc.Path = ''%'' ) AND NOT EXISTS (SELECT * FROM dbo.TrustedHosts thh WHERE A.Hosts LIKE (thh.Hostname COLLATE SQL_Latin1_General_CP1_CI_AS) AND thh.Hostname LIKE ''[%]%'' -- generates a hash join instead of a of nested loop AND RIGHT(A.Hosts,6) = (RIGHT(thh.Hostname,6) COLLATE SQL_Latin1_General_CP1_CI_AS) ) AND NOT EXISTS (SELECT * FROM dbo.DeniedHosts dhh WHERE dhh.Path = ''%'' -- this reduces the cost of the most expensive query AND dhh.Hostname LIKE ''[%]%'' AND A.Hosts LIKE (dhh.Hostname COLLATE SQL_Latin1_General_CP1_CI_AS) -- generates a hash join instead of a of nested loop -- performance difference is significant due to volume AND RIGHT(A.Hosts,6) = (RIGHT(dhh.Hostname,6) COLLATE SQL_Latin1_General_CP1_CI_AS) ) ORDER BY Requests DESC
There is some mess with collations, but these don't seem to hurt performance.
The query plans provided differ only slightly. SQL 2000 provides a Clustered Index Scan over WebProxyLog with predicate on resultcode. After this a filter for the LIKE operator. Cost is 78% for the scan and 10% for the filter. SQL 2005 combines both. With the scan costing 93%.
However, the key does not seem to be in this data, as the plans are nearly equavent, with the SQL2005 executing plan looking slightly better (table scan is a larger part of the execution).
How can these differences, espcially of this magnitude, be explained? And further, how can the query be optimized for decent performance on SQL2005? What am I doing wrong?
I am begginner at best so I hope someone that is better can help.I have a stored procedure that updates a view that I wrote using 2cursors.(Kind of a Inner Loop) I wrote it this way Because I couldn'tdo it using reqular transact SQL.The problem is that this procedure is taking longer and longer to run.Up to 5 hours now! It is anaylizing about 30,000 records. I thinkpartly because we add new records every month.The procedure works like this.The first Cursor stores a unique account and duedate combination fromthe view.It then finds all the accts in the view that have that account duedatecombo and loads them into Cursor 2 this groups them together for datamanipulation. The accounts have to be grouped this way because aaccount can have different due dates and multiple records within eachaccount due date combo and they need to be looked at this way aslittle singular groups.Here is my procedure I hope someone can shead some light on this. Myboss is giving me heck about it. (I think he thinks Girls cant code!)I got this far I hope someone can help me optimize it further.CREATE PROCEDURE dbo.sp_PromiseStatusASBEGINSET NOCOUNT ON/* Global variables */DECLARE @tot_pay moneyDECLARE @rec_upd VARCHAR(1)DECLARE @todays_date varchar(12)DECLARE @mActivityDate2_temp datetimeDECLARE @tot_paydate datetime/* variables for cursor ACT_CUR1*/DECLARE @mAcct_Num1 BIGINTDECLARE @mDueDate1 datetime/* variables for ACT_CUR2 */DECLARE @mAcct_Num2 BIGINTDECLARE @mActivity_Date2 datetimeDECLARE @mPromise_Amt_1 moneyDECLARE @mPromise_Status varchar(3)DECLARE @mCurrent_Due_Amt moneyDECLARE @mDPD intDECLARE @mPromise_Date datetimeSELECT @todays_date =''+CAST(DATEPART(mm,getdate()) AS varchar(2))+'/'+CAST(DATEPART(dd,getdate()) AS varchar(2))+'/'+CAST(DATEPART(yyyy,getdate()) AS varchar(4))+''DECLARE ACT_CUR1 CURSOR FORSELECT DISTINCTA.ACCT_NUM,A.DUE_DATEFROM VWAPPLICABLEPROMISEACTIVITYRECORDS AOPEN ACT_CUR1FETCH NEXT FROM ACT_CUR1 INTO @mAcct_Num1 , @mDueDate1WHILE (@@FETCH_STATUS = 0)BEGINSELECT @rec_upd = 'N 'DECLARE ACT_CUR2 CURSOR FORSELECTB.ACCT_NUM,B.ACTIVITY_DATE,B.PROMISE_AMT_1,B.PROMISE_STATUS,B.CURRENT_DUE_AMT,B.DAYS_DELINQUENT_NUM,B.PROMISE_DATE_1FROM VWAPPLICABLEPROMISEACTIVITYRECORDS B (UPDLOCK)WHERE B.ACCT_NUM = @mAcct_Num1ANDB.DUE_DATE = @mDueDate1ORDER BY B.ACCT_NUM,B.DUE_DATE,B.ACTIVITY_DATE,CASEB.Time_ObtainedWHEN 0 THEN 0ELSE 1END Desc, B.Time_ObtainedOPEN ACT_CUR2FETCH NEXT FROM ACT_CUR2INTO @mAcct_Num2 ,@mActivity_Date2,@mPromise_Amt_1,@mPromise_Status ,@mCurrent_Due_Amt,@mDPD,@mPromise_DateWHILE (@@FETCH_STATUS = 0)BEGIN----CHECK------------------------------------------------------------------------DECLARE @PrintVariable2 VARCHAR (8000)--SELECT @PrintVariable2 = CAST(@MACCT_NUM2 AS VARCHAR)+''+CAST(@MACTIVITY_DATE2 AS VARCHAR)+' '+CAST(@MPROMISE_AMT_1 ASVARCHAR)+' '+CAST(@MPROMISE_STATUS AS VARCHAR)+''+CAST(@mCurrent_Due_Amt AS VARCHAR)+' '+CAST(@mDPD AS VARCHAR)+''+CAST(@mPromise_Date AS VARCHAR)--PRINT @PrintVariable2----ENDCHECK------------------------------------------------------------IF @mDPD >= 30BEGINSELECT @tot_pay = SUM(CONVERT(FLOAT, C.PAY_AMT))FROM vwAplicablePayments CWHERE C.ACCT_NUM = @mAcct_Num2ANDC.ACTIVITY_DATE >= @mActivity_Date2ANDC.ACTIVITY_DATE < @mActivity_Date2 + 15----CHECK------------------------------------------------------------------------DECLARE @PrintVariable3 VARCHAR (8000)--SELECT @PrintVariable3 ='Greater=30 DOLLARS COLLECTED'--PRINT @PrintVariable3----ENDCHECK------------------------------------------------------------ENDELSE IF @mDPD < 30BEGINSELECT @tot_pay = SUM(CONVERT(FLOAT, C.PAY_AMT))FROM vwAplicablePayments CWHERE C.ACCT_NUM = @mAcct_Num2ANDC.ACTIVITY_DATE >= @mActivity_Date2ANDC.ACTIVITY_DATE BETWEEN @mActivity_Date2 AND@mPromise_Date + 5----CHECK----------------------------------------------------------------------DECLARE @PrintVariable4 VARCHAR (8000)--SELECT @PrintVariable4 ='Less 30 DOLLARS COLLECTED'--PRINT @PrintVariable4----END CHECK------------------------------------------------------------END----------------------------------------MY REVISEDLOGIC-------------------------------------------------------IF @rec_upd = 'N'BEGINIF @mDPD >= 30BEGINSELECT @mActivityDate2_temp = @mActivity_Date2 + 15--DECLARE @PrintVariable5 VARCHAR (8000)--SELECT @PrintVariable5 =' GREATER= 30 USING ACTVITY_DATE+15'--PRINT @PrintVariable5ENDELSE IF @mDPD < 30BEGINSELECT @mActivityDate2_temp = @mPromise_Date + 5--DECLARE @PrintVariable6 VARCHAR (8000)--SELECT @PrintVariable6 =' LESS 30 USING PROMISE_DATE+5'--PRINT @PrintVariable6ENDIF @tot_pay >= 0.9 * @mCurrent_Due_Amt--used to be promise amtBEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSET PROMISE_STATUS = 'PK',TOTAL_DOLLARS_COLL = @tot_payWHERE CURRENT OF ACT_CUR2--This statement updates the time that the status was placedinto PK.IF @mPromise_Status IN ('PTP','OP')BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSET Status_Date = @todays_dateWHERE CURRENT OF ACT_CUR2ENDSELECT @rec_upd = 'Y 'ENDIF ((@tot_pay < 0.9 * @mCurrent_Due_Amt) OR @tot_pay IS NULL)AND( @mActivityDate2_temp > @todays_date )--need to put 1dayof month here for snapshot9/01/2004BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSETPROMISE_STATUS = 'OP'WHERE CURRENT OF ACT_CUR2--This statement updates the time that the status was placedinto OP which is the original Activity Date.--The record will hold this date until it goes into PK,PB,orIP.IF @mPromise_Status IN ('PTP','OP')BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSET Status_Date = @mActivity_Date2WHERE CURRENT OF ACT_CUR2ENDENDELSE IF ((@tot_pay < 0.9 * @mCurrent_Due_Amt) OR @tot_pay ISNULL)AND( @mActivityDate2_temp <= @todays_date )--need to put 1dayof month here for snapshot 9/01/2004BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSETPROMISE_STATUS = 'PB',TOTAL_DOLLARS_COLL = case when @tot_pay is nullthen 0 else @tot_pay endWHERE CURRENT OF ACT_CUR2--This statement updates the time that the status was placedinto PB.IF @mPromise_Status IN ('PTP','OP')BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSET Status_Date = @todays_dateWHERE CURRENT OF ACT_CUR2ENDENDENDELSE IF @rec_upd = 'Y'BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSETPROMISE_STATUS = 'IP',TOTAL_DOLLARS_COLL = 0WHERE CURRENT OF ACT_CUR2--This statement updates the time that the status was placedinto IP.IF @mPromise_Status NOT IN ('IP')BEGINUPDATE VWAPPLICABLEPROMISEACTIVITYRECORDSSET Status_Date = @todays_dateWHERE CURRENT OF ACT_CUR2ENDENDFETCH NEXT FROM ACT_CUR2 INTO @mAcct_Num2,@mActivity_Date2,@mPromise_Amt_1,@mPromise_Status ,@mCurrent_Due_Amt,@mDPD,@mPromise_DateENDCLOSE ACT_CUR2DEALLOCATE ACT_CUR2FETCH NEXT FROM ACT_CUR1 INTO @mAcct_Num1 , @mDueDate1ENDCLOSE ACT_CUR1DEALLOCATE ACT_CUR1SET NOCOUNT OFFENDGO
I have a SSIS package, and inside it I do something like below:
1. I have a SQL component, to give back a object to store the records. 2. I have a VB script component, I direct the object I got in 1 step into the script as a dataset.
My problem is: I run the package in the SQL SERVER 2005 Store Procedue like this:
do dtexec.exe package.dtsx loop untill i>t
I control the it runs 30 times. But I found that the speed is slower and slower. the first time, it takes about 600 s, but the last time, it takes the 1800 s.
Why? The package don't drop the object it create during the loop in the Store Procedue ? Thanks!
We have an extensive stress test for our OLTP application.that runs slower after upgrading to SQL Server 2005.
We have done the following:
- applied SP1 and SP2
- updated statistics (sp_updatestats)
- recompiled all stored procedures
Some behavior we notice in Perfmon over the course of the 1-hour test is:
- Average CPU Utilization is up from 22% to 32% and we're trying to figure out why
- Compilations per second are up from 200 to 250
- Recompilations per second are down from 5 to 1 (which is good).
- Memory utilization is identical.
- We see exagerated spikes in response times. The general performance
seems pretty good - but there are periodic spikes that hurt our averages.
- I examined perfmon for waits and nothing jumped out.
- A small handful of application workflows are much slower, while most are roughly identical.
I realize there could be a zillion different causes. But if anyone has experience upgrading a large OLTP solution which uses a mixture of dynamic SQL and stored procedures to SS2005 - and tuning the upgraded database - please offer any suggestions.
Our current plan includes (a) running the Best Practices analyzer which already uncovered a trace flag we may want to use since we're running on an 8-way, (b) rebuilding indexes, and (c) examine a SQL Trace this time around too since it appears the slowness is isolated to certain application workflows.
Please offer any suggestions that have worked for your team.
Thanks so much!
Mike
PS I will post all the things we end up doing in the end once we get performance equal or better - hopefully the latter!!!
Hi, I am surprised to see that in Business Inteligence Studio debugging mode, my packages take shorter. And when I run it through Agent on the SQL Server where data actually resides, it take around 70% longer time.
The package is now very closed to data and database engine itself, in BIDS it wasnt.
Anybody knows why this happen ? Do I need to tune up something ?
My ASP.net project with database connection inside, I mean DataBound or whatever the name is make my website load slower because there's a delay about 8 til 9 seconds when in its first load from localhost, after the first load my ASP.net project run very fast and it just take 1 second or less to reload. The problem will show up again after I restart my computer and trying to load my project from localhost at the first time. How do I make my ASP.net project to load very fast with just 1 second or less at the first time from localhost?
We have recently upgraded our production server from a dual pentium II 400mhz server with 384Mb of RAM to a triple 700mhz Pentium III system with 2gb of RAM . However, since switching over, all of our Access queries on the SQL7 databases are either running extremely slowly or not at all despite the DSN's being set up correctly . Does anyone have any ideas why, and more importantly, how I can resolve this???
Thanks
Pete Burton (IT Support) Durham Aged Mineworkers Homes Association
DECLARE @SdxLastName char(4) DECLARE @TheFirstName varchar(30) DECLARE @HitCountLastNameOnly int DECLARE @HitCountFirstName int DECLARE @ThresholdLastName int DECLARE @ThresholdFirstName int
-- Set minimum row hit count for when to use the first name in addition to last name for search SET @ThresholdLastName = 2 --if we get at least this number of rows back, ignore the first name completely -- Set maximum row hit count for when to use the exact first name instead of first name soundex in addition to last name soundex for search SET @ThresholdFirstName = 12 --if we get over this number of rows back, look for exact first name match
IF @AN_CustID < 1 OR @LastName = '' RETURN --required params
-- Whenever a doctor's first name is unknown, the user will enter "DR"; therefore, do not soundex the first name in this case IF @FirstName='DR.' OR @FirstName='DR' SET @FirstName=''
-- Get the soundex of the specified physician SELECT @SdxLastName = dbo.aif_Soundex(@LastName), @TheFirstName = dbo.aif_Soundex(@FirstName)
-- Determine how many rows are returned based solely on similarity of the last name SELECT @HitCountLastNameOnly = COUNT(*) FROM PhysicianMstr p (NOLOCK) WHERE p.AN_CustID = @AN_CustID AND p.Active = 1 AND @SdxLastName = dbo.aif_Soundex(RTRIM(p.LastName))
-- Check if using just the last name soundex returns too few rows IF @HitCountLastNameOnly < @ThresholdLastName BEGIN -- Determine how many rows are returned based on last name soundex or first name soundex SELECT @HitCountFirstName = COUNT(*) FROM PhysicianMstr p (NOLOCK) WHERE p.AN_CustID = @AN_CustID AND p.Active = 1 AND ( @SdxLastName = dbo.aif_Soundex(RTRIM(p.LastName)) OR @TheFirstName = dbo.aif_Soundex(RTRIM(p.FirstName)) ) -- If too many rows, switch to exact match on first name instead of soundex of first name IF @HitCountFirstName > @ThresholdFirstName BEGIN SET @TheFirstName = @FirstName SET @HitCountLastNameOnly = -1 --negative value indicates first name exact match, not soundex END END
-- List physicians with similar sounding names SELECT p.AN_PhysicianID, pm.AN_CustID, RTRIM(p.LastName) AS LastName, RTRIM(p.FirstName) AS FirstName, RTRIM(p.StateLicNo) AS UPIN, RTRIM(mis.Specialty) AS Specialty1, pm.Specialty2, p.Active, RTRIM(pm.Degree) AS Degree, pm.Extra, p.CreatedBy, p.CreatedOn, p.ModifiedBy, p.ModifiedOn, p.PLEPhysicianMIID FROM PLEPhysicianMstrMI p (NOLOCK) INNER JOIN PhysicianMstr pm (NOLOCK) ON p.AN_PhysicianID = pm.AN_PhysicianID LEFT JOIN PLEPhysicianSpecialtyMstrMI mis (NOLOCK) ON p.PLEPhysicianSpecialtyMIID = mis.PLEPhysicianSpecialtyMIID WHERE pm.AN_CustID = @AN_CustID AND p.Active = 1 AND ( @SdxLastName = dbo.aif_Soundex(RTRIM(p.LastName)) OR @TheFirstName = CASE -- Include first name only when we don't have the minimum number of rows WHEN @HitCountLastNameOnly < 0 THEN p.FirstName --exact first name match WHEN @HitCountLastNameOnly < @ThresholdLastName THEN dbo.aif_Soundex(RTRIM(p.FirstName)) ELSE '****' --force false for first name test END ) ORDER BY p.LastName, p.FirstName
we recently got a scenario that we need to get the data from oracle tables which is installed on third party servers. we have sqlserver installed on ourservers. so they have created a DBLINK in oracle server to our sqlserver and published the DBLINK name.
what are the next steps that i need to follow on my sqlserver in order to access the oracle tables ?
Microsoft SQL Server Management Studio 9.00.2047.00 Microsoft Analysis Services Client Tools 2005.090.2047.00 Microsoft Data Access Components (MDAC) 2000.086.1830.00 (srv03_sp1_rtm.050324-1447) Microsoft MSXML 2.6 3.0 4.0 6.0 Microsoft Internet Explorer 6.0.3790.1830 Microsoft .NET Framework 2.0.50727.42 Operating System 5.2.3790
I have the OraOLEDB.Oracle provider installed to the (C:oraclexe) directory.
I am having problems querying from linked oracle server. When i setup oracle as a linked server and purposely enter an incorrect password the query i run tells me i have an incorrect password. So it at least knows that. when i set the correct password and run a query I get this error:
(i replaced the real server name with "someServer".)
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "OraOLEDB.Oracle" for linked server "SomeServer" reported an error. The provider did not give any information about the error.
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "OraOLEDB.Oracle" for linked server "SomeServer".
I want to enter times for running event performances (like 10:26.35 for a person's 2-mile time, for instance). I am using sql server 2000. This is an ongoing concern for me since I do a lot of work with running events. Can someone give me some direction as to how I can best deal with this?
The datetime data types do a lot of converting to 'time-of-day' and that is not what I want. For instance, if I enter a 2-mile time of 10:26.35 (a pretty typical 2-mile time) my db converts it to 10:26:35 AM.
If I use varchar data type there is a ton of code to write when I want to sort the data.
I am using an MS Access front end to input data into a SQL serverdatabase. From Access I am generating an update query utilising aninline function. My problem is that when I update a time value,instead of updating the field with something like 08:00:00 it updatesit to 1/1/1900 08:00:00. Is there a way to force the update to onlyuse the hours minutes and seconds and ignore the date part?Any help would be greatly appreciated.RegardsTrevor
I need to create a linked server that can access more than one database. I assume, the only wat this can be done by creating two separate links from the local server using Microsoft OLE DB Provider for SQL server. But the problem is the Server Name. I cannot use the same servername twice and if I use a name that is not an exisiting SQL server name I get an error that "server not found".
So, how do I addlink the same server a mulitple times to access different databases in the server?
I have an application that connects to an externally hosted SQL server. The first time i go to the web page, the application times out. I can then immediately refresh and everything works. After a period of about 20-30 minutes of inactivity, the issue repeats itself. This is very hard to trouble shoot. Anyone know what to do?
For security reasons customer wants to put a SQL database on an encrypted thumb drive (IronKey). Here's the rub though. He wants to be able to work with the data on a workstation. Then, if he takes his laptop out of the office he wants to be able to simply plug that thumb drive into the laptop and fire up SQL on the laptop and use that same database. Procedurally this would work in that the database can be created so that the location is the same from both machine viewpoints, however will the two different SQL instances allow moving the database back and forth like this?
We´ve migrated from SQL Server 7.0 EE to SQL Server 2000 in a 8processors, 8 GB RAM server, using W2000.
All seems go ok, but after reorganizing indexes, when we want to recover free space in the differents files using DBCC SHRINKFILE, that recovers are taking the triple of time than with previous SQL Server 7.0.
Shrinking big files (6GB to recover 1.5 GB), previously taking 3 hours now are needing 9 hours.
I am building a query. I have a table with 4 columns and need to try and put the times together. There are some inconsistencies with this, and i'm hoping to exclude them.. Here is a sample table:
Function 1 = Clock In Type 1 Function 2 = Clock Out Type 1 Function 3 = Clock In Type 2 Function 4 = Clock Out Type 2
Basically what I need to do is take the time from rows with Function 1 and match it with Function 2 so I can get a total time of the clock in. Function 3 rows need to match up with Function 4 rows so I can get another set of total times. There may be more clock in rows then clock out rows or more clock out rows then clock in rows, and there may be multiple clock ins & outs per day per employee. I'm basically trying to get totals for each Clock In/Out type.
I've got the below and have several variation and still cant seem to find a perfect way to query the server to bring back that last full backup per db. I'm shopwing mutilple records in the backup set db w/ type = 'D'. I look online and type D = Database. Which i assumed it meant full database backup. Apparently not. Try running the below on one of your full databases. Then check to see if the date is actually the last backup date.
DECLARE @db_name VARCHAR(100) SELECT @db_name = DB_NAME() -- Get Backup History for required database SELECT TOP ( 30 ) s.database_name, m.physical_device_name,
This is my 10th attempt. I've uninstalled all the .net components and then reinstalled .net 2.0 and then various versions of SQL Server 2005 Express. Here's what all 10 error logs said:
Microsoft SQL Server 2005 Setup beginning at Sun Mar 16 18:26:46 2008 Process ID : 3808 c:fc0a616bb7e15d80b498fce92asetup.exe Version: 2005.90.3042.0 Running: LoadResourcesAction at: 2008/2/16 18:26:45 Complete: LoadResourcesAction at: 2008/2/16 18:26:45, returned true Running: ParseBootstrapOptionsAction at: 2008/2/16 18:26:45 Loaded DLL:c:fc0a616bb7e15d80b498fce92axmlrw.dll Version:2.0.3609.0 Complete: ParseBootstrapOptionsAction at: 2008/2/16 18:26:46, returned false Error: Action "ParseBootstrapOptionsAction" failed during execution. Error information reported during run: Could not parse command line due to datastore exception. Source File Name: utillibpersisthelpers.cpp Compiler Timestamp: Wed Jun 14 16:30:14 2006 Function Name: writeEncryptedString Source Line Number: 124 ---------------------------------------------------------- writeEncryptedString() failed Source File Name: utillibpersisthelpers.cpp Compiler Timestamp: Wed Jun 14 16:30:14 2006 Function Name: writeEncryptedString Source Line Number: 123 ---------------------------------------------------------- Error Code: 0x80070002 (2) Windows Error Text: The system cannot find the file specified. Source File Name: cryptohelpercryptsameusersamemachine.cpp Compiler Timestamp: Wed Jun 14 16:28:04 2006 Function Name: sqls::CryptSameUserSameMachine:rotectData Source Line Number: 50 2 Could not skip Component update due to datastore exception. Source File Name: datastorecachedpropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:27:59 2006 Function Name: CachedPropertyCollection::findProperty Source Line Number: 130 ---------------------------------------------------------- Failed to find property "InstallMediaPath" {"SetupBootstrapOptionsScope", "", "3808"} in cache Source File Name: datastorepropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:28:01 2006 Function Name: SetupBootstrapOptionsScope.InstallMediaPath Source Line Number: 44 ---------------------------------------------------------- No collector registered for scope: "SetupBootstrapOptionsScope" Running: ValidateWinNTAction at: 2008/2/16 18:26:46 Complete: ValidateWinNTAction at: 2008/2/16 18:26:46, returned true Running: ValidateMinOSAction at: 2008/2/16 18:26:46 Complete: ValidateMinOSAction at: 2008/2/16 18:26:46, returned true Running: PerformSCCAction at: 2008/2/16 18:26:46 Complete: PerformSCCAction at: 2008/2/16 18:26:46, returned true Running: ActivateLoggingAction at: 2008/2/16 18:26:46 Error: Action "ActivateLoggingAction" threw an exception during execution. Error information reported during run: Datastore exception while trying to write logging properties. Source File Name: datastorecachedpropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:27:59 2006 Function Name: CachedPropertyCollection::findProperty Source Line Number: 130 ---------------------------------------------------------- Failed to find property "primaryLogFiles" {"SetupStateScope", "", ""} in cache Source File Name: datastorepropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:28:01 2006 Function Name: SetupStateScope.primaryLogFiles Source Line Number: 44 ---------------------------------------------------------- No collector registered for scope: "SetupStateScope" 00DFCFC0Unable to proceed with setup, there was a command line parsing error. : 2 Error Code: 0x80070002 (2) Windows Error Text: The system cannot find the file specified. Source File Name: datastorepropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:28:01 2006 Function Name: SetupBootstrapOptionsScope.InstallMediaPath Source Line Number: 44 Class not registered. Failed to create CAB file due to datastore exception Source File Name: datastorecachedpropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:27:59 2006 Function Name: CachedPropertyCollection::findProperty Source Line Number: 130 ---------------------------------------------------------- Failed to find property "HostSetup" {"SetupBootstrapOptionsScope", "", "3808"} in cache Source File Name: datastorepropertycollection.cpp Compiler Timestamp: Wed Jun 14 16:28:01 2006 Function Name: SetupBootstrapOptionsScope.HostSetup Source Line Number: 44 ---------------------------------------------------------- No collector registered for scope: "SetupBootstrapOptionsScope" Message pump returning: 2
I'm a student and need this to work for projects. Any help will be greatly appreciated.
I have created a big list of packages, some calling others. They all work fine from my computer using Visual Studio.
When I try to deploy them (building them with deployment turned on and running them either directly from Management Studio or as a job) I get the errors with the password of connection strings. From what I read so far its the encryption process that kills it.
I have tried to add a password to some packages, but it still didnt work (only when run directly on my computer in management studio after deploying to SQL Server, but not as a job).
I have tried to change ProtectionLevel to SecurityStorage, wouldnt let me save in Visual Studio (I understand it is ot allowed in VS because you are saving to File System, how the hell am I supposed to save it to anything else? why is it showing there if its not even valid?).
If anyone can please give me the steps to doing it properly, that would be awesome. I simply need to run the packages from SQL Server! thats all! I have no idea why it has to be soooo difficult :/
I have a request where i would like to get the start date/time and end date/time and flag (with an int) which hours (24 hour clock) have values between the two dates. Example car comes into service on 2013-12-25 at 0800 and leaves 2013-12-25 at 1400 the difference is 6 hours and i need my table to show
As i'm working away at it i'm trying to figure out how i could use a Time Dimension table for this but dont really see much. So far i have the difference between the two times in hours (hour_diff) and the start hour (min_hour) so i would like to do something where i update the first hour (min_hour) and update columns based on the numbers of hours (hour_diff)
I am not sure if this is the correct forum but here goes!
We rebuilt our SqlServer 2000 Trans replication the other night. It used to run in 3 hours but now it runs in 9.5 hours (7 hours bcp out, 2.5 hours bcp in). We have a dedicated distributor box (4 processors), a 4 processor publisher, and a 2 processor subscriber. None of the systems exhibited any processor stress or unusual disk activity. The network tests OK (tested with file xfers). But the bcp's wrote data at 2.5 to 4 minutes per 100k rows, and they loaded the data at about 100k rows in 10 seconds or less.
As you know, Replication Snapshot uses bcp on each source table to build a collection of flat-files. Then it uses bcp to load those files into the subscribing tables. Because bcp is the workhhorse here, I decided against posting this in the Replication forum.
The only change I know of is increasing MaxBcpThreads from 4 to 8. This parameter specifies the number of bulk-copy operations that can be performed in parallel. I was thinking that 8 bcp's might somehow be killing the drive where all the bcp files are written.
I installed SP2 two days ago and it seems like my SSIS-packes now take longer time than before - the very opposite of what I was hoping for.
Anyway, here are some data from runs on our performance environment. No new data is added to the source database between the runs, but I do a full process of the cubes every time (time is in seconds):
Process cubes....................2100...............................2600
So, as you can see, everything is going slower with SP2. I have yet to look into if there are any specific steps in the packages that take longer time than before, but it's odd that all packages take a longer time to execute. Especially that cube processing is slower suprises me.
We have a view with many left joins. The original creators of this view might have been lazy or sloppy, I don't know. I have rewritten the query to proper inner joins where required and also nested left joins.
So rather then the following exemplary fragment
select <many items> from A left join B on B.id_A = A.id left join C on C.id_B = B.idthis now looks like select <many items> from A left join (B join C on C.id_B = B.id ) on B.id_A = A.id
Compilation time of the original view was 18s, of the new rewritten view 4s. The performance of execution is also better (not counting the compile of course). The results of the query are identical. There are about 30 left joins in the original view.
I can imagine that the optimizer has difficulty with all these left joins. But 14s is quite a big difference. I haven't looked into detail in the execution plans yet. I noticed that in both cases the Reason for Early Termination of Statement Optimization was Time Out.