Massive UPDATE And SELECT TOP 1 QUERIES, Slowing Down...
Apr 10, 2007
Background
SQL Server 2005 Standard 9.0.1399 64bit
Windows 2003 64-bit
8gb RAM
RAID-1 70gb HD 15K SCSI (Log Files, OS)
RAID-10 1.08TB HD 10K SCSI (Data Files)
Runs aproximately _Total 800 Transaction/Second
We deliver aproximately 70-80 million ad views / day
8 Clustered Windows 2003 32-bit OS IIS Servers running Asp.net 2.0 websites
All 8 servers talking to the one SQL server via a private network (server backbone).
In SQL Server Profiler, I see the following SQL statements with durations of 2000 - 7000:
select top 1 keywordID, keyword, hits, photo, feed from dbo.XXXX where hits > 0 order by hits
and
UPDATE XXXX SET hits=1906342 WHERE keywordID = 7;
Where the hits number is incremented by one each time that is selcted for that keyword ID.
Sometimes these happen so frequently the server stops accepting new connectinos, and I have to restart the SQL server or reboot.
Any ideas on why this is happening?
Regards,
Joe
View 6 Replies
ADVERTISEMENT
Jun 3, 2003
hello,
i am getting a problem
suddenly the DB in Sql2000 has slowed down drastically
and when i have checked the logs the error is
Supersocket info:[spn register]:error 1355
pls. help me
pavan
View 2 Replies
View Related
Dec 4, 2006
I'm running into an issue when developing a clr project. I open the project to make changes and apparently it hoses up the sql server that I'm connecting to. I'm opening the CLR Project from a mapped drive on another computer. I can't figure out why just opening up the project causes the slow down. Any ideas.
Thanks,
Josh
View 1 Replies
View Related
Feb 11, 2004
Is there any way to use a graphical designer to build your insert & update SQL statements in Enterprise manager? I mean Access has an EASY way to build them, surely SQL does too?
I would just build them in Access and copy the SQL, but then I'm stuck replacing all the "dbo_" with "dbo." and other little nuances.
View 3 Replies
View Related
Mar 21, 2004
I have a page that will require several hundred update queries to be sent to the database. How much of a performance increase will i get by joining them all into one statement and sending them as a batch instead of running them one by one?
Thanks.
View 5 Replies
View Related
Jul 23, 2005
In Access, if I want to update one table with information from another,all I need to do is to create an Update query with the two tables, linkthe primary keys and reference the source table(s)/column(s) with thedestination table(s)/column(s). How do I achieve the same thing in SQL?RegardsColin*** Sent via Developersdex http://www.developersdex.com ***
View 3 Replies
View Related
May 29, 2001
We have a process here where a ~45Gb is copied from DB1 to DB2 (both on the same server) using the Export Data wizard and choosing the "Transfer objects and data between SQL Server 7.0 databases" - All the defaults are left as is.
Under 6.5 this method took around 24 - 26 hours, now with SQL 7.0 (SP3) it takes about 16 hours. That's great execpt towards the end of the copy it seems to be hogging most of the server resources - although the processors aren't mazed out.
Users of the other databases are having queries take 5+ mins when they normally come back in < 5 seconds. As soon as the copy finishes there query time return to 'normal'.
Anyone any ideas on where to start looking or how to analyse this problem.
Thanks in advance.
Rob Elmes
View 2 Replies
View Related
Mar 14, 2007
hi guys,
the following test script works fine and displays a list of cars from the fairly small database, but if I specify the sort order in the querystring, the page takes ages to display and usually times out. Can someone look over it please and tell me where I can fine-tune it for performance or redundant code?
thanks
M
<%@LANGUAGE="VBSCRIPT" CODEPAGE="1252"%>
<%
Dim oRS,oConn,myOrder,strSQL
Set oRS = Server.CreateObject("ADODB.Recordset")
Set oConn = Server.CreateObject("ADODB.Connection")
'next, a couple of test lines to prevent timeout (seems to have no effect)
oConn.CommandTimeout = 0
Server.ScriptTimeout = 0
Set strOrder = Request.QueryString("Order")
oConn.ConnectionString = "Provider=MSDASQL;DRIVER=SQL Server;SERVER=address;UID=userID;PWD=password;DATA BASE=name"
oConn.Open
%>
</head>
<body>
<%
strSQL = "Select make,model,price from vehicles where cat = 'car' AND active = 'yes'"
if strOrder <> "" then
strSQL = strSQL & " ORDER BY " & strOrder
end if
oRS.Open strSQL, oConn, 2, 3
oRS.moveFirst
Do while not oRS.eof
make = oRS("make")
model = oRS("model")
price = oRS("price")
%>
<%=make%> <%=model%> <%=price%><BR>
<%
oRS.MoveNext
loop
oRS.close
set oRS= nothing
oConn.close
set oConn=nothing
%>
</body>
</html>
View 14 Replies
View Related
Jul 20, 2005
I've been doing some experiments with speeding up copying tables ofapproximately 1 million rows between databases using BCP and BULK INSERT.I noticed that the total time for removing the indexes (non-clustered) andthen recreating them after the BULK INSERT was significantly less than justdoing the BULK INSERT with the indexes left there, even though I specifiedTABLOCK.I would have expected SQL Server not to update the index until the insertcompleted (given the table lock) and so removing the indexes would have noeffect. Can anyone explain why removing the indexes should speed it up?This is on SQL Server 7.CheersDave
View 2 Replies
View Related
Oct 2, 2007
We're having problems with our SQL 2000 SP3 standard server (on Win 2k3). Our quasi-data-warehouse application does data processing on feeds of approximately 7 million records. Once the data is loaded in the queries against that data and updates against large tables will often wait with PAGEIOLATCH_SH contention. To give an example, over 7 million rows I was testing something out in development and issued: UPDATE <Table> SET <VarcharColumn> = null, which took forever (over an hour) with the PAGEIOLATCH problem....meanwhile someone else using another database was completely blocked from making an update during that time. It seems like something is very wrong.
The server has 4 drives spinning at 15k rpm with some sort of a high end raid controller, so I'm sure it's not a slow i/o subsystem.
Has anyone experienced this behavior before? Is this an issue that's resolved in 2000 SP4?
Thanks in advance,
-Mark
View 3 Replies
View Related
Jan 22, 2008
Hello there,
I have a SQL Server 2000 with a dozen of databases. The databases are rather small (all sum up to 10 GB).
The entire server gets extremelly slow from time to time (lasting a few days when it happens and suddently coming back to normality). A profiler trace doesn't show anything strange (besides a lot of SQL Agent entries).
I pretty much tried to isolate every single application that makes use of the databases in that server and see if it was the cause of the problem, but I couldn't find any correlation.
I know the computer where this server runs is quite fragmented. Is there any way I could make sure this is the cause of my performance issues?
I don't know if this might help, but once the server simply went down for some 3 hours, and I wasn't able to bring it up in any way. It eventually started working again by itself. The only thing I did in the meantime was to run DBCC CHECKDB a few times, always getting the response "No errors found on the database".
Any hint on that?
View 3 Replies
View Related
Jul 28, 1999
I'm having problems executing TOP n queries on a database that was migrated from 6.5 to 7. I can get it to work on the Authors table in pubs, not in my other dbs. Here is an example:
CREATE TABLE dbo.tblsapParentCust (
Parent char (10) NOT NULL ,
Name varchar (40) NULL,
IsSoldTo bit NOT NULL DEFAULT 0,
CONSTRAINT PK_tblsapParentCust PRIMARY KEY CLUSTERED
(Parent)
)
GO
<load in some data>
SELECT TOP 10 * FROM tblsapParentCust
The select statement results in a syntax error:
Server: Msg 170, Level 15, State 1, Line 1
Line 1: Incorrect syntax near '10'.
I can switch over to pubs and change the query to reference the Authors table, and it runs fine.
If anyone can explain this behavior to me, I would appreciate it.
Thanks,
Buddy
View 1 Replies
View Related
May 22, 2008
Hi,
Trying to update a single value within a table, thus eliminating nulls. Another words, if the value is NULL update it with the next preceeding non-null value. In this example, 1 should be CO, 2 should be CO, 6 should be CO, 8 should be TT, and 10 should be TT.
For example,
1 NULL
2 NULL
3 CO
4 CO
5 CO
6 NULL
7 TT
8 NULL
9 TT
10 NULL
Any ideas? Thanks.
View 2 Replies
View Related
Jul 23, 2005
I'm new to adp w/ sql server but I have to use it on a project i'mdoing...One of the MUSTS for this project is the ability to update a 00 - 09text value with the appropriate text description from another table...Easy as pie in .mdb. Of course In the stored procedure it barks at meand tells me that an update query can only have one table.. ouch thathurts...I'm currently reading on the subject but this group has been veryhelpful in the past.....I found this link...http://www.sqlservercentral.com/col...stheeasyway.aspUnfortunetly I'm using MSDE not Enterprise so I don't think I can usethe query analyser.. But I tryed it in my Access ADP anywayit barked at me..I tried to go from this....SELECT dbo.LU_SEX.SEX_CODE, dbo.TEST.DEFECTS_DP1FROM dbo.TEST INNER JOINdbo.LU_SEX ON dbo.TEST.SEX_DP1 =dbo.LU_SEX.SEX_DECTo this...UPDATE dbo.TEST.SEX_DP1SET dbo.TEST.SEX_DP1 = dbo.LU_SEX.SEX_CODEFROM dbo.LU_SEX INNER JOINdbo.TEST ON dbo.LU_SEX.SEX_DEC =dbo.TEST.SEX_DP1Maybe I need a good book on this?Thanks,Charles
View 2 Replies
View Related
Jan 28, 2005
Hi,
In my database I have a field with the type decimal. In the select query I want to return true if this field is smaller than 1 and false if this field is 1. How can I do this?
I need something like that:
Select id, name, (mydecimalField < 1) from mytable
KaaN
View 2 Replies
View Related
Oct 8, 2014
I have 4 tables which I am extracting 2 distinct values;
Select Distinct UniId, PID from dbo.Event1
Select Distinct UniId, PID from dbo.Event2
Select Distinct UniId, PID from dbo.Event3
Select Distinct UniId, PID from dbo.Event4
Then, I want to select Distinct between these 4 tables (Event1, Event2, Event3 and Event4)
Then insert the distinct records of the 4 tables to the final table - dbo.EventLookup .
View 2 Replies
View Related
May 5, 2008
I need to create a temporary table using dynamic queries and then i have to use the temporary table for data manipulatuion.
Can someone help me out on this.
EG
sp_executesql N'Select top 1 * into #tmp from table1'
select * from #tmp
View 5 Replies
View Related
Jul 20, 2005
Help, please. I am trying to update atable with this structre:CREATE TABLE Queue (PropID int, EffDate smalldatetime,TxnAmt int)INSERT Queue (PropID) SELECT 1INSERT Queue (PropID) SELECT 2INSERT Queue (PropID) SELECT 3....from this table...CREATE TABLE Txns (PropID int, TxnDate smalldatetime,TxnType char(1), TxnAmt int)INSERT Txns SELECT 1 '20000201', 'B', 100000INSERT Txns SELECT 1 '20020515', 'B', 110000INSERT Txns SELECT 1 '20020515', 'A', 120000INSERT Txns SELECT 1 '20020615', 'c', 130000....only certain txn types are okay, and they have an orderof preference...CREATE TABLE GoodTxnTypes (GoodTxnType char(1), Pref)INSERT GoodTxnTypes SELECT 'A', 1INSERT GoodTxnTypes SELECT 'B', 2The idea is to fill in the NULL fields in the Queue table,according to a rule -- the transaction must be the latesttransaction within a date window, it must be one of the goodtxn types, and if there are two txns on that date, choosethe txn by the preferred txn type (A is preferred over B,according to the field Pref).If the time window were 20020101 to 20030101, the txnselected to update the Queue table would be this one:INSERT Txns SELECT 1 '20020515', 'A', 120000 -- there aretwo in the time window that are type A or B; they areboth on the same day, so the 'A' is preferred.If the time window were 20000101 to 20010101, this wouldbe selected because it is the only A or B type txn inthe interval:INSERT Txns SELECT 1 '20000201', 'B', 100000I'm looking for a statement that starts...UPDATE Queue SET EffDate = ...., TxnAmt = .... (EffDate,in this table, is the same as TxnDate in the Txn table).Assume we have @FirstDate and @LastDate available.Help, please. I'm getting stuck with (a) a sub-query tofind the relevant Txn records, and (b) another sub-querywithin that to find the MAX(TxnDate) within the timewindow. Filtering the Txn records on the basis of theGoodTxnTypes table is easy, as is ordering what is returned.But I'm having trouble joining the sub-queries back to theQueue table on the basis of PropId.
View 1 Replies
View Related
Sep 8, 2007
I'm writing an application for Windows Mobile 5 / Pocket PC using VB.NET 2005. The database is connected using an instance of SqlCeConnection and updated by an SqlCeCommand.
The application can perform select queries on data originally entered into the database through Visual Studio, or perform update / insert queries at run time. Anything inserted or updated can be returned by a select query whilst the application is running, however, anything I have inserted or updated doesn't appear to be written to the SDF file and hence is not in the database after restarting the application.
Am I missing something that's different between performing queries on an SQL CE database on Pocket PC and an ODBC source in a normal Windows application?
View 13 Replies
View Related
Jan 15, 2008
Hello:
Interestingly enough, I haven't come across this before. I have a SQL stored procedure which takes four parameters; periodstartdate (datetime), periodenddate (end time), hsgradyearstart (int), hsgradyearend (int)
[dbo].[CalculateActivityTotal]
-- Add the parameters for the stored procedure here
@periodstartdate datetime = '2007-01-01',
@periodenddate datetime = '2007-01-08',
@hsgradyearstart int = 1900,
@hsgradyearend int = 2007
AS...
If I run the stored procedure and pass the parameters using EXEC or
sp_executesql "CalculateActivityTotal '2008-01-04 12:00:00', '2008-01-11 12:00:00', 1900, 2008"
the stored proc takes well over ten minutes to run (it does a bunch of aggregation). If I modify the stored procedure to take no parameters, however, and I hardcode the dates in the stored proc using declare and set then it runs in 13 seconds. What could be causing my problem and how I can I go about resolving this? I need to pass the parameters via reporting server. Thanks!
View 2 Replies
View Related
Oct 15, 2004
Currenlty I have huge amounts of data going into a table.
I'm sending an xmldoc and using openxml with a cursor to seed them.
the question I have is whether to let duplicate keyed data rows bounce
and then check @@error and then do an update on the nokeyed field
or
to do a select on the keyed field and then do an insert or update based on the
selects results.
Speed is my goal.
View 3 Replies
View Related
Feb 6, 2007
Rather than posting twice, I thought I would put both issues I'm having in one. Our server is Windows Server 2003 and we're running SQL Server 2005.
The first issue is this: We have several databases and I have scheduled their backups to run nightly which works just fine. A couple weeks ago, one of the databases .bak file grew from about 500MEG to 2GB overnight. Then, just a few days ago, it went from 2GB to 3.5GB. There is nothing unusual going on in the live db that would warrant such an increase in the .bak file. All the dbs are in the same backup job schedule but this is the only one affected. Additionally, I had autogrowth enabled on all the dbs but today disabled it for this particular db. Any ideas?
The second issue is my tempdb.mdf file on my C drive. It will go from just a few hundred KB's to 4.5GB overnight consuming most of what is left on my C drive. I'm afraid I'm in for a system crash if it continues. I have to stop SQL Server and restart it to clear the size. Is there a way to move the location of the tempdb.mdf file to my F drive?
I don't know if these two issues are related or not but certainly would like to hear from someone.
Sorry, in advance, for the large post.
Dave
View 20 Replies
View Related
Sep 6, 2005
Hello,I have a huge database (2 GB / month) and after a while it is becomingnon-operational (time-outs, etc.) So I have written an SQL sentence(delete) that can reduce around 60% of the db size without compromisingthe application data needs. The problem is that when I execute it, thedb does reduce its size 60%, but the transaction log increases at thesame rate. Can I execute the sentence in a "commit" or"transaction" mode so to impede the SQL Server write in the log?Thanks for the help!Antonio
View 6 Replies
View Related
Feb 6, 2006
Hi,
I wanted to know a query which will create a final result table from a combination of select queries.
The select query is like :
1. select col1 , col2 , null from table1
2. select null , col2 , null from table2
3. select null , null , col3 from table 3.
null are inserted as i wanted a single select query which will merge all the columns from all the tables and finally create a result table.
Thanks in advance.
View 1 Replies
View Related
Mar 19, 2007
I wrote a simple select query that counts the number of records I have in certain zip codes. How can I get a total of the "count" column at the bottom of the results? For example, my results may look like this:
ZIP | (no column name for "count")
_____________________________________
89502 | 10
89509 | 15
89521 | 25
What statement would I use to get the total of '50' displayed in the resluts? Thank you in advance
-Lance
View 6 Replies
View Related
Jan 24, 2008
The server being used is a Intel Xeon E5310 Clovertown 1.6GHz 2 x 4MB L2 Cache Socket 771 80W Quad-Core 2U Passive Processor.
The problem is that this server is slowing down everytime about 1000 users log into a forum which the server is running. I think that the server should be able to handle this many users with no problems but I am not sure if that is the case.. The problem is probably something to do with the SQL of the server I am guessing. The server is not mine but I want to help the owner of the server as well as the users who are trying to access this forum but cant because of this server issue. If I was able to get the SQL would I be able to fix this problem? I doubt you need this but the server url is www.smashboards.com
I am fairly new to servers and have never really set one up myself yet. Forgive me for my lack of knowledge about them.
-Thank you!
View 1 Replies
View Related
Mar 3, 2000
When updating large sets a row at a time the performance is lacking in comparison to 6.5. When using PeopleSoft which uses cursors with a begin transaction with a loop inside and a commit after the loop completes, SQL 6.5 with Page locking could handle a 300,000 row transaction in 3-4 hours. 7.0 took 17.5 hours. The difference is 6.5 used 50,000 locks and 7.0 used 300,000 locks.
Does anybody have solution short of rewriting PeopleSoft ?
View 2 Replies
View Related
Apr 5, 2006
Hi Everyone,
We have a large and active MSSQL 2000 database. Recently, after a rebuild of the server, we had a problem with the SQL service SQLSERVERAGENT. The service could not start as the service account lost local permission to the registry. During this time, all of the data being sent to the database from our application accumulated into the database .ldf file. By the time we were able to get the service restarted, our .ldf file was approx. 28 Gigs. When the service restarded, the .ldf file shrunk down to regular size,about 40 megs, and the .trx tlog file grew up to 28 gigs for that specific period (new file every hour).
The problem is, the database file (database.mdf) stayed about the same as it was before the service was restarted. When the .ldf transfered to the .trn none of the 28 gigs of data got stored in the database. What does this mean? Perhaps with the service stopped the application using the db saw problems and did not commit the data making it all useless? Or is it possible that the data in the .trn log just needs to be forced to commit to the .mdf???
Is there any way to verify the data in the 28 gig .trn file and figure out if we should get it stored to the database? If yes, how would we go about verifying it, and after that how would we force it to commit to the .mdf file? Am I on the right track here or is it not as I see it??
Thanks!
Mike
View 4 Replies
View Related
Jul 2, 2006
Hi
Would you say that it's ok for a web site code to make ALL of it's access to a db through SP and views? And I mean everything including inserting new records and updating others with no use with SQL in the code.
The advantage would be very strict control over the access, but in order to achieve this it would take many many SP and views to cover all types of actions, can you think about a disadvantage except all the work creating those SP?? what about the server resources and performance? how demanding it would be?
Thanks,
Inon.
View 7 Replies
View Related
Jan 3, 2006
Hello,
I thought this was a neat solution I came up with, but I'm sure it's
been thought of before. Anyway, it's my first post here.
We have a process for importing data which generates a SELECT statement
based on user's stored configuration. Since the resulting SELECT statement
can be massive, it's created and stored in a text field in a temp table.
So how do I run this huge query after creating it? In my tests, I was
getting a datalength > 20000, requiring 3 varchar(8000) variables in
order to use the execute command. Thing is, I don't know how big it could
possibly get, I wanted to be able to execute it regardless.
Here's what I came up with, it's very simple:
Table is named #IMPORTQUERY, one field SQLTEXT of type TEXT.
>>
declare @x int, @s varchar(8000)
select @x = datalength(sqltext) / 8000 + 1, @s = 'execute('''')' from #importquery
while @x > 0
select @s = 'declare @s' + cast(@x as varchar) + ' varchar(8000) ' +
'select @s' + cast(@x as varchar) +
'=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery ' +
replace(@s,'execute(','execute(@s' + cast(@x as varchar) + '+')
, @x = @x - 1
set @s = 'declare @x int set @x=1 ' + @s
execute(@s)
<<
At the end, I execute the "@s" variable which is SQL that builds and
executes the massive query. Here's what @s looks like at the end:
>>
declare @x int set @x=1
declare @s1 varchar(8000)
select @s1=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s2 varchar(8000)
select @s2=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s3 varchar(8000)
select @s3=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
execute(@s1+@s2+@s3+'')
<<
View 4 Replies
View Related
Jul 23, 2005
Our database server has started acting weird and at this point I'm eithertoo sleep deprived or close to the problem to adequately diagnose the issue.Basically to put it simply... when I look at the read disk queue length, thedisks queues are astronomical.normally we're seeing a disk queue length of 0-1 on the disks that containthe DB data and index. (i.e non clustered indexes are on a disk of theirown).Writes are just fine.Problem is, all our databases are on the same drive, and I can't seem tonail down which DB, let alone which table is the source of all our reads.Now, to really make things weirder.. during the busier times of the daytoday (say 1:00 PM to 4:00 PM) things were fine.At 4:20 PM or so it was like someone hit a switch and read disk queue lengthjumped from 0-1 up to 100-200+... with spikes up to 1500 for a split secondor so.What's the best way folks know to nail down this?Thanks.----
View 9 Replies
View Related
Dec 29, 2007
If I remove the TOP 200 this query returns about 2.5 million rows. It combines a lot of records and turns it into much more programmer friendly results. The query slowed down from 2 seconds to about 13 seconds as it has grown from about 10k to the now couple of million.
Code Block
SELECT TOP 200 *
FROM
(
SELECT
[UserProfile].[UserId]
,[aspnet_Users].[UserName]
,[City]
,[State]
,[RoleName]
,[ProfileItemType].[Name] AS pt_name
,[ProfileItem].[Value]
FROM
[UserCriteria]
,[aspnet_Users]
,[aspnet_Roles]
,[aspnet_UsersInRoles]
,[Location]
,[ProfileType]
,[ProfileTypeItem]
,[ProfileItem]
INNER JOIN [UserProfile]
ON [ProfileItem].[ProfileId] = [UserProfile].[ProfileId]
INNER JOIN [ProfileItemType]
ON [ProfileItem].[ProfileItemTypeId] = [ProfileItemType].[ProfileItemTypeId]
WHERE [UserProfile].[UserId] IN (
SELECT [UserCriteria].[UserId]
FROM [UserCriteria]
WHERE
Zipcode IN (
SELECT [Zipcode]
FROM [ZipcodeProximitySQR] ('89108' , 150))
)
AND [UserProfile].[UserId] = [aspnet_Users].[UserId]
AND [UserCriteria].[UserId] = [UserProfile].[UserId]
AND [Location].[Zipcode] = [UserCriteria].[Zipcode]
AND [aspnet_UsersInRoles].[UserId] = [aspnet_Users].[UserId]
AND [aspnet_UsersInRoles].[RoleId] = [aspnet_Roles].[RoleId]
) AS t
PIVOT
(
MIN([Value])
FOR pt_name IN ([field1],[field2]],[field3]],[field4]])
) AS pvt
ORDER BY RoleName DESC, NEWID()
The line: FOR pt_name IN ([field1],[field2]],[field3]],[field4]]) I change the values from the long names to read field1, field2... because it was irrelevant but confusing because of the names.
Here is the showplan text
Code Block
|--Sequence
|--Table-valued function(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
|--Top(TOP EXPRESSION:((200)))
|--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId], [aous].[dbo].[aspnet_Users].[UserName], [aous].[dbo].[Location].[City], [aous].[dbo].[Location].[State], [aous].[dbo].[UserCriteria].[Birthdate], [aous].[dbo].[aspnet_Roles].[RoleName]) DEFINE:([Expr1039]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'height' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1040]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'bodyType' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1041]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'hairColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1042]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'eyeColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END)))
|--Nested Loops(Inner Join)
|--Nested Loops(Inner Join)
| |--Sort(ORDER BY:([aous].[dbo].[UserCriteria].[UserId] ASC, [aous].[dbo].[Location].[City] ASC, [aous].[dbo].[Location].[State] ASC, [aous].[dbo].[UserCriteria].[Birthdate] ASC, [aous].[dbo].[aspnet_Roles].[RoleName] ASC))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserCriteria].[Zipcode])=([Expr1043]), RESIDUAL:([Expr1043]=[aous].[dbo].[UserCriteria].[Zipcode]))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[ProfileItemType].[ProfileItemTypeId])=([aous].[dbo].[ProfileItem].[ProfileItemTypeId]))
| | | |--Index Scan(OBJECT:([aous].[dbo].[ProfileItemType].[ProfileTypes]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[ProfileId]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserProfile].[UserId])=([aous].[dbo].[aspnet_UsersInRoles].[UserId]), RESIDUAL:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[aspnet_UsersInRoles].[UserId]))
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | |--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | | |--Nested Loops(Left Semi Join, WHERE:([aous].[dbo].[UserCriteria].[Zipcode]=[Expr1044]))
| | | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId] < {guid'E3D72D56-731A-410E-BCB1-07A87A312137'} OR [aous].[dbo].[UserCriteria].[UserId] > {guid'E3D72D56-731A-410E-BCB1-07A87A312137'}), WHERE:([aous].[dbo].[UserCriteria].[Male]=(1) AND [aous].[dbo].[UserCriteria].[SeekingMale]=(0)) ORDERED FORWARD)
| | | | | | | | |--Compute Scalar(DEFINE:([Expr1044]=CONVERT_IMPLICIT(nvarchar(5),[aous].[dbo].[ZipcodeProximitySQR].[Zipcode],0)))
| | | | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
| | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserProfile].[UserProfileIds]), SEEK:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[UserCriteria].[UserId]) ORDERED FORWARD)
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[aspnet_Roles].[RoleId]))
| | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[aspnet_Roles].[aspnet_Roles_index1]))
| | | | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_UsersInRoles].[aspnet_UsersInRoles_index]), SEEK:([aous].[dbo].[aspnet_UsersInRoles].[RoleId]=[aous].[dbo].[aspnet_Roles].[RoleId]) ORDERED FORWARD)
| | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_Users].[_dta_index_aspnet_Users_5_37575172__K2_K1_K4_3]), SEEK:([aous].[dbo].[aspnet_Users].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | |--Index Seek(OBJECT:([aous].[dbo].[ProfileItem].[_dta_index_ProfileItem_5_1714105147__K2_K1_K3_4]), SEEK:([aous].[dbo].[ProfileItem].[ProfileId]=[aous].[dbo].[UserProfile].[ProfileId]) ORDERED FORWARD)
| | |--Compute Scalar(DEFINE:([Expr1043]=CONVERT_IMPLICIT(nchar(5),[aous].[dbo].[Location].[Zipcode],0)))
| | |--Index Scan(OBJECT:([aous].[dbo].[Location].[CityLocation]))
| |--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileType].[PKProfileTypeProfileTypeId]))
|--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileTypeItem].[ProfileTypeItem]))
Here is a link to the execution plan from Microsoft SQL Server management Studio.
http://epi.cc/BasicUserSearch.zip
There are no table scans, but the Hash Match from the inner join is pretty bad.
Can anyone give me a pointer or two?
View 1 Replies
View Related
Nov 8, 2006
Hi
Will somebody please explain how to combine asp.net dropdown lists to write
a SQL database select query. I am using VWdeveloper and C Sharp.
For example, say I have 3 dropdownlists on my webpage as below,
List 1, Cities, London, Rome, Barcelona etc
List 2, Restaurants by Type, Italian, chinese, Indian etc
List 3, Number of tables/ seats 10-20, 20- 40, 50 -100
I want someone to be able to search for a restaurant by selecting an item from each dropdownlist
such as, "Barcelona" "Italian" "50-100"
This search query would return all the Italian restaurants in Barcelona with 50-100 tables/seats.
I would also like the select query to work even if one of the dropdownlists items is not selected.
Hope somebody can clear this up?
Also would sql injection attacks be a threat by doing it this way?
Thanks all
View 9 Replies
View Related