Efficiency Of Query
Jul 20, 2005
I have the following 2 tables:
location:
placelftrgt
-------------------
Europe099
England110
France1120
Italy2130
Asia100199
London1212
staff:
namelocLft
--------------
Edwards0
Smith1
Leveil11
Rossi21
Lee12
Chan100
location uses the Celko hierarchy model.
I wish to retrieve for a location the names of all staff within it and
the hierarchy of place associated with that member of staff, eg a
query for Europe should return all staff in Europe, and for Lee I wish
to return Lee-London, Lee-England, Lee-Europe etc.
I can achieve this using a subquery, ie
SELECT name, place
FROM staff, location
WHERE name IN (SELECT name
FROM staff, location
WHERE place='Europe' And locLft>=location.lft And
locLft<=location.rgt)
AND locLft>=lft AND locLft<=rgt
But is this the most efficient way of doing so?
Thanks
View 1 Replies
ADVERTISEMENT
Feb 21, 2015
Suppose I have two tables(Customer and Order) which are as follows:
Code:
Customer
customer_id
first_name
[Code]....
Another thing I am concerned about is that in the line INNER JOIN Order ON Customer.customer_id = Order.customer_id , I have written Customer.customer_id on the left hand side. Is that correct or I should write it on the right hand side of the equal sign?
View 3 Replies
View Related
Jul 7, 2001
Hi,
Could someone confirm that the following query:
update table set x=1, y=1, z=1 where a = 1 or b = 1 or c = 1 or d = 1
is more efficient than this query:
update table set x=1, y=1, z=1 where e <> 1 and f <> 1 and g <> 1
Thanks!
View 1 Replies
View Related
Aug 11, 2007
Hi,
Please have a look at the following two queries, the purpose of which is to find which ten users (represented by 'Username') have created the most records which contain the term 'foo':
SELECT TOP 10 Username, COUNT(*) AS [Count] FROM Options
WHERE FREETEXT(*, 'foo')
GROUP BY Username
ORDER BY [Count] DESC
SELECT TOP 10 Username, COUNT(*) AS [Count] FROM Options
JOIN FREETEXTTABLE (Options, *, 'foo', 500) ct
ON OptionID = ct.[KEY]
GROUP BY Username
ORDER BY [Count] DESC
They both produce the same result set. However, I am wondering which is more performant. At first glance, it would seem the first one would be. It doesn't involve a JOIN and should, therefore, be more efficient.
But this depends on how the FREETEXT expression is evaluated. My concern is that internally, SQL Server would generate an entire recordset based on 'WHERE FREETEXT(*, 'foo')', which could be thousands of records, and only then restrict this to the TOP 10 by COUNT.
If this does happen, then it would be better to join to a FREETEXTTABLE, where I can at least restrict the result set using the 'top_n_by_rank' parameter (which is set as '500' in this case, as this seems a good balance of performance against the likely number of duplicates I will get in my FREETEXTTABLE results).
So... I am worrying about this unnecessarily? Should I just use the simpler first version?
Any thoughts appreciated.
Thanks
View 3 Replies
View Related
Sep 20, 2006
I want to select one field from a table,but it should on some conditionswhich refer to 5 table ,such as A.FILED1=B.FIELD1 AND B.FIELD2=C.FIELD3 AND....Should I use case "select sum(a.amount) from a,b,c,... wherea.field1=b.field1 and b.field2=c.field2 and ..." or "select sum(a.amount)from select b.field1 from select c.field2 from...."?And which case is moreefficiency?thanks!我想计算一个表中的某个字段的和,但此记录需在从多个 表中查询此记录是否满足特定的条件。那么我是用select ..from ...where ..and ..and..and ..and ..还是用select ..fromselect ..from select ..from ......?请问是哪一个效率高?谢谢!
View 2 Replies
View Related
Jan 29, 2007
Hello,I am looking at optimizing site searching on a web application. I have two thoughts on the idea:1. create views with fulltext indexes combining records from multiple tables.2. create a table with an xml column and primary index. I understand the xml column type has the overhead of a BLOB under the hood, but that a primary xml index can "shred" the contents and improve parsing. I also read the xml column is actually searched as a tree, providing some variant of log(n) run time. Does anyone know of good literate on this subject, the more big O notation, runtime analysis types of posts the better.Thanks
View 5 Replies
View Related
Jul 24, 2007
Hi guys,
Since the project that i'm developing is rapidly increasing, the pages seems to be getting slower everytime you view it. I would like to ask if code below would be efficient enough for several simultaneous request of data or if you have any other suggestions, you are welcome to add:
1 Public Shared Function QueryDatabase(ByVal sql As String) As DataTable2 3 ' SQL Server Connection Object Variable4 Dim _oConnection As SqlConnection5 ' SQL Server Command Object Variable6 Dim _oCommand As SqlCommand7 ' SQL Server Data Adapter Object Variable8 Dim _oAdapter As SqlDataAdapter9 ' DataTable Object Variable (Early Binding)10 Dim _oDataTable As New DataTable11 12 ' Instantiate Connection Object with connection string13 _oConnection = New SqlConnection("Data Source=XXX.XXX.XXX.XXX;Initial Catalog=XXXXXX;User=XXX;Pwd=XXX;")14 ' Instantiate Command Object with SQL String and Connection Object15 _oCommand = New SqlCommand(sql, _oConnection)16 ' Instantiate Data Adapter Object with Command Object17 _oAdapter = New SqlDataAdapter(_oCommand)18 ' Fill the DataTable Object with the retrieve records19 _oAdapter.Fill(_oDataTable)20 21 ' Release resources used by DataAdapter Object22 _oAdapter.Dispose()23 24 ' Release resources used by Command Object25 _oCommand.Dispose()26 27 ' Close the connection of the Connection Object from SQL Server28 _oConnection.Close()29 30 ' Release resources used by Connection Object31 _oConnection.Dispose()32 33 ' Return the retrieve records34 Return _oDataTable35 36 End Function Thanks a lot.
View 2 Replies
View Related
Nov 13, 2005
Hey,I am developing a website which will be used by a large number of people so I am concerned about efficiency.Sorry for the three posts but anyone with any info would be appreciated.The database has the following tables: FACILITY-----MEETING ---- | | USERS---- -------- MEETING_INVITE -------- REMINDER | | ---------CONTACTS-------When the user logs in I use there username to access the rest of the tables. I get all of the users information out of the database in one go and store it in a dataset.So when a user accesses there meetings page, I pass the dataset to that page with a server transfer.Question 1 > Is it more efficient to open the database once and access all the information and pass the information to seperate tables or is it more efficient to access the database on the individual pages and thus not passing of information.---------------------------------------------------------------------------------------------------------------In order to access the information I use 6 Select statements in a rowHere is an example of my select statments: SELECT * FROM USERS WHERE email = textbox_emailSELECT FACILITY.* FROM FACILITY, USERS WHERE FACILITY.email = USERS.email AND USERS.email = textbox_emailBy the time I get to the REMINDER table I am combining all the tables and my query is eight lines long.Question 2 > Is there a way of combining the results of a previous select to access information?---------------------------------------------------------------------------------------------------------------Question 3 > What do you think of my table design? The lines represent one to many relationships. If you can give me any tips on databases please do.Thanks for your time,Padraic
View 2 Replies
View Related
Nov 21, 2005
Hello all,I am developing a website which may be used by a large number of people in the future and I am concerned about performance.
Is it better to have one table with 50, 000 rows or 5,000 tables with 10 rows each?
Is there a way to divide a table in two if the table reaches a certain size?
Is there a limit on the size of tables?
Is there a limit on the number of tables?
Is it possible to create tables from vb.net?
Is it possible to program checks into sql server? For example, could I delete data that has passed a certain date or send an automated email when a time is reached?
Thanks for your time,Padraic
View 2 Replies
View Related
Sep 7, 2000
Hey people
I'd be really grateful if someone can help me with this. Could someone explain the following:
If the following code is executed, it runs instantly:
declare @SellItemID numeric (8,0)
select @SellItemID = 5296979
SELECT distinct s.sell_itm_id
FROM stor_sell_itm s
WHERE (s.sell_itm_id = @SellItemID )
However, if I use this WHERE clause instead -
WHERE (@SellItemID = 0 OR s.sell_itm_id = @SellItemID)
- it takes 70 micro seconds. When I join a few more tables into the statement, the difference is 4 seconds!
This is an example of a technique I'm using in loads of places - I only want the statement to return all records if the filter is zero, otherwise the matching record only. I think that by using checking the value of the variable in the WHERE clause, a table scan is used instead of an index. This seems nonsensical since the variable is effectively a constant. Wrapping the entire select statement with an IF or CASE works, but when I've got 10 filters I'd have to 100 select statements.
I DON'T GET IT!! There must be a simple answer, HELP!!
Jo
PS this problem seems to occur both in 6.5 and 7.0
View 1 Replies
View Related
Aug 31, 2004
Hi All,
I tried my luck in the Access forum and I've search the web and MSDN for an answer with little luck.
Simply, is it better to update a table via an UPDATE query or Recordset manipulation?
I have read that if you were to update 10,000 records an UPDATE query is more efficient (obviously), but does that transend down to say 1 - 10 updates?
i.e. There are six unique updates I want to make to 6 different rows. Should I code the backend VB to execute 6 different queries or seek and update a recordset?
It's a MS Access XP app with ADO 2.8.
My gut feeling on this is that making 6 update queries is more efficient, both with system resources and record-locking issues; I'd just like another opinion on the matter.
I appreciate your help!
Thanks,
Warren
View 2 Replies
View Related
Apr 8, 2008
I am using nested cursors in my script below, and wonder if there is a more efficient way please?
USE ar
GO
DECLARE @mortgage INT,
@mortgage_sequence int,
@getMortgage CURSOR,
@notes_1 varchar(MAX),
@notes_2 varchar(MAX),
@notes_3 varchar(MAX),
@notes_4 varchar(MAX),
@notes_5 varchar(MAX),
@notes_6 varchar(MAX),
@notes_7 varchar(MAX),
@notes_8 varchar(MAX),
@notes_9 varchar(MAX),
@notes_10 varchar(MAX),
@notes_11 varchar(MAX),
@notes_12 varchar(MAX),
@notesComplete varchar(MAX),
@addedUser varchar(255),
@addedDate varchar(255),
@amendedUser varchar(255),
@amendedDate varchar(255),
@sequence int,
@getDetail CURSOR
SET @getMortgage = CURSOR FOR
SELECT DISTINCT Mortgage_Number, Mortgage_Note_Sequence_No
FROM format_additional_notes
GROUP BY Mortgage_Number, Mortgage_Note_Sequence_No
ORDER BY Mortgage_Number ASC
OPEN @getMortgage
FETCH NEXT
FROM @getMortgage INTO @mortgage, @mortgage_sequence
WHILE @@FETCH_STATUS = 0
BEGIN
SET @getDetail = CURSOR FOR
SELECT ltrim(rtrim(Additional_Text_1)),
ltrim(rtrim(Additional_Text_2)),
ltrim(rtrim(Additional_Text_3)),
ltrim(rtrim(Additional_Text_4)),
ltrim(rtrim(Additional_Text_5)),
ltrim(rtrim(Additional_Text_6)),
ltrim(rtrim(Additional_Text_7)),
ltrim(rtrim(Additional_Text_8)),
ltrim(rtrim(Additional_Text_9)),
ltrim(rtrim(Additional_Text_10)),
ltrim(rtrim(Additional_Text_11)),
ltrim(rtrim(Additional_Text_12)),
Mortgage_Note_Sequence_No,
Extra_Added_by_User,
Extra_Added_on_Date,
Extra_Amended_By_User,
Extra_Amended_By_Date
FROM format_additional_notes
WHERE Mortgage_Number = @mortgage AND Mortgage_Note_Sequence_No = @mortgage_sequence
ORDER BY Mortgage_Note_Sequence_No
OPEN @getDetail
SET @notesComplete = ''
FETCH NEXT FROM @getDetail INTO @notes_1,
@notes_2,
@notes_3,
@notes_4,
@notes_5,
@notes_6,
@notes_7,
@notes_8,
@notes_9,
@notes_10,
@notes_11,
@notes_12,
@sequence,
@addedUser,
@addedDate,
@amendedUser,
@AmendedDate
WHILE (@@FETCH_STATUS = 0)
BEGIN
SET @notesComplete = @notesComplete +
ISNULL(@notes_1,'') + ' ' +
ISNULL(@notes_2,'') + ' ' +
ISNULL(@notes_3,'') + ' ' +
ISNULL(@notes_4,'') + ' ' +
ISNULL(@notes_5,'') + ' ' +
ISNULL(@notes_6,'') + ' ' +
ISNULL(@notes_7,'') + ' ' +
ISNULL(@notes_8,'') + ' ' +
ISNULL(@notes_9,'') + ' ' +
ISNULL(@notes_10,'') + ' ' +
ISNULL(@notes_11,'') + ' ' +
ISNULL(@notes_12,'')
FETCH NEXT FROM @getDetail INTO @notes_1,
@notes_2,
@notes_3,
@notes_4,
@notes_5,
@notes_6,
@notes_7,
@notes_8,
@notes_9,
@notes_10,
@notes_11,
@notes_12,
@sequence,
@addedUser,
@addedDate,
@amendedUser,
@AmendedDate
END
INSERT INTO format_additional_notes_1
(Mortgage_Number,
Mortgage_Note_Sequence_No,
Additional_Text,
Extra_Added_By_User,
Extra_Added_on_Date,
Extra_Amended_By_User,
Extra_Amended_By_Date)
VALUES
( @mortgage,
@sequence,
@notesComplete,
@addedUser,
@addedDate,
@amendedUser,
@amendedDate)
CLOSE @getDetail
DEALLOCATE @getDetail
FETCH NEXT
FROM @getMortgage INTO @mortgage, @mortgage_sequence
END
CLOSE @getMortgage
DEALLOCATE @getMortgage
GO
View 6 Replies
View Related
May 6, 2008
I would like to use MVJ's formula for creating a date table.
I would like to use it with our main ERP database. However, I am reluctant to make changes to it because I fear that at some point when we upgrade that software and it's database that the upgrade program will delete my table.
So, here is my question. Performance wise, does it matter whether I add the date table to our ERP database or if I create another database (on the same server) for the custom date table? Does linking between databases take substantially longer than linking within the same database?
View 1 Replies
View Related
Jun 12, 2008
Hi,
okay so I'm refactoring some code at the moment. At the moment, I'm working on a search screen. This search screen lets the user enter a number of criterias, I'm working on drags data from a view and then programmatically filters it according to the search filters.
This is obviously inefficent and non-scalable as the view drags out every entry and returns to the data layer, which then filters it.
I'm wondering what the best way to refactor this? i'm thinking the best way is to tell the db what to filter on, so it'll only drag out the right amount of data.
Therefore, should I keep the view? Is there any way of entering parameters into views or am i going to need to change this into a stored proc?
View 2 Replies
View Related
Sep 21, 2006
hi,Allcould you tell me which case is more efficiency?(my tables have no index)And does it has any else case more efficiency?case1:"select sum(Invoice_Production.Quantity) from Invoice_Production,(select[dat_Item].ItemCode from [dat_Item],(select [dat_MachineType].MachineTypeIDfrom [dat_MachineType]"&subQuery&") as T3 where [dat_Item].MachineTypeID =T3.machinetypeid) as T1,(select [Invoice].InvoiceNo from Invoice,(select[users].user_id from [users] where [Users].User_ID = '"& rs2(0) &"') as T4where T4.User_ID = invoice.dealerno and Invoice.Cyear >= "&startYear&" andInvoice.Cyear <= "&endYear&" and Invoice.Cmonth >= "&startMonth&" andInvoice.Cmonth <= "&endMonth&") as T2 where invoice_production.ItemCode =T1.ItemCode and T2.invoiceno = invoice_production.invoiceno"case2:"select sum(Invoice_Production.Quantity) from[Invoice_Production],[Invoice],[dat_MachineType],[dat_Item],[users] where[users].user_id = [invoice].DealerNo and [dat_Item].ItemCode =[Invoice_Production].ItemCode and [dat_Item].MachineTypeID =[dat_MachineType].MachineTypeID and [Invoice_Production].InvoiceNo =[Invoice].InvoiceNo and [Users].User_ID = '"& rs2(0) &"' and Invoice.Cyear
Quote:
View 2 Replies
View Related
Jul 20, 2005
How efficient is ti to use join views in a database?I'm developing an e-commerce system and using join views to join theproduct, product category and product review tables and wondering if thiswould have any adverse effect on performance.Thanks in advance
View 3 Replies
View Related
Nov 20, 2007
Greetings all,
I need to determine a hierarchy from a table with EmpID's and SupID's. Basically, the President doesn't have a SupID so it will be null. I need to determine programatically the hierarchy to keep it simple.
I have code that works and I was hoping for advice on optimizing it 'cuz it uses a cursor.
Also, It only deals with less than 300 records.
Code Block
CREATE TABLE Employee(fName varchar(30), EmpID int, SupID int)
INSERT INTO Employee SELECT 'Adam', 1, 4
INSERT INTO Employee SELECT 'Joe', 2, 4
INSERT INTO Employee SELECT 'John', 3, 4
INSERT INTO Employee SELECT 'Frank', 4, 10
INSERT INTO Employee SELECT 'Jane', 5, 10
INSERT INTO Employee SELECT 'Kristy', 6, 10
INSERT INTO Employee SELECT 'Angie', 10, 11
INSERT INTO Employee SELECT 'Ron', 11, NULL
--=====================================================================================
-- CODE
--=====================================================================================
CREATE TABLE #temp(Hierarchy int, myName varchar(30), SupID int, EmpID int)
INSERT INTO #temp SELECT 1, fName, SupID, EmpID FROM Employee WHERE SupID IS NULL
--NULL SupID means that they are at the top most branch
DECLARE @Counter int --Counter is used to increment
SET @Counter = 1
DECLARE MY_CURSOR Cursor
FOR
SELECT SupID, EmpID, fName
FROM Employee
ORDER BY SupID --ORDER BY SupID to bring NULLs to top
Open My_Cursor
DECLARE @EmpID int, @SupID int, @Name varchar(30)
FETCH NEXT FROM MY_Cursor INTO @EmpID, @SupID, @Name
WHILE (@@FETCH_STATUS = 0)
BEGIN
SET @Counter = (SELECT MAX(Hierarchy) FROM #temp) + 1 --Get the highest hierarchy ID and increment by 1
INSERT INTO #temp
SELECT @Counter, fName, SupID, EmpID
FROM Employee
WHERE SupID IN (SELECT EmpID FROM #temp WHERE EmpID = @SupID)
FETCH NEXT FROM MY_CURSOR INTO @EmpID, @SupID, @Name
END
CLOSE MY_CURSOR
DEALLOCATE MY_CURSOR
SELECT * FROM #temp
DROP TABLE #temp
Thanks in advance,
Adam
View 1 Replies
View Related
Aug 2, 2007
Howdy folks!
I've got a database that needs to run 24/7. I'm looking into maintanence options and wanted to run the following by y'all:
Ok, I've read the MSDN "Maintaining databases" article and noticed the following statement about autoshrinking: "This technique uses almost no processor time and memory". I also searched these forums and found that many users say autoshrinking heavily lags down sql transfers. So who's right? And if it does lag transfers, by how much?
Another question I have about autoshrink is fragmentation. It would seem to me that over time solely depending on autoshrink would cripple a server in terms of fragmentation; is this the case?
Also, does autoshrink (or manual shrinking or compacting) update the statistics?
Final question!!! I'm programming in native c++, is there a way for me to run commands such as "DBCC SHRINKDATABASE" in native OLE DB code?
Thanks!
View 4 Replies
View Related
Apr 14, 2008
I have a website that is probably going to hold a sizable amount of data. The data will be specific to groups of users based on login credentials. Would it be more efficient to create a whole new database for each group of users, or create new tables for the groups in the existing database?
Any thoughts on the topic would be appreciated.
Thanks
View 3 Replies
View Related
Mar 30, 2006
I'm working on a "comments" section for our application suite.
My thoughts are to have 1 Comments table, which is then linked to a comment Log table.
For each section: Item, Group, Section, User, Package (these can all have comments on them) I will link them to the comments in a many to many relationship.
Example:
Comments Table-CommentID-UserID
UserComments-UserID-CommentID
User_Table-UserID....
Would doing that be more efficient than having 1 seperate comments table, and log table for each area I want to have comments?
View 1 Replies
View Related
Jul 20, 2005
Hello All,I have a SQL Query with multiple correlated Subqueries in it. When itgets executed it runs rather slow due to the size of the QT table.Does anybody have any suggestions how to alter this query to make itrun faster, or any index suggestions to assist it with.Query is as follows:SELECT SH_ORDER, SH_CUST, SH_ADD_DATE, SH_CUST_REF, SH_DESC, SH_EXCL,(SELECT SUM(QT_CHARGE) AS QT_CHARGE_SUMFROM QT INNER JOINJU ON QT_PROC_CODE = JU_PROC_CODEWHERE (QT_NUMBER = ' ' + SH_NOTE_2) AND (JU_PROC_GRP < 2)AND (QT_QUOTE_JOB = 0))AS [PREPCOST],(SELECT SUM(QT_CHARGE) AS QT_CHARGE_SUMFROM QT INNER JOINJU ON QT_PROC_CODE = JU_PROC_CODEWHERE (QT_NUMBER = ' ' + SH_NOTE_2) AND (QT_QUOTE_JOB = 0)AND (JU_PROC_GRP > 1) AND (JU_CATEG = 1)) AS [MATCOST],(SELECT SUM(QT_CHARGE) AS QT_CHARGE_SUMFROM QT INNER JOINJU ON QT_PROC_CODE = JU_PROC_CODEWHERE (QT_NUMBER = ' ' + SH_NOTE_2) AND (QT_QUOTE_JOB = 0)AND (JU_PROC_GRP > 1) AND (JU_CATEG = 3)) AS [OUTCOST],(SELECT SUM(QT_CHARGE) AS QT_CHARGE_SUMFROM QT INNER JOINJU ON QT_PROC_CODE = JU_PROC_CODEWHERE (QT_NUMBER = ' ' + SH_NOTE_2) AND (QT_QUOTE_JOB = 0)AND (JU_PROC_GRP > 1) AND((JU_CATEG = 0) OR (JU_CATEG = 2) OR (JU_CATEG = 4))) AS [LABCOST]FROM SHWHERE SH_ADD_DATE = '5/FEB/2004'thanks a lot for any helpJason
View 1 Replies
View Related
Apr 13, 2007
Hi all.
I am new to reporting services and I am having an efficiency problem when loading my report.
I would like to know how Reporting Services handles its datasets.
1: Lets say I have 3 parameters. All set to retrieve data from the same dataset. Does reporting services execute the Query 3 times to get the results for each parameter ? If so, is there a way around this ? I am having a great performance hit with this if it is the case.
2: I am also having an issue with a data processing extensions, when my multi-valued parameter reads the fields from the dataset.. it inserts duplicates and not distinct values, Do i need to explicitly select distinct values in the data processing extension or should Reporting Services automatically do this ?
Any help is greatly appreciated.
Regards,
Neil
View 8 Replies
View Related
Oct 12, 2004
************* Edited by moderator Adec ***************
Inserted missing < code></ code> tags. Always include such
tags when including code in your postings. Don't force the
moderators to do this for you. Many readers disregard
postings without the code tags.
**************************************************
Well met,
Let's say I have a web form that allows users to insert and update records in a SQL database. Is it better to set empty web controls (textbox, etc.) to DBNull or let it go as an empty string into the database?
Example code:
if (tboxNAME.Text == "")
{
sqlCommand1.Parameters["@NAME"].Value = DBNull.Value;
}
else
{
sqlCommand1.Parameters["@NAME"].Value = tboxNAME.Text;
}
Is the above overkill? It seems like it would be a good idea to set fields which the user empties back to Null rather than an empty string.
Thanks,
-Tony
View 2 Replies
View Related
Apr 6, 2000
Hello,
I regularly create stored procedures and use them like functions within other stored procedures.
I've never had any difficulty but then I never ran any metrics on it.
Does anyone know if there is an efficiency difference between that approach and just doing an inline query? How much of a difference is it? minimal? impractically large?
For example, if I define an sp like:
create proc isValidUser @userID int , @result int OUTPUT as
if exists(select * from user where userid = @userID)
set @result = 1
View 1 Replies
View Related
Oct 5, 1999
I'm having problems testing the effectiveness of changes I'm making to a stored_procedure. I need to test the runtime, but each time I run the sp, it gets progressively faster because the data has already been cached. Is there a way for me to clear the data from cache before each run? I can't stop SQL Server.
View 2 Replies
View Related
May 6, 2005
Hi,
I have two servers that i want to create a SQL RS report on.
On one server there is an HR database with our staff details, on the other server there is a database of assets.
In order to report on the assets assigned to each user i am thinking that i will have to :
1) link the servers
2) create a view in the HR database exposing the fields needed
3) create a view in the assets database exposing the assets information joined to the view from the other server
4) create my reports on the view on the assets server.
is this right or am i barking up the wrong tree?
View 9 Replies
View Related
Feb 10, 2006
My general question is whether there is anything to be gained by having 50 tables in one database versus 5 tables each in 10 databases. I have a number of different databases running on a server (SQL Server 2k). The different databases represent different functional groups, for instance car maintenance, cab reservation/dispatch, cab accounting, limo reservation/dispatch, limo accounting, etc. There is some crossover, for instance the cab dispatch system would look to car maintenance to validate the car number entered.A friend who happens to be IT Director at the local university suggested that the server would run more efficiently if there was only one database, rather than the roughly 12 I have now. His belief is that each separate database carries a certain amount of overhead, and combining them into one would be advantageous.Is he all wet, or would there be gains to be made?TIA
View 7 Replies
View Related
Jul 1, 2006
Hi there,
when MsSQL srv 2005 hit the streets I was full of hope that developers have finally got an efficient tool for things like for example string manipulation with which TSQL had some substantial efficiency problems...
After installing 2k5 server I wrote my first simple CLR stored procedure. It took three parameters (Integers) and added them and returned the result in an output parameter:
declare @x int
set @x = 1
while (@x < 100000)
begin
exec testSP
@p1 = @p1,
@p2 = @p2,
@p3 = @p3 output
set @x = @x + 1
end
I executed the script and waited... and waited... 10 seconds! That means that a single loop took about 100 ms! :shocked: That was NOT what I had expected... Since then I've been experimenting with lots of different combinations of parameters... passing them byVal, byRef, as String, as SqlString, Integer, SqlInt32 and so on...all of this was just so damn slow :eek: I've been expecting something completely different.
Could somebody explain it to me?
View 14 Replies
View Related
Feb 5, 2007
Hi all,
Does JOIN order effect efficiency?
If I have three large tables to join together should I join the two that I know will cut the number of rows down a lot first and then join the 3rd table or does it make no difference (if I join the first and 3rd - which I know will be a large result set and then join the 2nd).
Thanks in advance,
Chiz.
View 1 Replies
View Related
Sep 17, 2005
Not sure what happened to my post, it seems to have disappeared. Here we go again. I have a stored procedure that I would like some feedback on, as I feel it may be inefficient as coded:
@ZUserID varchar(10)
AS
SET NOCOUNT ON
DECLARE @counter int
SET @counter = 0
WHILE @counter < 10
BEGIN
SET @counter = @counter + 1
IF EXISTS(SELECT * FROM tblWork WHERE UserID = @ZUserID And LineNumber = @counter)
BEGIN
UPDATE tblWork SET
TransID = Null,
TransCd = Null,
InvoiceNo = Null,
DatePaid = Null,
Adjustment = Null,
Vendor = Null,
USExchRate = Null
WHERE
UserID = @ZUserID And LineNumber = @counter
END
ELSE
INSERT INTO tblWork
(LineNumber,TransCd,UserID)
VALUES
(@counter,'P',@ZUserID)
END
View 2 Replies
View Related
Oct 12, 2007
Hi all,
I have a sql script that updates records in a table with 40 million records.
There is some functionality in the script that could be put away in functions for code reuse/elegance.
Functions would cause execution overhead.
What else could I use besides functions that would allow me the code reuse and not compromise the execution over head? Is there any thing like includes in TSQL that would allow me to do so?
TIA..
View 4 Replies
View Related
Jun 13, 2008
I have a function which is writing thousands of records coming from Access database and I was wondering if someone can suggest how can reconstruct my code for faster processing. Here my code: Public Sub MigrateNFSData(ByVal calcTbl As DataTable, ByVal strDBConnection As String) Dim sqlServerConn As New SqlConnection(strDBConnection) 'Define stored procedures Dim command As New SqlCommand Dim getAccID As New SqlCommand("GetAccountID", sqlServerConn) Dim getActionID As New SqlCommand("GetActionID", sqlServerConn) Dim getExchangeID As New SqlCommand("GetExchangeID", sqlServerConn) 'Dim getParrentAccID As New SqlCommand("GetParentAccID", sqlServerConn) Dim getStatusID As New SqlCommand("GetStatusID", sqlServerConn) Dim getTraderID As New SqlCommand("GetTraderID", sqlServerConn) Dim getGroupID As New SqlCommand("GetGroupID", sqlServerConn) Dim getGroupIDByIP As New SqlCommand("GetGroupIDByIP", sqlServerConn) Dim getTraderIDByIP As New SqlCommand("GetTraderIDByIP", sqlServerConn) 'Define insert records stored procedures Dim insertAcc As New SqlCommand("InsertAccount", sqlServerConn) insertAcc.CommandType = CommandType.StoredProcedure Dim insertAction As New SqlCommand("InsertAction", sqlServerConn) insertAction.CommandType = CommandType.StoredProcedure Dim insertExchange As New SqlCommand("InsertExchange", sqlServerConn) insertExchange.CommandType = CommandType.StoredProcedure Dim insertGroup As New SqlCommand("InsertGroup", sqlServerConn) insertGroup.CommandType = CommandType.StoredProcedure Dim insertStatus As New SqlCommand("InsertStatus", sqlServerConn) insertStatus.CommandType = CommandType.StoredProcedure Dim insertTrader As New SqlCommand("InsertTrader", sqlServerConn) insertTrader.CommandType = CommandType.StoredProcedure Try sqlServerConn.Open() Catch ex As Exception MessageBox.Show("Connection failed to open!") End Try 'Set parameters to helper Get Stored Procedures to retreive Id's getAccID.Parameters.Add("@AccName", SqlDbType.NVarChar) getAccID.CommandType = CommandType.StoredProcedure getActionID.Parameters.Add("@ActionName", SqlDbType.NVarChar) getActionID.CommandType = CommandType.StoredProcedure getExchangeID.Parameters.Add("@ExchName", SqlDbType.NVarChar) getExchangeID.CommandType = CommandType.StoredProcedure 'getParrentAccID.Parameters.Add("@ParentName", SqlDbType.NVarChar) 'getParrentAccID.CommandType = CommandType.StoredProcedure getStatusID.Parameters.Add("@StatusName", SqlDbType.NVarChar) getStatusID.CommandType = CommandType.StoredProcedure getTraderID.Parameters.Add("@TraderName", SqlDbType.NVarChar) getTraderID.CommandType = CommandType.StoredProcedure getGroupID.Parameters.Add("@GroupName", SqlDbType.NVarChar) getGroupID.CommandType = CommandType.StoredProcedure getGroupIDByIP.Parameters.Add("@IP", SqlDbType.NVarChar) getGroupIDByIP.CommandType = CommandType.StoredProcedure getTraderIDByIP.Parameters.Add("@IP", SqlDbType.NVarChar) getTraderIDByIP.CommandType = CommandType.StoredProcedure command = New SqlCommand("InsertTradeTransaction", sqlServerConn) command.CommandType = CommandType.StoredProcedure 'Set Parameters for Insert stored procedures insertAcc.Parameters.Add("@Account", SqlDbType.Text) insertAction.Parameters.Add("@ActionName", SqlDbType.Text) insertExchange.Parameters.Add("@Exchange", SqlDbType.Text) insertGroup.Parameters.Add("@Group", SqlDbType.Text) insertGroup.Parameters.Add("@ACCID", SqlDbType.Int) insertGroup.Parameters.Add("@GroupID", SqlDbType.UniqueIdentifier) insertStatus.Parameters.Add("@StatusName", SqlDbType.Text) insertTrader.Parameters.Add("@Group", SqlDbType.UniqueIdentifier) insertTrader.Parameters.Add("@IP", SqlDbType.Text) insertTrader.Parameters.Add("@TraderName", SqlDbType.Text) insertTrader.Parameters.Add("@TraderID", SqlDbType.UniqueIdentifier) 'Adding stored Get Stored Procedure's parameters----------------------- command.Parameters.Add("@OrderNum", SqlDbType.Text) command.Parameters.Add("@ACC_ID", SqlDbType.Int) command.Parameters.Add("@Group_ID", SqlDbType.UniqueIdentifier) command.Parameters.Add("@Trader_ID", SqlDbType.UniqueIdentifier) command.Parameters.Add("@Exch_ID", SqlDbType.Int) command.Parameters.Add("@Date", SqlDbType.DateTime) command.Parameters.Add("@Time", SqlDbType.DateTime) command.Parameters.Add("@ActionID", SqlDbType.Int) command.Parameters.Add("@StatusID", SqlDbType.Int) command.Parameters.Add("@TimeSent", SqlDbType.DateTime) command.Parameters.Add("@Qty", SqlDbType.Int) command.Parameters.Add("@Product", SqlDbType.Text) command.Parameters.Add("@MMYYY", SqlDbType.Text) command.Parameters.Add("@ExchOrderID", SqlDbType.Text) command.Parameters.Add("@TimeTicks", SqlDbType.Int) command.Parameters.Add("@W2G", SqlDbType.Int) command.Parameters.Add("@W2Exch", SqlDbType.Int) command.Parameters.Add("@G2ExchDerived", SqlDbType.Int) command.Parameters.Add("@Msg", SqlDbType.NVarChar) 'command.Parameters.Add("@ExchDate", SqlDbType.DateTime) 'command.Parameters.Add("@ParentID", SqlDbType.Int) 'Paremeters Defenition-------------------------------------- 'Parsing DateTime Objects Dim formaterA As IFormatProvider formaterA = New System.Globalization.CultureInfo("en-GB", True) Dim dateObj As Date 'DEBUG 'Dim rows = calcTbl.Rows.Count Dim colValues = GetColumnsValues(calcTbl) 'Write table with computed NFS data to sql server DB For Each dr As DataRow In calcTbl.Rows Dim orderNo = dr.Item("Order No").ToString() command.Parameters("@OrderNum").Value = dr.Item("Order No").ToString() getAccID.Parameters("@AccName").Value = dr.Item("Acct").ToString() If getAccID.ExecuteScalar() = Nothing Then insertAcc.Parameters("@Account").Value = dr.Item("Acct").ToString() insertAcc.ExecuteNonQuery() getAccID.Parameters("@AccName").Value = dr.Item("Acct").ToString() command.Parameters("@ACC_ID").Value = getAccID.ExecuteScalar() Else command.Parameters("@ACC_ID").Value = Int32.Parse(getAccID.ExecuteScalar()).ToString() End If getGroupID.Parameters("@GroupName").Value = dr.Item("Group ID").ToString() If getGroupID.ExecuteScalar() = Nothing Then 'Find Group by IP address if input Data Table doesn't have group getGroupIDByIP.Parameters("@IP").Value = dr.Item("IP").ToString() If getGroupIDByIP.ExecuteScalar() = Nothing Then insertGroup.Parameters("@GroupID").Value = Guid.NewGuid insertGroup.Parameters("@Group").Value = dr.Item("Group ID") insertGroup.Parameters("@ACCID").Value = getAccID.ExecuteScalar() insertGroup.ExecuteNonQuery() command.Parameters("@Group_ID").Value = getGroupID.ExecuteScalar() Else command.Parameters("@Group_ID").Value = getGroupIDByIP.ExecuteScalar() End If Else command.Parameters("@Group_ID").Value = getGroupID.ExecuteScalar() End If getTraderID.Parameters("@TraderName").Value = dr.Item("Trd ID").ToString() If getTraderID.ExecuteScalar() = Nothing Then getTraderIDByIP.Parameters("@IP").Value = dr.Item("IP").ToString() If getTraderIDByIP.ExecuteScalar() = Nothing Then insertTrader.Parameters("@Group").Value = getGroupID.ExecuteScalar() insertTrader.Parameters("@IP").Value = dr.Item("IP").ToString() insertTrader.Parameters("@TraderName").Value = dr.Item("Trd ID").ToString() insertTrader.Parameters("@TraderID").Value = Guid.NewGuid insertTrader.ExecuteNonQuery() command.Parameters("@Trader_ID").Value = getTraderID.ExecuteScalar() Else command.Parameters("@Trader_ID").Value = getTraderIDByIP.ExecuteScalar() End If Else command.Parameters("@Trader_ID").Value = getTraderID.ExecuteScalar() End If getExchangeID.Parameters("@ExchName").Value = dr.Item("Exch").ToString() If getExchangeID.ExecuteScalar() = Nothing Then insertExchange.Parameters("@Exchange").Value = dr.Item("Exch").ToString() insertExchange.ExecuteNonQuery() command.Parameters("@Exch_ID").Value = getExchangeID.ExecuteScalar() Else command.Parameters("@Exch_ID").Value = getExchangeID.ExecuteScalar() End If getActionID.Parameters("@ActionName").Value = dr.Item("Action").ToString() If getActionID.ExecuteScalar() = Nothing Then insertAction.Parameters("@ActionName").Value = dr.Item("Action").ToString() command.Parameters("@ActionID").Value = getActionID.ExecuteScalar() Else command.Parameters("@ActionID").Value = getActionID.ExecuteScalar() End If getStatusID.Parameters("@StatusName").Value = dr.Item("Status").ToString() If getStatusID.ExecuteScalar() = Nothing Then insertStatus.Parameters("@StatusName").Value = dr.Item("Status").ToString() insertStatus.ExecuteNonQuery() command.Parameters("@StatusID").Value = getStatusID.ExecuteScalar() Else command.Parameters("@StatusID").Value = getStatusID.ExecuteScalar() End If 'getParrentAccID.Parameters("@ParentName").Value = "" 'If getParrentAccID.ExecuteScalar() = 0 Then 'insert parent acc 'Else 'command.Parameters("@ParentID").Value = getParrentAccID.ExecuteScalar() dateObj = Date.Parse(dr.Item("Exch Date").ToString(), formaterA) command.Parameters("@Date").Value = dateObj command.Parameters("@Time").Value = DateTime.Parse(dr.Item("Time").ToString()) command.Parameters("@TimeSent").Value = DateTime.Parse(dr.Item("Time Sent").ToString()) If (dr.Item("Qty").Equals(System.DBNull.Value)) Then command.Parameters("@Qty").Value = System.DBNull.Value Else command.Parameters("@Qty").Value = Int32.Parse(dr.Item("Qty").ToString()) End If command.Parameters("@Product").Value = dr.Item("Product").ToString() command.Parameters("@MMYYY").Value = dr.Item("MMMYY").ToString() command.Parameters("@ExchOrderID").Value = dr.Item("Exchange Order ID").ToString() If (dr.Item("Time Ticks").Equals(System.DBNull.Value)) Then command.Parameters("@TimeTicks").Value = System.DBNull.Value Else command.Parameters("@TimeTicks").Value = Int32.Parse(dr.Item("Time Ticks").ToString()) End If 'command.Parameters("@ExchDate").Value = Date.Parse(dr.Item("Exch Date").ToString()) 'command.Parameters("@ExchDate").Value = Convert.ToDateTime(dr.Item("Exch Date").ToString()) 'DEBUG 'Dim strW2G = dr.Item("W2G").ToString() 'Dim strW2E = dr.Item("W2E").ToString() If (dr.Item("W2G").Equals(System.DBNull.Value)) Then command.Parameters("@W2G").Value = System.DBNull.Value Else command.Parameters("@W2G").Value = Int32.Parse(dr.Item("W2G").ToString()) End If If dr.Item("W2E").Equals(System.DBNull.Value) Then command.Parameters("@W2Exch").Value = System.DBNull.Value Else command.Parameters("@W2Exch").Value = Int32.Parse(dr.Item("W2E").ToString()) End If 'command.Parameters("@G2ExchDerived").Value = Int32.Parse(dr.Item("Time Delta G2E").ToString()) If (dr.Item("Time Delta G2E").Equals(System.DBNull.Value)) Then command.Parameters("@G2ExchDerived").Value = System.DBNull.Value Else command.Parameters("@G2ExchDerived").Value = Int32.Parse(dr.Item("Time Delta G2E").ToString()) End If command.Parameters("@Msg").Value = dr.Item("Msg").ToString() command.ExecuteNonQuery() Next sqlServerConn.Close() End Sub
View 3 Replies
View Related
Jul 26, 2007
Hi
We have a application running on Sql server 2005, which require to browse/search text field. Does anyone know if Sql server's search/browse performance on text field is better than oracle?
The table the application will search on is a customer table that has a 10000 records in it, does this size of table casue a performance problem for sql server 2005 if I index the text field?
Please advise, thanks for your help!
Li
View 4 Replies
View Related