Fastest Way To Handle The Search
Jul 23, 2005
Hi!
We have Sql Server 2000 in our server (NT 4). Our database have now about
+350.000 rows with information of images. Table have lot of columns
including information about image name, keywords, location, price, color
mode etc. So our database don?t include the images itself, just a path to
the location of every image. Keywords -field have data for example like
this:
cat,animal,pet,home,child with pet,child. Now our search use Full-Text
Search which sounded like good idea in the beginning but now it have had
problems that really reduce our search engine?s performance. Also search
results are not exact enough. Some of our images have also photographer?s
name in keywords -column and if photographer?s name is, for example, Peter
Moss, his pictures appears in web-page when customer want to search "moss"
(nature-like) -pictures.
Another problem is that Full-Text Search started to be very slow when query
result contains thousands of rows. When search term gives maximum 3000
rows, search is fast but larger searches take from 6 to 20 seconds to
finish which is not good. I have noticed also that first search is always
very slow, but next ones are faster. It seems that engine is just
"starting" when first query started to run.
Is there better and faster way to handle the queries? Is it better to
rebuild the database somehow and use another method to search than Full-
Text Search? I don?t know how to handle the database other way when every
image have about 10 to even 50 different keywords to search.
We have made web interface and search code with Coldfusion. Coldfusion
Server then take care of sending all queries to Sql Server.
I hope that somebody have some idea how to speed up our picture search.
--
Message posted via http://www.sqlmonster.com
View 2 Replies
ADVERTISEMENT
Nov 5, 2004
I have a master table which has demographic data such as name, dob, location along with a primary key id. It will have about 10-12000 records. We get a refresh file every hour which may or may not have corrections for these records hourly with about 3,000 records. I put this data into a table. This data should be considered always to be correct. To handle the update to the master table I need to create an update process. I can take one of two approaches, just update all the records in the master table regardless if they are correct or not, or do some type of left join on those that do not match (in other words, only update the ones where thae names or dob don't match) There is an underlying update trigger on the patient master which will also fire if these values are changed. An opinions on a best approach?
View 1 Replies
View Related
Feb 15, 2007
Good day all,I'm looking for suggestions on how to handle the calculating of scores for search results from data. Primarily if it would be best to calculating the score on the SQL server side or in the application's logic itself after the results have been retrieved? I already have an idea of the calculations I want to do which will be pretty simple, just a basic point system for containing all the queried terms and additional points for the number of times those words appear on the page. Feedback or links to articles on this would be appreciated. I'm leaning towards doing so on the SQL side myself but my SQL skills are not as polished as I would like for jumping into that, so examples could be good.
View 3 Replies
View Related
Mar 3, 2006
I am currently faced with developing a search function (ASP.NET 1.1
based user interface) to a database (SQL Server 2000 ), and I am
wondering how best to handle the logistics of it.
The search can take anywhere from one to 9 parameters. The search would be executed against three related tables.
The main problem, as I see it at this stage, is how to handle
parameters that have not been selected, it's easy enough to do a search
with all 9 parameters, but as soon as there is an unpredictable
variable number of search criteria, I just don't know how best to
handle this in a stored procedure for example.
I would be very grateful for some guidance.
Thanks.
View 1 Replies
View Related
Dec 3, 2007
Hi
I'm using service broker and keep getting errors in the log even though everythig is working as expected
SQL Server 2005
Two databases
Two end points - 1 in each database
Two stored procedures:
SP1 is activated when a message enters the sending queue. it insert a new row in a table
SP2 is activated when a response is sent from the receiving queue. it cleans up the sending queue.
I have a table with an update trigger
In that trigger, if the updted row meets a certain condition a dialogue is created and a message is sent to the sending queue.
I know that SP1 and SP2 are behaving properly because i get the expected result.
Sp1 is inserteding the expected data in the table
SP2 is cleaning up the sending queue.
In the Sql Server log however i'm getting errors on both of the stored procs.
error #1
The activated proc <SP 1 Name> running on queue Applications.dbo.ffreceiverQueue output the following: 'The conversation handle is missing. Specify a conversation handle.'
error #2
The activated proc <SP 2 Name> running on queue ADAPT_APP.dbo.ffsenderQueue output the following: 'The conversation handle is missing. Specify a conversation handle.'
I would appreceiate anybody's help into why i'm getting this. have i set up the stored procs in correctly?
i can provide code of the stored procs if that helps.
thanks.
View 10 Replies
View Related
Aug 9, 2006
I'm just wonder if this is a bug in MS Search or am I doing something wrong.
I have a query below
declare @search_clause varchar(255)
set @Search_Clause = ' "hepatitis b" and "hepatocellular carcinoma"'
select * from results
where contains(finding,@search_clause)
I don't get the correct result at all.
If I change my search_clause to "hepatitis" and "hepatocellular carcinoma -- without the "b"
then i get the correct result.
It seems MS Search doesn't like the phrase contain one letter or some sort or is it a know bug?
Anyone know?
Thanks
View 3 Replies
View Related
Jan 18, 2008
We have implemented our service broker architecture using conversation handle reuse per MS/Remus's recommendations. We have all of the sudden started receiving the conversation handle not found errors in the sql log every hour or so (which makes perfect sense considering the dialog timer is set for 1 hour). My question is...is this expected behavior when you have employed conversation recycling? Should you expect to see these messages pop up every hour, but the logic in the queuing proc says to retry after deleting from your conversation handle table so the messages is enqueued as expected?
Second question...i think i know why we were not receiving these errors before and wanted to confirm this theory as well. In the queuing proc I was not initializing the variable @Counter to 0 so when it came down to the retry logic it could not add 1 to null so was never entering that part of the code...I am guessing with this set up it would actually output the error to the application calling the queueing proc and NOT into the SQL error logs...is this a correct assumption?
I have attached an example of one of the queuing procs below:
Code Block
DECLARE @conversationHandle UNIQUEIDENTIFIER,
@err int,
@counter int,
@DialogTimeOut int,
@Message nvarchar(max),
@SendType int,
@ConversationID uniqueidentifier
select @Counter = 0 -- THIS PART VERY IMPORTANT LOL :)
select @DialogTimeOut = Value
from dbo.tConfiguration with (nolock)
where keyvalue = 'ConversationEndpoints' and subvalue = 'DeleteAfterSec'
WHILE (1=1)
BEGIN
-- Lookup the current SPIDs handle
SELECT @conversationHandle = [handle] FROM tConversationSPID with (nolock)
WHERE spid = @@SPID and messagetype = 'TestQueueMsg';
IF @conversationHandle IS NULL
BEGIN
BEGIN DIALOG CONVERSATION @conversationHandle
FROM SERVICE [InitiatorQueue_SER]
TO SERVICE 'ReceiveTestQueue_SER'
ON CONTRACT [TestQueueMsg_CON]
WITH ENCRYPTION = OFF;
BEGIN CONVERSATION TIMER ( @conversationHandle )
TIMEOUT = @DialogTimeOut
-- insert the conversation in the association table
INSERT INTO tConversationSPID
([spid], MessageType,[handle])
VALUES
(@@SPID, 'TestQueueMsg', @conversationHandle);
SEND ON CONVERSATION @conversationHandle
MESSAGE TYPE [TestQueueMsg] (@Message)
END
ELSE IF @conversationHandle IS NOT NULL
BEGIN
SEND ON CONVERSATION @conversationHandle
MESSAGE TYPE [TestQueueMsg] (@Message)
END
SELECT @err = @@ERROR;
-- if succeeded, exit the loop now
IF (@err = 0)
BREAK;
SELECT @counter = @counter + 1;
IF @counter > 10
BEGIN
-- Refer to http://msdn2.microsoft.com/en-us/library/ms164086.aspx for severity levels
EXEC spLogMessageQueue 20002, 8, 'Failed to SEND on a conversation for more than 10 times. Error %i.'
BREAK;
END
-- We tried on the said conversation, but failed
-- remove the record from the association table, then
-- let the loop try again
DELETE FROM tConversationSPID
WHERE [spid] = @@SPID;
SELECT @conversationHandle = NULL;
END;
View 2 Replies
View Related
May 14, 2008
right now I have a stored procedure that goes through each of the Line and Body fields using a cursor. The problem is that this method is very slow. How would you experts solve this problem? any Hints or suggestions?
BEFORE
EXAMPLEPartLineBodySeriesEngineYear
11234A,BWETC1998
25678991,93,94,95WET01997
3345656S,R5,6,12WENC1995
AFTER
EXAMPLEPartLineBodySeriesEngineYear
11234AWETC1998
11234BWETC1998
25678991WET01997
25678993WET01997
25678994WET01997
25678995WET01997
3345656S5WENC1995
3345656S6WENC1995
3345656S12WENC1995
3345656R5WENC1995
3345656R6WENC1995
3345656R12WENC1995
View 4 Replies
View Related
Sep 29, 2000
Hi,
In my SQL server 7.0, I have got 250 store procedures in each database.
Before using them for my application, I want to ecyption all.
I must add "WITH ENCRYPTION" string in each SP in all database and it'll take me a long time. Is there fastest way to encryption all SPs in all DBs? Have anyone got an utility SP ( or anyway else) to do this?
Thanks in advance.
View 1 Replies
View Related
Mar 10, 2008
What is the fast way a stored procedure can copy a table from a linked server?
I would like to tune this statement, possibly with hints or other logging options. Assume that table_A and table_B have the exact table structure and that I want to preserve table_A and all its indexes and contraints. The table will be truncated before this load, if that helps in any way.
insert into table_A select * from OpenQuery(Server,'select * from Table_B')
TIA, Mike
View 4 Replies
View Related
Jul 20, 2005
In relation to my last post, I have a question for the SQL-gurus.I need to update 70k records, and mark all those updated in a specialcolumn for further processing by another system.So, if the record wasKey1, foo, foo, ""it needs to becomeKey1, fap, fap, "U"iff and only iff the datavalues are actually different (as above, foobecomes fap),otherwise it must becomeKey1, foo,foo, ""Is it quicker to :1) get the row of the destination table, inspect all valuesprogramatically, and determine IF an update query is neededOR2) just do a update on all rows, but addingand (field1 <> value1 or field2<>value2) to the update querythat isupdate myTablesetfield1 = "foo"markField="u"where key="mykey" and (field1 <> foo)The first one will not generate new update queries if the record hasnot changed, on account of doing a select, whereas the second versionalways runs an update, but some of them will not affect any lines.Will I need a full index on the second version?Thanks in advance,Asger Henriksen
View 2 Replies
View Related
Dec 26, 2000
Hi,
1)I need to transfer 500 gb of data from one server to other, which is faster, DTS/BCP/Restore.
2)Which are the best methods for checking blocking, dead locks & Indexes!
Thanks you all in advance
Richard..
View 5 Replies
View Related
Feb 2, 2006
Hi,
I have a production server that has an 8Gb db. It is dual Xeon with 5x HDD - 2 mirrored and 3 striped. db on stripe, log and OS on mirror. 2x Gb network cards.
The application goes slow (ie users notice) when a backup is running so i have placed a crossover cable from one NIC to a test server so that it can back up to a HDD on that server, and then to tape. The test server has 2xGb NIC and the link between the two servers is on a seperate subnet to However, in the first trial of this the back up and verify takes 3 minutes longer.
Is this because the target server doesnt have a disk stripe?
What is the best config for the production server (ie will a slower backup but to another server be less load to contend with the application)?
thanks
Fatherjack
View 2 Replies
View Related
May 22, 2007
I've got a view that is driven from a 80 million record table in a data warehouse. I am trying to populate an aggregate table in a datamart, but am running into preformance problems. The datamart table needs to be updated daily. I understand there are many factors that effect performance, but in general would the fastest approach be:
1) Truncate the datamart table
2) Perform a bcp of the view to a text file
3) Bulk Insert to the datamart table
If you need more information to answer this please let me know.
Thanks,
Matt
View 7 Replies
View Related
Oct 21, 2007
Im trying to dedupe a table with only one field on it. The table has40 million records in it. What is the fastest way?1) create a table with a unque constraint on it insert into thattable?2) create a table without a unique constraint on it and use insertinto table select distinct un from table2?3) another way?Michael
View 1 Replies
View Related
Jan 10, 2008
Can anyone point to a reference which documents the pros and cons of the various connection protocols, such as Shared Memory vs. TCP/IP? I thought I saw something indicating that shared memory is fastest, which would explain why this protocol is tried first, but now I can't find it. This resource has information on creating connection strings, but not the advantages and disadvantages.
http://msdn2.microsoft.com/en-us/library/ms187662.aspx
View 7 Replies
View Related
Jan 11, 2007
Hi,
I'd like to know the fastest way to add named instances to SQL server 2005 (I need to add 5 named instances)
Thank you!
John
View 4 Replies
View Related
Nov 17, 2006
I am trying to find some info on the fastest connection transport for an app that is running on the same box as a SQL 2005 instance. The app does a large number of updates (high volume of data).. win32 using native ODBC. I am trying to find info on which connection mechanism is best... socket, pipes, etc. I read somewhere that there is a file mapped method available but i cannot find info on that either. Also, is there any performance difference between the old SQL OCBD drive and the new SQL Native Client OCBD drive?
Thanks.
View 2 Replies
View Related
Jul 30, 2007
Hi All,
Im bulk loading a ton of data into MSSQL SERVER 2005 Standard Edition. I used to do this process in version 2000. It seems there is some more overhead in 2005. Is there a way to drop logging to almost null to speed up insert?
This is my current sql statment to load data.
EXEC sp_dboption 'my_stuff', 'select into/bulkcopy', 'true'
SET ANSI_WARNINGS OFF
BULK INSERT mystuff.dbo.[v1]
FROM 'c:myfile.txt'
WITH
(
FIRSTROW = 1,
FORMATFILE = 'c:scriptsv1.fmt',
MAXERRORS=2000,
ROWS_PER_BATCH=100000
)
Thanks,
Mike
View 2 Replies
View Related
Nov 14, 2007
I'm writing a program that allows users to upload a csv file. This file is then seperated into 4 datatables based on certain criteria then each datatable is uploaded into my database. I'm essentially adding new rows to the datatables then running an update command on each using a tableadapter. The problem is that these csv files can be large and can end up with 4000+ new records being added to the database and the update commands take a while to do it. I've sat for about five minutes on one run while it updated. I put in some time variable to see where all the time is spent and it takes only seconds to parse the data and seperate into the datatables, but minutes on the update commands. Is there a more efficient way to insert this much data?
View 4 Replies
View Related
Mar 5, 2008
Hi, I have a table with one XML type column. This column holds custom field information. Its used as a way of storing ad hoc fields and data that don't fit the DB design. <?xml version="1.0"?><contact><Reference>A39390TFH</Reference><Misc>all kinds of stuff go in here</Misc></contact>I want to provide a way of displaying the data stored in this column in the same DataTable as normal relational data from the same table. I have been able to achieve this goal BUT I want to know if the community had any ideas on how I could speed the process. I am using the XML value() function. It allows me to extract the data I need. SELECT Name, Number, Reference FROM
(
SELECT Name, Number, xmlvalues.value( ' (contact/Reference)[1] ', ' varchar(40) ' ) as Reference
From MyTable
WHERE Name = 'Some Dude'
AND xmlvalues is not null
) T
GROUP BY Name, Number, Reference anyone know if there are better xml functions to get this data out of the XMLColumn?????? There is no schema, because each xml fragment has different tags and different values. Regards Niall
View 1 Replies
View Related
Mar 15, 2004
If there is 13 million records in one table and 40 thousand records in another table then what is the fastest way of joining these two tables????
This was a question to me from somebody to which i cudn't answer back properly. Cud anybody tell the answer with properreasons behind the answer??????
Thanx.
View 7 Replies
View Related
Nov 22, 1999
We have a table that we BCP into, the data is then processed and inserted into its appropriate table.
Then the table or its data needs to be removed. This seems to be a very slow operation to remove
the table or table data. I have tried drop table, and truncate table and it takes nearly as long as
the bcp operation. The table has 12 million rows. I didn't think either operation wrote to the
transaction log except for page extent management. Why is the drop and truncate so slow. Suggestions?
View 1 Replies
View Related
Mar 19, 2008
Hello,
What is the fastest way to update 20million records in our database.
I have tried to do a simple update statement like this:
update trail_log with (tablockx, holdlock)
set trail_log .entry_by = users.user_identity
from users
where trail_log.entry_by = users.user_id
but it take 10 plus hours to run since it cannot commit the transactions until the very end. So was was thinking that I need to commit in batch like after 50K but that is slow as well.
Set rowcount 50000
Declare @rc int
Set @rc=50000
While @rc=50000
Begin
Begin Transaction
update trail_log With (tablockx, holdlock)
set trail_log.entry_by = users.user_identity
from users
where trail_log.entry_by = users.user_id
and trail_log.entry_by not like '%[0-9]%'
Select @rc=@@rowcount
--Commit the transaction
Commit
End
go
I have let the above statement run for 1.5 hours and it only update 450000 rows. Any ideas...
Maybe I'm doing it wrong. Please Help!!
View 1 Replies
View Related
Jul 23, 2005
What is the fastest way to generate an SQL 2000 databaseout of SQL scripts.The SQL scripts contain the create tables, views, storedprocedures, triggers, constraints, and the tables DATArecords.What are my options? isql? osql? are there other ways?Thank you
View 6 Replies
View Related
Jul 20, 2005
I'd like to use a stored procedure to insert large amounts of recordsinto a table. My field A should be filled with a given range ofnumbers. I do the following ... but I'm sure there is a better(faster) way:select @start = max(A) from tbl where B = 'test1' and C = 'test2'while @start <= 500000begininsert into tbl (A, B, C)values (@start, 'test1', test2')set @start = @start +1endanother question is, how to prevent that another user inserts the samenumbers into the field A?Thanks a lot for any help!ratu
View 5 Replies
View Related
Jul 31, 2007
Hi All,
I have NUnit tests. on each test I create a temp db with data and drop it in the end of the test.
Since I have several connections to it during the test, I do "set single_user" on the db and then drop because I can't drop it while others are connected. ("Other" menas other connections)
This takes quite some time. tried detach but no improvements.
What would be the fastest way to drop a database?
Thanks
Avi
View 3 Replies
View Related
Jul 10, 2006
I need to load a lot of data into a sql table as fast as possible. (All data is string data, CSV file). I have read that the "fastest" way to load data is using the "Fast Load" option of the OLEDB Destination but I have also read that the "SQL Server" Destination is fast.
1. Is there a general consensus as to which way is the quicksest load?
2. The file is a CSV file. Would a fixed-format file be faster to read?
TIA,
barkingdog
View 3 Replies
View Related
Aug 13, 2015
I have a big table (> 40,000,000 rows) in SQL Server 2008 R2 database . I added new column of type int to the table. What is the fastest way to populate the column with -1 value. The table can be locked.
View 2 Replies
View Related
Mar 3, 2006
I was wondering what is the fastest way to UPDATE lots of recods. I heard the fastest way to perform lots of inserts in to use SqlCeResultSet. Would this also be the fastest way to update already existing records? If so, is this the fastest way to do that:
1. Create a SqlCeCommand object.
2. Set the CommandText to select the datat I want to update
3. Call the command object's ExecuteResultSet method to create a SqlCeResultSet object
4. Call the result set object's Read method to advance to the next record
5. Use the result set object to update the values using the SqlCeResultSet.SetValue method and the Update method.
6. repeat steps 4 and 5
Also I was wondering do call the SqlCeResultSet.Update method once per row, or just once? Also would it be possible and faster to wrap all that in a transaction?
Would parameterized updates be faster?
Any help will be appreciated.
View 3 Replies
View Related
Jan 11, 2007
Hello SQL colleagues,
I need to add 5 new named instances on a SQL Server 2005 Enterpise Edition.
Is there a fast way to do it?
Best Regards,
John
View 3 Replies
View Related
Jul 27, 2006
If I create an index on a field in SQL Server, what will be the most efficient (fastest) field type to index a field? (This field will be a "Pointer" to a child table that will contain a list of codes, and their description.)
Would a Numeric field be quicker than a VarChar field?
VarChar would make it easier for a Human to decipher the raw records. (For example, if I used a numeric the code would be 42 or 47, while the VarChar could be 'savings' or 'checking'.)
Basically I will have the following "Master" table: FieldType
---------
IDInt
NameVarChar
StatusInt -or- VarChar
Customer_TypeInt -or- VarChar
If Customer_Type is a code that can be looked up in another table, and I index that field, would I want the "Code" to be an Int or VarChar?
SQL: Select *
From Master
Where Customer_Type = <42> or <'savings'>
My Where clause would depend on the field type.
Thank you, Bryan
View 1 Replies
View Related
Mar 29, 2004
ASP.NET and MsSQL are run inside the same machine, and inside win2000 server,
and the physical memory limit of mssql is set to 192MB.
any one have any good idea(s)? please share to us here
:)
View 4 Replies
View Related