13GB MySQL Table...Processing Queries Faster...
May 23, 2007
I have a 13 GB MYSQL table with the following definition:
Code:
CREATE TABLE `WikiParagraphs` (
`ID` int(10) unsigned NOT NULL auto_increment,
`Paragraph` text NOT NULL,
PRIMARY KEY (`ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=73035281 ;
I query it like this:
Code:
select Paragraph from WikiParagraphs where ID in (1,2,3,4)
the 1,2,3,4 bit comes from the Sphinx FullText engine that gives me the IDs I need that match my query within about 500 milliseconds.
But retrieving the data itself takes approximately 3-6 seconds.
Obviously, I'd like to speed this query up.
Any ideas?
G-Man
View 1 Replies
ADVERTISEMENT
Feb 13, 2008
I'm reporting from a Microsoft SQL database (poorly documented unfortunately) and would like to find a 3rd party application to assist me in rapidly making Select queries. The ability to browse data in a field from the interface would be a plus.
What are the best alternatives for rapidly creating these queries from some sort of builder or wizard?
TIA.
View 7 Replies
View Related
Dec 28, 2001
I've noticed lately that my queries through ADO/VB are taking alot longer to process at certain times. The query and the result information never changes, only that at certain times the query takes alot longer than usual. I thought that I possibly need more licenses, or it might be network traffic. I currently use MS SQL Server 2000 small business Ed(5 Cals).
Has anyone any information about performance problems due to licensing issues?
Thanks,
Dave Cohoon
View 1 Replies
View Related
Jul 22, 2007
I'm working on an app that contains a series of 100 very compute-intensive queries that altogether take 30 minutes on my laptop machine. The app may be atypical in that there is just this series of expensive queries, and not lots of simultaneous simple queries.
To speed up testing, I bought a Dell PE2900 Quad-Core Xeon with 16GB of RAM and four 15000rpm hard drives. It runs 64-bit Windows 2003 server and 64-bit SQL Server 2005 dev edition.
Unfortunately performance is worse than on my laptop. Task manager reports (almost constant) 13GB of available physical memory, while processor usage hovers around 30%. The page file usage is constant at 3 GB.
Where do I start trouble-shooting this problem?
View 1 Replies
View Related
Dec 23, 2005
If you were doing paging of results on a web page and were interestedin grabbing say records 10-20 of a result set. But also wanted to knowthe total # of records in the result set (so you could know the total #of pages in the set).Would it be better to query the DB table 2X. Once for Count(*). Andagain for the records for the current page?Or better to create a temp table, select the records into it, and thenget count(*) and the page results from the temp table?I saw an example in a book that made a temp table to do this and to meit seemed like it would be slower. I don't get the reason for a temptable. Anyone have any ideas?
View 5 Replies
View Related
Dec 13, 1999
Hello,
I was wondering if there was a different approach I should take in appending data to a table...
My destination table has about 94+ million records in it, and I have been taking two approaches to getting new files into this table:
1) I do a data pump task in a DTS to import the file to a trans (temp) table, which is truncated every time, and then do an INSERT INTO statement from the temp table to my destination table.
The import to the trans table only takes a few minutes (about 1 - 2 million records per file, but have short record lenghts,) but when I do the INSERT INTO statement, it takes upwards of 6 hrs to append.
2) I have tried doing a bulk insert task, going directly to the destination table (which defeats the purpose of my trans table to check out the data prior, but I feel the data is clean at this point.)
I am running the bulk insert right now, and it's been running for over 3 hours...so I'm going to assume this will take just as long as the INSERT INTO statement does like I did before.
My destination table does not have any indexes in it at all, and I don't need to do any transformations to the data when bringing it into SQL since the data is clean. Also, I have a default value constraint on one of my fields on the destination table.
Plus there are other ppl and applications hitting the server which could impact the overall processing, but nothing out of the ordinary is going on the server today. I know there are only so many ways to get a file into a table...but maybe someone knows a different way I should try this.
Thanks for anyone's suggestions!
Kael.
View 1 Replies
View Related
Apr 23, 2007
Hello
DTS is notoriously faster than making the following statement:
insert into synonym_MyRemoteTable
select * from myLocalTable
Why is it so?
Thanks a lot.
View 6 Replies
View Related
Jun 21, 2007
hi!
i am using sql 2005 k with sp1.
I want to delete 30-40 million rows from a transactional table. Whats the fastest way to delete these rows. just to delete 300,000 rows it takes 30 min. also i don't want to truncate the table.
any help would be appreciated
Thanks
View 9 Replies
View Related
Oct 1, 2007
I have a query that has the following structure
Select *
From Table
Where Condition And ... (some 'Exists' conditions)
When I run the query using field names the query gets much slower, and I cannot understand Why!
Select MyField
From Table
Where Condition And ... (some 'Exists' conditions)
I'm talking about three times slower using the Select MyField sintax.
Any ideas???
View 8 Replies
View Related
Feb 12, 2007
Good Morning
Has anyone successfully used cherry's oledb provider for MYSQL to create a linked server from MS SQLserver 2005 to a Linux red hat platform running MYSQL.
I can not get it to work.
I've created a UDL which tests fine. it looks like this
[oledb]
; Everything after this line is an OLE DB initstring
Provider=OleMySql.MySqlSource.1;Persist Security Info=False;User ID=testuser;
Data Source=databridge;Location="";Mode=Read;Trace="""""""""""""""""""""""""""""";
Initial Catalog=riverford_rhdx_20060822
Can any on help me convert this to corrrect syntax for sql stored procedure
sp_addlinkedserver
I've tried this below but it does not work I just get an error saying it can not create an instance of OleMySql.MySqlSource.
I used SQL server management studio to create the linked server then just scripted this out below.
I seem to be missing the user ID, but don't know where to put it in.
EXEC master.dbo.sp_addlinkedserver @server = N'DATABRIDGE_OLEDB', @srvproduct=N'mysql', @provider=N'OleMySql.MySqlSource', @datasrc=N'databridge', @catalog=N'riverford_rhdx_20060822'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'collation compatible', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'data access', @optvalue=N'true'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'dist', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'pub', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'rpc', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'rpc out', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'sub', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'connect timeout', @optvalue=N'0'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'collation name', @optvalue=null
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'lazy schema validation', @optvalue=N'false'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'query timeout', @optvalue=N'0'
GO
EXEC master.dbo.sp_serveroption @server=N'DATABRIDGE_OLEDB', @optname=N'use remote collation', @optvalue=N'false'
Many Thanks
David Hills
View 7 Replies
View Related
Sep 8, 2006
We have an XML column in a SQL Server 2005 table. Each row of this table contains one XML document.
I want to shred values from the XML documents and process these within a Data Flow. I want the Data Flow to execute once across a record set comprised of all of the XML documents.
I can shred the XML using a For-Each loop and XML Task. I'm kinda stuck on how I then get the data from variables into a Recordset or similar so that I can process this within single iteration of a Data Flow.
Or - is my approach incorrect? I seem to be building a verbose and clunky solution to this problem. I know I could accomplish the same in a pretty simple SQL statement using .value on the XML column... am I missing something? Is a SQL query just better suited to this problem?
Any help much appreciated.
James
View 1 Replies
View Related
Mar 25, 2008
Hi guys,
I am trying to transfer data from a table in a SQL server database to an equivilent table in a MYSql database that resides on a different server.
I have previously used a php script to do the job but it proved very slow. I am dealing with a table that is growing and has 9.5 million records in this case and the proceedure may need to be carried out a number of times preday so I am hoping to find a better solution.
I am wondering anyone can point me in the direction of an alternative method.
Thanks,
Derick
View 1 Replies
View Related
Apr 21, 2006
Hello.
I use VS2005 and I was trying to make a little program able to connect to a MySQL 5.0.18 database through MyODBC 3.51.12. I can connect to the database and I can load a table in the Datagridview control, but I can't update the table: the wizard didn't generate the Update method. Even if I configure again the tableAdapter, the wizard generates only:
SELECT statement
table mappings
Fill method
Get method
The table is a very simple table: it has 3 columns and the first is a primary key.
If I create an identical table using Access database and make a program that connects through ODBC, I don't have the problem: the wizard generates also the UPDATE, INSERT and DELETE statements.
Where is the difference? What is the problem?
Thankyou so much for the help
Anthony R.
View 7 Replies
View Related
Jul 20, 2005
Hello,I have an application that will be logging to a SQL Server 2000database user user activity from several Windows 2003 terminalservers. This information will be retrieved by monitoring theSecurity logs of these servers (this part I know how to accomplishalready).A table in the database, tblLogEntries, will contain the followingfields:- ID = autoincrementing int- LogTime = Date/Time the user activity was recorded in the securitylog- Username = User's login ID that the activity was recorded with- Type = int, referencing a lookup table with the values of Logon,Logoff, and possible other future items- Server = The name of the server the activity was recorded on.The only question I have is, can you offer a way to process the totaluser login time during a given range using T-SQL.For Example...Given the table data:ID LogTime Username Type Server1 10-10-2003 8:30:00 Tom Logon SERVER-A2 10-10-2003 8:45:00 Sarah Logon SERVER-A3 10-10-2003 16:45:00 Tom Logoff SERVER-A4 10-10-2003 17:00:00 Sarah Logoff SERVER-A5 10-11-2003 8:30:00 Tom Logon SERVER-A6 10-11-2003 8:45:00 Sarah Logon SERVER-A7 10-11-2003 16:30:00 Sarah Logoff SERVER-A8 10-11-2003 17:15:00 Tom Logoff SERVER-AHow would you receive the output:User Logon Total Time for SERVER-ATom 17.0 hrsSarah 16.0 hrsI know I can handle this type of processing on my ASP.NET front-end,but I'm curious as to how easily it can be done by the database,itself.Thanks in advance for your assistance.
View 4 Replies
View Related
Sep 22, 2013
I want to process data from source table to destination table without using cursor.
At this point; we are creating temp table and inserting data from source to temp table. once we get data into temp table; using while loop we are processing record one by one to destination table.
while executing stored procedure; we noticed that there are few records in source table which are invalid and stored procedure is terminating from such records.
Any better approach to log INVALID data and resume code to process next record instead of terminating.
View 7 Replies
View Related
Sep 5, 2007
I want to do conditional processing depending on values in the rows of a CTE. For example, is the following kind of thing possible with a CTE?:
WITH Orders_CTE (TerritoryId, ContactId)
AS
(
SELECT TerritoryId, ContactId
FROM Sales.SalesOrderHeader
WHERE (ContactId < 200)
)
IF Orders_CTE.TerritoryId > 3
BEGIN
/* Do some processing here */
END
ELSE
BEGIN
/* Do something else here */
END
When I try this, I get a syntax error near the keyword 'IF'
Any ideas? I know this kind of thing can be done with a cursor but wanted to keep with the times and avoid using one!
View 3 Replies
View Related
Aug 2, 2006
Dear All,
I have the table creation script and insret record script.
This is MySQl Format.
What changes I have to do so can I run this scripts into SQL Server 2000.
If any body has successfully done it then please tell me the procedure.
CREATE TABLE `activity` (
`id` bigint(20) NOT NULL auto_increment,
`object_type` varchar(60) default NULL,
`object_id` varchar(20) default NULL,
`person_id` bigint(20) NOT NULL default '0',
`activity_dtm` datetime NOT NULL default '0000-00-00 00:00:00',
`activity_type_cd` varchar(25) NOT NULL default '',
`description_code` varchar(200) default NULL,
PRIMARY KEY (`id`),
KEY `FK9D4BF30FB4715636` (`activity_type_cd`),
KEY `FK9D4BF30F270CDEE0` (`person_id`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=1 ;
--
-- Dumping data for table `activity`
--
-- --------------------------------------------------------
--
-- Table structure for table `actv_type`
--
CREATE TABLE `actv_type` (
`code` varchar(25) NOT NULL default '',
`description` varchar(100) NOT NULL default '',
`void_ind` char(1) NOT NULL default '',
PRIMARY KEY (`code`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1;
--
-- Dumping data for table `actv_type`
--
INSERT INTO `actv_type` VALUES ('job_create', 'Created job', 'F');
INSERT INTO `actv_type` VALUES ('job_update', 'Changed job', 'F');
INSERT INTO `actv_type` VALUES ('job_void', 'Voided job', 'F');
INSERT INTO `actv_type` VALUES ('job_activate', 'Activated job', 'F');
INSERT INTO `actv_type` VALUES ('job_deactivate', 'Changed job to deactive', 'F');
INSERT INTO `actv_type` VALUES ('job_appl_create', 'Created application', 'F');
INSERT INTO `actv_type` VALUES ('job_appl_update', 'Updated application', 'F');
INSERT INTO `actv_type` VALUES ('intrv_create', 'Created interview', 'F');
INSERT INTO `actv_type` VALUES ('intrv_update', 'Updated interview', 'F');
INSERT INTO `actv_type` VALUES ('person_update', 'Update person', 'F');
INSERT INTO `actv_type` VALUES ('person_create', 'Create person', 'F');
INSERT INTO `actv_type` VALUES ('company_void', 'Voided company', 'F');
INSERT INTO `actv_type` VALUES ('company_create', 'Created company', 'F');
INSERT INTO `actv_type` VALUES ('company_update', 'Updated Company', 'F');
Thanks in Advance.
View 12 Replies
View Related
Mar 20, 2007
Does anyone know of a reference site where I can find a reference table to get a better idea of data type conversions that I should be using?
I have a MySQL 5.0 database which had a lot of tables (mostly empty) that I already have gotten transferred to SQL Server 2005. However, I am suspicious of some of the data type conversions that SQL Server did.
I would really like a good web site to bookmark to help me with this if there is such a reference. Can anyone help?
If not, the most specific example I have right now is a MySQL column that is expecting to accept currency and the MySQL data type is "Double". SQL Server 2005 translated this as a "float" data type. I normally use a "decimal" data type.
- - - -
- Will -
- - - -
http://www.strohlsitedesign.com
http://www.servicerank.com/
View 2 Replies
View Related
Dec 29, 2003
Can anybody tell how to import a table with the text column from SQL Server 2000 to MySQL 4.0.17?
I tried this using ODBC connection but got an error message saying, "Query-based Insertion or updating of BLOB values is not supported".
View 9 Replies
View Related
Oct 22, 2015
I have a job which executes hourly.
Essentially:
Begin
Truncate table A
Insert into A
(Col1,
Col2,
Col3...
)
Select Value1,
Value2,
Value3...
From Table B
End
The insert operation query takes approximately 3.5 minutes to execute. What's occurring is the Table is immediately truncated, and there are no rows in the table for those 3.5 minutes.
How can I avoid having this gap - where there are no rows in the table for that period of time during the job execution ? The table could be locked, but that doesn't seem like the best solution.
View 8 Replies
View Related
Jun 15, 2004
In Oracle, alter statements with modify clause can be given for more than one query ? Is it possible in SQL Server ?
Eg :-
ALTER TABLE test1
MODIFY col1VARCHAR2(1024)
MODIFY col2VARCHAR2(256)
MODIFY col3VARCHAR2(256)
Please give the equivalent for the above in SQL Server . Can all this exists in a single query in SQL Server ?
View 2 Replies
View Related
Jul 23, 2005
In Access, if I want to update one table with information from another,all I need to do is to create an Update query with the two tables, linkthe primary keys and reference the source table(s)/column(s) with thedestination table(s)/column(s). How do I achieve the same thing in SQL?RegardsColin*** Sent via Developersdex http://www.developersdex.com ***
View 3 Replies
View Related
Mar 3, 2005
I have an internal Project Management and Scheduling app that I wrote internally for my company. It was written to use MySQL running on a Debian server, but I am going to move it to SQL Server 2000 and integrate it with our Accounting software. The part I am having trouble with is the user login portion. I previously used this:
PHP Code:
$sql = "SELECT * FROM users WHERE username = "$username" AND user_password = password("$password")";
Apparently the password() function is not available when accessing SQL Server via ODBC. Is there an equivalent function I could use isntead so the passwords arent plaintext in the database? I only have 15 people using the system so a blank pwd reset wouldn't be too much trouble.
View 7 Replies
View Related
Nov 20, 2006
The dynamic sql is used for link server. Can someone help. Im getting an error
CREATE PROCEDURE GSCLink
( @LinkCompany nvarchar(50), @Page int, @RecsPerPage int )
AS
SET NOCOUNT ON
--Create temp table
CREATE TABLE #TempTable
( ID int IDENTITY, Company nvarchar(50), AcctID int, IsActive bit )
INSERT INTO #TempTable (Name, AccountID, Active)
--dynamic sql
DECLARE @sql nvarchar(4000)
SET @sql = 'SELECT a.Name, a.AccountID, a.Active
FROM CRMSBALINK.' + @LinkCompany + '.dbo.AccountTable a
LEFT OUTER JOIN CRM2OA.dbo.GSCCustomer b
ON a.AccountID = b.oaAccountID
WHERE oaAccountID IS NULL
ORDER BY Name ASC'
EXEC sp_executesql @sql
--Find out the first and last record
DECLARE @FirstRec int
DECLARE @LastRec int
SELECT @FirstRec = (@Page - 1) * @RecsPerPage
SELECT @LastRec = (@Page * @RecsPerPage + 1)
--Return the set of paged records, plus an indication of more records or not
SELECT *, (SELECT COUNT(*) FROM #TempTable TI WHERE TI.ID >= @LastRec) AS MoreRecords
FROM #TempTable
WHERE ID > @FirstRec AND ID < @LastRec
Error:
Msg 156, Level 15, State 1, Procedure GSCLink, Line 22
Incorrect syntax near the keyword 'DECLARE'.
View 3 Replies
View Related
May 4, 2007
What is the best approach to handle this situation? I have three different databases, which has it's own stored procedure. I need to call them all at page load and piece together the data. The common demoninator is the date.
2007
JAN
FEB
MAR
APR
row 1
50
60
89
63
row 2
44
21
62
46
2006
JAN
FEB
MAR
APR
row 1
60
90
65
41
row2
984
650
452
762
Row 1 and Row 2 come from two different databases and stored procedures.
How can I query the data and present it as it's shown above?
Thank you!
View 10 Replies
View Related
Feb 6, 2006
Hi,
I wanted to know a query which will create a final result table from a combination of select queries.
The select query is like :
1. select col1 , col2 , null from table1
2. select null , col2 , null from table2
3. select null , null , col3 from table 3.
null are inserted as i wanted a single select query which will merge all the columns from all the tables and finally create a result table.
Thanks in advance.
View 1 Replies
View Related
Jan 4, 2015
I was messing around with stored procedures and I was wondering if creating a SP that populates a single table for reporting is a good idea?
I have ~10 queries that I have to currently run manually and was hoping to drop them into a physical table and then leverage that single table to pull into excel.
Some of my queries use virtual tables or CTE's, this is to get the aggregate set correctly.
Essentially I am working out of a data warehouse and would like to eventually get all my queries in one SP or something similar and then call that query for a insert.
Speaking of which could you create a SP that has several selects than with that output drops the records into a single table by using an insert into query so the data from the all the queries would line up into the right columns?
View 1 Replies
View Related
Feb 9, 2015
I am trying to insert the queries in a table such as the following, not able to insert it.
create table test (txt Varchar(200))
insert into test values('select *from master.dbo.sysprocesses WHERE status NOT IN('sleeping','background')')
View 4 Replies
View Related
Oct 17, 2013
I have 3 queries pulling from the same table, trying to define a count on each criteria. I have the data pulling, but the data is in multiple rows. I want the data in one row with all the counts in each separate columns. Also I need to setup a flag if a client purchased and order within 30 days from their last purchase.
I am doing this select for each credit card, check and cash purchases. I do not know how to setup a flag where the client may have ordered and paid by check or cash after 30 days from a credit card purchase. Is this something that can be done?
select
clientnumber,count(distinct clientnumber) as cccnt,
0 as ckcnt, 0 as cacnt
from dbo.purchases
where orderdate >= 20120101 and orderdate <= 20121231 and
payment_type = 'CC'
group by clientnumber;
OUTPUT currently looks like this:
1234 2 0 0
1234 0 1 0
1234 0 0 4
Is it possible to result in this, along with a flag with the criteria above?:
1234 2 1 4 Y
View 3 Replies
View Related
May 29, 2006
Hi friends,
when I write multi-table queries which involve two tables which are joined via a bridging table (M:N),
do I just join the tables or do I have to reference the bridging table as well in the queries?
Cm
View 3 Replies
View Related
Jul 23, 2005
I'm new to adp w/ sql server but I have to use it on a project i'mdoing...One of the MUSTS for this project is the ability to update a 00 - 09text value with the appropriate text description from another table...Easy as pie in .mdb. Of course In the stored procedure it barks at meand tells me that an update query can only have one table.. ouch thathurts...I'm currently reading on the subject but this group has been veryhelpful in the past.....I found this link...http://www.sqlservercentral.com/col...stheeasyway.aspUnfortunetly I'm using MSDE not Enterprise so I don't think I can usethe query analyser.. But I tryed it in my Access ADP anywayit barked at me..I tried to go from this....SELECT dbo.LU_SEX.SEX_CODE, dbo.TEST.DEFECTS_DP1FROM dbo.TEST INNER JOINdbo.LU_SEX ON dbo.TEST.SEX_DP1 =dbo.LU_SEX.SEX_DECTo this...UPDATE dbo.TEST.SEX_DP1SET dbo.TEST.SEX_DP1 = dbo.LU_SEX.SEX_CODEFROM dbo.LU_SEX INNER JOINdbo.TEST ON dbo.LU_SEX.SEX_DEC =dbo.TEST.SEX_DP1Maybe I need a good book on this?Thanks,Charles
View 2 Replies
View Related
Nov 1, 2007
Hi,
I currently have three queries running seperately which I'd like to join up..
However I'm sure it can be done..
Here goes:
I have one table which lists payments made (it stores a PaymentID from another table, a payment amount and an invoiceID)
Therefore there can be several records with the same PaymentID referenceing different invoices (i.e a user paying off several invoices with one payment)
I have one query which lists all of the invoices paid against one payment.
SELECT PaymentID, InvoiceID, PaymentAmount WHERE PaymentID = xxx AND PayType <> 'Credit'
I then have a second query which is ran against each individual invoice which shows the other payments which have been made against this invoice already
SELECT PaymentID, InvoiceID, PaymentAmount WHERE PaymentID <> xxx AND PayType <> 'Credit' AND InvoiceID = XXX
and final I have one query which lists the credits
SELECT PaymentID, InvoiceID, PaymentAmount WHERE PayType = 'Credit' AND InvoiceID = XXX
All of the above lets me see an payment, which invoices have been paid against that payment.. and then for each invoice, any other payment which were made beforehand, and finally any credits against that invoice.
I run these from an ASP page in a loop which is pretty inefficient way of doing it.
I would much prefer to amalgamate the three queries above so I could see what I was paying now, what had already been paid and what was credited against each invoice from a PaymentID.. all in query.
Is that possible?
Thanks
mtm81
View 5 Replies
View Related
Jul 26, 2007
I'm working with a table with about 60 million records. This monster is growing every minute of the day as well, by 200,000 - 300,000 records/day. It's 11 columns wide, and has one index on a datetime column. My task is to create some custom reports based on three of these columns, including the datetime one.
The problem is response time. Any query executed on this table takes forever--anywhere between 30 seconds and 4 minutes. Queries such as this one below, as simple as it is, can take a minute or more:
select
count(dt_date) as Searches
from
SearchRecords
where
datediff(day,getdate(),dt_date)=0
As the table gets larger and large, the response time is going to get worse and worse. Long story short, what are my options to get the speed of queries down to just a few seconds with a table this big? So far the best I can come up with is index any other appropriate columns (of which there is one for sure, maybe two).
View 6 Replies
View Related