Help With Query To Reduce Records
Dec 13, 2007
I want to write a quick one time query to create a new table based off an existing table. The idea is to make the new table more efficient by reducing the amount of records...see example below
id1 id2 country
1 2 US
3 4 AU
5 6 US
7 8 US
9 10 PE
11 12 PE
13 14 US
15 16 US
17 18 US
19 20 US
21 22 US
id1 id2 country
1 2 US
3 4 AU
5 8 US
9 12 PE
13 22 US
View 10 Replies
ADVERTISEMENT
May 22, 2007
Hi There,
I'm relatively new to AS2005, so you'll have to excuse me if there is a simple solution that has been overlooked.
Here is the situation - We have a fact table, linked to several dimensions - one of them being a generic date dimension used by a number of fact tables. The Date Dimension has all dates ranging from 1901 to 2100, but the fact table I am querying just has records from 2005 to present. Once the cube is processed and I open the cube in Excel (for example) it lists every date from the Date Dimension, is there any way to limit which dates are processed into the cube to those that just appear in the fact table (in essence - what would be an Inner Join in TSQL).
Hope this makes sense,
Regards,
Tobias
View 7 Replies
View Related
Feb 26, 2008
Hi,I have a task at hand to reduce the time taken for search query toexecute. The query fetches records which will have to sorted bydegrees away from the logged in user. I have a function whichcalculates the degrees, but using this in the search query slows theexecution and takes about 10 secs to complete which is unacceptable.Please advice. Your help is much appreciatedFor more details plz see:http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=97021ThanksIsfaar
View 6 Replies
View Related
Nov 13, 2013
I have select query with 10 Joins in place with different i want to improve the performance of the query how do i join tables with out joins.
View 9 Replies
View Related
Mar 20, 2014
writing the query for the following, I need to collapse the continuity. If the termdate for an ID is one day less than the effdate of the next id (for the same ID) i need to collapse the records. See below example .....how should i write the query which will give me the desired output. i.e., get min(effdate) and max(termdate) if termdate is one day less than the effdate of next record.
ID effdate termdate
556868 1999-01-01 1999-06-30
556868 1999-07-01 1999-10-31
556869 2002-10-01 2004-01-31
556872 1999-02-01 2000-08-31
556872 2000-11-01 2004-01-31
556872 2004-02-01 2004-02-29
output should be ......
ID effdate termdate
556868 1999-01-01 1999-10-31
556869 2002-10-01 2004-01-31
556872 1999-02-01 2000-08-31
556872 2000-11-01 2004-02-29
View 0 Replies
View Related
Apr 29, 2015
I want to create index for hash table (#TEMPJOIN2) to reduce the update query run time. But I am getting "Warning!
The maximum key length is 900 bytes. The index 'R5IDX_TMP' has maximum length of 1013 bytes. For some combination of large values, the insert/update operation will fail". What is the right way to create index on temporary table.
Update query is running(without index) for 6 hours 30 minutes. My aim to reduce the run time by creating index.Â
And also I am not sure, whether creating index in more columns will create issue or not.
Attached the update query and index query.
CREATE NONCLUSTERED INDEX [R5IDX_TMP] ON #TEMPJOIN2
(
[PART] ASC,
[ORG] ASC,
[SPLRNAME] ASC,
[REPITEM] ASC,
[RFQ] ASC,Â
[Code] ....
View 7 Replies
View Related
Jan 18, 2008
I have a query similar to the following. The intent of this query is to retrieve the top 6 records meeting the specified criteria (LOGTYPENAME = 'Process Status Start' OR LOGTYPENAME = 'Process Status End' ) based on most recent dates. Please keep in mind that I expect to return up to 6 records for each unique LogProcessName. This could be thousands of different LogProcessNames with up to 6 records for each.
1) The table I am executing against currently is very large in size and thus takes a long time to execute against. It would seem there must be a more efficient query to get the results I am looking for?
2) CTE doesn't work on SQL 2000. I need a query that does.
3) I cannot modify the database itself in the process.
;WITH cte AS (
SELECT [LogProcessName], [LogBody], [LogDate], [LogGUID], row_number()
OVER(PARTITION BY [LogProcessName]
ORDER BY [LogDate] DESC)
AS RN
FROM [LOGTABLE]
WHERE [LogTypeGUID] IN (
SELECT LogTypeGUID
FROM LOGTYPE
WHERE LogTypeName = 'Process Status Start'
OR LogTypeName = 'Process Status End' ) )
SELECT *
FROM cte
WHERE RN = 1 OR RN = 2 OR RN = 3 OR RN = 4 OR RN = 5 OR RN = 6
ORDER BY [LogProcessName] DESC, [LogDate] DESC
Does anybody else have any idea that would yield the results that I am looking for and take into account items 1-3 above?
Thanks in advance.
View 4 Replies
View Related
Aug 17, 2001
I need to reduce the size of a db from it's original allocated size of 2.0 gb to 1.0 gb.so that I can allocate more to another db.How can I do?Thanks,
Ravi.
View 2 Replies
View Related
Mar 16, 2007
Basically, what I'm doing is storing answers to questions in a survey. I have two ways I can organize the table:
1. Just having a table with lots of columns - one for each question
2. A table with only about maybe 3 columns:
SurveyID
QuestionID
QuestionAnswer
In this second case, the the primary key would be both SurveyID and QuestionID combined, of course.
I don't fully know the pros and cons of the two approaches, and both look like they would work. Right now, I'm using option 1, but I keep wondering if option 2 might be better. Whenever I change the questions, I currently have to drop and recreate the table (altering it is too much effort), and I know option 2 would be a way of avoiding that. By the way, the questions themselves are stored in an xml file, if it means anything. Anyhow, once the survey is being used, there shouldn't be any further changing of questions. And there's also just too much I don't know about (how is performance affected, for example?).
Any ideas which is better and why?
View 9 Replies
View Related
Apr 10, 2008
I have a search function, which searches across many tables.
It's a pretty heavy SPROC, I'm wondering in general, what are the best way to reduce deadlocks ? Its used a fair bit, and altho I haven't noticed problems with it myself, there are definately a decent amount of deadlocks showing up in the logfiles.
I've always assumed this is something really difficult, and avoided it like the plague.
Any tips are much appreciated !
thanks,
mike123
View 9 Replies
View Related
May 16, 2007
I'm new to SQL Server Maintainance. I need some disk space now. In my database, I have some table that has 7 year old data and I can get rid of that without any conequences. Could you please tell me the best course of action ?? I was thinking about:
1. Dropping those tables.
2. Recreating them again
3. Running ShrinkDB on entire DB.
However, this solution does not sound right or elegant to me. Could you please help me with that ? Thanks
View 12 Replies
View Related
Aug 16, 2006
Hello to everyone
I am running MS SQL 2005 Express I get per day 2-4 hackers attacks trying to login from €œsa€?
Some 37 calls times per second one of attack was continuing 4 days
Is there some setting into MS SQL 2005 to reduce that?
Can you recommend me good firewall for DDOS attacks?
Is it there some legal action that I can take to this people I have their IPs most are from US and Canada?
Thank you in advance
val
View 9 Replies
View Related
Jun 7, 2008
Hi my database .ldf (log file) size is very big. How can I decrease the size of it?
Thanks.
View 2 Replies
View Related
Mar 29, 2001
Hello. Can anybody help me with this?
I have a sql server 7.0, where tempdb database has a size of 21 Gb and space available is 2Mb less than 21 Gb.
How can I shrink, reduce, compact ... it?
I have just tried with Backups and truncate log, but nothing
Bye, JuanSa.
View 9 Replies
View Related
Jan 13, 2000
I'm worhing with SQL Server7.0 and I have created a database.
I have objets and data in my database
My database have 300Mb and I only need about 50.
How can I reduce to 50 the dimension of my database.
Is it posible??
Any advice will be greatly apreciatted.
Nauj
View 2 Replies
View Related
Mar 19, 2000
My database's transaction log has become 1.7 GB. Can I reduce it's size? I have tried to shrink database and also set truncate on checkpoint option and also taken the backup after that. but nothing helps. Please advice.
View 1 Replies
View Related
Dec 9, 1999
I am using SQL 7, SP1 / NT 4. The .LDF file has grown to 1.1GIG; I ran a DBCC SQLPerf(LogSpace), the used portion of the log is 2%. When I run a DBCC Shrinkdatabase and DBCC Shrinkfile, the log file does not reduce in size. How do I get the virtual log files that are not active released back to the system? Is there a way to tell if all the virtual log files are active, therefore, not reducing the size of the file? Any help is greatly appreciated.......
View 3 Replies
View Related
Aug 17, 2000
I had a database that’s comprised of different file groups and log files spread out among different hard drives. I have recently upgraded the database to SQL 7.0 on a RAID 10 volume. I would like to consolidate all the file groups and files as well as various log files into one primary datafile and logfile. How do I do that? Thanks in advance.
View 2 Replies
View Related
Feb 5, 1999
Can anyone suggest a method to reduce the size of one our log devices. The DB was set up
initially at 500Mb with a log size of 1 Gb (typo by the client). We would like to reduce it to 100Mb.
if possible. Our environment is SQL Server 6.5 with service pack 3.
thanks,,,brad
View 1 Replies
View Related
May 25, 2002
Dear All,
I am running Slq2000 (EV) on NT4.0. I have problem is that a size of Transactional Log(*.ldf) file is 3GB. I want to reduced the size to 2GB.Can anyone help me answer this question.
Note: I hv already took the backup of Transactional file by choosing Trunct file after backup but size is not got reduced.
View 4 Replies
View Related
Aug 9, 2004
HI all,
I have a database that allocated:
Data: 7300 MB
Log: 2000 MB
But only used
Data: 5500MB
Log: 50MB
How can I free the unused space in the transaction log because
the database is getting too big.
Thank you for your suggestion.
View 1 Replies
View Related
Feb 11, 2004
Hi all,
I know this topic has been discussed in the past, but I still don't quite get it. So please be patient with me
database size created with automatically growth of 10% with unrestricted file growth.
database size = 5 Gb used 4.5 GB (taskpad)
transaction log=8GB used 54MB (taskpad) 7.5 GB free
database run in FULL mode
full backup nightly, transaction log backup every 30 min
What should I do to free up the space that are not used in the transaction log.
Thanks for your help.
View 7 Replies
View Related
Apr 2, 2008
Hello all,
Mine below function takes much time at every execution. It takes 0.18 sec to retrive 984 rows.
Can any one help me, how to reduce execution time?
"Create function [dbo].[Fn_Get_Consensus_Curve_41_Data]
(@p_Location_Code nvarchar(10), @p_Sector_Id int, @p_Match_Date DateTime ,@p_UserID int , @p_CustId int)
RETURNS @Temp_Curve_Submission_Data table
(
Location_Codenvarchar(10),
Sector_Idint,
MatchDatedatetime ,
EntityIdint ,
CustomerIdint,
MaturityDatedatetime ,
Cust_Pricefloat ,
Bid_Pricefloat ,
Offer_Pricefloat ,
Consensus_Mid_Price float ,
Tickernvarchar(20) ,
Cust_Mnemonicnvarchar(50) ,
Currency_Idint
)
as
BEGIN
/*
GO
IF (EXISTS (SELECT name
FROM sysobjects
WHERE (name = N'Fn_Get_Consensus_Curve_41_Data')
AND ((type = 'P') OR (type = 'IF') OR (type = 'TF') OR (type = 'FN'))))
DROP FUNCTION [dbo].Fn_Get_Consensus_Curve_41_Data
GO
*/
declare @p_ENTITYID INT
declare @p_CUSTOMERID INT
Declare @p_Login_Type int
Declare @p_Result_Status int
set @p_Login_Type = (SELECT DBO.GET_USER_LOGIN_TYPE_ID(@p_UserID))
If @p_Login_Type=1 and not (@p_CustId is null or @p_CustId='')
Set @p_Result_Status = 1
Else if @p_Login_Type > 1
Set @p_Result_Status = 2
Else
Set @p_Result_Status = 0
If @p_Result_Status > 0 -- if user is valid and given enough parameters than
Begin
If @p_Result_Status = 1 -- if User is trader and gives customer id
Begin
Declare Cur_Fetch_Curve_Cust_Data cursor for
Select Distinct Customerid
From PricesRR PRR
Where
Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And
Sector_Id = @p_Sector_Id And
Location_Code = @p_Location_Code And
CustomerID = @p_CustId And
--CustomerID <> 0
--CustomerID not in (0, -1, -2, -3, -100, -200)
CustomerId Not In (Select CustomerId From Fn_Get_PricesRR_Not_To_Include_Cust_Id('V'))
and isnull(PRR.Record_Last_Action,'N') <> 'D'
and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
Declare Cur_Fetch_Curve_Entity_Data cursor for
Select Distinct EntityID
From PricesRR PRR
Where
Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And
Sector_Id = @p_Sector_Id And
Location_Code = @p_Location_Code
AND EntityId IN ( Select Distinct Entity_Id from Fn_Get_Allowed_Entity_List(@p_Location_Code , @p_Sector_Id , @p_Match_Date ,@p_UserID ))
and isnull(PRR.Record_Last_Action,'N') <> 'D'
and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
End
Else If @p_Result_Status = 2 -- if User is higher than trader.. means broker or higher
Begin
Declare Cur_Fetch_Curve_Cust_Data cursor for
Select Distinct Customerid
From PricesRR PRR
Where
Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And
Sector_Id = @p_Sector_Id And
Location_Code = @p_Location_Code And
--CustomerID <> 0
--CustomerID not in (0, -1, -2, -3, -100, -200)
CustomerId Not In (Select CustomerId From Fn_Get_PricesRR_Not_To_Include_Cust_Id('V'))
and isnull(PRR.Record_Last_Action,'N') <> 'D'
--and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
Declare Cur_Fetch_Curve_Entity_Data cursor for
Select Distinct EntityID
From PricesRR PRR
Where
Convert(Nvarchar,Matchdate,101) = Convert(Nvarchar,@p_Match_Date,101) And
Sector_Id = @p_Sector_Id And
Location_Code = @p_Location_Code
and isnull(PRR.Record_Last_Action,'N') <> 'D'
--and Version = dbo.GET_PRICESRR_MAX_VERSION(@p_Location_Code, @p_Sector_Id, @p_Match_Date, PRR.EntityID, @p_CustId, PRR.Date)
End
delete from @Temp_Curve_Submission_Data
-----------------------
-----------------------
Open Cur_Fetch_Curve_Cust_Data
fetch next from Cur_Fetch_Curve_Cust_Data
into @p_CUSTOMERID
WHILE @@FETCH_STATUS = 0
BEGIN
IF @@FETCH_STATUS <> 0 break
BEGIN
-----------------------
-----------------------
Open Cur_Fetch_Curve_Entity_Data
fetch next from Cur_Fetch_Curve_Entity_Data
into @p_ENTITYID
WHILE @@FETCH_STATUS = 0
BEGIN
IF @@FETCH_STATUS <> 0 break
-----------------------
Insert Into @Temp_Curve_Submission_Data
(
Location_Code ,
Sector_Id,
MatchDate ,
EntityId ,
CustomerId ,
MaturityDate ,
Cust_Price ,
Bid_Price,
Offer_Price,
Consensus_Mid_Price ,
Ticker ,
Cust_Mnemonic ,
Currency_Id
)
select
@p_Location_CodeLocation_Code,
@p_Sector_IdSector_Id,
X.MatchDateMatch_Date,
X.EntityIdEntity_Id ,
X.CustomerIdCustomer_Id,
X.MaturityDateMaturity_Date,
X.PriceCust_Price,
X.BidValueBid_Price,
X.OfferValueOffer_Price,
DBO.GET_Consensus_MID ('V',@p_Location_Code , @p_Sector_Id , @p_Match_Date, @p_ENTITYID ,x.MaturityDate) Consensus_Mid_Price,
--DBO.GET_Consensus_MID ('B1',@p_Location_Code , @p_Sector_Id , @p_Match_Date, @p_ENTITYID ,x.MaturityDate) Consensus_Mid_Price,
X.TickerTicker ,
X.MnemonicCust_Mnemonic,
X.Currency_Id
from
(
SELECT
row_number() over (order by maturitydate) Line_No,
a.* ,
b.ticker,
c.mnemonic
from Fn_Get_Tot_Curve_41_Date(@p_Location_Code, @p_Sector_Id, @p_Match_Date, @p_ENTITYID , @p_CUSTOMERID ,@p_UserID ) a,
referenceentity b,
(
select customerid, mnemonic
from customersrr
group by customerid,mnemonic
) c
where
a.customerid = c.customerid and
a.EntityID=b.CMAID
--order by maturitydate
) X
-----------------------
Fetch Next From Cur_Fetch_Curve_Entity_Data
Into @p_ENTITYID
END
CLOSE Cur_Fetch_Curve_Entity_Data
END
-----------------------
-----------------------
Fetch Next From Cur_Fetch_Curve_Cust_Data
Into @p_CUSTOMERID
END
deallocate Cur_Fetch_Curve_Entity_Data
CLOSE Cur_Fetch_Curve_Cust_Data
deallocate Cur_Fetch_Curve_Cust_Data
End
return
end
"
Prashant Hirani
View 1 Replies
View Related
Apr 29, 2008
Hi All,
On my server C drive is of 34GB. Right now tempdb size is 22GB which is causing C drive to be full. How I can I reduce it? I dont want to move tempdb to any other drive, and I am only looking a way to reduce its size.
Please help quickly....
Thanks,
Zee
View 8 Replies
View Related
Jun 17, 2008
Hi all;
i m using SQL 2000, i have a database with 86G mdf and 56G ldf size. i shrink the ldf and it reduced to 32M, however, i did not do anything on my mdf file, but the size of mdf has been reduced to 28G. just would like to check, is this correct?why is mdf size reduced when i only shrink my ldf? hope can help. thanks
View 2 Replies
View Related
Jul 23, 2005
Database log of my DB is around 2GB.The database is using FULL recovery option.I want to reduce the file size of the log cause it takes up a lot ofspace.I'd do a full database backup, then backup the transaction log as well.... both backup performed with a check on the option "clear inactiveentries from transaction log".But after I backup, the database log is still 2GB.What should I do to reduce the database log file size?Should I use?:==============================Dump Tran databaseName with no_logDBCC shrinkdatabase(databaseName, 30)==============================Is that safe to be used in production server?Peter CCH
View 14 Replies
View Related
Jul 20, 2005
Hi, my database size has grown out of control and I need help with thefollowing issues. (I am very new to databases)I am storing financial tick data in one of the tables and after two monthsthe database has grown to 30GB. I do not need a permanent record of thistick data after it has been processed and tried to remove all rows from thistable (delete from Tickdata), however sql does not take kindly to removingmillions of rows and the operation seams to time out. The only solution Icould come up with was to delete the table.Secondly, after managing to clear out these tables I have noticed that thedatabase size is still 30GB, despite 29GB being available. Is there any wayto reduce the size of the database from 30GB. I tried the shrink databaseoption but it does not do anything. Any ideas?Thanks.
View 5 Replies
View Related
Jun 9, 2006
Hi,
I have a table with column value like '123 345 678 143 648' like that. What I need to do is I have to take each code value and put it as a new record in another table. So, if I say 'Select substring(column_name,1,3) from table' then it is very fast (fraction of second). But since I need to take each code and the # of codes in each record may vary, I am using a while loop to take each code and so I delclared a variable @i and now my select statement is like this: 'Select substring(column_name,@i,3) from table'. Interesting now this select statement is taking almost 2 mins for each iteration.
Why it is like this? Is there any way I can reduce the time taken to execute each iteration?
Thanks.
View 6 Replies
View Related
Dec 25, 2007
Hi,
I got a problem.
I installed Microsoft SQL Server Management Studio Express 2005 version.
And I created a Compact database.
I created an connection in SSMSE to connect the database and opened a query form.
then, i run the following sql:
Select * from Table1
It returned 3 records to me.
After that, I used program to insert record into this table.
Then i ran this sql again, it still show me 3 records.
I closed the query form, and re-created a new query form, then run the sql, it returned 4 records to me.
Why? It's very strange and difficult to operate, right?
Is there anyone know how to make the SSMSE to return whole records without any close query form and re-create query form operation?
Thanks a lot!
And Merry X'max!!!
View 4 Replies
View Related
Feb 5, 2008
protected void Buy_Click(object sender, EventArgs e)
{SqlConnection conn = new SqlConnection();
SqlCommand cmd = new SqlCommand();
try
{String connStr = ConfigurationManager.ConnectionStrings["project_DataFile"].ConnectionString;
conn.ConnectionString = connStr;
conn.Open();
cmd.Connection = conn;ArrayList cart = (ArrayList)Session["Cart"];
{String oldSel = "SELECT Qty_warehouse FROM Producttable";
Int32 sel = 0;Int32.TryParse(oldSel, out sel);
string num = cart.Count.ToString();Int32 numQty = 0;
Int32.TryParse(num, out numQty);int nQty_warehouse = 0;
nQty_warehouse = sel - numQty;String sql = "UPDATE Producttable" + " SET Qty_warehouse =" + "'"+ nQty_warehouse +"'";
cmd.CommandText = sql;}
}catch (Exception ex)
{
Response.Write(ex.ToString());
}
finally
{if (conn.State.Equals(ConnectionState.Open))
{
conn.Close();
Response.Write("end.aspx");
}
}
from this code, i want to reduce product's quantity in database.
when I click buy button,it reduce old quantity and new quantity (quantity from cart when I want to buy it) in auto.
please advise me.
View 3 Replies
View Related
Dec 6, 1999
Hi guys.
I am having trouble in time issues while backuping my database.
My database size is around 50GB. It is taking around 5hrs.
Is there any way to reduce the 5 hr backup time to 3 or less.
Thanks in advance
MAK
View 1 Replies
View Related
Dec 10, 2001
hi, i'm a newbie in SQL Server and have recently setup a test SQL 7 server. I used all the defaults at the beginning, and now the MDB file is about 500MB and the LDF file is of similar size.
i'm still trying to figure out how to reduce the size of the transaction log file. Currently I only have full backup of the database once a week, and there is no backup for transaction log.
as of this moment, the transaction log is of not much use to me, but I really want to get it reduced as i'm running out of disk space.
and i'd also greatly appreciate if someone could suggest a good DBA practice on the proper setup/handling of transaction logs (how to balance the disk-space usage AND be able to use the transacton logs for proper roll-over during a recovery process).
i'd soon be setting up a SQL 7 server where about 10 active users are expected at any one time. I've read that the transaction log file should be about 40% to 50% of the estimated size of the database file, and should be allowed "auto-grow". So what happens if the more space is required by the transaction logs? Does a full-backup purge the transaction logs (like the way they do in Exchange Server)?
Thanks!
View 2 Replies
View Related
Jul 19, 2002
Hi,
My transaction log file's physical size has exceeded 2GB. How can I reduce that?
Is there any way by which this file size can be controlled?
View 1 Replies
View Related