Huge Dataset, Only Need A Little

Dec 7, 2007

Hello, newbie here...

I have a huge dataset in MS SQL Server, over 11 million records of machine data taken every four seconds. I really don't need samples of this data at this interval. I'd like to run a query to retrieve my data for every 30 minute interval. I know enough SQL and DTS to SELECT was fields I want, and how to direct it to a CSV file, but I'm not sure how to manipulate the TimeStamp field in the query to only pull data every 450 records.

TIA

View 20 Replies


ADVERTISEMENT

Huge Deletes In A Huge Table

Apr 3, 2000

SQL 7 SP1 NT4 SP5

I have a TRANSACTION table with 150 million rows.

I have a USER table.

Each user has about 600 records in the TRANSACTION table.

The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.

The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.

I want to go through the USER table and delete all users who have not visited me in a while.

I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.

The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.

Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?

Thanks !!!

View 3 Replies View Related

SQL Server 2008 :: Populate One Dataset In SSRS Based On Results From Another Dataset Within Same Project?

May 26, 2015

I have a report with multiple datasets, the first of which pulls in data based on user entered parameters (sales date range and property use codes). Dataset1 pulls property id's and other sales data from a table (2014_COST) based on the user's parameters. I have set up another table (AUDITS) that I would like to use in dataset6. This table has 3 columns (Property ID's, Sales Price and Sales Date). I would like for dataset6 to pull the Property ID's that are NOT contained in the results from dataset1. In other words, I'd like the results of dataset6 to show me the property id's that are contained in the AUDITS table but which are not being pulled into dataset1. Both tables are in the same database.

View 0 Replies View Related

Integration Services :: Perform Lookup On Large Dataset Based On A Small Dataset

Oct 1, 2015

I have a small number of rows in a dataset, Table 1.  There is a CLOB on a large dataset, Table 2.  They join on a PK.  I would like to retrieve this CLOB and add it to the data flow for Table1.  In short I want to emulate the following:

Table 1:  Small table without CLOB, 10 rows. 
Table 2: Large table with CLOB, 10,000,000 rows

select CLOB
from table2
where pk = (select pk from table1)

I want this to return the CLOBs for the small number of rows in Table 1.  The PK is indexed obviously so it should be a fast look up.

Table 1 and Table 2 live on different Oracle databases.  How do I perform this operation efficiently in SSIS?  It seems the Lookup and Merge Join wont do this.

View 2 Replies View Related

Reporting Services :: Populate One Dataset In SSRS Based On Results From Another Dataset Within Same Project?

May 27, 2015

I have a report with multiple datasets, the first of which pulls in data based on user entered parameters (sales date range and property use codes). Dataset1 pulls property id's and other sales data from a table (2014_COST) based on the user's parameters.

I have set up another table (AUDITS) that I would like to use in dataset6. This table has 3 columns (Property ID's, Sales Price and Sales Date). I would like for dataset6 to pull the Property ID's that are NOT contained in the results from dataset1. In other words, I'd like the results of dataset6 to show me the property id's that are contained in the AUDITS table but which are not being pulled into dataset1. Both tables are in the same database.

View 3 Replies View Related

How Can I Use SQL Reporting Services To Get A Dynamic Dataset From Another Web Service As My Reports Dataset?

May 21, 2007

I found out the data I need for my SQL Report is already defined in a dynamic dataset on another web service. Is there a way to use web services to call another web service to get the dataset I need to generate a report? Examples would help if you have any, thanks for looking

View 2 Replies View Related

Listing Datasets In Report (dataset Name, Dataset's Commands)

Oct 12, 2007



Is there any way to display this information in the report?

Thanks

View 3 Replies View Related

Dataset.Tables.Count=0 Where There Are 2 Rows In The Dataset.

May 7, 2008

Hi,
I have a stored procedure attached below. It returns 2 rows in the SQL Management studio when I execute MyStorProc 0,28. But in my program which uses ADOHelper, it returns a dataset with tables.count=0.
if I comment out the line --If @Status = 0 then it returns the rows. Obviously it does not stop in
if @Status=0 even if I pass @status=0. What am I doing wrong?
Any help is appreciated.


ALTER PROCEDURE [dbo].[MyStorProc]

(

@Status smallint,

@RowCount int = NULL,

@FacilityId numeric(10,0) = NULL,

@QueueID numeric (10,0)= NULL,

@VendorId numeric(10, 0) = NULL

)

AS

SET NOCOUNT ON

SET CONCAT_NULL_YIELDS_NULL OFF



If @Status = 0

BEGIN

SELECT ......
END
If @Status = 1
BEGIN
SELECT......
END



View 4 Replies View Related

How To Transfer Data From One Dataset To Other Dataset

Apr 11, 2008

i have two datasets.one dataset have old data from some other database.second dataset have original data from sql server 2005 database.both database have same field having id as a primary key.i want to transfer all the data from first dataset to new dataset retaining the previous data but if old dataset have the same id(primary key) as in the new one then that row will not transfer.
but if the id(primary key) have changed values then the fields updated with that data.how can i do that.
 

View 4 Replies View Related

Filter One One Dataset With Values In Another Dataset?

Dec 19, 2006

Hi,

I have two datasets in my report, D1 and D2.

D1 is a list of classes with classid and title

D2 is a list of data. each row in D2 has a classid. D2 may or may not have all the classids in D1. all classids in D2 must be in D1.

I want to show fields in D2 and group the data with classids in D1 and show every group as a seperate table. If no data in D2 is available for a classid, It shows a empty table.

Is there any way to do this in RS2005?

View 2 Replies View Related

Reporting Services :: IF Statement If Dataset Field Value Equals Value Of Dataset Field

Sep 3, 2015

Using this IIF statement:

=CountDistinct(IIF(Fields!Released_DT.Value = Fields!Date2.Value, Fields!Name.Value,
Nothing))
Released_DT = a date  - 09/03/2015 or 09/02/2015
Date2 = returns another date value in this case 09/03/2015

What I'm trying to do is: count distinct number of people (Fields!Name.Value) if the Relased_DT = Date2.My IIF statement is returning a zero value.

View 4 Replies View Related

Huge Log

May 27, 2007

Dear all,



i have problems with log .

the mssql write 4 G log so plz how can i Eliminate log huge size

View 1 Replies View Related

Huge Log Files

May 9, 2008

 I have SQL Server 2005 Express Edition with Advanced Services running on a small web server. It all runs fine, but every now and then the log files grow and grow and eventually use up all the disk space of 30GB. As a quick fix, restarting SQL a couple of times clears out the logs and everything is up and running again. Any ideas on how to stop this happening?

View 5 Replies View Related

Huge Transaction Log - Do I Need This?

Nov 19, 2004

I just peeked at my DNN setup and I found that I have a transaction log about 98 gigs large, compared to a DNN database that is only about 250 megs. Crazy, huh?

Do you happen to know what I need that transaction log for? Can I just delete it or will it break my SQL db? Is there a way that I can keep only maybe a week of transactions in it so it doesn't grow so dang large?

Thanks in advance for your response!!

Tim x 4
(always learning!)

View 2 Replies View Related

Huge Transaction Log

Aug 10, 2000

One of my production databases is currently 51 mb. The transaction log is well over 5 gig. I have tried truncating and then shrinking the log through the use of SQL utilities. This does not work! How can I quickly resolve this problem without tampering with the production environment?

Thanks in advance.
Ray Reinders

View 1 Replies View Related

Huge Transaction Log

Sep 30, 2002

I am not a DBA and I run a personal web site that has gotten pretty large. I have never done anything to maintain my sql server, and now my transaction log is 10 Gigs and my data is only like 300 Megs. I am starting to get a memory leak with the sql service. What should I do? Is it bad to have a huge transaction log. I am not familiar with any of this stuff, so someone please point me in the right direction.

View 3 Replies View Related

What Should I Do With The Huge Log File

Oct 11, 2004

Hi all,
I found my database log file is 26GB and the database file is just about 280MB. We are doing full backup everyday. However, my sql server seems running very slow now and please advise:

1. How can I decrease/truncate my log file?
2. Would the huge size of the log file be the reasons slowing up my sql server?
3. Would anyone give me direction knowing more on the transaction log?
Thank you and appreciated!

View 4 Replies View Related

Log File HUGE!!

Oct 30, 2007

Hi guys, its my first post! Its also like my first time really diving into sql. We are using sharepoint on site here along with sql server 2005, one of our log files is 255 GBs and needs to be made smaller very fast!! We are almost out of disk space and the log is growing fast.

I am very new to sql and dont even know where to go to enter commands, so youll have to bear with me here. I've read about truncating and shrinking and some other things, I am just worried and dont want to mess anything up. I know this is probably a simple task, but like I said, with the truncate command I was reading about, I dont even know where to go to type it in!!! If someone could please help it would be much appreciated. Thanks so much.

View 14 Replies View Related

Huge Issue - Well For Me Anyway

Dec 26, 2007

Hey everyone.

I have a problem and I am sure someone here can help. Lat night my DB was working fine, as it has been . This morning I get to the office and now everything has gone to hell in a handbag .

I can no longer connect to my sql2005 DB I get this error when trying to place an order on our order page.

"There was a problem with the website:An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: TCP Provider, error: 0 - No connection could be made because the target machine actively refused it.)
The error has been logged."

I also notice the sql agent will not start. Any the former owner of the Co. had a eval version of sql management studio the must have just expired(could this be causing it?)

Any help anyone can offer would be great.

View 12 Replies View Related

Huge .BAK FILE

Jan 28, 2008

I have a .bak file of 72gb. But my database size is only 32gb, I got this value from sp_spaceused?
Anyone know why the .bak file is so big?. Is it possible to reduce the size? How could i reduce it?



http://www.sqlserverstudy.com

View 9 Replies View Related

Huge T-SQL - Out Of Memory

Mar 25, 2008

Hello,
I have a very big T-SQL script (~24mb) and I need the SQL Server on my hosted site to execute it.

Right now, I have a web page that uses Sql.Connection.ExecuteNonQuery () to do it, but the .NET process runs out of memory when I load the file into a C# string.

How would I go about executing this T-SQL script on the server?
Is there such a command:
EXECUTE SCRIPT "myscript.sql" FROM DISC
?

thanks.

View 1 Replies View Related

Huge LDF File

Nov 13, 2007



Hi
this is regarding SQL Sever 2000. ( it was upgraded form sql7). its log file is increasing in very high manner. say 40 gb, 50 gb and now 57 gb. Mdf file is around 15 mb. we created back up and tried to restore to another system. its asking 57 gb free space. how to proceeed with file recovery. we have backups but it askes more space for log file. how to retrieve the data.
rgds
Pramod

View 3 Replies View Related

Huge SQL Queries In SQL 2005

Mar 20, 2008

Good evening:
 We're porting an old app written in ASP.NET  1.1433 and SQL 2000 to ASP.NET 2.0 and SQL 2005.  In the old app we have a few data grids that are populated from a dataset pulled from the database.  We use a SQL query that we build based on more than 10 different user inputs, the result of which is an enormously complicated SQL string.  We'd like to move this processing into a SPROC in the 2005 database.
Rather than writing stored procedures to create the SQL SELECT statement, is it possible to pass an entire select state ment to a SPROC and have it executed within?  We're trying to capitalize on paging in 2005 using ...ROW_NUMBER() OVER (ORDER BY PM ASC)... and building the string using IF ELSE statements is mind numbling complex and tedious.
And suggestions woudl be great.
Thanks,
Brad

View 4 Replies View Related

SQL Backup File Is HUGE!!!

Sep 25, 2004

One of my databases is approxiamtely 1.5 GB's but when I run a backup the backup file explodes to 38GB's.

What is the proper usage of Shrink Database, is this safe or is there another method to reduce the size?

View 2 Replies View Related

The Log File For My Database Is Huge

May 20, 2006

The log file for my database in SQL Server 2005 is huge. How do I get empty it or in effect shrink it or start it over?Thanks

View 1 Replies View Related

Problem On Huge Table.

Mar 14, 2001

We have a huge table which has 12 million records. And when I run the following script, it took 50 hours. Is there anyone who can help? Thanks.

update TableA
set In=e.In, EA=e.ea, We=e.we
from TableA c, TableB e
where c.code=e.code

TableA 12,000,000 records.
TableB 750,000 records.
And have clustered index on each code field.

View 6 Replies View Related

Alter A Huge Table

May 18, 2001

I need to alter a table (expand the column size for varchar(10) to varchar(255)) and the table has 200 million rows.
Please suggest me the best and the fastest method to achieve it. The database is on SQL 7.0


Thanks

View 1 Replies View Related

Huge Backup File

May 12, 2000

I am using SQL Server 7 and have about 5 databases. One of them has a data file of about 10 Meg, and most of the others are larger. I do a nightly backup to both a local and mapped drive. On both, the size of the backup file for this database is more than 500 Meg, but the rest appear to be an appropriate size. Does anyone know why this would be happening? The database works fine, it does not get a lot of insert/delete activity and I run DBCC every weekend. If anyone has any ideas I would sure like to hear from them.

View 1 Replies View Related

Huge Memory Usage With BCP?

Oct 17, 2000

Does anybody know why BCP on v6.5 grabs so much memory for SQL Server? I have a few table imports where the BCP process will consume over 460MB of RAM during the imports.

The BCP cmd file is executed via an xp_cmdshell call. The server has 2+GB of RAM, but the BCP process effectively flushes large amounts of data from the buffer. It takes quite along time for the cache to recover from this, and after this, the rest of the nightly processes run much slower, as they end up having to hit the drives to retrieve information that should already be in cache.

If anyone can shed some light on this it would be much appreciated.

Ian Dundas
DBA
Assante Asset Management

View 2 Replies View Related

Huge Select Into.. (Can I Choose Not To Log?)

Dec 12, 2000

I have a job that selects alot of data from one database into another.

Can I choose not to log this operation (doesn't need to be and the log fills up before it's done)

Thanks..

type of code:

Insert into database_1
Select * from database_2

View 1 Replies View Related

Huge Log File On SQL Server 7

Feb 21, 2001

I have a huge log file (285M) on SQL Server 7.
The database itself is about 10M.
How can I reduce the log file ?
Is it possible to build it again from scratch ?

I tried the Truncate Transaction Log but it didn't help.

View 2 Replies View Related

What Is The Best Way To Upload Huge File Into SQL 2K

Mar 13, 2002

I have a 2 GB text file(semicolon delimited), which I need to
pump into SQL 2K. What is the best way to achieve this in shortest time?

I tried with DTS BCP, it tooks 1 minute 14 secs to transfer 13457 records. This is only 1/2000 of the record i need to transfer. Please help.

Thanks.

regards,
Terry

View 2 Replies View Related

Transaction Logs Are Huge

Mar 19, 2008

Hi all our It Admin is having issues with the backups. He was doing full backups every 4 hours with backupExec, which means thats way too much. But now hes trying to do a simple recovery now, because obviously the transaction logs have not been truncated. Its a big mess, I'm not involved in this part, they handle the backing up and permissions. Transaction logs are huge??

Any suggestions??

View 12 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved