Log File Fills HD In Minutes. Help!

Apr 9, 2008

Does anyone know what would cause my log file (.LDF) to grow at a rate of over 1MB per second and quickly fill up the hard drive? I could use a quick answer on this. My experience is in Oracle but I'm assuming you can set the maximum size for a log file for starters? Not sure why it would be growing at this rate anyway though. I could use some quick answers on this one. Thanks!

View 4 Replies


ADVERTISEMENT

T-SQL (SS2K8) :: Display Row As 2 Days Ago / 1 Hours 34 Minutes Ago / 11 Minutes Ago

Apr 21, 2015

My table as data as follow,

IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[table_Data]') AND type in (N'U'))
DROP TABLE [dbo].[table_Data]
GO
/****** Object: Table [dbo].[table_Data] Script Date: 04/21/2015 22:07:49 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[table_Data]') AND type in (N'U'))

[code].....

View 6 Replies View Related

Append Result Set Every 15 Minutes On Same Text File

Jul 11, 2013

I am running a SQL job that append the result set every 15 minutes on a same text file.

But it also brings lot of information on the text file like JOb lOg info and time stamps.

I only need to see the pure data i am querying how to fix it.

View 1 Replies View Related

Trying To Run A Schedule Dts Package Every 20 Minutes To Recreate A Table With Updated Data Into A Excell File

Jun 7, 2007

The only way the job success is if I select the option add rows wich is copying all rows even the ones that are already there, I tried to drop amd recreate the table option but i just dosn't run that way, help please..

View 1 Replies View Related

Delete Fills Logs.

May 10, 2007

I'm trying to do a very large delete. I set the db to simple mode thinking that the Trans Log will not fill during the delete.

But it does still fill. I thought if you used simple mode the logs are not used?

Any thoughts?

View 1 Replies View Related

Delete Fills Logs.

May 10, 2007

I'm trying to do a very large delete. I set the db to simple mode thinking that the Trans Log will not fill during the delete.

But it does still fill. I thought if you used simple mode the logs are not used?

Any thoughts?

View 5 Replies View Related

SQL Server 2008 :: What Happens When The Xact Log Fills Up

May 5, 2015

how SQL handles a full transaction log. I'm running a population of a temp table with a lot of rows (roughly 70M), and when I run dbcc sqlperf(logspace) it says 99% of the tempdb log is full. It doesn't throw an error so is it still working? Does it have to do some sort of disk/memory swapping of the transaction log? Is occasionally filling-up of the xact log ok? Or is it something that has dire consequences for other operations as well?

View 3 Replies View Related

SQL Server Error Log Fills Disk

May 30, 2007

Hello,



I have the following problem:



I'am running SQL Server 2005 Express Advanced Services on a Windows 2003 Server in a hosted Environment. Some times SQL Server is beginning writing entries into C:ProgrammeMicrosoft SQL ServerMSSQL.1MSSQLLOGERRORLOG until the disk is full.. After that, I have to delete the error log file (some GB of size), restart the server and everything is running fine until the log file runs amok again.



I have installed SQL Server Management Studio.



With SQL server 2005 Standard I can configure or disable Error logging in the Management Studio. But with the Express Edition it seems that is not possible.



What I want to do is (maybe with system stored procedures)

limit number of error log files by cycling it, e.g. 5 files and delete the old ones
limit the size of one log file. e.g. 100 MB

Is there an option to configure this in the Express edition of SQL Server 2005?



Thanks in advance



Regards



Rolf

View 9 Replies View Related

Package Execution Fills C Drive

Mar 31, 2008



When I run an SSIS package that moves up to 2 GB of data from an ODBC source (FoxPro) to SQL Server a large temp file is created at:

c:Documents and SettingsusernameLocal SettingsTemp1filename.tmp

where username is obviously my domain user name, and filename is a seemingly random filename.

My C drive has limited space, but I have other local drives with plenty of space. Can I dictate where these temp files are created?

Thanks,
Chris

View 7 Replies View Related

Memory Fills Up Then A Buffer Error Is Generated.

Jun 22, 2007

I have SSIS sp2 running on a Win2003 64bit Server with 4processors and 16GB of ram. I am trying to load 1 billion rows of data into 10 tables. The source data is found in 12 different 50GB fixed width flat files stored on 2 different files servers. The destination is 10 different tables in a single SQL Server 2000 database which has 1TB of space allocated to it. I use the MS SQL OLE DB connection for each destination table.



The SSIS package is pretty straight forward. Everything takes place in 1 data flow. The 12 sources each flow through 12 different Row Count Transformations into a single Union All Transformation. From the Union All transformation the data goes into another Row Count Transformation then into a Conditional Split Tranformation. The data is split into 10 streams base on the last digit of one of the ID fields in the data. The 10 streams are fed to the 10 destination tables.



Every time I run the package (Start without Debugging) the avaible physical memory goes from around 15GB to 0 in about 2 minutes. The % comitted bytes in use goes from 5% to 100% in about 5 minutes. Once at 100% it will stay there for around 5 minutes before it will finally give me the following error message:



The system reports 98 percent memory load. There are 17178939392 bytes of physical memory with 189382656 bytes free. There are 8796092891136 bytes of virtual memory with 8742748930048 bytes free. The paging file has 54388109312 bytes with 16056320 bytes free.



This message is followed by a bunch of other messages:



SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Union All" (2073) failed with error code 0x8007000E. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.


SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread1" has exited with error code 0x8007000E. There may be error messages posted before this with more information on why the thread has exited.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.


SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Dr 2" (663) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.


...



SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Dr 3" (898) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.


SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread2" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited.



I have tried adjusting the Engine threads down from 5 to 4 to 2. I have tried adjusting the FastLoadMaxInsertCommitSize from 1000000 to 100000 to 1000 (Destinations are tablocked and Check Constraints). I have tried moving the DefaultBufferMax up to 16500 and down to 2000.



Nothing works. The package fails everytime within 20 minutes of its start.



I would prefer not to have to rewrite the package and process each file sequentially as that would take forever.



Any ideas would be greatly appreciated.



Thanks.

-Scott

View 5 Replies View Related

SQL Backup Continuosly Grows And Fills Up Drive Issue

Nov 1, 2007

I currently have a SQL backup process that backs up my databases via the network to a backup hard drive on a separate system. I recently began getting strange issues with my backup process in which it continually writes to the backup drive until the drive fills up and then the job fails. I also noticed that when I kill the job on the host server, the backup file drops to the normal file size. The normal file size is 300 GB but it has grown to over 400GB. I looked at various logs and even performed several backup tests with success.

I am trying to figure out if this is a known SQL Server issue or an issue with the OS?

specs:

windows 64bit/sql server 2k5 64bit

Thanks in advance for the help.

View 5 Replies View Related

T-SQL (SS2K8) :: Query With Several Joins Fills Up TempDB And Won't Finish

Aug 14, 2014

I have this query I need for a report. Originally it was 4 queries to be used in Crystal Reports. Now I want to create the same report with SSRS and therefore I incorporated all queries in one in order not to use subreports [URL].....

Tempdb fills up to nearly 90 GB. I am running SQL Server on a local box, so I am sure there is no other traffic. Here is the query:

SELECT AdHaupt.NSprache_ID
,AdHaupt.mengentext AS mengentextHaupt
,AdHaupt.Einzelpreis
,AdHaupt.Anzeigebezeichnung
,AdHaupt.Gesamtpreis

[Code] ...

I ran it with TOP 10 as well, just to see if it will finish at all, but it never did (ran for an hour now).

View 9 Replies View Related

Tempdb Grows Rapidly And Fills Up Disk Space

May 10, 2006

Hi,The tempdb file on one of our servers grew very large and used allavailable disk space. This is SQL Server 2000 SP4. I have installedhotfix version 8.00.2187. I opened a profiler trace but can't still getto the root of the problem. Any help will be appreciated.Egbon*** Sent via Developersdex http://www.developersdex.com ***

View 1 Replies View Related

LOG Fills All Drive Space In Simple Recovery Mode

Dec 21, 2007



I am amazed to see this morning that log file consuming whole disk space even though the database is in simple recovery mode.

What could be the reasons to fill in the space even in simple recovery mode??

View 10 Replies View Related

SQL 2012 :: How Server Fills Empty Spaces In Data Files

Oct 1, 2015

I understand that we shouldn't shrink data files as it might cause heavy fragmentation along with log usage, high IO/CPU etc.

In a DB in which lot of DML transaction occur, there will be empty spaces whenever deletions occur.

Will SQL Server fill that part with data when new insertions occur ?.

View 4 Replies View Related

How To Re-set A Job To Run 10 Minutes Later

Aug 12, 2005

Hi,

I have a job where the first step starts and checks for a condition. If its not true, I want it to reset itself and start again in 10 minutes. I'm using sp_stop_job and sp_update_jobschedule and, initially, it looks like it works. But since it's a Daily job, the 'Next Run Date' increments to the following day. Even though I'm using sp_update_jobschedule to keep the active_start_date as the same day, it still increments. I've tried updating sysjobschedules directly, but get the same results.

Any thoughts much appreciated! Here's my code:
USE msdb


--This is the part that goes in the job step
--and increments the next_run_time if the condition is true.

If
(Select count('x') from mytable (NoLock)
Where PublicationDate > getdate()) < 1
BEGIN
Declare @ActiveStartDate int
Declare @ActiveStartTime int

Select @ActiveStartDate = active_start_date from msdb.dbo.sysjobschedules (NoLock)
Where schedule_id = 61
Select @ActiveStartTime = active_start_time from msdb.dbo.sysjobschedules (NoLock)
Where schedule_id = 61

EXEC msdb.dbo.sp_stop_job @job_name = 'Owners'
Select @ActiveStartTime = @ActiveStartTime + 1000

EXEC sp_update_jobschedule @job_id = '46C074D4-908D-46AE-8B0C-A23E3AD4A4F6',
@name = 'Daily',
@active_start_date = @ActiveStartDate,
@active_start_time = @ActiveStartTime
End

View 3 Replies View Related

Min/max Of X Minutes

Mar 27, 2006

I am trying to develop a sql statement that will create a recordset of the min (or max) values in x minute increments over a period of time.

e.g. over a period of 7 days, I have data that was collected in 1 minute intervals. I need to know the min (or max) value in each 10 minute interval over that same period of time.

Is there an efficient way of doing this?

View 3 Replies View Related

Group On Minutes

Mar 4, 2002

Hi,
I need to write a query whioch give me count on
0-15 minutes and then from 0-720 minutes.
I don't know how to group it.
Any help appreciated.
TIA

View 1 Replies View Related

Datetime To Minutes?

Aug 3, 2004

hey all, i need to find the ratio of difference in 2 datetime variables and the difference of another 2 datetime vars. I figured the best way to do it is to convert the difference in both numerator and denominator to number of minutes.

can anyone help ??

View 1 Replies View Related

CURRENT_TIMESTAMP - 5 Minutes

Sep 25, 2006

:shocked: hi,

I want to use CURRENT_TIMESTAMP - 5 minutes in a select and where clause.

I have tried using

CURRENT_TIMESTAMP - 0.004 AS [Time_-6]

But this is not a round off to a whole minute

Also tried

CURRENT_TIMESTAMP, dateadd(minute, datediff(minute, 0, CURRENT_TIMESTAMP) / 5 * 5, 0)

But this will not do wholes seconds e.g

CURRENT_TIMESTAMP = 10.03.33

CURRENT_TIMESTAMP, dateadd(minute, datediff(minute, 0, CURRENT_TIMESTAMP) / 5 * 5, 0) = 10.00.00

Can anyone help??

View 2 Replies View Related

MS SQL Performance From 10 To 3 Minutes

Sep 27, 2006

Hello all !

I am runing from .NET application an SQL Query
it normally return the rows in 10 seconds
but time to time the application turn 2 or 3 minutes and nearlly crash (or crash)

with exactly the same datas in database

what can be the reasons ?

thank you

View 8 Replies View Related

Got Seconds But Need Minutes

Apr 16, 2004

I have a result that comes out in number of seconds, but need to see it converted to minutes and hours and seconds. Is there a convert function that would do this?

Thanks,
Dan

View 3 Replies View Related

Date - 15 Minutes

May 11, 2008

how can i get my date minus 15 minutes in sql?
is there a dateadd function?

View 2 Replies View Related

Group By Every 15 Minutes?

Sep 20, 2013

I have this script that runs on a LOGGING database and find hourly requests for a particular firm.

The date format is "2013-08-19 13:44:50.177"

How can I group it by every 15 mins?

select
LocalDate [date],
LocalHour [Hour],
count(*) [requests],
avg(web_request_duration) [avg response time],
min(web_request_duration) [min response time],
max(web_request_duration) [max response time]

[Code] ....

View 4 Replies View Related

Set Time Up To Minutes

Oct 28, 2014

I am using below query to get the today date and time(2 hrs more than actual time)

select dateadd(HOUR, 2, getdate()) as time_added

the result of above query is "2014-10-28 13:19:09.343" but I want time up hours like shown below

"2014-10-28 13:00"

View 7 Replies View Related

All Records Within X Minutes Of Each Other

Aug 30, 2005

Consider a table that holds Internet browsing history for users/machines,date/timed to the minute. The object is to tag all times that are separatedby previous and subsequent times by x number of minutes or less (it couldvary, and wouldn't necessarily be a convenient round number). This willenable reporting "active time" for users (a dubious inference, but hey).There are a lot of derivative ways of seeing this information that might begood to get to. What's the fist and last of these sets of times? Whatpercentage of a given period is spanned by active times, and not? What isthe average duration of such periods? What is the average interval betweenweb hits during such periods? During other times?Blah, blah. The basic problem is my principal problem. I don't have muchexperience with cursors, but from what I understand it would be very goodindeed to spare them, given the number of records I anticipate workingwith.I'd be glad of any pointers.--Scott

View 29 Replies View Related

The DB Freezes For 5 Or 10 Minutes

Oct 9, 2007

Hi,
the SQL DB freezes and no one can access the DB for 5 or 10 minutes. Even select queries don t execute, nothing is displayed.

Until we kill the process id that s blocking in the sql activity monitor, only then the DB is released and people can work again.
What does it mean that no query executes until we kill the processes ID? what could it be?

Also, recently we created indexes and ran tuning adviser, is it possible that the creation of indexes cause the freeze of a DB? is that possible?

Thanks a lot for your help

View 6 Replies View Related

Cannot Connect Again After 15 Minutes

Nov 19, 2006

Hi,

I have SQL 2005 full version installed a remote server and when I start my computer I can connect to the database from SQL Managment Studio and a program I'm making. After about 15 minutes I find that I cannot connect using both programs, but if I enter the servers IP address using Managment Studio I can connect (this does not happen with the program I'm making). I have to log of my computer before I can connect again.

Anyone having the same problem or how to fix it??

Thanks

PQSIK

View 1 Replies View Related

Execute Trigger Every Minutes

Feb 13, 2008

I want to create a Trigger that will run every 5 minutes.
This trigger would actually run a Stored Procedure named say "SP_SetRooms".
This SP will in turn run and andupdate the rooms table.
Would anyone know how to help me get started on creating a Trigger with the info I've provided?
Thank you,
Dharmendra parihar

View 4 Replies View Related

SqlCacheDependency Works Just For 2-3 Minutes

Jan 31, 2006

I've the following simple code bound to a button on a Web page

string time = (string)Cache["KEY"];
if (time == null)
{

SqlConnection sqlConnection = new SqlConnection(@"Server=BIZYUSUFSQL2005;Database=Deneme;User Id=sa;Password=;");
SqlCommand command = new SqlCommand(@"select KOLON1 from dbo.CACHE", sqlConnection);
sqlConnection.Open();

SqlCacheDependency dependency = new SqlCacheDependency(command);
time = System.DateTime.Now.ToString();
Cache.Insert("KEY", time, dependency);
command.ExecuteNonQuery();
sqlConnection.Close();
}
return time;
This code has to return the time value from cache. And when a record is inserted into the CACHE table, the cache item has to be invalidated and the new time value has to be returned.
The code works properly for 2-3 minutes. But when there is no activity for 5 minutes, the cache invalidation does not work anymore.
 

View 1 Replies View Related

How To Subtract Minutes From The Date?

Oct 18, 2007

I want to fetch all the records which are 2 minutes older
How can I do it?

View 2 Replies View Related

Replicate Every 15 Minutes: Insane?

Jan 15, 2008

The company I work for has no experience with replication, and neither have I.
Now I recieved a functional specifications document and apparently what they want is use replication to maintain a copy of a production database on the data warehouse server.
The database is 70GB, and they want to do it every 15 minutes (so their reports will contain up-to-date data).

As I said, I know nothing about replication, but to me, this sounds like madness. I fear that this will will (at least) have a negative impact on the performance of the production database.

So is it possible? Does replication have a big impact on the database or is it hardly noticable? I expect about 500 new records every 15 minutes; production database and data warehouse are on different servers; both are SQL2005.

View 6 Replies View Related

8000 Records In Over 12 Minutes?

May 28, 2014

I have the following SQL select query:

Code:
SELECT
inbound_emails.inbound_email_id,
inbound_emails.campaign_id,
campaign_header.campaign_name,
customers.customer_name,
inbound_emails.date_received,

[Code] ....

Without any indexes (other than PK's) this query would take 1min 21 secs to return all 8500 records.

I wasn't satisfied with that so I added indexes to the tables.

Indexes are;

inbound_emails:

PK: inbound_email_id
1. campaign_id (non-unique, non-clustered)
2. email_from (non-unique, non-clustered)
3. email_to (non-unique, non-clustered)

campaign_header:

PK: campaign_id
1. campaign_name (unique, non-clustered)

customers:
PK: customer_id
1. customer_name (unique, non-clustered)

contracts:
PK: contract_id
1. contract_name (unique, non-clustered)
2. customer_id (non-unique, non-clustered)

Have I made a hash of the indexes? Or is the query statement badly written? What is a reasonable amount of time to expect to retrieve 8500 records?

I think email_from and email_to on inbound_emails is the main culprit, but I don't understand why?

View 11 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved