Huge Log File On SQL Server 7

Feb 21, 2001

I have a huge log file (285M) on SQL Server 7.
The database itself is about 10M.
How can I reduce the log file ?
Is it possible to build it again from scratch ?

I tried the Truncate Transaction Log but it didn't help.

View 2 Replies


ADVERTISEMENT

Insert Huge Text File Into SQL Server

Dec 16, 2004

I am writing a .NET service, and I need to insert files in a SQL DB for temporary storage.

I have never inserted a file into a SQL database before. I am thinking about using the image field type. Has anyone done this before? How did you do it?

View 9 Replies View Related

SQL Server 2008 :: Huge Volume Of Records To Copy To Excel File Through SSIS

Oct 22, 2015

I am copying data from database to an excel file through SSIS. database is MS SQL 2005 and BIDS is also 2005.However, the job doing this task fails every time.As per investigation, the result of the query is more than 100,000 rows and we know that excel has a limit of 65000 rows of data.Is there a way of setting the limit up? or something? a better approach maybe so that everything will be copied to the excel file successfully.

The PrimeOutput method on component "Source - Query" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. End Error Error: 2015-10-22 04:34:58.05 Code: 0xC0047021

Source: Data Flow Task
Description: SSIS Error Code DTS_E_THREADFAILED.

Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 4:30:00 AM Finished: 4:35:05 AM Elapsed: 304.844 seconds. The package execution failed. The step failed. "

View 1 Replies View Related

What Should I Do With The Huge Log File

Oct 11, 2004

Hi all,
I found my database log file is 26GB and the database file is just about 280MB. We are doing full backup everyday. However, my sql server seems running very slow now and please advise:

1. How can I decrease/truncate my log file?
2. Would the huge size of the log file be the reasons slowing up my sql server?
3. Would anyone give me direction knowing more on the transaction log?
Thank you and appreciated!

View 4 Replies View Related

Log File HUGE!!

Oct 30, 2007

Hi guys, its my first post! Its also like my first time really diving into sql. We are using sharepoint on site here along with sql server 2005, one of our log files is 255 GBs and needs to be made smaller very fast!! We are almost out of disk space and the log is growing fast.

I am very new to sql and dont even know where to go to enter commands, so youll have to bear with me here. I've read about truncating and shrinking and some other things, I am just worried and dont want to mess anything up. I know this is probably a simple task, but like I said, with the truncate command I was reading about, I dont even know where to go to type it in!!! If someone could please help it would be much appreciated. Thanks so much.

View 14 Replies View Related

Huge .BAK FILE

Jan 28, 2008

I have a .bak file of 72gb. But my database size is only 32gb, I got this value from sp_spaceused?
Anyone know why the .bak file is so big?. Is it possible to reduce the size? How could i reduce it?



http://www.sqlserverstudy.com

View 9 Replies View Related

Huge LDF File

Nov 13, 2007



Hi
this is regarding SQL Sever 2000. ( it was upgraded form sql7). its log file is increasing in very high manner. say 40 gb, 50 gb and now 57 gb. Mdf file is around 15 mb. we created back up and tried to restore to another system. its asking 57 gb free space. how to proceeed with file recovery. we have backups but it askes more space for log file. how to retrieve the data.
rgds
Pramod

View 3 Replies View Related

SQL Backup File Is HUGE!!!

Sep 25, 2004

One of my databases is approxiamtely 1.5 GB's but when I run a backup the backup file explodes to 38GB's.

What is the proper usage of Shrink Database, is this safe or is there another method to reduce the size?

View 2 Replies View Related

The Log File For My Database Is Huge

May 20, 2006

The log file for my database in SQL Server 2005 is huge. How do I get empty it or in effect shrink it or start it over?Thanks

View 1 Replies View Related

Huge Backup File

May 12, 2000

I am using SQL Server 7 and have about 5 databases. One of them has a data file of about 10 Meg, and most of the others are larger. I do a nightly backup to both a local and mapped drive. On both, the size of the backup file for this database is more than 500 Meg, but the rest appear to be an appropriate size. Does anyone know why this would be happening? The database works fine, it does not get a lot of insert/delete activity and I run DBCC every weekend. If anyone has any ideas I would sure like to hear from them.

View 1 Replies View Related

What Is The Best Way To Upload Huge File Into SQL 2K

Mar 13, 2002

I have a 2 GB text file(semicolon delimited), which I need to
pump into SQL 2K. What is the best way to achieve this in shortest time?

I tried with DTS BCP, it tooks 1 minute 14 secs to transfer 13457 records. This is only 1/2000 of the record i need to transfer. Please help.

Thanks.

regards,
Terry

View 2 Replies View Related

Import Of Huge XML File

May 22, 2013

I have an xml file of 44 Gb (Not Meg, its really GB) . Delivered by the Danish custom authorities.

My problem is simple - How to import such a beast? I have seen a limit of 2.1 Gb everywhere.

View 9 Replies View Related

Delete Huge Log File

Feb 15, 2006

how to delete a very huge log file, to free up some harddisk space

View 3 Replies View Related

'transaction Log Backup' File Is Huge

May 19, 2007

Hello all,

I have inherited a SQL 2000 server, and am therefore an absolute beginner of SQL2000.

I know this has been covered before, but I don't know how to use the KB as I don't know how to run the commands/script.
http://support.microsoft.com/kb/272318/

I can no longer backup the SQL database because the 'transaction log backup' file is about 17GB. The SQL database is only about 2GB! The partition that it is backed up onto fills up every day.

Can anyone help me?

Thanks
Puskas

View 4 Replies View Related

Turning Off Logging / Avoiding Huge Log File

Nov 15, 1999

Help,

Have a user who needs many data changes to a huge table with over 2 million rows.
He hopes to do this by running a series of queries.

Last time our log grew gargantuan. I know 7.0 is wonderful & dynamic, but after
an hour of rambling through books online I'm more confused than ever.

Is there a way we can code into his query at intervals any hard coded checkpoints
or log backups? Do we need to dbcc shrinkfile the logfile?

Thanks,

Mark Blackburn
mark@mbari.org

View 1 Replies View Related

Database Script Creates Huge Log File

Mar 20, 2008

I am creating a database for an application through script. After the tables, views, and sp's are created, the database is populated with data. After all of this (and before the application is even run), the log file is about 700MB. If I shrink the database, it takes the log down to 1MB. The mdf file is about 165 MB before and after it has been shrunk.

I have two questions:
1. Is there something I should look for in my database scripts or is there a setting that could prevent this from being created so large.

2. Is there a script I can run in my sql code after the database has been created and populated to shrink it.

Thank you for your help

View 2 Replies View Related

Move Current Huge Data File To New GPT Drive?

Oct 31, 2015

I have a database data file almost at 2tb maxing out a windows drive. Only 16gb left. Should I just add another data file on another Windows drive for growth? Or just move current huge data file to a new GPT drive? Or do both adding another data file and moving existing to its own new GPT drive?

Primary objective is to make do for now.

View 1 Replies View Related

Question About Log File Size When Alter Huge Table

Oct 31, 2005

I have the next question, and i would like to hear what do you thinkabout, and if is there a better solution for "my problem"here is the question, I have a huge table with 60GB of data (imagefiles). The problem happen always when i try to ALTER the structure ofthe table. For example I change a field char(3) to char(4)...thesqlserver then performs the "alter table" command...that must besomething similar than "insert into the new table + drop the actualtable" and for that I need about 60GB o space for my LOG file, andtakes hours to complete the operation.Is this the only way to alter a single field in my table??I would like to heard you opinions...Thanks..ALberto

View 2 Replies View Related

Simple Recovery Model Database - Huge Log File

Nov 3, 2015

Have a database that's in "Simple" recovery mode whose .ldf has grown to 270GB.   This database is a data warehouse so "full" is not required.  I put it in simple mode a month ago and shrunk the log down and now it's filled up the disk. 

What steps can I take to mitigate this in future?  I've read that this is caused by long running transactions which fill the log for DR purposes.  Should I put the database back into full mode and backup/truncate daily.  

The auto-growth is set to 128MB which is very low. 

View 3 Replies View Related

HELP: How To Avoid Ridiculous System Resource Issues When Connecting To A Huge Excel File

Sep 20, 2007

Hi all,

I have a 400MB Excel file that I consume from another automated process (don't ask). I copy this file down locally to my server, and I am attempting to create an SSIS package that points to this file via a connection manager. My computer starts gobbling up massive amounts of memory (devenv.exe gets up to about 800MB or so, then drops back down to 100MB) even when I attempt to rename the connection in the connection managers tab.

I have set all BypassPrepare to TRUE and ValidateExternalMetadata properties to FALSE, and still it can take up to 3 to 6 minutes for BI Dev Studio to respond. My specs:

Intel Centrino Duo 2.00 GHz
2GB RAM
XP Pro SP2

There MUST be a way for me to work effectively on a file of this size. Please help! Thanks much for any assistance.

Sincerely,

Brian Pulliam

View 4 Replies View Related

Huge Deletes In A Huge Table

Apr 3, 2000

SQL 7 SP1 NT4 SP5

I have a TRANSACTION table with 150 million rows.

I have a USER table.

Each user has about 600 records in the TRANSACTION table.

The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.

The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.

I want to go through the USER table and delete all users who have not visited me in a while.

I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.

The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.

Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?

Thanks !!!

View 3 Replies View Related

SQL Server Huge Bug On Multiplication ?

Nov 28, 2007

When I run the following code :


DECLARE @var1 float,
@var2 int;
SET @var1 = 32.91 ;
SET @var2 = 100 ;
SELECT CAST(@var1 * @var2 AS int);

the result is ... 3290 (whereas it should be 3291 !)

Also works for @var1 set to 33.91 34.91 35.91 36.91 37.91 38.91 39.91 but not for other values > 40.

Is that a SQL Server bug ?

View 4 Replies View Related

Loading Huge XML To SQL Server 2005

Jan 13, 2008

Hi,
 I am using VS2003 and SQL Server 2005. I will get the huge size xml from the web service. I need to load this xml to sql server 2005 table as records.
I got below solution from microsoft site. Load XML to Dataset and from Dataset to SQL Server 2005.
Note   If you call ReadXml to load a very large file, you may encounter slow performance. To ensure best performance for ReadXml, on a large file, call the DataTable.BeginLoadData method for each table in the DataSet, then call ReadXml. Finally, call DataTable.EndLoadData for each table in the DataSet.
Any other better solution is there to load huge XML to SQL server 2005?

View 1 Replies View Related

SQL Server 2008 :: How To Migrate Huge Databases

Sep 27, 2015

I would like to migrate around 15 databases in production server to the new production server. The biggest database is 80 GB .

Wondering is there any fastest way to do that with very low downtime ?

What I can think about is shrink databases files and perform attach – detach .

View 8 Replies View Related

PULL Huge Table From SQL Server Problem

Jul 12, 2007



Hey Guys

I have meet the same problem, too.

I create a table with 1,380,000 rows data,

the db real size about 114 MB.

The primary key size is nchar(6).

When I use RDA pull, I found that the primary

key in the PDA disappear. So, It took a long time

to get query response.

But when I delete some rows to 680,000 rows of data.

After I pull, The primary key can pull from the SQL Server.

PS: I didn't change any code. Just delete some rows.

Is that SQL-Mobile's bug??



PS: 1.Database and Temp Database limitation both are 384MB

2.If I use query analyzer to add primary key it works! so strange!!

3.Pull process return "S_OK".

4.After Pull process finished, the db connection still alive. It seems not like

time out problem.

5.Local Connection String:"Data Source='%s\%s';SSCEatabase Password='%s';SSCE:Encrypt Database='true';SSCE:Max Database Size=384;SSCE:Temp File Max size=384;SSCE:Temp File Directory=%s"



View 1 Replies View Related

Huge Performance Difference When Running UDF In Workstation Vs Server

Dec 13, 2007

Hi,

I created a CLR UDF that returns a large number of rows, when I run it from my VPC (XP, SQL Server Developer Edition and 1GB Memory) it takes approx 2 min and 30 secs to start displaying the rows (Using Management Studio), when I run the same query in our development server (Win 2003, SQL Server Enterprise Edition, 8 GB Memory and 8 Processors) it takes more than 15 min to start displaying the results, does anybody have an idea why is this happening?

Thanks in advance

View 2 Replies View Related

SQL Server 2014 :: Huge Number Of Rows In Table Spool

Jul 6, 2015

I have a CTE query against a table with 32K rows that runs fine in 2008R2. I am running it in 2014 Std Ed. against the same data and it runs very slowly. Looking at the execution plan I think I see what's contributing to the slowness.

Note that the "actual number of rows" is some 351M...how is this possible?

the query:

declare @amts table (claim int,allowed decimal(12,2),copay decimal(12,2),deductible decimal(12,2),coins decimal(12,2));
;with unpaid (claimID) as (select claimID from claim where amt+copay + disct+mm + ded=0)
insert @amts
select lineID, sum(rc), sum(copay), sum(deduct),
case when sum(mm)>0 and (sum(mm)<sum(mmamt)) then sum(mm) else 0 end
from claimln
where status is null
and lineID not in (select claimID from unpaid)
group by lineID

it's like there's some massively recursive process going on?

View 5 Replies View Related

How To Work With Huge Amount Of Records In A Table Using MSSQL Server 2000?

Dec 21, 2005

In
one of our forth coming projects, with ASP.Net/C#/MSSQL Server, We have
to deal with a Business table having about 15 millions of records. We
want to know, that which methodologies should we adopt, both regarding
front end and back end perspective, so the site could give optimised
performance. Also in place of a Dedicated Server, the Hosting Company
provides MSDE (that come with .net). Will this create any problem with
this project, that have such a huge table? Should we go for some
advanced database technique, such as, Clustering, Spliting Tables, etc.

Followings are the fields that the business table contains:

ID, Category ID (which comes from a Category table, each business is
under a category), BusinessName, SignupDate, Address1, Address2, Phone
Number,
Hours Of Operation, Years in Business, LicenseNumber, DiscountCoupon, Website

View 3 Replies View Related

SQL Server Admin 2014 :: Huge Traffic Between Share Point Front End Servers To Content Database?

Feb 20, 2014

We have a Customized share point application with Very minimal data usage and we have used only 5 to 6 lists and libraries only in the share point.

Configuration is

Clients -- fire wall --- Load Balancer ---- WF1 and WF2 --- SQL DB

ROUTING IS VIA FIRE WALL.

SUDDENLY THE SITE GOT DEAD SLOW AND UNABLE TO TRACE THE PROBLEM AS EVERY THING LOOKS FINE.

Checked with the firewall Team and they stated its fine from their end & even we have verified the counters, CPU, Memory & Page life expectancy, buffer counters all looks good and even we do not have huge data in the database. We have only 50 concurrent users are working...

View 2 Replies View Related

How To Optimize Data Import With Huge Volumes And Joins Across Data Sources Not All SQL Server Based?

Jun 7, 2006

I need to periodically import a (HUGE) table of data from an external data source (not SQL Server) into SQL Server, with the following scenarios:
Some of the records in the external data source may not exist in SQL.Some of the records in the external data source may have a different value at different imports, but this records are identified univocally by the same primary key in the external datasource and in SQL Server.Some of the records in the external data source may be the same in SQL.

Due to the massive volume of the import, I would like to import only the records which are different from what I have in SQL Server (cases 1 and 2 above). In fact case 2 is the most critical.

I thought of making a query with a left outer join between the data in the external data source table (SOURCE) and the data in the SQL Server table (DESTIN). The join is done on the respective primary keys (composed keys of up to 10 columns) and one of the WHERE conditions will be that the value in SOURCE is different from the value in DESTIN.

The result of this query would be exactly what I need to import.
How to do this in SSIS??? I couldn't figure out how to join tables in different data sources yet.

In fact I cannot write a stored procedure to do that, since one of the sources is in a datasources not SQL Server.
I have seen the Lookup transformation in this article http://www.sqlis.com/default.aspx?311 but this is not exacltly what I want to do.
Another possibility is to use the merge join, but due to the sorting I believe its performances would be terrible!

Thanks in advance for your suggestions!

View 9 Replies View Related

Huge Log

May 27, 2007

Dear all,



i have problems with log .

the mssql write 4 G log so plz how can i Eliminate log huge size

View 1 Replies View Related

Huge Log Files

May 9, 2008

 I have SQL Server 2005 Express Edition with Advanced Services running on a small web server. It all runs fine, but every now and then the log files grow and grow and eventually use up all the disk space of 30GB. As a quick fix, restarting SQL a couple of times clears out the logs and everything is up and running again. Any ideas on how to stop this happening?

View 5 Replies View Related

Huge Transaction Log - Do I Need This?

Nov 19, 2004

I just peeked at my DNN setup and I found that I have a transaction log about 98 gigs large, compared to a DNN database that is only about 250 megs. Crazy, huh?

Do you happen to know what I need that transaction log for? Can I just delete it or will it break my SQL db? Is there a way that I can keep only maybe a week of transactions in it so it doesn't grow so dang large?

Thanks in advance for your response!!

Tim x 4
(always learning!)

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved