Hi all, I found my database log file is 26GB and the database file is just about 280MB. We are doing full backup everyday. However, my sql server seems running very slow now and please advise:
1. How can I decrease/truncate my log file? 2. Would the huge size of the log file be the reasons slowing up my sql server? 3. Would anyone give me direction knowing more on the transaction log? Thank you and appreciated!
Hi guys, its my first post! Its also like my first time really diving into sql. We are using sharepoint on site here along with sql server 2005, one of our log files is 255 GBs and needs to be made smaller very fast!! We are almost out of disk space and the log is growing fast.
I am very new to sql and dont even know where to go to enter commands, so youll have to bear with me here. I've read about truncating and shrinking and some other things, I am just worried and dont want to mess anything up. I know this is probably a simple task, but like I said, with the truncate command I was reading about, I dont even know where to go to type it in!!! If someone could please help it would be much appreciated. Thanks so much.
I have a .bak file of 72gb. But my database size is only 32gb, I got this value from sp_spaceused? Anyone know why the .bak file is so big?. Is it possible to reduce the size? How could i reduce it?
Hi this is regarding SQL Sever 2000. ( it was upgraded form sql7). its log file is increasing in very high manner. say 40 gb, 50 gb and now 57 gb. Mdf file is around 15 mb. we created back up and tried to restore to another system. its asking 57 gb free space. how to proceeed with file recovery. we have backups but it askes more space for log file. how to retrieve the data. rgds Pramod
I am using SQL Server 7 and have about 5 databases. One of them has a data file of about 10 Meg, and most of the others are larger. I do a nightly backup to both a local and mapped drive. On both, the size of the backup file for this database is more than 500 Meg, but the rest appear to be an appropriate size. Does anyone know why this would be happening? The database works fine, it does not get a lot of insert/delete activity and I run DBCC every weekend. If anyone has any ideas I would sure like to hear from them.
I have a huge log file (285M) on SQL Server 7. The database itself is about 10M. How can I reduce the log file ? Is it possible to build it again from scratch ?
I tried the Truncate Transaction Log but it didn't help.
I have inherited a SQL 2000 server, and am therefore an absolute beginner of SQL2000.
I know this has been covered before, but I don't know how to use the KB as I don't know how to run the commands/script. http://support.microsoft.com/kb/272318/
I can no longer backup the SQL database because the 'transaction log backup' file is about 17GB. The SQL database is only about 2GB! The partition that it is backed up onto fills up every day.
I am writing a .NET service, and I need to insert files in a SQL DB for temporary storage.
I have never inserted a file into a SQL database before. I am thinking about using the image field type. Has anyone done this before? How did you do it?
I am creating a database for an application through script. After the tables, views, and sp's are created, the database is populated with data. After all of this (and before the application is even run), the log file is about 700MB. If I shrink the database, it takes the log down to 1MB. The mdf file is about 165 MB before and after it has been shrunk.
I have two questions: 1. Is there something I should look for in my database scripts or is there a setting that could prevent this from being created so large.
2. Is there a script I can run in my sql code after the database has been created and populated to shrink it.
I have a database data file almost at 2tb maxing out a windows drive. Only 16gb left. Should I just add another data file on another Windows drive for growth? Or just move current huge data file to a new GPT drive? Or do both adding another data file and moving existing to its own new GPT drive?
I have the next question, and i would like to hear what do you thinkabout, and if is there a better solution for "my problem"here is the question, I have a huge table with 60GB of data (imagefiles). The problem happen always when i try to ALTER the structure ofthe table. For example I change a field char(3) to char(4)...thesqlserver then performs the "alter table" command...that must besomething similar than "insert into the new table + drop the actualtable" and for that I need about 60GB o space for my LOG file, andtakes hours to complete the operation.Is this the only way to alter a single field in my table??I would like to heard you opinions...Thanks..ALberto
Have a database that's in "Simple" recovery mode whose .ldf has grown to 270GB. This database is a data warehouse so "full" is not required. I put it in simple mode a month ago and shrunk the log down and now it's filled up the disk.
What steps can I take to mitigate this in future? I've read that this is caused by long running transactions which fill the log for DR purposes. Should I put the database back into full mode and backup/truncate daily.
The auto-growth is set to 128MB which is very low.
I am copying data from database to an excel file through SSIS. database is MS SQL 2005 and BIDS is also 2005.However, the job doing this task fails every time.As per investigation, the result of the query is more than 100,000 rows and we know that excel has a limit of 65000 rows of data.Is there a way of setting the limit up? or something? a better approach maybe so that everything will be copied to the excel file successfully.
The PrimeOutput method on component "Source - Query" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. End Error Error: 2015-10-22 04:34:58.05 Code: 0xC0047021
Source: Data Flow Task Description: SSIS Error Code DTS_E_THREADFAILED.
Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 4:30:00 AM Finished: 4:35:05 AM Elapsed: 304.844 seconds. The package execution failed. The step failed. "
I have a 400MB Excel file that I consume from another automated process (don't ask). I copy this file down locally to my server, and I am attempting to create an SSIS package that points to this file via a connection manager. My computer starts gobbling up massive amounts of memory (devenv.exe gets up to about 800MB or so, then drops back down to 100MB) even when I attempt to rename the connection in the connection managers tab.
I have set all BypassPrepare to TRUE and ValidateExternalMetadata properties to FALSE, and still it can take up to 3 to 6 minutes for BI Dev Studio to respond. My specs:
Intel Centrino Duo 2.00 GHz 2GB RAM XP Pro SP2
There MUST be a way for me to work effectively on a file of this size. Please help! Thanks much for any assistance.
Just like in DTS where we can add error file so that if the DTS package fails we can see what caused the DTS to fail, likewise do we have anything like error logging file in SSIS. I greatly appreciate your help on this. thanks!!
I've created SQL Server job that executes my IS package in regular schedule, is there a way to redirect IS logging (Progress tab when the package is executed from VS) to flat file in same path as my packge.
When my job fails I want to know wich particular step fail and progress tab olso contain usefull info like the duration for each step.
Can someone tell me if SQL server logs individual inserts when copying from Flat File to OLE DB Destination? If so, can this be turned off?
ULTIMATELY, I would like to use a Bulk Insert task, but it looks like it is pretty specific to a single DB and a single table in the DB. I would like the Bulk Insert task to be generic enough to where I can specify (via Parameters) the DB and table for the Bulk Insert because this task will be performed in an iteration of many different flat files.
Sorry I never seem to have a "simple" question. My DBA is breathing down my neck here..
After logging is configured in SSIS package, it seems that after each execution the output is appended to the log file (we are talking about log provider for text files in this case). As a result the file just keeps on growing. I would like to overwrite old information with each run, but I can't find where to configure this. Anybody knows?
I have a big stored procedure which is going to alter many tables,insert data, basically lot of changes.
So, i want to have a text file (or) any log file which will display, what all the changes does the stored procedure has done ( They dont want profiler output )
Can anybody know how to log the results of execution of stored procedure to a text file.
I am looking to promote a SSIS package to various servers. The name and location of the text Log File setup in BIDS for the package is expected to change. The SSIS package will be executed through a 3rd party scheduler as a batch command file.
I am trying to use this command to change the location dtexec /F DASD_Database_List_export.dtsx /L "DTS.LogProviderTextFile.1;E:Log.txt"
For which I am getting this error:
Description: The connection manager "E:log.txt" is not found. A component failed to find the connection manager in the Connections collection.
I tried Set command approach from http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=72104
It did not work.
Has anyone been successful in using the dtexec log parameter DTS.LogProviderTextFile.1? Could you post how you got it to work? Thanks
I am trying to use a conditional split task so that I can check for specific fields. If the value doesn't exist I am piping the records to a derived field task, where I add an error. I then try to send these records to a flat file destination so that I can keep track of them. However, when I execute the SSIS data flow task I get the following error
[Log Invalid Records [5496]] Warning: The process cannot access the file because it is being used by another process.
This file isn't being used by any other process as far as I can tell, and the only process using it is the SSIS task trying to write to it.
If anyone has any ideas, then I would really really appreciate it
I am developing a package on my local workstation. I have defined two logging service providers. One is for SQL Server and the other is for the Windows Event Log. I am using the Dts.Log method in a script task to write log entries.
Logging is working properly with the SQL Server provider and rows are being inserted into the sysdtslog90 table. However, the only events that are being logged in the Windows Event Log are the package start and end events which I believe SSIS is doing automatically anyway.
Is there something I need to do to enable WIndows Event Log logging other than defining a log provider and making sure it is checked active? Won't SSIS write to two different logs with one Dts.Log call? Any ideas on what might be going wrong with my approach?
We run std 2008 r2. When I deploy and run a pkg from the catalog, how can I get that flat file system log we always instructed ssis to write to when we ran from the command line? I believe it was the /L param . Not sure at this point if i'll use sql agent or somehow employ task scheduler to kick off the pkg.
i had setup the merge replication across the server but each next morning i find that replication syncronization has stop with following error
The merge process was unable to access row metadata at the 'Subscriber'. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to write to, or use SQL Profiler to determine the source of the failure. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147200996)
for this i manually restrat the syn. agent by stopping and then starting agent.
plz tell me how can i fix this hell of message or tell me how i call start/stop merge agent on command line so that i make script starting and stoping agent as job and will configure to run every morning.
Hi, I decided to use the SQL Server log provider to store logging data of all my Integration Services packages. I also created some reports about this data for operating purposes. I have a problem occurs the name of the executing package is not always written to the log,but the name of the single task which failed. But that is not very useful information for operating, because I do not see any chance to get the name of the package by the information which is logged in the sysdtslog90 table in the database which I defined for SSIS Logging.
How do I configure the package to always log the package information into the table, too?