For Each File Enumrator Is Not Exceute Remaining Files If One File Fails

Oct 10, 2007



Hi All,


I my requirement I need to read all the files from folder (one by one) and insert data from those files into the database table . I am facing one issue here . If suppose while executing if any of the file fails its not executing the other files. Package stops execution.

I cannot use Redirect rows option beacuse as per requirment if file has some Data problem , I am suppose to ignore the file instead of Data Rows.

Is there is any property in for each file task .....kindly suggest

Regards
Shagun

View 1 Replies


ADVERTISEMENT

Flat File Source - If An Error Occurs, Continue Parsing The Remaining Columns In The Row Before Failing

Jan 14, 2008

Hello everyone,


I have a package that extracts data from a Flat File. If any errors or truncation occur during the extraction of the input data, the package should fail. All fields that have erroneous values should be reported in the log file.


My Solution:
- I have created a Data Flow Task that contains a Flat File Source Adapter and a dummy destination.

- I have left the default "Error Output" configuration of the Flat File Source adapter, namely if a truncation or an error occur for a certain column, then the reaction is "Fail Component".


Problem:
This configuration gives me only the first erroneous column in the row being processed.


Question:
Is it possible to make the Flat File Source adapter continue parsing the current row before it fails? This way, I would be able to get all the erroneous columns in the row in one shot.


Thanks in advance...
Samar

View 6 Replies View Related

Integration Services :: Move Multiple Files Based On File Names Listed In A Spreadsheet / File?

May 27, 2015

I need to move specific files from a server to another server on a monthly basis.  There are hundreds of files that are in the source directory and I need to move approximately 40 of those to the destination server.  I would like to easily add or delete the file list as needed.  I have seen where several variables were created for for each file name (and one for the path) and the ForEach Loop would go through them.  With 40 or more I was thinking that I could make a connection to an Excel spreadsheet or text file with a record for each file name and read in and and move to the next record and make that value become the content of a "FileName" variable.  Then if I wanted to add another file name I could just add another record to spreadsheet/text file or remove and the package would handle automatically....

View 10 Replies View Related

Split The Existing MDF File Into Mutliple Files As A File Group?

Jul 30, 2007

I have a huge MDF File - 120 GB File (Had setup as 1 MDF initially) -- Did not anticipate that the DB would grow to that size!!

Anyways.. I heard that the general performance woul grow if i had them as "File Groups"..

Is there anyway - to split the existing MDF file into Mutliple files as a File Group?

Where should i start? Can someone please direct me..

View 1 Replies View Related

SQL 2012 :: DBCC Shrinkfile Empty File Not Distributing Data Evenly In Primary File Group With Multiple Files

Apr 29, 2014

Why shrinkfile empty file does not redistribute data evenly in the primary file group with multiple files:

Please run the script attached to see what the end result is.

This is what I set up last night on my test machine.

1) Create database [FGTest] size 200MB
2) Create table called TEST on primary
3) Insert 40MB of data into test
4) Create another file group called temp in primary size 200MB
5) Shrinkfile('FGTest',emptyfile) so that all data is transfered from FGTest into temp file group.
6) Add another 2 files called DATA2 and DATA3. Both are 200MB.
7) We now have 3 empty files that I want data distributed evenly on. FGTest, DATA2 & DATA3
8) Shrinkfile('temp',emptyfile) to move all the data from temp over the 3 file groups evenly

I would expect at this stage to have the following:

FGTest = 13MB,
DATA2 = 13MB,
DATA3 = 13MB

(40MB of data over 3 files should be about 13 MBish in each file)

What I actually end up with is this:

FGTest = 20MB
DATA1 = 10MB
DATA2 = 10MB

It looks as though SQL Server is allocating 50% of all data to the original file and then 50% evenly over
the remaining files in PRIMARY.

View 3 Replies View Related

For Loop - Iterate From Older Files To Newer Files Based On File's Timestamp

Mar 13, 2008

In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.

Any Suggestions?

View 3 Replies View Related

Bcp Fails For 39gb File

Apr 7, 2008

I have a rather large text file that I am importing with BCP. It is known to have over 600 million rows. However, when I import it I get this:

SQLState = HY000, NativeError = 0
Error = [Microsoft][SQL Native Client]Unexpected EOF encountered in BCP data-file
214698811 rows copied.

so only about a third of the rows make it in. I am fairly certain that the file does not end after row 214698811. My certainty is based on the file size - other files with similar size and exactly the same schema managed to import fully and they have over 600m rows.

My question is, anyone have any ideas how I might be able to diagnose the problem with this file? Maybe a super-fantastic text editor I could view a 39gb text file with, and jump straight to row 214698811 to see if there is any weirdness there?



elsasoft.org

View 8 Replies View Related

DTSX Package Fails On .CSV File

Dec 13, 2007

I am having trouble with a dtsx package to truncate a table, then insert the contents of a .csv file.
The package is being executed off the local filesystem, reading a csv on the same file system, and inserting into a remote SQL 2k5 server. If I run the package alone in BI it will run perfectly, if I implement the package into a console app in visual studio, it will trunc the table, but will not insert any of the data in the csv file. When running from DtExec I recieve the following error on the CSV portion after the table is truncated:



Code: 0xC00470FE
Soure: Data Flow Task DTS.Pipeline
Description: SSIS Error Code DTS_E_PRODUCTLEVELTOLOW. The product level is insufficient for for componenet "Soure - My_File_CSV" (1).

I have tried all the work arounds I can find without any luck. All help will be appreciated.

View 4 Replies View Related

Job Fails With The System Cannot Find The File Specified

Oct 31, 2007

Hello,
I have a package that copies data FROM an MS Access database table to a SQL Server 2005 table. 'Run64BitRunTime' has been set to 'False'. The package has been saved to SQL Server. I have a Job that runs the package using an operating system command. The following is the command syntax:

C:Program Files (x86)Microsoft SQL Server90DTSBinndtexec.exe? /SQL "RebatesRebates_TotalSecurity" /SERVER bwdbfin1 /MAXCONCURRENT " -1 " /CHECKPOINTING OFF /REPORTING E

I created the package on a machine other than bwdbfin1. I can run the package from Visual Studio. I can run the package from Integration Services. I have sysAdmin rights on bwdbfin1. I've tried running the job using two different proxy accounts and the sql agent account. I have the location of the Access database. No matter what I do, the Job fails with the following error:

The process could not be created for step 1 of job 0xD947EF76ACD96340B12279FEDDC580CE (reason: The system cannot find the file specified). The step failed.

I have an identical package that copies data TO an Access database. The database addressed in that package and this package are in the same location. The 'CreatorName' of both packages is the same. I have logging enabled for every category, but nothing is written to the sysdtslog90 table when the Job runs. I set up error output in the DataFlow task, and have also tried to 'ignore' errors. I have searched the forum, done a web search, and I can't find a reason for the failure.

Is dtexec the file that is not found? If that were the case, then why can a Job run my other package?

Any ideas?

Thank you for your help!

cdun2

View 4 Replies View Related

Package Configuration Using Xml-file Fails

Jun 29, 2006

Hallo,

I use SSIS Version 9.00.1399.00 and keep getting problems trying to use package configuration.



I choose xml configuration file as type and browse a path on our LAN to create te configuration file .

Then I select the properties of a OLEDB communication manager as Properties to Export

Doing so I obtain this configuration:Name:
JACBE_IF_CONFIG

Type:
Configuration File

New configuration file will be created.

File name:
L:ProjectsVinciSSISDVLFMC loader ImportFMC Loader ImportFMC Loader ImportJACBE_IF_CONFIG.xml

Properties:
Package.Connections[JACBE_IF].Properties[UserName]
Package.Connections[JACBE_IF].Properties[ServerName]
Package.Connections[JACBE_IF].Properties[RetainSameConnection]
Package.Connections[JACBE_IF].Properties[ProtectionLevel]
Package.Connections[JACBE_IF].Properties[Password]
Package.Connections[JACBE_IF].Properties[Name]
Package.Connections[JACBE_IF].Properties[InitialCatalog]
Package.Connections[JACBE_IF].Properties[Description]
Package.Connections[JACBE_IF].Properties[ConnectionString]

The system creates a XML file but when I run the package I get the following error in the output pane.
Information: 0x40016041 at FMC_People: The package is attempting to configure from the XML file "L:ProjectsVinciSSISDVLFMC loader ImportFMC Loader ImportFMC Loader ImportJACBE_IF_CONFIG.xml".
SSIS package "FMC_People.dtsx" starting.
Information: 0x4004300A at Dataprocessing_PEOPLE, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0202009 at FMC_People, Connection manager "JACBE_IF": An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020801C at Dataprocessing_PEOPLE, FMC_ARE_PRESENT_destination 1 [22338]: The AcquireConnection method call to the connection manager "JACBE_IF" failed with error code 0xC0202009.
Error: 0xC0047017 at Dataprocessing_PEOPLE, DTS.Pipeline: component "FMC_ARE_PRESENT_destination 1" (22338) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at Dataprocessing_PEOPLE, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Dataprocessing_PEOPLE: There were errors during task validation.
SSIS package "FMC_People.dtsx" finished: Failure.

I don't get it. Where do I go wrong?

I tried the same with a DtsConfig file instead of an XML but to no avail. the way of working as described in BOL and in the book professional SQL SERVER 2005 Integration service seems to me perfectly similar.

Any ideas anyone? I would be most gratefull.

Kind regards,



Paul Baudouin

View 10 Replies View Related

Checking For Existence Of File Fails During Validation

Aug 29, 2007

OK. Here's my situation. I check for the existence of a dummy .txt file using a script. I send an e-mail if it does not exist and exit package. The .txt file only exists if another .xls file is present which I import. However, during the validation phase of the package, the package fails because the .xls file does not exist. Is there a way to bypass the validation step? The only solution I came up with is to have a two-step job. The first runs the file check step and sends the e-mail. The second attemps to run the package and fails. Not a very graceful exit.

View 3 Replies View Related

Openrowset(bulk '\server1c$file.txt') Fails!

Aug 3, 2007

select * from openrowset(bulk '\server1c$file.txt', SINGLE_BLOB) as t
works from sql server itself, but doesn't work from any other machine. I got
"Operating system error code 5(Access is denied.)."
I am running as the domain admin, the file.txt has full control for everyone. In server1 even log, I see Anonymous Logon.


Please help!

Thanks,

Bo

View 3 Replies View Related

MS Access Destination Fails Package When File 'in Use'

Nov 30, 2007

Hello,
For packages where an MS Access database is the destination, what are some ways to detect whether or not the file is 'in use'? I know there is a lock file associated with Access databases. Could it be as simple as using something from the FileSystem Object to discover whether or not there is a lock file with the same name as the destination database? I'd like to stop the package if the file is in use.

Any ideas?

Thank you for your help!

cdun2

View 10 Replies View Related

Differential Backup File Fails To Restore.

Jul 19, 2007

We are doing the following steps:-

1. Weekly Full Bckup

2.Daily Differential backup

3.hr Log back up

The following command works fine for all the Backups. it means backup file is fine.

RESTORE FILELISTONLY from DISK = 'D:BackupewDB_ full.BAK'

RESTORE HEADERONLY FROM DISK = 'D:BackupewDB_ ull.BAK'

RESTORE LABELONLY FROM DISK = 'D:BackupewTest_full.BAK'

RESTORE VERIFYONLY FROM DISK = 'D:BackupewTest_full.BAK'



Also The full back up restoration works fine:

RESTORE DATABASE Test3 FROM DISK = 'D:BackupewTest_full.BAK'

WITH MOVE 'Test2_Data' TO 'F:MSSQL2KMSSQLdataTest2Net_Data.MDF',

MOVE 'Test2_Log' TO 'F:MSSQL2KMSSQLdataTest2Net_Log.LDF',

NORECOVERY

GO

/* RESULT :

Processed 1032 pages for database 'Test3', file 'test2_Data' on file 1.

Processed 1 pages for database 'Test3', file 'test2_Log' on file 1.

RESTORE DATABASE successfully processed 1033 pages in 1.907 seconds (4.433 MB/sec).

*/

But While restoring the Differential file it throws error:

RESTORE DATABASE Test3 from DISK = 'D:Backupew est2_Diff1.bak'

WITH NORECOVERY

GO



Msg 3136, Level 16, State 0, Line 1

Cannot apply the backup on device 'D:Backupew est2_Diff1.bak' to database 'Test3'.

Msg 3013, Level 16, State 1, Line 1

RESTORE DATABASE is terminating abnormally.

I checked the SQL Server Log no error msg corresponding Differential backup restore.



I don't know how to proceed further.Can any one guide me how to over come this..



Thanks !

View 2 Replies View Related

DTS To Write To Text File (fails When Scheduled) (Urgent!!!)

Aug 21, 2001

Recently converted from 7 to sql 2k.

Running NT 4.0, sp6a. and sql 2k. I have a DTS job that works just fine if i go to 'design view' and then 'execute' it. But if I schedule it, the following error appears.. My question is... why? Also I modified the path from the current below (denapp02cmc$cd01.txt to the local path of the server as well... still came up with the same problem. ?? TIA! ::

DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: Copy Data from Results to denapp02cmc$cd01.txt Step DTSRun OnError: Copy Data from Results to denapp02cmc$cd01.txt Step, Error = -2147467259 (80004005) Error string: Error opening datafile: Access is denied. Error source: Microsoft Data Transformation Services Flat File Rowset Provider Help file: DTSFFile.hlp Help context: 0 Error Detail Records: Error: 5 (5); Provider Error: 5 (5) Error string: Error opening datafile: Access is denied.

View 2 Replies View Related

Writing Into The FLAT FILE When Derived Column Fails

Aug 30, 2007

Flat file is the source for to load the data into a table. I am using "Derived Column Component" for the data validation.

"Derived Column Component" Fails then i am writing/redirecting the records into the Flat File using "Flat File Destination" component.

It works fine except the following the issue.

Issue:
The derived columun value (that cause an error) is not get inserted into the Flat File

Scenario:
the data comes as "000000" and tring to convert to date format
(DT_DATE)("20" + RIGHT(Check_Date,2) + "/" + SUBSTRING(Check_Date,1,LEN(Check_Date) - 4) + "/" + SUBSTRING(Check_Date,LEN(Check_Date) - 3,2))

The above expression is working fine, except the data 000000 not passed into the Flat File Destination.

Pls advise. Thank you.

View 1 Replies View Related

Visual Studio 2013 Fails To Connect To MDF File?

Oct 20, 2015

I'm attempting to connect to a database file through visual studio 2013 Ultimate.  The .mdf file is located on my local drive inside the App_Data folder of the project.  However when I try to connect to the file it fails and throws an error message (see below).  

“The attempt to attach to the database failed with the following information: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 50 - Local Database Runtime error occurred. The specified LocalDB instance does not exist.”

View 8 Replies View Related

Using UDL Files As File Name

May 9, 2006

I am using an UDL file to connect to an ORACLE database. In UDL GUI the test of the connection is ok. However, in the Connection Manager, when I set the File Name property to the UDL file name and I test the connection, I get the message 'The connection failed because of an error in initializing provider. The ConnectionString property has not been initialized.'

View 2 Replies View Related

Upload .docx File (Windows 2007) Fails With SQL Error

May 9, 2008

When uploading a Mirosoft 2007 file (*.docx) I get the following error,Message: String or binary data would be truncated.Line Number: 1Source: .Net SqlClient Data ProviderProcedure: Message: The statement has been terminated.Line Number: 1Source: .Net SqlClient Data ProviderProcedure: Using
FileUpload
MS SQLServer, VB.NET

View 9 Replies View Related

Flat File, Fixed Width Import With Nulls Always Fails

Dec 12, 2006

More SSIS woes. DTS was so much easier.

I have a flat file. It's fixed-with with CRLF record delimiters (a.k.a. Ragged Right format).

Some fields are null, and represented by the text NULL.

I'm trying to import the file into SQL via an OLE DB connection. The target table is a SQL 2000 data table. Two of the fields in the target database are of type smallint.

When I run PREVIEW on the data source (Flat File), everything looks good & correct. I added the convert columns task to convert my strings to smallint. This is where things go haywire.

After linking everything up, the conversion gives me a "Cannot convert because of a possible loss of data." All of my numbers are < 50, so I know this isn't the case. Another SSIS bogus error

My first instinct is the SSIS doesn't understand that NULL means null. I edited the file and replaced all instances of NULL with 4 emtpy string chars. Still no good. It seems to be having a hard time parsing the file now.

I dropped the convert task and tried editing the data source, and set the two smallint fields to smallint instead of string (SSIS formats). I get the same conversion error.

Changing the NULL values to 0 fixed the problem, but they're not 0. They're null.

Short of creating another script that converts all zeros to NULL using the aforementioned hack, I'm out of ideas.



I'm I missing something or is SSIS just incapable of handling nulls in fixed-width flat file formats?

TIA

View 7 Replies View Related

Bulk Insert Fails. Column Is Too Long In The Data File

Jun 27, 2006

Hi,

for testing purposes I'm inserting a flat file into a sql-server table using BULK INSERT unsig the following code:

BULK INSERT rsk_staging
FROM 'c: empulk
sk.txt'
  WITH (
    FIELDTERMINATOR = '',
    ROWTERMINATOR = '
',
    CODEPAGE = 'RAW',
    DATAFILETYPE  = 'char',
    BATCHSIZE = 100000,
    ROWS_PER_BATCH = 1925604,
    TABLOCK
  )

I have two versions of "rsk.txt" one with 1.9mill rows and one with the first 2000 rows only. The files have one column only with 115 characters that I'll split in to several columns later using SUBSTRING. The one with 2000 rows fires in to the database with no problems whatsoever using this exact code, the other one throws the following error:

Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.

How can I resolve this problem?

EDIT: I tried several different row- and fieldterminators but this exact one works for the small data-file so I assume it should also work for the large one...the large one is however copyed directly using binary ftp from a unix-filesystem and the small one is manually copied into a new txt-file using UltraEdit.

View 1 Replies View Related

Cannot Shrink Log File 2 Because Files Are In Use.

May 24, 2004

after backup the log with truncate_only,
dbcc shrinkdatabase('ABC', 10)
failed with following error:

Cannot shrink log file 2 (ABC_Log) because all logical log files are in use.

I have put above statements with a job running every night, just got the same error on shrinkdatabase.

Is there a way to shrink the log without stop-restart the sql server?

-D

View 4 Replies View Related

For Each File - Limiting Files

Nov 21, 2006

Thanks everyone

I have another question. If I use FOr Each Loop Container (For each file Enumerator), it will select all the files in that folder. What if I want to select just 100 files (assuming 500 files in the folder)

How do i do this??



Thanks

View 1 Replies View Related

Files And File Extensions

Mar 6, 2007

Hello.

I am attempting to document the various files that are incorporated into a reporting services project and need a more official explanation or defintion of the particular file and it's purpose. I understand what most of the files are and what they serve but would prefer to document it using Microsoft's official explanation so that we can decide which files may require that we source control them.

I have tried searching MSDN for 'file extentions',' file types', and typing in the individual .xxx extensions in to see if there is a documented definition for those files but seemed to get results that do not give me an official definition, didn't come close or were entirely not related.

.sln, .suo, .rds, .rdl, .rdl.data, .rptproj, .rptproj.vspscc, etc

Any links to the official explantion or definition of the files that make up a reporting services report project and their function/importance to the project would be appreciated.

Thanks.

View 1 Replies View Related

Source Files In A Zip File

Aug 26, 2007

I am designing a SSIS package where my source is Flat Files from a zip file, I am not sure how to work with flat file which are inside zip file...

View 5 Replies View Related

Upload Files And Downloading A Pdf File

Mar 5, 2008

what data type am i going to put to my uploadedFiles column in my database... uploaded files are in document format or .txt
 also.. how can i make those files converted into pdf files.. also enable users to download it..
tnx!!!
forums.asp.net = "great help"

View 7 Replies View Related

Generate File Task With 2 Files

Sep 18, 2007

Hi Guys,

I've one Dafta flow task where I'm getting data from OleDb source and then doing some scripting using script component and then generating a file.
Now I would need to get the same data and apply some different things and generate another file.
Can I used this same task for doing the secondry work? If yes how woulld I put the thing in place, I would need to get the same data but I would need to use a seperate scripting and generate a seperate file?

TA
Gemma

View 1 Replies View Related

DB Design :: Merge MDF And LDF Files Into One File

Jul 14, 2015

In my enviornment i have one database with 6 ndf files and 5 ldf files and one mdf file.Actuvally what i am looking is to merge 6 ndf files into one ndf file and 5 ldf files into one ldf file.is it possible to do like this? , i tried using MOVE and TO option while backup is restorating but getting below error messages.

ERROR:
Msg 3176, Level 16, State 1, Line 4
File 'J:NDFabc.ndf' is claimed by 'Finance_data2'(4) and 'Finance_data1'(3). The WITH MOVE clause can be used to relocate one or more files.

View 4 Replies View Related

Combine Multiple RDL Files Into One RDL File

Aug 31, 2007



Hello,
I need to generate a report, which should display 4 reports. Two tables and some charts. I have all these reports (I mean the .RDL files) individually. I can render the reports separately. But, now the need is to combine these reports in the one RDL file. Is this possible? If yes, how?

Also, I tried to create a stored procedure, which would call all these 4 SP inturn and provide 4 result sets. I thought of have an RDL by calling only this SP which would give 4 result sets. But infortunately, it gave only the first SP's result set. So, I have to combine the 4 RDL files into one to show on the Reporting Console. Can anyone please help me in this? Help would be grately appreciated.

Thanks a lot. Let me know if the question is not clear.

Mannu.

View 5 Replies View Related

Reporting Services :: Parsing SSRS Config File And Dynamically Changing File Path Of Config File In Code

Sep 2, 2015

Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url.  My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?

Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc. 

I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).

Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.

View 2 Replies View Related

SSIS Data Flow Task Fails To Load All Text From Flat File

Jan 2, 2007

Hi Guys,

I
have a flat file which is loaded into the database on a daily basis.
The file contains rows of strings which I load into a table,
specifically to a column of length 8000.

The string has a length of 690, but the format is like 'xxxxxx xx xx..'
and so on, where 'xxxx' represents data. So there are spaces, etc present in the middle.

Previously
I used SQL 2000 DTS to load the files in, and it was just a Column
Transformation with the Col001 from the text file loading straight to
my table column. After the load, if I select len(col) it gives me 750
for all rows.

Once I started to migrate this to SSIS, I
allocated the Control Flow Task and specified the flat file source and
the oledb destination, and gave the output column a type of String and
output column width of 8000. But when I run the data flow task it
copies only 181 or 231 characters out of the 750 required.
I feel it stops where it finds the SPACES and skips the rest.

I
specified row delimiters or CR, and LF. I checked the file under
UltraEdit and there were no special characters in the file that would
cause the problem.

Any suggestions how I can get it to load the full data?

Thanks

View 26 Replies View Related

Can I Split A Long .sql File Into Multiple Files?

May 9, 2008

 I have one really long .sql file I'm working on.  It's actually a data conversion type script.  It's gotten really cumbersome to work on as long as it is.  I would like to split up various logical parts of script into their own .sql file.How can I have one file .bat, .sql or whatever call each .sql file in the order I specify? Hoping this is easy. Thanks 

View 3 Replies View Related

Cannot Shrink Log File 2 (ABC_Log) Because All Logical Log Files Are In Use....

Nov 6, 2003

A small database ABC with data only 5 mb but its log is growing everyday around 20 mb. I want to shrink its size like for other databases on daily bases.

1. backup log ABC with truncate_only
2. DBCC SHRINKDATABASE (ABC, 10)
got following error:
<<Cannot shrink log file 2 (ABC_Log) because all logical log files are in use.>>

with no_log also tried but have the same error when dbcc shrinkdatabase..
any idea?

thanks
-D

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved