DTS Large Flat Files
Dec 5, 2000
I have some Large flat fiiles that I need to export to my SQL Server database. The file sizes range from 16 MB to 116 MB. I've tried to save the files to an excel sread sheet and then export them in that format, but that didn't work. does anyone have any suggestions?
View 1 Replies
ADVERTISEMENT
Dec 7, 2000
Here's my delema, I have a file that's 308 bytes wide by 5.7 million records. The record length is fixed and the position and width of the known within the record. When I run DTS I recieve this error Msg MS DTS flat file provide and Err Diesdription: error creating file mapping view: not enough storage is available to process this command. Then when I try to continue with the wizard, it will not allow me to separate the data into the format that I need. Is there any other way to import this file using DTS?
View 1 Replies
View Related
Jul 16, 2007
Hi ,
Is there any method by which I can divide the large flat file into certain number of small files keeping the header in each of the sub files?
Regards,
Prash
View 4 Replies
View Related
Oct 20, 2000
Hi,
I have inherited some databases whith extremely large Log files.
I tried the truncate transaction log but did not work.
Can some body please tell me how to truncate these log files.
Thanks in advance.
Attaullah
View 2 Replies
View Related
Apr 4, 2007
i have a few tables using Sql Server 2005 Express. currently they are holding roughly 30-40k records in them. i have my log files set at restricted growth to 90 megs. while im not close to reaching that, i would like my tables to be able to scale up to possibly millions of records. based on that, i figure the transaction log file will prolly need to have a higher threshold (unrestricted growth). for those with experience, for tables that have millions of records, what are the average size log files i could expect.
is it a bad idea to just shrink the log file every night during off peak hours so that regardless of the amount of records i have, ill always start the day with a minimal log file?
do large log files have any effect on SQL performance?
View 3 Replies
View Related
Aug 16, 2006
We have SQL Server running on a Windows 2003 server, only because Backup Exec requires it. AT the location : C:Program FilesMicrosoft SQL ServerMSSQLData
there is this file: SuperVISorNet_log.LDF which is 15 Gb and is accessed daily. I apologize because I don't know what this is!
My question is: can this file be 'pruned' (for want of a better word) because it's taking up a lot of backup space.
View 17 Replies
View Related
Jan 25, 2008
I am trying to run a query that deletes duplicates records on a table with 24m records. The problem is each time I run it the log file fills up and I get an error saying the log file is full. For this reason the query never ends.
Is there anyway to turn of logging when running a query?
I think it also has to do with disk drive runng out of space as the log file is growing to over 12gb.
It is running in simple mode already.
View 11 Replies
View Related
Jun 11, 2008
Hello,
I have decided to use Linq for my current ASP.NET project and so far it has been good, but now I am implementing a system that will allow users to upload binary content such as pictures and videos. For ease of management and security, I have decided to store this content directly in the database. The performance hit is a minor concern because very few user-uploaded images/videos will be seen on any given page (usually just one).
From the limited tutorials I have seen on the internet, Linq supports the SQL Server varbinary column through its System.Linq.Binary class. This class does not appear to support STREAMS and instead opts to load all of the contents into memory. This content can then be converted to an array of bytes, which can then be output to the browser via the response stream. This is not good. What if I am sending a video that is very large? Varbinary supports up to 2 GB. I can't have a 2 GB video sitting in memory. It makes a lot more sense to stream it via a small buffer.
Obviously, I am going to limit the size of the content that users can upload, but the core problem remains. If I limit content size to 2 MB and I have 2 GB of memory on the server, then I can only serve 1000 users concurrently. In reality, that number would be much less because of other processes running on the server.
Is there no way to stream data from a varbinary column with Linq using a small buffer of bytes?
Do I need to implement some custom logic on my Linq classes? Since these classes are automatically generated, how would I do such a thing?
Thanks.
View 1 Replies
View Related
Apr 9, 2008
how do i insert a large chunk of text into a table column. my project is to build a news website. where people can go and read news articles. the articles are provided by the author in word format, so how do i insert that news article into the table's column? any help would be appreciated
thanks
View 2 Replies
View Related
Feb 9, 2006
I have a table that I'm inserting a file into and using the Image data type to store the binary object. Now the code below works fine for files around 1.5 MB, but anything larger and it's like the code won't even execute and I get a Page Not found error.
I'm in the process of running some traces to find out what's going on in the backend, but I'm assuming there's something amiss with my code. The Image data type should handle files that size with no problem but for some reason it isn't.
Does anyone see anything wrong?
Thanks
Dim iLength As Integer = CType(File1.PostedFile.InputStream.Length, Integer)
If iLength = 0 Then Exit Sub 'not a valid file
Dim sContentType As String = File1.PostedFile.ContentType
Dim sFileName As String, i As Integer
Dim bytContent As Byte()
ReDim bytContent(iLength) 'byte array, set to file size
'strip the path off the filename
i = InStrRev(File1.PostedFile.FileName.Trim, "")
If i = 0 Then
sFileName = File1.PostedFile.FileName.Trim
Else
sFileName = Right(File1.PostedFile.FileName.Trim, Len(File1.PostedFile.FileName.Trim) - i)
End If
conn = New SqlConnection(eco)
conn.Open()
cmd = New SqlCommand("INSERT INTO ECO_Attachments (ECOID, FromType, DocName,OldRev,NewRev,NtLogin,DisplayName, FileName, FileSize, FileData, ContentType) VALUES (@ECOID, @FromType,@DocName,@OldRev,@NewRev,@NtLogin,@DisplayName, @FileName, @FileSize, @FileData, @ContentType) ")
cmd.Connection = conn
Try
File1.PostedFile.InputStream.Read(bytContent, 0, iLength)
With cmd
.Parameters.Add("@ECOID", SqlDbType.Int)
.Parameters.Add("@FromType", SqlDbType.NVarChar, 50)
.Parameters.Add("@DocName", SqlDbType.NVarChar, 250)
.Parameters.Add("@OldRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NewRev", SqlDbType.NVarChar, 50)
.Parameters.Add("@NTLogin", SqlDbType.NVarChar, 100)
.Parameters.Add("@DisplayName", SqlDbType.NVarChar, 200)
.Parameters.Add("@FileName", SqlDbType.NVarChar, 255)
.Parameters.Add("@FileSize", SqlDbType.Real)
.Parameters.Add("@FileData", SqlDbType.Image)
.Parameters.Add("@ContentType", SqlDbType.NVarChar, 50)
.Parameters("@ECOID").Value = ECOID
.Parameters("@FromType").Value = From
.Parameters("@DocName").Value = DocName
.Parameters("@OldRev").Value = OldRev
.Parameters("@NewRev").Value = NewRev
.Parameters("@NTLogin").Value = NTLogon
.Parameters("@DisplayName").Value = DisplayName
.Parameters("@FileName").Value = sFileName
.Parameters("@FileSize").Value = iLength
.Parameters("@FileData").Value = bytContent
.Parameters("@ContentType").Value = sContentType
.ExecuteNonQuery()
'.ExecuteScalar()
End With
Catch ex As Exception
Response.Write(ex)
'Handle your database error here
conn.Close()
End Try
View 1 Replies
View Related
Sep 20, 2005
Hi my data files sit in the default directories and I think they are causing my partition to run out of space. I mainly use one db that I created but don't use the others (ie master, model, tempdb, etc). Yet I see their MDF and LDF files are growing. What can I do to shrink them down or perhaps move them off to a larger partition after shrinking?
View 6 Replies
View Related
Dec 19, 2007
Hi€¦
During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are:
1) Can SQL CE 3.5 handle a 4 €“ 6 GB file
- Read
- Parse (SQL)
2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file?
- Will I need a .NET (small) client to read the large (4-6 GB) text file?
More info:
The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.
Thank you (in advance)€¦
SQL CE 3.5
View 3 Replies
View Related
Oct 22, 2007
Hello,
I created a SSIS solution for reading data from dbase and storing them in SQL Server. In a ForEachDirectory-Loop up to one thousand dbase files are read and stored. The system where the packages are running has 16 GB RAM.
For the first few hundred dbase files everything goes fine, but then, the RAM seems not to suffice any more and a temp file is created (I changed the path in BufferTempStoragePath).
How can it be that there is a need to create temp files if there is so much RAM available?
Why is the RAM filled more and more during the SSIS package execution?
Is there anything I can do to release some of it? (it is running in a loop and there is no need to store all the data)
Could it be caused by dbase?? (I use Microsoft Jet 4.0 OLE DB Provider)
Another thing is that the temp file is not stored in the path I set in BufferTempStoragePath.
There are sufficient permissions set, but temp file is still created in user temp folder...
Any kind of help is very much appreciated!
Best Regards,
Stefan
View 5 Replies
View Related
Apr 13, 2007
How can I export data from sql server 2005 table to fixed length flat file without using xp_cmdshell option from sql server stored procedure ?
View 7 Replies
View Related
Feb 7, 2007
hi all
i using SSIS to import flat files and i need support
how can i import flat file from folder inculed many files and when finish start to next and next .....
if can i select from flat files to add condition
like Select * from.....where ......
thanks
View 5 Replies
View Related
Jan 24, 2008
Hello Everyone.
I am a bit new to SQL Server but not to DBA or programming per se. I am having difficulties getting either an Excel or Text flat file to import properly.
I guess it would be best to ask, using either SSIS or BULK INSERT, what options need to be entered for a typical excel flat file?
View 2 Replies
View Related
Oct 18, 2006
In my application I am allowing the users attach files. I found the data type "Image", Will this also allow regular file attachments?
Thanks,
Steve C.
View 4 Replies
View Related
Jul 28, 2014
I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.
What is the best and fastest way of importing such files. I have played around with smaller files and found the following.
1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.
how to import large XML quickly in a relational table structure?
View 9 Replies
View Related
Aug 13, 2007
What is the easiest way to get a large fixed width text file (200 columns) defintion into SSIS? To have to define each column with the ruler would be very cumbersome.
View 5 Replies
View Related
Sep 25, 2005
How can I store flat files in SQL SERVER??
Actually I am planning to prepare a repository of different files like .xls, .pdf, .doc, .ppt etc and then i will have a web interface to access these files. Can anybody guide me, How can i store these flat files in datbase.
View 3 Replies
View Related
Aug 1, 2006
I'm trying to input a few thousand flat files into a few thousand tables in a sql databaseim using integration services with a for each loop to read all the files in a directorythe problem is i can only insert the data from all the files into one tabledoes anyone know a way to do multiple tables? maybe using some sort of variable?
View 1 Replies
View Related
Nov 3, 2006
Hello,
I have a package that contains 22 data flow tasks, one for each flat file that I need to process and import. I decided against making each import a seperate package because I am loading the package in an external application and calling it from there.
Now, everything works beautifully when all my text files are exported from a datasource beyond my control. I have an application that processes a series of files encoded using EBCDIC and I am not always gauranteed that all the flat files will be exported. (There may have not been any data for the day.)
I am looking for suggestions on how to handle files that do not exist. I have tried making a package level error handler (Script task) that checks the error code ("System::ErrorCode") and if it tells me that the file cannot be found, I return Dts.TaskResult = Dts.Results.Sucsess, but that is not working for me, the package still fails. I have also thought about progmatically disabling the tasks that do not have a corresponding flat file, but it seems like over kill.
So I guess my question is this; if the file does not exist, how can I either a) skip the task in the package, or b) quietly handle the error and move on without failing the package?
Thanks!
Lee.
View 9 Replies
View Related
Aug 3, 2007
Hi,
I'm trying to write the content of variables on flat files... is that possible?
Thanks!
View 7 Replies
View Related
Aug 2, 2006
I'm trying to input a few thousand flat files into a few thousand tables in a sql database, using SQL Server Business Intelligence Development Studio.
im using a for each loop to read all the files in a directory
the problem is i can only insert the data from all the files into one table
does anyone know a way to do multiple tables? maybe using some sort of variable?
View 1 Replies
View Related
Jun 29, 2006
Our ETL process involves some pre-load validation, and I'm wondering how best to implement it in SSIS.
Some details on my situation: I need to import 30 flat files with different data formats into 30 destination tables. In addition, these files share a common header and footer row format, and I need to validate these headers and footers before using the imported data downstream. (For example, the footer contains a record count, and fields in the header and footer should match some user variables.) My first approach was to write a Perl script that splits each file into three (header, data, and footer), but while that makes it easy to import the data section, it's more complicated to validate the header and footer and work them into the control flow. I think I'd also have to copy the same logic for all 30 data flows, which is less than ideal.
It looks like implementing this logic directly in SSIS is a little ugly (though that could be my lack of experience speaking). As I thought about this some more, I came up with a couple other solutions -- any critiques or comments?
1) Write a custom source adapter (which will probably contain the default flat file adapter) that knows how to validate my header and footer. I'd be able to read the file formats from an XML file, which might make my scripts more generic, and I might even be able to handle some custom data conversions more elegantly than I'm doing right now. (These files represent null numerics as whitespace rather than an empty field.)
2) Beef up the Perl splitter to validate the header and footer. If the cleanest approach is to say "assume that SSIS is only loading pre-validated data", this makes the problem entirely external.
Or am I entirely missing the mark here? Any thoughts?
View 5 Replies
View Related
May 18, 2007
For some reason I am having a really hard time grasping IS and I have a task that I would imagine is easy.
I have a flat file source with 6 columns, I would like to import this file into two flat files. One file containing columns 1,2,3,5 and the second containing 2,4,5,6. I created the connection managers for both destination files, but I can€™t determine what transformation tool I need to accomplish this task? Could you help?
View 3 Replies
View Related
Jun 13, 2007
Hi
I got two text files with data.I got to compare two files and if there is any inconsistancy between two files I need to dispaly as a report using sql reporting services.I do not know how to do that?
Any source code or suggestion.
Thanx in advance
View 5 Replies
View Related
Aug 29, 2007
I have several databases that have grown to 300 GB and would like to distribute the data into multiple files across multiple drives. Can I create a new database that is spread across the new drives and use a full backup to restore or am I stuck with unloading the data table by table?
View 3 Replies
View Related
Sep 11, 2007
Hello,
I am attempting to restore the database from within VB.NET application I am making the following 3 calls:
RESTORE FileListOnly FROM DISK = 'C:MyDatabase.dat'
USE Master RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat' WITH NORECOVERY, MOVE 'MyDatabase' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataMyDatabase.mdf', MOVE 'MyDatabase_log' TO 'C:Program FilesMicrosoft SQL ServerMSSQLDataLDFMyDatabase.ldf', REPLACE
RESTORE DATABASE MyDatabase FROM DISK = 'C:MyDatabase.dat'
using SMO. This logic works fine with small *.dat files, however when using *.dat file of about 4Gb I get an error on the 3d restore database call:
ExecuteNonQuery failed for Database 'master'.
An exception occurred while executing a Transact-SQL statement or batch.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Operator aborted backup or restore. See the error messages returned to the console for more details.
ExecuteNonQuery failed for Database 'master'.
An exception occurred while executing a Transact-SQL statement or batch.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
Operator aborted backup or restore. See the error messages returned to the console for more details.
The same program/logic also works fine when I use MS SQL 2005 and it runs fine from MS SQL 2005 Query Analyzer for both 2005 and 2000 databases. There seem to be only problem with MS SQL 2000 from within VB.NET. Anybody has any idea? I'd appreciate any response. Thanks
Eugene
View 6 Replies
View Related
Feb 21, 2012
So we are running SQL 2000 and we use a DTS Package to load about 100 txt files to different tables in a SQL DB.
So in DTS packages you define the connection (database) and the different source files will connect to the connection.
You define the destination on the flow arrow. My question is how do i accomplish this in SQL 2008 using SSIS. (upgrading SQL).
View 9 Replies
View Related
Jan 4, 2008
I had to use use ssis 2005 in a short project recently & had littletime to work it out. I was importing a whole bunch of flat files intoSQL Server tables with many derived columns and transformations inbetween.It seems to automatically map columns from the flat file to columns inthe sql table where the names of the columns are equal. But can italso do it automatically on position, so flat file column 1 goes tosql table colum 1, etc, etc? In each flat file I had to manually clickand drag the columns across to map them which took a very long time asthere were hundreds of columns in some tables!Thanks.
View 3 Replies
View Related
Apr 23, 2008
Hi Evry one,
I Have Multiple Flat Files in Source Folder(They have Naming Conventions With Todays Date ex: Flatfile_20082204_1,Flatfile_20082204_2,Flatfile_20082204_3 ),
I need to Extract Each and Evry file by Dynamically, and Transform the Flat File then load that Flat file into the Destination Folder with Standard Prefix and Todays Date with a Sequence No ex:Flatfile_20082304_A,Flatfile_20082304_B, Flatfile_20082304_C
Please HELP Me
Thanks In Advance.
View 20 Replies
View Related
Sep 20, 2006
Hi.
I've tried to create a SSIS package to simply export a bunch of tables as flat files, and am having troubles because when the for each loop hits the second table the column mappings in the flat file destination are not synchronised with its schema.
I created a for each loop with an enumerator that returns the table names and sets a user variable.
I created a data flow task which dynamically connects to the table name variable.
In the Flat File Destination there is a column mapping property, but I don't know how to reset these mappings on each iteration.
Any ideas?
View 3 Replies
View Related