Hey guys,
I have a dilemma and hope someone can help.
I don't know of any utilities or commands in SQL that do this but I hope someone does.
What I need to do is something like a bcp import a text file in. I can do that with DTS as well. But what I wanted to do is create a table on the import. So lets say, I am importing a tab-delimited file with column names as the first row that is called ax.txt. On import, it would create the table ax with the column names in the file and then import the data into that table.
I hope I explained it clearly. Please let me know if there is anything I can use to do this without writing lots of code.
I have an idea how to do it the long way but hope there is a utility that already does it.
I am a relative newbie to SQL but I've written many queries for vb.net/.net code...I'm not an absolute beginner.
I'd like to import a text file into a sql database so that I can use SQL Reporting Services to report on the data. Here is a sample of the first 8 textfile records. All of the 6 potential database fields are separated by a comma and no spaces:
The description for field datatypes of the first record above is:
1 (this is an autonumber, should be a number for ordering) 12/4/06 4:12:11 PM (date and time, can be converted to text if necessay) 67.13 (number, 2 decimal spaces) 70.50 (number, 2 decimal spaces) 71.56 (number, 2 decimal spaces) 8.23 (number, 2 decimal spaces)
The textfile is big, 97K records. I have SQL 2005 installed on my PC.
Can anyone out there please help me with the import or SQL statement to create a SQL table from this? Any help would be greatly appreciated!
hi guys i need to import text file to sql table in sql server 2005 ...using query how do i import text file to sql table ....................... i need query i dont want go Import/export options
I have table schema and text data file. After create the table, and then import the text data file. However, the text file always overwrite the table schema.
How could I using existing table schema to import the text file?
I have a text file which needs to be created into a table (let's call it DataFile table). For now I'm just doing the manual DTS to import the txt into SQL server to create the table, which works. But here's my problem....
I need to extract data from DataFile table, here's my query:
select * from dbo.DataFile where DF_SC_Case_Nbr not like '0000%';
Then I need to create a new table for the extracted data, let's call it ExtractedDataFile. But I don't know how to create a new table and insert the data I selected above into the new one.
Also, can the extraction and the creation of new table be done in just one stored procedure? or is there any other way of doing all this (including the importation of the text file)?
I have a record in an Excel format (Excel 2010) and I would like to bulk import that into SQL Server 2008 and also while importing, SQL Server will automatically create a new table based on the header fields or row of the source file.
I am not sure if SQL Server 2008 has this capabilities.
Hi, I want to create a text file and write to text it by calling its assembly from Stored Procedure. Full Detail is given below
I write a code in class to create a text file and write text in it. 1) I creat a class in Visual Basic.Net 2005, whose code is given below: Imports System Imports System.IO Imports Microsoft.VisualBasic Imports System.Diagnostics Public Class WLog Public Shared Sub LogToTextFile(ByVal LogName As String, ByVal newMessage As String) Dim w As StreamWriter = File.AppendText(LogName) LogIt(newMessage, w) w.Close() End Sub Public Shared Sub LogIt(ByVal logMessage As String, ByVal wr As StreamWriter) wr.Write(ControlChars.CrLf & "Log Entry:") wr.WriteLine("(0) {1}", DateTime.Now.ToLongTimeString(), DateTime.Now.ToLongDateString()) wr.WriteLine(" :") wr.WriteLine(" :{0}", logMessage) wr.WriteLine("---------------------------") wr.Flush() End Sub Public Shared Sub LotToEventLog(ByVal errorMessage As String) Dim log As System.Diagnostics.EventLog = New System.Diagnostics.EventLog log.Source = "My Application" log.WriteEntry(errorMessage) End Sub End Class
2) Make & register its assembly, in SQL Server 2005. 3)Create Stored Procedure as given below:
CREATE PROCEDURE dbo.SP_LogTextFile ( @LogName nvarchar(255), @NewMessage nvarchar(255) ) AS EXTERNAL NAME [asmLog].[WriteLog.WLog].[LogToTextFile]
4) When i execute this stored procedure as Execute SP_LogTextFile 'C:Test.txt','Message1'
5) Then i got the following error Msg 6522, Level 16, State 1, Procedure SP_LogTextFile, Line 0 A .NET Framework error occurred during execution of user defined routine or aggregate 'SP_LogTextFile': System.UnauthorizedAccessException: Access to the path 'C:Test.txt' is denied. System.UnauthorizedAccessException: at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath) at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy) at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, ileOptions options) at System.IO.StreamWriter.CreateFile(String path, Boolean append) at System.IO.StreamWriter..ctor(String path, Boolean append, Encoding encoding, Int32 bufferSize) at System.IO.StreamWriter..ctor(String path, Boolean append) at System.IO.File.AppendText(String path) at WriteLog.WLog.LogToTextFile(String LogName, String newMessage)
I have a small project on that involves importing a series of csv files held within an ftp directory into our Datawarehouse. Every day a series of csv files will be added to the directory. These will be named something like:
Audit1.csv,Audit2.csv etc.
I would like to automate this process as this can involve up to 400 files at a time. The proecedure would need to identify a valid file, import it into the database, delete the file and then move onto the next one.
Does anyone know of a way to achieve this? I was thinking along the lines of using a cursor and bcp but I'm not sure how to identify these files to the database i.e. how do i make it step through the directory and process the files?
In a DTS package I have a text file import object, a data pump, and a SQL object. The text file import object has been set up to splice a 500 character wide file into 20 columns. The data pump task does a copy column for all the columns into the appropriate table. What I need to do is have a way of changing the file name I specify in the text import object. I have 12 months worth of data in seperate files (DBF0199.TXT, DBF0299.TXT, DBF0399.TXT, etc..) which all use the same format. Is there a way to change the text import objects file name inside the package using an active script task or something?
I'm trying to import a fixed field text file into SQL Server using DTS but everytime I go past 3640 characters, I am not able to add, delete or move column breaks after that. Is anyone else experiencing this problem and know of a work around. Any help would be appreciated. Thanks!
Hi All, I'm having a problem in importing an Excel file into SQL server 2000.Here is the scenario with my data. One of the column has got the mixed data which is putting null's in the SQL server table in some rows.I found in the MSFT Technet that it is a bug in SQL server 7.0/2000.The workaround for it ( according to MSFT ) is to get the data into text file and import into SQL server. Now the question is , my data contains some currency fields and numeric fields in addition to the char and date fields. When I'm importing the table using DTS wizard , it is failing. I'm trying to use conversion functions like cdate and clng etc . Still the DTS is failing. What I noticed is when I try to import into a table with data type varchar for all columns, it is working fine.But the data is of no use. I would appreciate if any one can help me out in solving this problem. Thanks, Sammy.
Quick advice question. I import lots of text files -- many with 50 plus data columns. Few come with a table layout other than perhaps the first row having a set of column names.
When I go to pull them into SQL server the columns default to varchar 8000. Is anyone aware of a tool (as a part of SQL Server or otherwise) that can scan a column of data and recommend a data type and size.
I have created a DTS that imports a text file to by data table. I geterrors when ever I run this since there are fields in the table thatare numeric. I understand that I need to create an activeX script toimport those fields. DOes anyone have any guidance?
Does anyone know if it's possible to use the wizard or DTS Designer toaccept a source file with the following simplified format:<field1label>: <record1field1value><field2label>: <record1field2value>- - - - - - -<fieldNlabel>: <record1fieldNvalue><field1label>: <record2field1value><field2label>: <record2field2value>etc.i.e. each input record is delimited by {LF}{LF}, and each column by {LF}. Orwill it be necessary to write a Perl script (say) to convert it first into a..csv file?Thanks,Dave--************************************************** **********************Dave Stone e-mail: Join Bytes!Computing Services Telephone: +44 131-650-3314University of Edinburgh Internal ext: 503314Main Library, George Square FAX: 0131-650-3308Edinburgh EH8 9LJ************************************************** **********************
Something I find myself wanting to do frequently is the traditional foreach loop looping through a directory of files and importing (which works great in SSIS) - Only I don't want to import data I have already imported.
In a previous job I used Perl for thing like this and the structure would be as follows:
For Each File:
1. Get filename and timestamp
2. Query db table with list of already imported filenames and timestamp. If Filename not in the table or is in table with older date return 1 to import or if file already imported return 0
3. Based on the result of step 2 either import or skip to next file.
Any recommendations how to do something similar in SSIS? I run stuck when I try to get the timestamp of the file and I also can't figure out how to do the conditional inside the foreach container... I am also open to other ideas on how to only import files I have not already imported.
Hi i'm new to DTS and need to be able to import a text file into a table each day.
The main problem I have is the file is datestamped so the name of the file changes each day.
Today it would be called file20070419.txt tomorrow it would be file20070420.txt. When I select a text file as a source I have to pick a valid file ??? how can I get round this ???
Hi allI am looking for examples of scripts that will help me doing these things: - import a text file delimited with the character "*", representing a new month of data, for example data from march 2007 - create a new table with the structure of an existing one to import the data, for example Data_March_2007 - alter an existing totals table adding a new column for the new moth imported, adding a new colum for the month of March 2007.
Hello Everyone and thanks for your help in advance. I am having a problem importing a text file into SQL Server 2005 and can't figure out where the issue is. The file is in CSV format with the text delimiter field as a ". When attempting to import the field into a SQL Server 2005 table, the import fails due to numerous truncation errors. I have tried importing into an existing table, but have also tried importing and allowing the import to create the table. I receive the same failures in both cases. For whatever reason, when allowing the import to create the table, each column is created as a nvarchar(50) even thought the column sizes vary widely. Oddly enough, when importing into a SQL Server 2000 table with the correct layout, the file imports perfectly fine, thus verifying there is nothing wrong with the data source. It also creates the table with appropriate column sizes in SQL 2000. I'm really at a loss as to what is going on. Any assistance woul dbe greatly appreciated. Thanks.
Everyday, there are users FTP the text files to a directory in the server. During the night, a job will be run to import these text files to a table.
First, the job need to read the file name, then open the file, read the first line and insert to the table until then end of the file. Then the second file will be read ... until no more file to be read.
I am using VB to read the file name and open the file, and my question is how to pass the file name to the second step which is in T-SQL?
If you have both invoice header lines and invoice detail lines in a comma delimited file, how can I get the data in the file to be imported into two different tables. I can produce a text file eg:
1,20,10/03/2002,39 High Street Any Town,, 2,20,Fluffy Slippers,2,Red,10.99 2,20,Pyjamas,3,Black,15.99 2,20,Trousers,1,Lilac,24.99 1,21,10/03/2002,11 Gibson Close, 2,21,Sandles,1,Black,12.99 2,21,Shoes,4,Blue,23.99 1,22,13/03/2002,45 Mill Street, 2,22,Womble Feet,4,White,16.99 2,22,Glass Slipper,1,Transparent,23.99
Lines with 1 in the first column should go in the InvoiceHeader table Lines with 2 in the first column should go in the InvoiceDetails table.
I have tried with DTS but to no avial - ActiveX scrips in the Transform Data Task can only seem to access one data destination - it one table not two.
I have a fixed field text file I am trying to setup an import for SQL 2000. If I use Access 2000 the file reads fine, but if I use the SQL Import feature or DTS, SQL doesn't line up the records correctly, I tried all the combinations of row returns with no luck, but Access works just fine. ANy ideas?
dear all, I have a text file that contains data that i need to insert into sql server... the file size is about 800 MB .. and contains about 17,000,000 lines .. some one told me that there is a way in sql server to import this data automatically by writting some scripts ... the file looks like this
xxxxxxxxx xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxx yyyy .. yyyyy "I Need only These fields (the Ys).. I don't care about the rest of the file" yyyy .. yyyyy xxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxx
I have created an database with 3 indexed (not clustered) fields key : bytes 100 error : integer status : bytes 6 I would like to know how to import 3 million or more records quickly. If I don't index the the fields, the operation takes 3 hours. With indexed fields it takes overnight. Therefore, how can i optomise the import operation from a .txt file.
Hi, I am using DTS to import data from text file to database tables in SQL server2005. there is one big column thay I dont know what data type I should use to import it, I tried both nvarchar(MAX) and varchar(MAX).
but I am getting the following error:
for nvarchar(MAX) , getting the below error:
Error 0xc02020ed: Data Flow Task: Columns "Column 2" and "VendorDesc" cannot convert between unicode and non-unicode string data types. (SQL Server Import and Export Wizard)
for varchar(MAX) , getting the below error:
Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "Column 2" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard)
Could you please let me know what data type I can use to handle this column?
This was a no brainer in SQL Server 2000 but I just can't figure out where the heck it is in SQL Server 2005 Enterprise Edition. Where is the wizard that will import a text file and create a database from it?
First off, I am not a DBA, not even remotely close. Anywho, I have been given the task of figuring out how to import from a comma delimited text file into 2 columns of an existing database. The task is as follows:
- A daily text file is created by a Unix DB and placed on a folder local to the SQL Server. - I am to take this file and import into an existing MS SQL2005 DB that has 3 columns. - AccountID, AccountName, DateRecordCreated - The imported data has to overwrite all existing SQL DB data. - This is to run automated on a daily schedule.
Being a SysAdmin, this sounds super simple to do but I have wasted 2 full days in trying to figure out how to make this happen using SSIS. All I want to know is if I am in the right track in focusing on SSIS for a solution. Any additional How To's would be greatly appreciated. BTW, the text file looks something like this...
AccountID,AccountName A123456,Joe Smith, M.D. A234567,John H. Dude,M.D.