I am able to import a CSV file into a temporary table as long as I know the number of fields in the CSV file. Here is what I would like to do:
I would like to have a CSV file which has UP to 6 entries per row. I would like to insert each row into a table; if the there three fields, then I want to insert them into the first three columns to the temporary table. If there are four, then insert into the first four fields. Is this possible?
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind. I've tried using the new .write() method in my update statement, but it cuts off the text after a while. Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.
I have imported an event log file from an NT server in a test database . The table has been created automatically with some 10 columns as Col001...Col002 ...and so on till Col0010 . Now i want to copy the data of the event logs to specific columns . So i created a table with name such as Evnt , date , time , server and evntdescription so that whenever i can execute a simple query like
Select * from tablename where type = 'app' , Server = 'test1 ' . so that i get all the results for that server 'test1 with type 'application' .
The problem is how do i insert that event log file into the table which i have created with different columns names . So that the data with 'App' should go to column 'type , data with server name should go to the column name server and so on ..
I tried all the was but could not succeed . Can i get some help in this regard please through some stored procedures or through DTS , if its possible .
I import a group of sentences INSERT from a text file .... test
Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3') Insert Into XXXXX Values('UUUUUU','3')
The file contains 1000 insert (Aprox); I read lines for lines the file I make the insert
In VS.NET 2003 it works correctly and the process consumes little memory but In VS.NET 2005 the pocket is without space.
How I can specify the factor of growth of the database SQL Mobile?
The files have pipe delimters and double quotes as text qualifiers. I can get the file to import with a bulk insert statement, but it brings in the double quotes in as well. What setting is it that can be set to indicate what the text qualifiers are?
I have created an SSIS package, in my VS2005 solution, that Bulk Inserts a CSV file (see example below) "100",2006-10-03 00:00:00,"HEX012",1"101",2006-10-03 00:00:00,"DS00130",1
I have a Bulk Insert Task that uses a Flat File Connection Manager to import my CSV file into my SQL2005 database. My source CSV file (see example above), has double quatation marks surrounding any text fileds. I have set the Flat File Connection Manager's 'Text Qualifier' to double quatation marks. The Bulk Insert works ok, but ignores the Text Qualifier. My database table is left with the original quatation marks in any text field. Any help appreciated. Regards, Paul.
Hello I need to write a proc to load data from txt files I receive into a table. It works fine when I specify bulk insert.... from 'myfilename.txt' BUT my filename will always change and I store it into a variable @filename
When I try to run the bulk insert instruction ... from @filename it doesn't work.. do you know why?
OK, Ive read many posts on this problem but have seen no resolution.
Here's the deal. When on the actual SQL box (There's 2 in the cluster) the bulk insert works fine. No errors and the event log on the machine that is hosting the text file shows that the account that SQL runs on has accessed the file. This account is a DOmain account and is in the Local Administrator of the SQL server and the remot box hosting the text file.
Thats great if you want your developers testing and accessing your SQL box as Administrators. We don't work that way. Our developers connect via SQL Management Studio and test ther stuff that way. That's where the problem rears it's ugly head.
When anyone runs a Bulk Insert command that accesses a text file that is on a remote server, they get an "Access Denied". Now, I did a lot of testing and found that when the users executes the Bulk Insert command from the SQL Management studio on their desk top, they connect to the SQL box with their credentials (OK, that's correct right?), SQL then runs the Bulk Isert command which then reaches out to the remote file server but gets the "Access Denied". I check the logs and it shows that "Anonymous" was trying to accesss the file.
Why is Anonymouss coming over as credentials for SQL on a Bulk Insert? I'm no idiot but this sounds like a big crappy bug tha M$ will not fess to. I followed many suggestions, made sure NTFS and Share level permissions were correct (That was the first thing...), made sure the account that was running as SQL Server within the cluster on both nodes in the cluster was the same, that wasn't it, I even created a SPN for SQL to run and automatically register in AD with the credentials that SQL runs as. NOTHING!!!
Has anyone gotten their bulk insert to work when inserting from a file that is NOT local to the SQL box? Any help is appreciated, but putting the text files on the local SQL box IS NOT AN OPTION.
How do I do a bulk insert into a temp table from a text file. Text file looks like that: Â ver_id TYPE E57AB326-803C-436E-B491-398A255C919A 58 34D2A601-6BBA-46B1-84E1-3A805FDA3812 58 986E140C-62F1-48F1-B428-3571EBF00DA0 58
My statement looks like this:
CREATE TABLE [dbo].[tblTemp]([ver_id] Â [nvarchar](255), [TYPE] [smallint])Â GO BULK INSERT [dbo].[tblTemp] FROM 'C:v2.txt' I keep receiving errors.
The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).
I am using Microsoft SQL 2005, I need to do a BULK INSERT from a .csv I just downloaded from paypal. I can't edit some of the columns that are given in the report. I am trying to load specific columns from the file.
bulk insert Orders FROM 'C:Users*******DesktopDownloadURL123.csv' WITH ( FIELDTERMINATOR = ',', FIRSTROW = 2,
ROWTERMINATOR = '' )
So where would I state what column names (from row #1 on the .csv file) would be used into what specific column in the table.
I saw this on one of the sites which seemed to guide me towards the answer, but I failed.. here you go, it might help you:
FORMATFILE [ = 'format_file_path' ]
Specifies the full path of a format file. A format file describes the data file that contains stored responses created using the bcp utility on the same table or view. The format file should be used in cases in which:
The data file contains greater or fewer columns than the table or view.
The columns are in a different order.
The column delimiters vary.
There are other changes in the data format. Format files are usually created by using the bcp utility and modified with a text editor as needed. For more information, see bcp Utility.
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.Â
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.  Is there any proper way to load this kind of files into the db table?Â
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98 My Table would then have: ProductID as int Name as text Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column. Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString); SqlCommand cmd = new SqlCommand();
SqlConnection.Open(); cmd.ExecuteNonQuery(); SqlConnection.Close(); RefreshData(); I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I have a table containing 8 million records. I need to replace 2 million of these records with a scaled down query that goes something like: SELECT 1, ShareholderID, Assets1 FROM MyTable (Yields appx. 200,000 recods) SELECT 2, ShareholderID, Assets2 FROM MyTable (Yields appx. 200,000 recods) . . . SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9 FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way. SELECT 6 million records that don't need to be deleted into a #TempTable Use statements above to select into same #TempTable DROP and recreate Original Table SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
Hi, i'm trying to do a full text search on my site to add a weighting score to my results. I have the following database structure: Documents: - DocumentID (int, PK) - Title (varchar) - Content (text) - CategoryID (int, FK) Categories: - CategoryID (int, PK) - CategoryName (varchar) I need to create a full text index which searches the Title, Content and CategoryName fields. I figured since i needed to search the CategoryName field i would create an indexed view. I tried to execute the following query: CREATE VIEW vw_DocumentsWITH SCHEMABINDING ASSELECT dbo.Documents.DocumentID, dbo.Documents.Title, dbo.Documents.[Content], dbo.Documents.CategoryID, dbo.Categories.CategoryNameFROM dbo.Categories INNER JOIN dbo.Documents ON dbo.Categories.CategoryID = dbo.Documents.CategoryID GOCREATE UNIQUE CLUSTERED INDEX vw_DocumentsIndexON vw_Documents(DocumentID) But this gave me the error: Cannot create index on view 'dbname.dbo.vw_Documents'. It contains text, ntext, image or xml columns. I tried converting the Content to a varchar(max) within my view but it still didn't like. Appreciate if someone can tell me how this can be done as surely what i'm trying to do is not ground breaking.
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages: BULK INSERT TableName FROM 'C:DataDbTableName.bcp' WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following: DataFileType: DTSBulkInsert_DataFileType_WideNative RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help. Paul
I have followed many tutorials on selecting and replacing text in text fields, varchar fields and char fields, but I have yet to find a single script that will to all 3 based on field type. Let's assume for a moment that I don't know where all in my database a certain value that I need changed resides ... i.e., the data's tablename and fieldname. How would I go about doing the following ... or more importantly, is this even possible in a SQL only procedure?1) Loop over entire database and get all user tables2) Loop over all user tables and get all fields3) Loop over all fields and determine the field type4) switch between field types and change a string of text from 'a' to 'b'Please be gentle, I'm a procedure newb.
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK INSERTtbl_ASX_Data_temp FROM 'M:DataASXImportTest.txt' WITH (FORMATFILE='M:DataASXSQLFormatImport.Fmt')
I have to upload about a million records from a text file using DTS packages. The file has a predefined format which map to the fields in the database. Some of the fields width exceed the width in the database. Does that mean that it will truncate these fields, or will it fail the whole bulk upload?
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
I have a data table in my project that has a "text" (not varchar) field in it. I am trying to load this field with little paragraphs of text for showing on my pages. How do I get the free text loaded into the table? The database explorer and all the data grid controls cut the text off at one line. There doesn't seem to be any way to get multiline text into this table.
I have imported a whole bunch of tables. Most of them have an ID (int) column. Is there a way to set the ID columns across all tables to auto increment Primary Keys in bulk?
An insert statement was not inserting all the data into a table. Found it very strange as the other fields in the row were inserted. I ran SQL profiler and found that sql statement had all the fields in the insert statement but some of the fields were not inserted. Below is the sql statement which is created dyanmically by a asp.net C# class. The columns which are not inserted are 'totaltax' and 'totalamount' ...while the 'shipto_name' etc...were inserted.there were not errors thrown. The sql from the code cannot be shown here as it is dynamically built referencing C# class files.It works fine on another test database which uses the same dlls. The only difference i found was the difference in date formats..@totalamount=1625.62,@totaltax=125.62are not inserted into the database.Below is the statement copied from SQL profiler.exec sp_executesql N'INSERT INTO salesorder(billto_city, billto_country, billto_line1, billto_line2, billto_name,billto_postalcode, billto_stateorprovince, billto_telephone, contactid, CreatedOn, customerid, customeridtype,DeletionStateCode, discountamount, discountpercentage, ModifiedOn, name, ordernumber,pricelevelid, salesorderId, shipto_city, shipto_country,shipto_line1, shipto_line2, shipto_name, shipto_postalcode, shipto_stateorprovince,shipto_telephone, StateCode, submitdate, totalamount,totallineitemamount, totaltax ) VALUES(@billto_city, @billto_country, @billto_line1, @billto_line2,@billto_name, @billto_postalcode, @billto_stateorprovince, @billto_telephone, @contactid, @CreatedOn, @customerid,@customeridtype, @DeletionStateCode, @discountamount,@discountpercentage, @ModifiedOn, @name, @ordernumber, @pricelevelid, @salesorderId,@shipto_city, @shipto_country, @shipto_line1, @shipto_line2,@shipto_name, @shipto_postalcode, @shipto_stateorprovince, @shipto_telephone,@StateCode, @submitdate, @totalamount, @totallineitemamount, @totaltax)',N'@billto_city nvarchar(8),@billto_country nvarchar(13),@billto_line1 nvarchar(3),@billto_line2 nvarchar(4),@billto_name nvarchar(15),@billto_postalcode nvarchar(5),@billto_stateorprovince nvarchar(8),@billto_telephone nvarchar(3),@contactid uniqueidentifier,@CreatedOn datetime,@customerid uniqueidentifier,@customeridtype int,@DeletionStateCode int,@discountamount decimal(1,0),@discountpercentage decimal(1,0),@ModifiedOn datetime,@name nvarchar(33),@ordernumber nvarchar(18),@pricelevelid uniqueidentifier,@salesorderId uniqueidentifier,@shipto_city nvarchar(8),@shipto_country nvarchar(13),@shipto_line1 nvarchar(3),@shipto_line2 nvarchar(4),@shipto_name nvarchar(15),@shipto_postalcode nvarchar(5),@shipto_stateorprovince nvarchar(8),@shipto_telephone nvarchar(3),@StateCode int,@submitdate datetime,@totalamount decimal(6,2),@totallineitemamount decimal(6,2),@totaltax decimal(5,2)',@billto_city=N'New York',@billto_country=N'United States',@billto_line1=N'454',@billto_line2=N'Road',@billto_name=N'Hillary Clinton',@billto_postalcode=N'10001',@billto_stateorprovince=N'New York',@billto_telephone=N'124',@contactid='8DAFE298-3A25-42EE-B208-0B79DE653B61',@CreatedOn=''2008-04-18 13:37:12:013'',@customerid='8DAFE298-3A25-42EE-B208-0B79DE653B61',@customeridtype=2,@DeletionStateCode=0,@discountamount=0,@discountpercentage=0,@ModifiedOn=''2008-04-18 13:37:12:013'',@name=N'E-Commerce Order (Before billing)',@ordernumber=N'BRKV-CC-OKRW5764YS',@pricelevelid='B74DB28B-AA8F-DC11-B289-000423B63B71',@salesorderId='9CD0E11A-5A6D-4584-BC3E-4292EBA6ED24',@shipto_city=N'New York',@shipto_country=N'United States',@shipto_line1=N'454',@shipto_line2=N'Road',@shipto_name=N'Hillary Clinton',@shipto_postalcode=N'10001',@shipto_stateorprovince=N'New York',@shipto_telephone=N'124',@StateCode=0,@submitdate=''2008-04-18 14:37:10:140'',@totalamount=1625.62,@totallineitemamount=1500.00,@totaltax=125.62
Hello, I am wondering is the Transaction Log logged differently between BULK INSERT vs INSERT? Performance speaking, which operations is generally faster given the same amout of data inserted.
Hi~, I have 3 questions about memory based bulk copy.
1. What is the limitation count of IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit(true)? For example, how much insert row at below sample?(the max value of nCount) for(i=0 ; i<nCount ; i++) { pIFastLoad->InsertRow(hAccessor, (void*)(&BulkData)); }
2. In above code sample, isn't there method of inserting prepared array at once directly(BulkData array, not for loop)
3. In OLE DB memory based bulk copy, what is the equivalent of below's T-SQL bulk copy option ? BULK INSERT database_name.schema_name.table_name FROM 'data_file' WITH (ROWS_PER_BATCH = rows_per_batch, TABLOCK);
------------------------------------------------------- My solution is like this. Is it correct?
// CoCreateInstance(...); // Data source // Create session
I have a record in an Excel format (Excel 2010) and I would like to bulk import that into SQL Server 2008 and also while importing, SQL Server will automatically create a new table based on the header fields or row of the source file.
I am not sure if SQL Server 2008 has this capabilities.