SQL Server Bulk Uploads
Sep 13, 2007
I have tried to upload about 1 million records from a foxpro table to SQL Server. It only loaded about 700,000 records.
I found out that the drive where the SQL Server transaction log file were located had increased to 15 gigs and I run out of disk space.
I guess that is the reason it could not load the rest of the records.
What should be the option setting in my SQL Server database when doing "bulk uploads"?.
Should I delete table indexes when uploading?.
Should I set the Recovery Model to simple?.
Thanks
View 1 Replies
ADVERTISEMENT
Mar 14, 2007
hi,
i need to create two instances of db and transfer incremental uploads from one db to another without having to transfer the entire table of data again and again. how should i go about it? what commands should i use?
thanks in advance
ramya.
View 1 Replies
View Related
Jan 12, 2008
Hi,I created a small application to upload small video files on my
local machine and tested it to work perfectly on my local sql server
database. I then change the connection string to connect to the remote
database.When I do this and try to upload, the application hangs and
the video file is not uploaded. The file is small file only about 10MB
or less.When I upload the same file locally,it works perfectly fine and
uploads quickly without hanging.Here's a link to sample of my web.config filehttp://www.bleusolutions.com/sampleconfig.txt Here's a link to the pagehttp://www.bleusolutions.com/foto/uploadvideo.aspxI believe that this may be a database issue. I am running my
application locally
(http://localhost:4796/BleuS/foto/uploadvideo.aspx)and all I do is
change the web.config connection string to point to my local database
or to the remote database. When I point to my local database the application works fine as you can see: http://www.bleusolutions.com/images/localupload.jpgWhen
I run the application locally and change the connection string to point
to the remote database this is when I have problems with the same video
file. The application hangs for a minute or two and then it finally
finishes the upload. When I go to the database to view the table the
table takes a minute or two to open and then I get this message:http://www.bleusolutions.com/images/remoteupload.jpgAnd this is what the table looks like after I click Okhttp://www.bleusolutions.com/images/remoteupload2.jpg Any help on why this is happening or how I can fix this issue will be greatly appreciated. Thanks
View 1 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Sep 27, 2007
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
View 1 Replies
View Related
Oct 11, 2000
I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
Any comments are appreciated,
-Marc
View 3 Replies
View Related
Mar 24, 2008
Hi All,
Please some one help me...
I have to insert a csv into one table in sql server. But the problem is the file is in one server and SQL SERVER 2005 is in other server..
how do i insert the file....
please help me.....
View 1 Replies
View Related
Jun 29, 2015
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')
[code]...
View 5 Replies
View Related
Feb 1, 2007
Hi~,
Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.
- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility
- SQL Server's resource usage : when running memory based bulk copy, server resource's influence
- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?
- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit
- any other guide lines
View 1 Replies
View Related
Apr 18, 2008
Hello,
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Thanks.
View 8 Replies
View Related
Apr 26, 2004
Hello Everyone,
I am currently struggling with a problem bulk inserting data into sql server.
The application I am writing is multi-threaded and downloads about 2000 records every 2 seconds on x amount of threads (depending on bandwidth).
What I would like to do is find a 'friendly' way to insert this data into sql server without hammering the cpu.. I have tried the following with little success.
1, Using a single insert and thread.sleep(x * 20) to allow for massive data input, altough this made the application more stable and lowered cpu usage to very little the data takes about 60 times longer to download and process into the database.
2, Using a SqlDataAdapter and DataSet and updating the database via the .Update method of the data adapter.. Simply this was awful and took forever to process the data into the database.. (Took about 30 seconds to process 2000 records and Command Timeout was high).
3, Using OpenXML in SqlServer and parsing the data as an XML string (nText), although this method is fast its still very CPU intensive. I have to set the command timeouts very high to allow for this approach (because of the multi-threaded nature of the app).
Does anyone have any idea's on a cpu friendly approach to this problem ??
Thanks in advance..
Gary.
View 4 Replies
View Related
Mar 7, 2008
Hello,
I need help regarding bulk insert into sql server. basically i m making a customized webapplication to save all the file and directories saving it in an arraylist and then want to save the contents of that array list into sql server table. now my questions are:
1) should i use arraylist for this purpose(i mean is it fast as my data is too big)
2) how to save all the data from arraylist into sql server table with bulkinsert or smthing like that. i dont want to use insert query with looping through the arraylist, as it will take lot of time and resources.
hope you get my query.
View 4 Replies
View Related
Oct 25, 2006
does anyone know how to do a bulk insert in SQL Server Express thanks Marcel
View 2 Replies
View Related
Dec 27, 2007
Hi!
I'm building a web application. I need to read data from a text or excel file and process the data and then store the result records into database. The record number is big. I can store the data record into database (SQL Server 2005) one at a time. I think it's slow. Is there any way to insert the data in bulk.
Thanks!
ccy
View 4 Replies
View Related
Aug 1, 2000
Hi,
I am working on sql server bulk copy program. I am getting data files from
our vendors for shares and stocks. The data files are pipe separated values.
for ex the Ascii file format is
8388182|"ACC consultanats"|"rating for the current financial year"|23
My doubt is i have four columns in my sql server table named stocks.
table structure
---------------
serialno numeric
caption nvarchar(255)
memo1 ntext 16
sno int
In this the third column named memo1 in the data file would be a large
volume data. That is it may be upto one full A4 size page.
One important thing is, data in the third column is not formatted.
Since it is very urgent, let me know what would be the format file for
this type of data file and the bulk copy program utility.
Kindly let me know as early as possible.
Regards
Sivaramakrishnan
View 1 Replies
View Related
Nov 7, 2001
I am running the following:
BULK INSERT DB.dbo.[stblCLIENT]
FROM 'SERVER1downloadClient.txt'
WITH
(
FIELDTERMINATOR = 'Ř',
ROWTERMINATOR = ''
)
DB.dbo.[stblCLIENT] is on SERVER2. I receive the following error:
"Could not bulk insert because file 'SERVER1downloadClient.txt' could not be opened. Operating system error code 53(The network path was not found.)."
I am able to run a DTS package that imports the same text file from SERVER1 with no error.
Is BULK INSERT limited to importing text files from the server on which SQL Server is running or should I be able to BULK INSERT from another server on my LAN?
View 1 Replies
View Related
Sep 4, 2006
I'm trying to import data from flat file in table and have fewproblems.1.Field Delimiter is ',' (comma). If ',' occurs in quotedstring it is still treated as field delimiter. This is BUG or ?2.In table I have datetime field that can be null, but bulkinsert reports error if in flat file is null or ''. It's OK only whenreal date is specified.Table:create table AttachmentList (Code integer not null,ClassID integer null,Description varchar(200) null,ValidUntil datetime null,constraint PK_ATTACHMENTLIST primary key (Code))flat file.1,13,'Naputak, CU 261098', ''Thanks in advanceDavor
View 3 Replies
View Related
Aug 29, 2006
I have a web page that prompts a user to select a csv file. Using a Bulk Insert the data is loaded into a SQL Server 2005 table.
I have been using the Bulk Insert with SQL Server 200 with no problems, but with 2005 I am getting the error "You do not have permission to use the bulk load statement".
My web.config file has the following connection string:
[code]
<add key="connectionString" value="Server=(local);Database=BroadCastOne;trusted_connection=true" />
[/code]
I've given bulkAdmin role to the ASPNET user. It's still not working. What am I doing wrong?
Any help is greatly appreciated,
Ninel
View 3 Replies
View Related
Sep 9, 2007
Hey There,
Here, is the example of Bulk Insert into SQL Server Table.
From Application you have to pass a XML string to a Stored Procedure and it will insert all data into table using that XML.
Example SP.
CREATE PROCEDURE StoredProcName
(
@strXML varchar(8000)
)
AS
Declare @intPointer int
exec sp_xml_preparedocument @intPointer output, @strXML
INSERT into tbl_plnd_insertion
SELECT Column1, Column2, Column3, Column4, Column5
FROM OpenXml(@intPointer,'/root/tbl_plnd_insertion',2)
WITH (Column1 varchar(20) '@Column1' , Column2 varchar(20) '@Column2', Column3 varchar(20) '@Column3' , Column4 varchar(50) '@Column4', Column5 varchar(50) '@Column5')
exec sp_xml_removedocument @intPointer
Thanks !!!!!
View 10 Replies
View Related
May 2, 2007
I need to update a number of sql server tables, the data sources for these coming from a number of stored procedures. I want a generic way of getting the data and then passing this data to the tables.I am thinking of doing this for each table:Populating a datasetWriting this dataset to XMLUsing SQLXML Bulk Load to pass this XML to the database to updateI can create the xml data file by:
dataset.WriteXml("C:data.xml")The problem I have is that the example (http://support.microsoft.com/default.aspx/kb/316005/en-us) I looked at relies on the schema being defined:<?xml version="1.0" ?><Schema xmlns="urn:schemas-microsoft-com:xml-data" xmlns:dt="urn:schemas-microsoft-com:xml:datatypes" xmlns:sql="urn:schemas-microsoft-com:xml-sql" > <ElementType name="CustomerId" dt:type="int" /> <ElementType name="CompanyName" dt:type="string" /> <ElementType name="City" dt:type="string" /> <ElementType name="ROOT" sql:is-constant="1"> <element type="Customers" /> </ElementType> <ElementType name="Customers" sql:relation="Customer"> <element type="CustomerId" sql:field="CustomerId" /> <element type="CompanyName" sql:field="CompanyName" /> <element type="City" sql:field="City" /> </ElementType></Schema>Is there any way I can create the schema 'on the fly' similar to how I did for the data source file.As I could then pass these files to the database:objBL.Execute ("schema.xml","data.xml");
View 1 Replies
View Related
Jun 19, 2014
I found loads of things but nothing seems to work...
I'm trying to get a link with XML data inside the page into a table but I can't find anything
View 9 Replies
View Related
Mar 24, 2008
Hi All,
Please some one help me...
I have to insert a csv into one table in sql server. But the problem is the file is in one server and SQL SERVER 2005 is in other server..
how do i insert the file....
please help me.....
View 1 Replies
View Related
Nov 15, 2006
I have an SSIS job that is pumping to a SQL Server Destination, hundreds of gigabytes of raw text files. Today I received this strange error - does anyone have insight? Also, how would I make the data tasks more stable and robust so that this doesn't cause package failure (retries, or something?)
[SQL Server Destination [4076]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".
View 20 Replies
View Related
Sep 29, 2015
I need to create a T-sql script to create logins in bulk so users can access views on the database. Usernames as used in the local domain are all stored in a table.
Users already exist in the domain (let’s say: MYDOMAIN)ADSI.dbo.Ad.username is the column wich contains the usernames as used in the Active Directory domain (+/- 350)Script needs to check if login already exists to make sure login is not added twice.If possible give all these users db_datareader role, in let’s say: MYDATABASE
View 7 Replies
View Related
Nov 20, 2015
SQL Server 2012
I want to be able to run the following command from SSMS (as an ad-hoc query).
BULK INSERT Database_Name.dbo.Table_Name FROM 'serverfile.txt' WITH (FIELDTERMINATOR = '|', ROWTERMINATOR = '0x0a', MAXERRORS = 0);
When I do I get:
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "serverfile.txt" could not be opened. Operating system error code 5(Access is denied.).
I have full access to the file.I can do the same command successfully if the file is stored on a local drive on the server.
According to my DBA I can not run it with a remote file location because I don't have the SA permission. His solution is for me to create a job that runs the command. I have done so and the job works correctly.
Is he correct that there is no way for me to be able to run it from SSMS without SA permissions?
View 5 Replies
View Related
Dec 11, 2006
Hello
I have four tables as xml. I'm successfully bulk loading them into 4 tables, using the SQL Server Destination, with check constraints unchecked.
The process manages to load all the data without any FK issue.
After that, how can I check the constraints ? (I've read a bit about is_not_trusted and sys.check_constraints)
Thibaut
View 4 Replies
View Related
Sep 29, 2006
Hi All,I have an asp.net 2.0 app that needs to bulk load data from an xml file into a Sql Server (Express) table. Is there an easy way to do this?Thanks,Claude.
View 3 Replies
View Related
Dec 28, 2013
I'm able to successfully import data in a tab-delimited .txt file using the following statement.
BULK INSERT ImportProjectDates FROM "C: mpImportProjectDates.txt"
WITH (FIRSTROW=2,FIELDTERMINATOR = ' ', ROWTERMINATOR = '')
However, in order to import the text file, I had to add columns to the text file to match the columns that exist in the table. The original file is an export out of another database and contains all but 5 columns from my db.
How would I control which column BULK INSERT actually imports when working with a .txt file? I've tried using a FORMAT FILE, however I kept getting errors which I tracked down to being a case of not using it with a .txt file.
Yes, I could have the DBA add in the missing columns to the query from the other DB to create the columns, however I'd like to know a little bit more about this overall.
View 9 Replies
View Related
Jul 2, 2014
I've been reading about the "table lock on bulk load" option and TABLOCK hint.
So my understanding is by default only row locks are taken out and other queries can read/write data while the bulk load is going on. However if you were doing parallel bulk loads with overlapping keys from a clustered index then they may block each other.
But if the option is enabled, you can do the parallel bulk loads without blocking because a table lock is taken out, however, other processes couldn't read/write the data until they're all done.
Is that the gist of it? I think I got confused by some misinformation. Don't all those row locks eventually likely escalate to a table lock anyway though?
View 1 Replies
View Related
Jul 29, 2014
I need to load the following data into a SQL table. This is how the vendor is able to provide it to us.
CRCorp Daily Report,,,,,,
,,,,,,
Facility,Location,Purchase Order #,Vendor,Inventory #,Date Ordered,Extended Cost
09-Mowtown 495 CRST,09-402A Women's Imaging,327937,"BARD PERIPHERAL VASCULAR, INC.",113989,7/25/2014,650
09-Mowtown 495 CRST,09-402A Women's Imaging,327936,"WB MASON CO., INC.",112664,7/25/2014,8.64
01-Mowtown 499 CRST,01-302B Oncology,327947,McKesson General Medical,n/a,7/25/2014,129.02
[Code] ....
I have attempted to bulk insert it into this table with no luck.
CREATE TABLE POMaster
(Facility VARCHAR(75),
Location VARCHAR(75),
PONum INT,
VendorNm INT,
INVENTORYNUM VARCHAR(25),
orderDte DATE,
ExtendedPrice NUMERIC(10,2)
)
GO
It does not like the double quotes. How to make this format work? Do I need a format file?
View 2 Replies
View Related
Mar 5, 2015
I have to perform a bulk Import on a regular Basis and have created a script to do this. The Problem is that the .csv file has 12 Columns and the table to Import into has 14. To Workaround this discrepancy I have decided to use a Format file. The Problem is that how to create one.
View 3 Replies
View Related
Mar 23, 2015
I want to bulk insert data into a table named scd_event_tab inside a database named rdb.
When I do select * from rdb.dbo.scd_event_tab, i get :
JOB_ID RUN_ONPRIORITYPAYLOADTIMEOUT_INTERVALSTATUSPICKUP_TIMESCD_TYPESCHEDULE_IDDB_ADMIN_LOGIN_REQUIRED_YN
I saved the result into a csv file and then truncated the table. Now, I am trying to bulk insert the data into the table. So I used:
bulk insert
rdb.dbo.scd_event_tab from 'C:userssluintel.ctrdesktopeventtab.csv'
with
(
codepage = 'RAW',
datafiletype = 'native',
fieldterminator = ' ',
keepidentity,
keepnulls
);
go
However, I get this error:
Msg 4867, Level 16, State 1, Line 1
Bulk load data conversion error (overflow) for row 1, column 1 (JOB_ID).
Msg 4866, Level 16, State 5, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 3. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
View 9 Replies
View Related