Transact SQL :: Bulk Insert From Remote Server?
Nov 20, 2015
SQL Server 2012
I want to be able to run the following command from SSMS (as an ad-hoc query).
BULK INSERT Database_Name.dbo.Table_Name FROM 'serverfile.txt' WITH (FIELDTERMINATOR = '|', ROWTERMINATOR = '0x0a', MAXERRORS = 0);
When I do I get:
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "serverfile.txt" could not be opened. Operating system error code 5(Access is denied.).
I have full access to the file.I can do the same command successfully if the file is stored on a local drive on the server.
According to my DBA I can not run it with a remote file location because I don't have the SA permission. His solution is for me to create a job that runs the command. I have done so and the job works correctly.
Is he correct that there is no way for me to be able to run it from SSMS without SA permissions?
View 5 Replies
ADVERTISEMENT
Nov 7, 2001
I am running the following:
BULK INSERT DB.dbo.[stblCLIENT]
FROM 'SERVER1downloadClient.txt'
WITH
(
FIELDTERMINATOR = '',
ROWTERMINATOR = ''
)
DB.dbo.[stblCLIENT] is on SERVER2. I receive the following error:
"Could not bulk insert because file 'SERVER1downloadClient.txt' could not be opened. Operating system error code 53(The network path was not found.)."
I am able to run a DTS package that imports the same text file from SERVER1 with no error.
Is BULK INSERT limited to importing text files from the server on which SQL Server is running or should I be able to BULK INSERT from another server on my LAN?
View 1 Replies
View Related
Mar 24, 2008
Hi,
I have load a CSV file into one of the table in sql server 2005 using bulk insert command. But the csv file in remote system.
Please help me.....
View 1 Replies
View Related
Jul 20, 2005
Hi all,We have an application through which we are bulk inserting rows into aview. The definition of the view is such that it selects columns froma table on a remote server. I have added the servers usingsp_addlinkedserver on both database servers.When I call the Commit API of oledb I get the following error:Error state: 1, Severity: 19, Server: TST-PROC22, Line#: 1, msg:SqlDumpExceptionHandler: Process 66 generated fatal exception c0000005EXCEPTION_ACCESS_VIOLATION. SQL Server is terminating this process.I would like to know if we can bulk insert rows into a view thataccesses a table on the remote server using the "bulk insert" or bcpcommand. I tried a small test through SQL Query Analyser to use "bulkinsert" on a such a view.The test that I performed was the following:On database server 1 :create table iqbal (var1 int, var2 int)On database server 2 (remote server):create view iqbal as select var1,var2 from[DBServer1].[SomeDB].[dbo].[iqbal]set xact_abort onbulk insert iqbal from '\MachineIqbaliqbaldata.txt'The bulk insert operation failed with the following error message:[Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionCheckForData(CheckforData()).Server: Msg 11, Level 16, State 1, Line 0General network error. Check your network documentation.Connection BrokenThe file iqbaldata.txt contents were :112233If the table that the view references is on the same server then weare able to bulk insert successfully.Is there a way by which I should be able to bulk insert rows into aview that selects from a table on a remote server. If not then couldanyone suggest a workaround. I would actually like to know someworkaround to get the code working using OLEDB. Due to unavoidablereasons I cannot output the records to the file and then use bcp tobulk insert the records in the remote table. I need to have some wayof doing it using OLEDB.Thanks in advanceIqbal
View 7 Replies
View Related
Jan 20, 2015
SQL Server 2012 running under a domain Managed Service Account. (Server A)
File located on a Windows 2012 server in a directory which has been shared to user A. (Server B).
User A is a domain account and is using his laptop, (laptop C) which is using SSMS to run a bulk insert command.
User A (Bulk Insert from laptop SSMS Client) --- > SQL Server (server A) --- > File Server (Server B)
The command fails and is returning Access denied to the file/folder share on Server B.
Running the same command on the SQL Server (Server A), the command works fine, so this is a double hop kerberos issue.
If I use a SQL Login from Laptop C, then the command works fine as the SQL Server will use the SQL's Managed service account to connect to the file share, which is set up for delegation and impersonation.
I am struggling to work out why a domain user cannot bulk insert a file from a remote location. I have checked that the user is connected with Kerberos authentication and they are. All articles seem to talk about setting up SPN's for the SQL Server so that SQL Login authentication can work over remote bulk insert, and just say to set up the file share properties properly if using a domain account.
What I am missing to allow domain accounts to bulk insert remotely, from a remote file share?
View 2 Replies
View Related
Feb 5, 2007
OK, Ive read many posts on this problem but have seen no resolution.
Here's the deal. When on the actual SQL box (There's 2 in the cluster) the bulk insert works fine. No errors and the event log on the machine that is hosting the text file shows that the account that SQL runs on has accessed the file. This account is a DOmain account and is in the Local Administrator of the SQL server and the remot box hosting the text file.
Thats great if you want your developers testing and accessing your SQL box as Administrators. We don't work that way. Our developers connect via SQL Management Studio and test ther stuff that way. That's where the problem rears it's ugly head.
When anyone runs a Bulk Insert command that accesses a text file that is on a remote server, they get an "Access Denied". Now, I did a lot of testing and found that when the users executes the Bulk Insert command from the SQL Management studio on their desk top, they connect to the SQL box with their credentials (OK, that's correct right?), SQL then runs the Bulk Isert command which then reaches out to the remote file server but gets the "Access Denied". I check the logs and it shows that "Anonymous" was trying to accesss the file.
Why is Anonymouss coming over as credentials for SQL on a Bulk Insert? I'm no idiot but this sounds like a big crappy bug tha M$ will not fess to. I followed many suggestions, made sure NTFS and Share level permissions were correct (That was the first thing...), made sure the account that was running as SQL Server within the cluster on both nodes in the cluster was the same, that wasn't it, I even created a SPN for SQL to run and automatically register in AD with the credentials that SQL runs as. NOTHING!!!
Has anyone gotten their bulk insert to work when inserting from a file that is NOT local to the SQL box? Any help is appreciated, but putting the text files on the local SQL box IS NOT AN OPTION.
Thanks
View 28 Replies
View Related
Jun 10, 2007
Using Bulk Insert Task extensively in our solution. Everything was working great till we deployed it in stage columns. The database server is different from application servers. We have ASP.NET web services driving SSIS packages on application server. After struggling thru several security issues to get this working (ended up creating an application pool with a domain account) we are now stuck with this problem. On a different note still don't understand what specific security permission is available to domain account that makes it work.
Read in some blog that SQL Server 2005 SP2 Beta had this (Bulk Insert) fixed but not in final production version. Is there a specific reason why this is so?
SSIS and the API is quite easy to work with but associated security and deployment issues are not always clear. A lot of answers seem to be coming from end users - thanks a lot to all for sharing your experiences - sadly not presented clearly in SSIS documentation.
View 2 Replies
View Related
May 15, 2015
I have a file which has some wind data that i am trying to import into a sql data base through bulk insert. if the script works as it supposed i should see 144 rows impacted but i see 0 rows affected.
BULK INSERT TOWER.RAWINTERFACE_1058 FROM 'C:Temp900020150427583.txt'
WITH(CHECK_CONSTRAINTS,CODEPAGE='RAW',DATAFILETYPE='char',FIELDTERMINATOR=' ',ROWTERMINATOR='
',FIRSTROW=172)
The code works if the file is large but if its small 0 rows are affected. and also if i remove the header rows then the file works again. want to understand what is going on here. i am including the screen shot of the file in notepad++. I have tried changing the row terminator to ' ' , ' ' and also tried to change the codepage but nothing seems to work. No error file is being generated either, if i give a error file option.
View 7 Replies
View Related
May 16, 2012
I need fmt(format ) file for below values
“Stuid”,”Stuname”,”Class”,”DOJ”,”English”,”Math”,”Science”
"S1","Ram","10/31/2011,Monday",40,32,50
"S2","Bala","10/31/2011,Monday",50,45,69
"S3","Sam","10/31/2011,Monday",74,78,79
"S4","Jon","10/31/2011,Monday",65,58,89
"S5","Jos","10/31/2011,Monday",41,25,69
"S6","Jim","10/31/2011,Monday",74,41,41
"S7","Jack","10/31/2011,Monday",98,57,47
"S8","Sate","10/31/2011,Monday",87,73,45
"S9","Brb","10/31/2011,Monday",47,89,65
"S10","Jom","10/31/2011,Monday",14,100,47
View 15 Replies
View Related
Aug 11, 2015
Need to know a mode whereby somehow I can every time insert an additional column in a table while bulk inserting data to an existing table from a new flat file thus identifying the file from which, or the time when, the data was inserted in an existing table.
View 2 Replies
View Related
Nov 5, 2010
I have a CSV file that I am trying to bulk load into a temp table. The data in the file is all jumbled together, as in, there does not appear to be a row terminator. However, I do see a bunch of little rectangular boxes that I assume are the row terminators.
When I run the bulk insert, the data is treated as one string. For example... If I have 10 columns in the table, the 10 columns will be populated, but the remainder of the data is dumped into the last column.
Here are the row terminators I have used so far that haven't worked.
,
,
,
, CRLF,
View 31 Replies
View Related
Aug 5, 2015
USE TEST
GO
/****** BULK INSERT ******/
BULK
INSERT [Table01]
FROM 'C:empdata.csv'
[code]....
I am using above code to insert csv file data which consist of arabic data as well. Upload is successful however Arabic field data is uploaded with invalid characters and getting the following error Msg 4864, Level 16, State 1, Line 3...Bulk load data conversion error (type mismatch or invalid character for the specified codepage)
View 15 Replies
View Related
Sep 15, 2015
How do I do a bulk insert into a temp table from a text file. Text file looks like that:
ver_id TYPE
E57AB326-803C-436E-B491-398A255C919A 58
34D2A601-6BBA-46B1-84E1-3A805FDA3812 58
986E140C-62F1-48F1-B428-3571EBF00DA0 58
My statement looks like this:
CREATE TABLE [dbo].[tblTemp]([ver_id] [nvarchar](255), [TYPE] [smallint])
GO
BULK INSERT [dbo].[tblTemp]
FROM 'C:v2.txt'
I keep receiving errors.
The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).
View 2 Replies
View Related
May 14, 2015
I am using a BCP format file to import a CSV file. The file looks like the following:
"01","02"
The format file looks like the following:
6.0
2
1 SQLCHAR 0 0 """ 0 ""
2 SQLINT 0 0 "","" 1 MROS
3 SQLINT 0 0 ""
" 2 MROF
When both the two fields are set to SQLCHAR data types the data imports successfully without the quotes as 01 and 02. These fields will always be numbers and I want them as integers so I set the data type to int in the database and SQLINT in the format file. The results was that the 01 became 12592 and the 02 became 12848. where these numbers are coming from?
View 7 Replies
View Related
Dec 3, 2015
I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:
Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')
Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:
Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).
I found this snip at
BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):
To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.
How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.
I have verified the SPNs:
C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local: MSSQLSvc/SQLQA.ad.local:1433 MSSQLSvc/SQLDev.ad.local:1433 MSSQLSvc/SQLQA.ad.local MSSQLSvc/SQLDev.ad.local
I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token.
C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2>
gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>
If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?
View 8 Replies
View Related
Jun 5, 2015
I try to import data with bulk insert. Here is my table:
CREATE TABLE [data].[example](
col1 [varchar](10) NOT NULL,
col2 [datetime] NOT NULL,
col3 [date] NOT NULL,
col4 [varchar](6) NOT NULL,
col5 [varchar](3) NOT NULL,
[Code] ....
My format file:
10.0
7
1 SQLCHAR 0 10 "@|@" 2 Col2 ""
1 SQLCHAR 0 10 "@|@" 3 Col3 ""
2 SQLCHAR 0 6 "@|@" 4 Col4 Latin1_General_CI_AS
[Code] .....
The first column should store double (in col2 and col3) in my table
My file:
Col1,Col2,Col3,Col4,Col5,Col6,Col7
2015-04-30@|@MDDS@|@ADP@|@EUR@|@185.630624@|@2015-04-30@|@MDDS
2015-04-30@|@MDDS@|@AED@|@EUR@|@4.107276@|@2015-04-30@|@MDDS
My command:
bulk insert data.example
from 'R:epoolexample.csv'
WITH(FORMATFILE = 'R:cfgexample.fmt' , FIRSTROW = 2)
Get error:
Msg 4823, Level 16, State 1, Line 2
Cannot bulk load. Invalid column number in the format file "R:cfgexample.fmt".
I changed some things as:
used ";" and "," as column delimiter
changed file type from UNIX to DOS and adjusted the format file with "
" for row delimiter
Removed this line from format file
1 SQLCHAR 0 10 "@|@" 2 Col2 ""
Nothing works ....
View 7 Replies
View Related
Feb 3, 2010
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Is there any proper way to load this kind of files into the db table?
View 6 Replies
View Related
Apr 18, 2008
Hello,
I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.
Thanks.
View 8 Replies
View Related
Nov 2, 2007
Does anyone know how to do a bulk insert using just the script task? I've been searching everyehere but can't seem to find a sample.
View 6 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related
Sep 29, 2015
I need to create a T-sql script to create logins in bulk so users can access views on the database. Usernames as used in the local domain are all stored in a table.
Users already exist in the domain (let’s say: MYDOMAIN)ADSI.dbo.Ad.username is the column wich contains the usernames as used in the Active Directory domain (+/- 350)Script needs to check if login already exists to make sure login is not added twice.If possible give all these users db_datareader role, in let’s say: MYDATABASE
View 7 Replies
View Related
Sep 27, 2007
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
View 1 Replies
View Related
Mar 7, 2008
Hello,
I need help regarding bulk insert into sql server. basically i m making a customized webapplication to save all the file and directories saving it in an arraylist and then want to save the contents of that array list into sql server table. now my questions are:
1) should i use arraylist for this purpose(i mean is it fast as my data is too big)
2) how to save all the data from arraylist into sql server table with bulkinsert or smthing like that. i dont want to use insert query with looping through the arraylist, as it will take lot of time and resources.
hope you get my query.
View 4 Replies
View Related
Oct 11, 2000
I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)
Updates and cursors just seem to be too slow.
So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.
This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert
Any comments are appreciated,
-Marc
View 3 Replies
View Related
Oct 25, 2006
does anyone know how to do a bulk insert in SQL Server Express thanks Marcel
View 2 Replies
View Related
Dec 27, 2007
Hi!
I'm building a web application. I need to read data from a text or excel file and process the data and then store the result records into database. The record number is big. I can store the data record into database (SQL Server 2005) one at a time. I think it's slow. Is there any way to insert the data in bulk.
Thanks!
ccy
View 4 Replies
View Related
Sep 4, 2006
I'm trying to import data from flat file in table and have fewproblems.1.Field Delimiter is ',' (comma). If ',' occurs in quotedstring it is still treated as field delimiter. This is BUG or ?2.In table I have datetime field that can be null, but bulkinsert reports error if in flat file is null or ''. It's OK only whenreal date is specified.Table:create table AttachmentList (Code integer not null,ClassID integer null,Description varchar(200) null,ValidUntil datetime null,constraint PK_ATTACHMENTLIST primary key (Code))flat file.1,13,'Naputak, CU 261098', ''Thanks in advanceDavor
View 3 Replies
View Related
Aug 29, 2006
I have a web page that prompts a user to select a csv file. Using a Bulk Insert the data is loaded into a SQL Server 2005 table.
I have been using the Bulk Insert with SQL Server 200 with no problems, but with 2005 I am getting the error "You do not have permission to use the bulk load statement".
My web.config file has the following connection string:
[code]
<add key="connectionString" value="Server=(local);Database=BroadCastOne;trusted_connection=true" />
[/code]
I've given bulkAdmin role to the ASPNET user. It's still not working. What am I doing wrong?
Any help is greatly appreciated,
Ninel
View 3 Replies
View Related
Sep 9, 2007
Hey There,
Here, is the example of Bulk Insert into SQL Server Table.
From Application you have to pass a XML string to a Stored Procedure and it will insert all data into table using that XML.
Example SP.
CREATE PROCEDURE StoredProcName
(
@strXML varchar(8000)
)
AS
Declare @intPointer int
exec sp_xml_preparedocument @intPointer output, @strXML
INSERT into tbl_plnd_insertion
SELECT Column1, Column2, Column3, Column4, Column5
FROM OpenXml(@intPointer,'/root/tbl_plnd_insertion',2)
WITH (Column1 varchar(20) '@Column1' , Column2 varchar(20) '@Column2', Column3 varchar(20) '@Column3' , Column4 varchar(50) '@Column4', Column5 varchar(50) '@Column5')
exec sp_xml_removedocument @intPointer
Thanks !!!!!
View 10 Replies
View Related
Dec 28, 2013
I'm able to successfully import data in a tab-delimited .txt file using the following statement.
BULK INSERT ImportProjectDates FROM "C: mpImportProjectDates.txt"
WITH (FIRSTROW=2,FIELDTERMINATOR = ' ', ROWTERMINATOR = '')
However, in order to import the text file, I had to add columns to the text file to match the columns that exist in the table. The original file is an export out of another database and contains all but 5 columns from my db.
How would I control which column BULK INSERT actually imports when working with a .txt file? I've tried using a FORMAT FILE, however I kept getting errors which I tracked down to being a case of not using it with a .txt file.
Yes, I could have the DBA add in the missing columns to the query from the other DB to create the columns, however I'd like to know a little bit more about this overall.
View 9 Replies
View Related
Jul 29, 2014
I need to load the following data into a SQL table. This is how the vendor is able to provide it to us.
CRCorp Daily Report,,,,,,
,,,,,,
Facility,Location,Purchase Order #,Vendor,Inventory #,Date Ordered,Extended Cost
09-Mowtown 495 CRST,09-402A Women's Imaging,327937,"BARD PERIPHERAL VASCULAR, INC.",113989,7/25/2014,650
09-Mowtown 495 CRST,09-402A Women's Imaging,327936,"WB MASON CO., INC.",112664,7/25/2014,8.64
01-Mowtown 499 CRST,01-302B Oncology,327947,McKesson General Medical,n/a,7/25/2014,129.02
[Code] ....
I have attempted to bulk insert it into this table with no luck.
CREATE TABLE POMaster
(Facility VARCHAR(75),
Location VARCHAR(75),
PONum INT,
VendorNm INT,
INVENTORYNUM VARCHAR(25),
orderDte DATE,
ExtendedPrice NUMERIC(10,2)
)
GO
It does not like the double quotes. How to make this format work? Do I need a format file?
View 2 Replies
View Related
Mar 5, 2015
I have to perform a bulk Import on a regular Basis and have created a script to do this. The Problem is that the .csv file has 12 Columns and the table to Import into has 14. To Workaround this discrepancy I have decided to use a Format file. The Problem is that how to create one.
View 3 Replies
View Related