SQL Tools :: Cannot Bulk Load In Execute Task If File Is On Network (2014)
May 19, 2015
SELECT from openrowset(BULK 'SERVERNAMEsomepathsomefile.csv'... fails while SELECT from openrowset(BULK 'c:somepathsomefile.csv' ... works.
I am running the task as a specific sql server user. If I run the same query in management studio using execute as login='batchuser', it also works for any path.
How can I make this work without an extra step moving the data to the local server? Because that would cause extra administration.
I have an application like fetching records from the DataBase(MS Access 2000) and results i have to use in Script Task. At present i have used the record fetching query,connection string in Script itself. I would like to use in Independently. Is there any Tools like (Control Flow Tools like Execute SQL Task) are there to fetch the result set from Acccess and can use the fetching results in Script Task....
I am trying to load the data from one data base to another using the Task->Generate Script->tables (data only). So I saved the script of the data and when tried to run the script to the destination DB, getting the below error when ran the script.
Error: Visual Studio has encountered an exception. This may be caused by an extension.You can get more information by running the application together with the /log parameter on the command line, and then examining the file'C:UsersKK67AppDataRoamingMicrosoftAppEnv10.0ActivityLog.xm'. When tried opening the folder using the above path... don't see anything for AppData folder from that path.
We have 1 tb of dbsize of 3 different db which include ,mdf and log and mdf file and also how some sql jobs ile daily and weeklybackup jobs .what is the best option to copy these db and sql jobs . Is there any tool tools available which can copy these file over the network .How much time will it take
1) If we go in for fullback and restore option ? 2) make the sql server db offfine and copy and paste these mdf ,ldf,log file to different location ? 3) what would happen to the will database login accounts , cross database ownership chaining ? etc
I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?
Msg 4863, Level 16, State 1, Line 1 Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode). Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
BULK INSERTtbl_ASX_Data_temp FROM 'M:DataASXImportTest.txt' WITH (FORMATFILE='M:DataASXSQLFormatImport.Fmt')
as Declare @xml VARCHAR(MAX) Declare @i as int select @xml=BulkColumn from openrowset(bulk 'C:Documents and SettingsKasiDesktopote.xml', single_clob)as cse EXEC sp_xml_preparedocument @i OUTPUT,@xml Select * From OpenXML(@i,'/college/cse',2) With (name varchar(50), rollno int, year int)
i got the output... but i want to give path as parameter during execution of procedure name. can anyone help me..
I have a file which has * as the field delimiter and ~ as the record delimiter, but I don't know how much columns each row will have. Only known is the maximum which can be 15.
The file looks something like: A1*A2*A3*~B1*B2~C1*C2*C3*C4*C5~
SO I have created a table with 15 columns(since 15 can be the max) but now when I try to insert it to that table, I inserts only the entire file into 1 single column.
The command which I am using is: BULK INSERT tablename FROM filename WITH (FIELDTERMINATOR = '*' , ROWTERMINATOR = '~')
but this is not giving the correct output.
The output expected is A1 A2 A3 B1 B2 C1 C2 C3 C4 C5
Is there a way to read a file name automatically from a network folder? I can successfully bulk insert from this particular folder. The next step is as I add files, I wish to bulk insert the latest file added so the program must make that determination and import that specific file. I can delete the older files if necessary and save them elsewhere but it would still be nice to be able to read the file name. I then wish to store the name of this file, whatever it is, into a field called "SourceFileName" in my table that I am bulk inserting into. Does anyone have an example in dynamic SQL? Thanks.
I am running Microsoft SQL Server 2012 SP on a Windows Server 2008 R2 Standard SP1 box. The SQL Server service is running as a simple windows domain user (nothing special, no admin rights, etc.) I am having some issues with using Bulk Insert when the data file is on a network share when using Windows Authentication. What is known is that the SQL Server service account has access to the network resource, which is shown by logging into SQL Server with a SQL account and doing the Bulk Insert. I also have rights to the files on the share, as shown by the fact that I put the files there. My SQL is in the form of:
Bulk Insert [table name] From '[server][share][filename]' With (FirstRow = 2, FormatFile='FormatFile.xml')
Now, when connecting to SQL Server with Windows Authentication and running the Bulk Insert I get the following error:
Msg 4861, Level 16, State 1, Line 2 Cannot bulk load because the file "[server][share][filename]" could not be opened. Operating system error code 5(Access is denied.).
I found this snip at
BULK INSERT (Transact-SQL)Security Account Delegation (Impersonation), which says, in part (emphasis mine):
To resolve this error [4861], use SQL Server Authentication and specify a SQL Server login that uses the security profile of the SQL Server process account, or configure Windows to enable security account delegation. For information about how to enable a user account to be trusted for delegation.
How to Configure the Server to be Trusted for Delegation, and we tried the unconstrained delegation and I rebooted the SQL server, but it still does not work. Later we tried constrained delegation and it still does not work.
I have verified the SPNs:
C:>setspn adsvc_sqlRegistered ServicePrincipalNames for CN=SVC_SQL,OU=Service Accounts,OU=Users,OU=ad domain,DC=ad,DC=local: MSSQLSvc/SQLQA.ad.local:1433 MSSQLSvc/SQLDev.ad.local:1433 MSSQLSvc/SQLQA.ad.local MSSQLSvc/SQLDev.ad.local I have verified that my SQL connection is TCP and I am getting/using a Kerberos security token. C:>sqlcmd -S tcp:SQLQA.ad.local,1433 -E1> Select dec.net_transport, dec.auth_scheme From sys.dm_exec_connections As dec Where session_id = @@Spid;2> gonet_transport auth_scheme------------- -----------TCP KERBEROS(1 rows affected)1>
If I move the source file to a local drive (on the SQL server), all works fine, but I must be able to read from a file share?
Hi All,I have an asp.net 2.0 app that needs to bulk load data from an xml file into a Sql Server (Express) table. Is there an easy way to do this?Thanks,Claude.
I am new to SSIS but i have avg working knowledge in sql. My problem is as follows ,I have a text pipe dilimited file in some folder and the number of columns and the name of the column is not consistant. It can have N number of column and it can have any column names. I need to load this text file data into a sql table. All that i want is to load this file to SQL Database with some temp name. Once i get the table in SQL Database, i can match the column names of both taget table and this temp table and only push those column which matches with the target table. For this i can frame Dynamic SQL. This part is clear to me.
Now the problem is , I developed a SSIS pacakge to push the text file to SQL Table. I am able to do this. But if i change the column names or added new column SSIS is not able to push the new columns. Is this functionality available in SSIS, is it can be dynamic like this?
I hope i am clear with my prob... if need any clarification please let me know
I am able to run SSIS packages as SQL Server Agent jobs with a Control Flow items "File system task", if I move a file (test.txt) from a drive (c on the server (where SQL Agent jobs run) to a subdirectory on the same drive. But, if I try to move a file on a network drive, the package fail.
In the Control flow tab, I have an Execute SQL Task that outputs full Result set into a variable of an object type. Now how can I write the contents of the Full Result Set into a text file using Script Task. I also want to format the following way while I output into a file:
Column Name 1 : Column Value
Column Name 2: Column Value and so on
I tried writing the contents of the Object Variable into a file, but the file had an output of single word: System.__ComObject.
Code for Writing the Full Result Set into a Text File
Dim RSsqloutput as String = Dts.Variables("objVariable").Value.ToString
Dim strVal as String = "File completed on " & Now() & vbCrLf & "------------------------------------------------------" & vbCrLf
I can't use DTS nor DTSwizard as I need to put it in a .sql and run it through a command line via .bat file (it's more for the users).
Each row ends with an EOL character, the fields are all fixed width, but I have a little problem here, some rows are empty but just with a EOL character.
I wasn't sure where to put this topic so I put it here since I figured it is a question that would apply to virtually any version even though I am using SQL Server 2005.
We have a vendor that sends us a fixed width text file every day that needs to be imported to our database in 3 different tables. I am trying to import all of the data to a staging table and then plan on merging/inserting select data from the staging table to the 3 tables. The file has 77 columns of data and 20,000+ records. I created an XML format file which I sampled below:
The data file is a fixed width file with no column delimiters or row delimiters that I can tell. When I run the following insert statement I get the error below it.
BULK INSERT myStagingTable FROM '.........myDataSource.txt' WITH ( FORMATFILE = '.........myFormatFile.xml', ERRORFILE = '.........errorlog.log' );
Here is the error:
Msg 4832, Level 16, State 1, Line 1 Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I am facing a issue with bulk upload on Test Server.
Issue: When running Openrowset command from SQL server other that Test Server query runs fine when trying to run the same command from Test Server it gives error.
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "\ServerNameinputFileName.csv" could not be opened. Operating system error code 5(Access is denied.).
For example: If the command is run from System A connecting to SQL Server instance on Test Server Test Server it gives this error. If the same command with same rights is run from any other SQL server instance say Dev1 its running fine.
If the command is run from Test Server connecting to any SQL Server instance including Test Server it is running fine.
Tried: 1) Given the read/write rights on shared folder, to user under which the SQL server service is running on Test Server
2) Given the read/write rights on shared folder to everyone.
Query:
SELECT DISTINCT * FROM OPENROWSET
(
BULK '\ServerNameinputFileName.csv',
FORMATFILE='\ServerNameFormat.xml'
)
AS FileList
Please provide me with some solution. What can be the reason for such behaviour?
I am trying to load a XML file into a string variable using a XML Task. The file is 67,936 KB. When I excecute the package in the dev environment I get the following error:
[XML Task] Error: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".
Has anybody run into this before? In the next step I pass the XMLData string to a stored procedure which loads it into a table with a XML data type column.
with my ssis-package i have to read several flatfiles. this files are stored in different folders on a unix machine. in my loop-task i have first a script, that checks whether the file exists. it does, so i set the path to a variable. in the connection-manager for the flat-file i have set source for the file to that variable. the bulk-insert-task starts, read the file and everithing is cool. but, sometimes, the package fails with this message:
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "\Owpu0kas 0 0 5 eceivedw000005.090" could not be opened. Operating system error code 53(error not found)."
i stop the degugging-mode, restart debuging and what happens? it runs ... maybe for the next one, two or three files and the error comes again, but never on the same file. i'm going crazy, what can i do?
I am having problems executing a child package from a parent package using the Execute Package Task. I am attempting to run the master package through a SQL Server Agent job.
The SQL Server Agent job is owned by sa. The step that runs the parent package is configured to load the package from the SSIS Package Store on the same server that the job is running.
I have the Execute Package Task configured as follows:
Location: SQL Server ExecuteOutOfProcess: True Connecting as a SQL Server login (let's say TestEtl)
I have added the db_dtsoperator database role to both the TestEtl login and the login that SQL Server Agent connects through. I have also configured the child package's reader role to include db_dtsoperator. Per http://msdn2.microsoft.com/en-US/library/ms141053.aspx, this should allow these logins to run the child package.
I have enabled logging of all events in both the parent and child packages. I see the following in the logs when the Execute Package Task executes (omitted portions unrelated to the execution of the child package task):
450939 OnPreExecute ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450940 OnPreValidate ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450941 OnPostValidate ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450942 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_pre: The object is ready to make the following external request: 'IDataInitialize::GetDataSource'.450943 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_post: 'IDataInitialize::GetDataSource succeeded'. The external request has completed.450944 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_pre: The object is ready to make the following external request: 'IDBInitialize::Initialize'.450945 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_post: 'IDBInitialize::Initialize succeeded'. The external request has completed.450946 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_pre: The object is ready to make the following external request: 'IDBCreateSession::CreateSession'.450947 User: Diagnostic ETL 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 ExternalRequest_post: 'IDBCreateSession::CreateSession succeeded'. The external request has completed.450948 OnError ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 Error 0x80070005 while preparing to load the package. Access is denied. . 450949 OnError ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 Error 0x80070005 while preparing to load the package. Access is denied. . 450950 OnTaskFailed ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450951 OnPostExecute ChildPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450952 OnWarning ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. 450953 OnPostExecute ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 450954 PackageEnd ParentPackage 2007-06-08 13:35:17.000 2007-06-08 13:35:17.000 End of package execution.
I am sure that what I am doing is quite common, and I obviously have something misconfigured somewhere - but I'm not sure what my misconfiguration is. Can anyone enlighten me?
I have an SSIS package doing a bulk insert from a file. Then later on I'm trying to delete that file (in a file delete task), but I'm getting an error:[File System Task] Error: An error occurred with the following error message: "The process cannot access the file 'xyz' because it is being used by another process.".I'm wondering if there isn't some way to 'tweak' the bulk insert syntax so that it doesn't lock the file?
I have a flat file which is loaded into the database on a daily basis. The file contains rows of strings which I load into a table, specifically to a column of length 8000.
The string has a length of 690, but the format is like 'xxxxxx xx xx..' and so on, where 'xxxx' represents data. So there are spaces, etc present in the middle.
Previously I used SQL 2000 DTS to load the files in, and it was just a Column Transformation with the Col001 from the text file loading straight to my table column. After the load, if I select len(col) it gives me 750 for all rows.
Once I started to migrate this to SSIS, I allocated the Control Flow Task and specified the flat file source and the oledb destination, and gave the output column a type of String and output column width of 8000. But when I run the data flow task it copies only 181 or 231 characters out of the 750 required. I feel it stops where it finds the SPACES and skips the rest.
I specified row delimiters or CR, and LF. I checked the file under UltraEdit and there were no special characters in the file that would cause the problem.
Any suggestions how I can get it to load the full data?
i have designed DTS packages, I have a script component that picks up the 'Path' of a file stored. The Path is a column in database and that obtained from OLEDB source.
now i need to zip the files in the path given above and than store it in some location. How do I do that?
I'm importing a large csv file two different ways - one with Bulk Import Task and the other way with the Data Flow Task (flat file source -> OLE DB destination).
With the Bulk Import Task I'm putting all the csv rows in one column. With the Data Flow Task I'm mapping each csv value to it's own column in the SQL table.
I used two different flat file sources and got the following:
-The first task in that package (an Exec SQL Task) retrieves a timestamp value from an external source
-That timestamp value is used in data-flows to pull data that has been changed since then
-At the end of the package the value in the db is updated
Now...if something fails then on the next execution the checkpoint file will kick in. But this means that the first task (which retrieves the timestamp) will not execute and therefore all the data-flows will be pulling the wrong data.
The only way I can think of getting around this problem is to specify that a task should execute regardless of the presence of a checkpoint file. Unfortunately it seems this cannot be done.
Another option might be to put an OnPreExecute task on the package that gets the timestamp value.
Anyone got any advice about how I should progress? Short of a suggestion from elsewhere I'm going to go with the tactic of using the OnPreExecute to retrieve the timestamp.
I have a remote batch file on machine B that I need to execute using 'Execute process task' control from a package on machine A. The batch file uses pgp software and encrypts a file sitting on machine B itself. The reason why my batch file is sitting on machine B, is because the PGP software is on machine B.
If I execute the batch file by itself from machine B, the script runs fine. I refer the same batch file as a UNC path from my package on machine A. But that does not work since the 'Working directory' is still machine A. I can not set machine B's folder as the working dir because it does not accept UNC path. So I say, ok , let me map a path to that UNC location and map it as drive 'Z:'. Certainly if I do so, I will be running the process on machine A and the batch file will look for the pgp software on machine A, and hence fail.
I have tried third party remote batch execution tools (PSEXEC) but have not had success, not because of SSIS limitations, but simply because the PGP executable when run through the PSEXEC tool, does not identify the location of the public keys on machine B and hence gives an encrytion failure.
How do I get the remote batch file to execute such that it executes with its own env? Is there a better remote execution tool I can try or are there any other features of SSIS I can use to get around this issue? I need the results of the batch file and hence do not want to make it an asyncronous process.
I am new to this, but have scoured the web and not found an answer to my question...
I have an execute process task that runs a simple batch file. When this batch file completes with an ERRORLEVEL greater than 0, I would like the task to fail. I thought this simply meant setting the "FailTaskIfReturnCodeIsNotSuccessValue" property to true, and setting the SuccessValue to 0. However, this does not appear to work.
Even with a simple batch file forcing the errorcode to 1 as follows, the task still completes "successfully".
I'm trying to use "findstr.exe" to extract some lines of interest from a data file, which I will later load to a table. I'd like to issue this form of a command:
findstr.exe "^SEARCHSTRING" "srcfile" > "dstfile"
I build the arguments using expressions, and both the search string and source file get correctly set. However, the ">" seems to be ignored--I can see the lines spitting out to the temporary window when I run under VS.
QUESTION: how do you redirect the output of a command run under an Execute Process Task?