Need To Set A Conditional Default Value For Bulk Import

Feb 27, 2008

Let me preface this request with the info that I am relatively new to sql server so I may be asking something that is rally basic, and/or is not a best practice but here goes....I need to import data from an excel spreadsheet - one of the columns may be null or may have an integer value. I'd like to replace any null values with a 1 during import so that calculations can be done with the field once the data are imported. Can someone give me an example of how to do this? I had planned to use the bulk insert option but if there's a better way please let me know. Thanks in advance for any advice.

View 3 Replies


ADVERTISEMENT

Conditional Bulk Insert

Sep 19, 2007



Hi

I am doing bulk insert as follows. The @lastUpdate, @filePath, @formatFile comes as a parametes to stored proc

INSERT INTO Categories
SELECT CategoryId, @LastUpdate FROM OPENROWSET
(

BULK @filePath ,
FORMATFILE = @formatFile,
FIRSTROW =2
)
AS a
This works fine for me.

But my new requirement is that i shouldn't insert the CategoryId if it exists

How can we have conditional bulk insert? i am using Bulk insert as the file might have millions of category Ids.

Please provide your inputs that executes much faster

Best Regards,
~Mohan Babu

View 1 Replies View Related

Bulk Insert + Conditional Split

Mar 12, 2008



Hello all,

I just wanted to know whether is it possible to use Bulk Insert and Conditional Split together for one transformation.

Regards,
Kapadia Shalin P.

View 3 Replies View Related

SQL Server 2014 :: Why Don't Bulk Imports TABLOCK By Default

Jul 2, 2014

I've been reading about the "table lock on bulk load" option and TABLOCK hint.

So my understanding is by default only row locks are taken out and other queries can read/write data while the bulk load is going on. However if you were doing parallel bulk loads with overlapping keys from a clustered index then they may block each other.

But if the option is enabled, you can do the parallel bulk loads without blocking because a table lock is taken out, however, other processes couldn't read/write the data until they're all done.

Is that the gist of it? I think I got confused by some misinformation. Don't all those row locks eventually likely escalate to a table lock anyway though?

View 1 Replies View Related

Bulk CSV File Import?

May 25, 2004

Hi, don't know how to do this,

i've got 100+ .CSV text files (50mb in total, not each) i need to import into a SQL server database but don't know how.

i've tried hunting the web for some file concatenation program but just came against pay-for-me software, which didn't help.

any ideas?, would the BCP command do it?

View 14 Replies View Related

Bulk Import Error

Feb 26, 2013

This is my source data in CSV format:

4,23,2AY5623,7235623
4,23,2GP1207,1451207
4,23,2GQ6689,4186689

Table:

CREATE TABLE [dbo].[Table1](
[idCodeLevel] [int] NOT NULL,
[idFirm] [int] NOT NULL,
[valCodeFrom] [varchar](15) NOT NULL,
[valCodeTo] [varchar](15) NOT NULL
) ON [PRIMARY]

[code]....

I googled and found out that I might have to use the format .fmt. But how can I convert a csv file to fmt. I have seen code to create fmt file from sql table.

View 5 Replies View Related

BULK IMPORT Stress

Jul 26, 2006

I am trying to import a data file, which is tab delimited, using BULKINSERT. I have used BCP to create a format file, since the destinationtable has around 20 columns, but the data file has only three.Here's the problem: The columns I am trying to import comprise ID (anint identity column), Name (a varchar(255) column and Status (a smallint column). The data file contains identity values for the firstcolumn, so I am using the KEEPIDENTITY modifier. The Status column ismandatory, so I have set all rows in the data file to zero for thatcolumn. All of the other columns in the destination table either allowNULL or have default values. When I BULK INSERT the file using theformat file the identity columns are NOT imported and the Status columngets value 3376. The Name column is the only one that gets importedcorrectly. Here's the format file:8.031 SQLINT 0 4 " " 1 ID""2 SQLCHAR 0 0 " " 2 NameSQL_Latin1_General_CP1_CI_AS3 SQLSMALLINT 0 2 "" 4 Status""Sorry it's a bit messy.Where is 3376 coming from, and why are my identity values for column IDnot being imported?

View 3 Replies View Related

Bulk Import From A Remote Machine

Jul 11, 2001

Hi folks,

We are trying to import a flat file from a remote machine using the BULK INSERT command.

We have mapped the remote directory on the server.

we have used the following command.

Bulk insert table_name from my_servermy_sharefilename.txt

The error it gives is operating system Error number 5.( Access denied)

We are able to access the same file from the Windows explorer. We also refered the books online and as suggested,
have set the System Path to the same share name.

But it does not work.

Could you help us in this regard.

Thanks.

-Rajesh

View 2 Replies View Related

Import Data With Bulk Insert

Jun 25, 2014

I have imported data in my table using the bulk insert command. I was supposed to fill specific columns of my table with that data so I used a view to put them in the column I wanted.

The table looks like this now:

id | id_param | val_param
+-----------+--------------+
1 | no_tel | 742062141
2 | sex | 1
3 | age | 23
4 | no_tel | 765234157
5 | sex | 1
6 | age | 34

When I want to select only the val_param that is=1 for the id_param=sex using this interogation:

select * from bd_rox where id_param='sex' and val_param='1'

it returns no value and I don`t know why.The wanted result should look like this:

id | id_param | val_param
+-----------+--------------+-
2 | sex | 1
5 | sex | 1

View 9 Replies View Related

Bulk Insert, Bcp For Import Log File In Sql

Mar 11, 2008

i have a log file, i am trying to import data from it to SQL in order to analyze the data (able to query on the data), however that task seems impossible.
In fact the log file contains a varying number of column fields (error logged, various types of data logged demand varying number of columns). More than that the fields themself are hard to extract.

An example of data in my log is:xxxxxxxx is some alphanumeric chars2008-01-09 20:16:05,4784E36F.req,10.1.1.26,xxxxxxxxx,OK -- SMPP - xxxxxxx:xxxxx,Sender=xxxx;SMSCMsgId=2028eecc;Binary=1;DCS=8;Data=xxxxxxxxxxxxxx...2008-01-09 20:16:05,4784E338.req,10.1.1.26,xxxxxxxx,Retry Pending - ERROR: Timeout waiting for response from server or lost connection -- SMPP - xxxxxxxxxxx:xxxxx,Sender=xxxxx........



I may use regular expressions to extract the data, and maybe use a regular INSERT to put in the right table. Thus it seems like making a manual Bulk Insert(yeah and it may take much more time), it seems strange, can i use somehow some additional tool (in SQL package or external), to assist somehow.

Thanks and sorry if this is double posted !

View 1 Replies View Related

Bulk Import Data To SQL Mobile

Jul 26, 2006

Hi!.

Is there something like BCP utility in SQL CE?

I need to import (pereodicaly) large ammount of data to my CE database. When tested import on network this take a lot of time. That's why decided to send raw data in ASCII files (because of small size) and to import files to CE database.

Certainly, it's not a problem to write those cli by myself, but it's interesting if someone already did this...

Thanks, Sandr

View 3 Replies View Related

SQL Server 2014 :: Bulk Import XML To Table

Jun 19, 2014

I found loads of things but nothing seems to work...

I'm trying to get a link with XML data inside the page into a table but I can't find anything

View 9 Replies View Related

Bulk Import Errors - Dbl Quotes Maybe Problem

Mar 18, 2008

Hi Guys,I’m trying to do a Bulk Insert but I am receiving the following error:
conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 8 (Phone).Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 8 (Phone).".
Task failed: dbo.tblNoName

The data is comma delimited which I stipulated in my Import Connection section. But the data also has quotes around it (ie. The field Lname nvarchar (17) = “Brown� and the field phone nvarchar (10) = “12345678921�. Is there a way to ignore the quotes or do I have to remove them before I import?

Or is my problem something else all together?

The connection is solid;
Format = “Specify�
RowDelimiter = {CR}{LF}
columnDelimiter = Comma {,}

No other options are set.

The data looks like:
"tstLName","tstFname","000 N Tst DR","IDAHO sp","ID","00000000",

Thank you,

View 2 Replies View Related

Bulk Import With Header In Text File

Nov 28, 2005

I would appreciate some help on a procedure that I have. Using BulkInsert, I would like to import records from a text file. The issue Ihave is the file contains a header - '1AMC_TO_Axiz' and a footer'1AMC_TO_Axiz2". Using a format file, I can get the import to work byediting the file and removing these two entries. Is there a way tosetup the format file to skip these two entries? My file currentlylooks like this:7.0161 SQLCHAR 0 50 "|" 1 keyMemberNo2 SQLCHAR 0 10 "|" 2 fldEffdate.................16 SQLCHAR 0 10 "
" 16 fldNewRecordThanksCharles

View 3 Replies View Related

Default Datatypes Using DTS Import

Dec 30, 2005

I have been inserting data From Access 2000 using the DTS import wizard. By default, I get the following datatypes


Access 2000 SQL Server 2000
datetime converts to smalldatetime
text converts to nvarchar
memo converts to nText
Is there a way to change the defaults? I would prefer to have varchar data in SQL server instead of nvarchar, and datetime instead of smalldatetime

Miranda

View 3 Replies View Related

Bulk Import Images To Binary Object In SQL 2000

Apr 3, 2006

I have a web application that I am rebuilding. I have many picture files that want to take off the file system and move into SQL as a blob. I will create an index of uids against the file names but need a good way to bulk add the files to the database... any hints on code or tools would be a great help.
Thanks
 
Bill

View 1 Replies View Related

Transact SQL :: Import Bulk CSV Files In Database Table

Nov 12, 2015

I have more than 500 CSV files with a similar structure [Same column name and same data format]. I would like to load these files in a database table on the SQL Server 2014 database.

View 21 Replies View Related

Simple Or Bulk Logged Recovery Model For Fastest Import ?

Dec 21, 2006

We have a sql 2005 x64 database (datawarehouse related), essentially a work area for us, that we truncate and re-populate via BCP weekly. (We don't backup the database at all) . From the perspective of data-import speed what is the best recovery model to use: Bulk-Logged or Simple? (I have read sql 2005 BOL and don't find it partcularly clear on this point.)

Barkingdog

P.S. Anyone know of an article listing "best practices" for high-speed data import?

View 1 Replies View Related

Distributed Query: Import XML Using OpenRowSet Bulk From UNC - Access Denied

Feb 26, 2008

I'm experiencing issues importing XML data using a distributed query with the following statement which is run from an XP client named WorkstationA connecting to SQL2005 SP2 ServerB, the XML data is located on ServerC.


AdHoc Queries using OpenRowSet has been enabled and verified.


The SQL Server service is running using a domain user account with permissions to read the remote files. I have logged in locally to the SQL server and verified this. It still fails even if the SQL services are running using LocalSystem.

User on Workstation A is authenticated with Integrated security (SQL Admin) and has rights to read the XML files on ServerC.

WorkStationA = SQL2005 Mgt Studio running the query
ServerB = SQL2005 SP2
ServerC = XML data files


DECLARE @xml XML
SELECT @xml =CONVERT(XML, bulkcolumn, 2)
FROM OPENROWSET(BULK '\SERVERCSHAREPATHDATAFILE.XML', SINGLE_BLOB) AS x
SELECT @xml


Results: Msg 4861, Level 16, State 1, Line 2

Cannot bulk load because the file "\SERVERCSHAREPATHDATAFILE.XML" could not be opened. Operating system error code 5(Access Denied).


The query fails when it is run from Workstation A connected to SQL ServerB querying data on ServerC via a UNC.
The query is succesful when it is run from the local SQL ServerB. The problem is with distributed queries.
The query is succesful when the XML files are local to the SQL server including referencing them via a local UNC

Thank you for any responses.



Hamish

View 4 Replies View Related

Bulk Insert Fails To Import Data Files Created On Unix

Sep 21, 2006

It seems to me that files created on Unix machines with line terminator , or chr(10), cannot be imported using the Bulk Insert statement. Is this a bug, or an oversight by Microsoft? Does this mean that unless one replaces all with
, there is no way to use Bulk Insert to import Unix files? This is a very strange behavior by MSSQL. Even lessor programs such as Excel and Word automatically recognize chr(10) as a line termination character. Am I missing something, or is this just the way MSSQL is?

View 7 Replies View Related

Csv File Import: Bulk Insert Data Conversion Error (type Mismatch)

Sep 27, 2004

Hi,

Iam trying to import data from a csv file into my table in SQL Server 2000. My table is called as temp_table and consists of 3 fields.

column datatype
-------- -----------
program nvarchar(20)
description nvarchar(50)
pId int

pId has been set to primary key with auto_increment.

My csv file has 2 columns of data and it looks like follows:

program, description
"prog1", "this is program1"
"prog2", "this is program2"
"prog3", "this is program3"


Now i use BULK INSERT like this

"BULK INSERT ord_programs FROM 'C:datafile.csv' WITH (FIELDTERMINATOR=',', ROWTERMINATOR='', FIRSTROW=2)"

to import data into my table in SQL server and it gives me this error

"Bulk insert data conversion error (type mismatch) for row 2, column 3 (pId)"

I guess i have to use fileformat or something since i dont have anything for pId field in the csv file to make it work...

Please help me out guys and please post a snippet of code if you have.

Thank You.

View 2 Replies View Related

Default To NULL Instead Of Blank/empty String During Flat File Import

Aug 15, 2006

Hi,

In SSIS flat file import using fastload, I'm trying to import data into SQL 2005 previously created tables.

The table may contain column that are NULLable BUT there is NO DEFAULT for them.

If the incoming data from flat files contains nothing either between the delimeters, how can I have a NULL value inserted in the column instead of blank/empty string?

I didn't find an easy flag unless I'm doing something wrong. I know of at least two ways to do it the hard way:

1- set the DEFAULT(NULL) for EVERY column that needs this behaviour

2-set up some Derived Column option in the package to return NULL if the value is missing.

Both of the above are time consuming since I'm dealing with many tables. Is there a quick option to default the value to NULL WHEN there is NO data ELSE insert the data itself? So the same behavior that I have right now except that I want NULL in place of empty string/blank in the varchar(x) columns.



Thanks

Anatole

View 9 Replies View Related

SQL Server 2008 :: Bulk Import And Create New Table Based On Header Fields Of Imported File (XLXS)

Sep 11, 2015

I have a record in an Excel format (Excel 2010) and I would like to bulk import that into SQL Server 2008 and also while importing, SQL Server will automatically create a new table based on the header fields or row of the source file.

I am not sure if SQL Server 2008 has this capabilities.

View 0 Replies View Related

SQL Tools :: Import Export Wizard Default Data Source Causing Error / Crash

Sep 1, 2015

I have several versions of SQL Server and have been using SQL 2008 on a regular basis due to this issue. Our SQL 2014 when I do the Import Data process, it opens up the dialog window, hit next, and the data source is currently defaulting to ".NET Framework Data Provider for IBM i" - when it does this it immediately errors out with:

"An error occurred which the SQL Server Integration Services Wizard was not prepared to handle.

Additional Information:
> Exception has been thrown by the target of an invocation (mscrolib)
>> Failed to find or load the registered .NET Framework Data Provider (System.Data)"

It immediately crashes/closes the Import/Export wizard with me unable to change the data source to what I need it to be.

My 2008 defaults to SQL Server Native Client 10.0 and does allow me to change to that same option (at which point it errors) but it does not close the wizard.

I need a way to either:

> Default the starting Data Source to be something else
> Fix whatever error is causing it to crash - I am at a loss as to what the error is looking for
> Not have the wizard crash whenever it defaults to this source.

Any of the above solutions would work fine - but at the moment I am unable to use the Import/Export wizard at all in SQL 2014.

View 3 Replies View Related

Conditional Subscription / Conditional Execution Of Report

Mar 7, 2008



Hello everyone,

Is there a way in order to execute a subscribed report based on a certain criteria?

For example, let's say send a report to users when data exist on the report else if no data is returned by the query
executed by the report then it will not send the report to users.

My current situation here is that users tend to say that this should not happen, since no pertinent information is contained in the report, why would they receive email with blank data in it.


Any help or suggestions will be much appreciated.

Thanks,
Larry

View 6 Replies View Related

Bulk Insert - Bulk Load Data Conversion Error

Jan 17, 2008

Im having some issues with bulk insert.

This is the table:

CREATE TABLE [dbo].[tmp_GA_status](

[GA_recno] [int] NOT NULL,

[GA_desc] [varchar](40) NULL

)


This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"


and this is the sql:

bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'

with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')



so yeah, pretty simple. But whatever I do I get this;

Msg 4864, Level 16, State 1, Line 1

Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).



So what am I doing wrong ?

View 13 Replies View Related

I Don't Suppose BULK UPDATE Exists?... Like BULK INSERT?

Sep 27, 2007

I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.

Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?

Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!

View 1 Replies View Related

Cannot Fetch A Row From OLE DB Provider BULK With Bulk Insert Task

Nov 23, 2005

Hi, folks:

View 18 Replies View Related

Pros: How To Bulk Delete And Bulk Insert?

Oct 11, 2000

I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)

Updates and cursors just seem to be too slow.

So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.

This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert


Any comments are appreciated,

-Marc

View 3 Replies View Related

Conditional Formatting - Not So Conditional??

Dec 15, 2006

I have the following code in the color property of a textbox. However, when I run my report all of the values in this column display in green regardless of their value.

=SWITCH(Fields!Wrap.Value >= 3, "Red", Fields!Wrap.Value < 3, "Green")

I already tried =iif(Fields!Wrap.Value >= 3 , "Red", "Green") and got the same results.

Is it because this is a matrix report? What am I doing wrong?

Thanks in advance . . .

View 4 Replies View Related

SQL Server 2012 :: Use Of Default Keyword As Parameter Default - What Value Is It

Aug 11, 2015

@pvColumnName  VARCHAR(100) = Default,  

However, I am unable to determine what is the value for Default. Is it '' ?

Default is not permitted as a constant - below fails to parse:

WHERE t2.TABLE_TYPE = 'BASE TABLE'
AND (@pvColumnName = Default OR t1.[COLUMN_NAME] Like @vColumnName)

View 4 Replies View Related

BULK INSERT ERROR Using Format File - Bulk Load Data Conversion Error

Jun 29, 2015

I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?

Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')

[code]...

View 5 Replies View Related

Questions About Bulk Copy Insert Using 'Memory Based Bulk Copy Operations'

Feb 1, 2007

Hi~,

Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.

- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility

- SQL Server's resource usage : when running memory based bulk copy, server resource's influence

- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?

- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit

- any other guide lines

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved