BULK IMPORT Stress
Jul 26, 2006
I am trying to import a data file, which is tab delimited, using BULK
INSERT. I have used BCP to create a format file, since the destination
table has around 20 columns, but the data file has only three.
Here's the problem: The columns I am trying to import comprise ID (an
int identity column), Name (a varchar(255) column and Status (a small
int column). The data file contains identity values for the first
column, so I am using the KEEPIDENTITY modifier. The Status column is
mandatory, so I have set all rows in the data file to zero for that
column. All of the other columns in the destination table either allow
NULL or have default values. When I BULK INSERT the file using the
format file the identity columns are NOT imported and the Status column
gets value 3376. The Name column is the only one that gets imported
correctly. Here's the format file:
8.0
3
1 SQLINT 0 4 " " 1 ID
""
2 SQLCHAR 0 0 " " 2 Name
SQL_Latin1_General_CP1_CI_AS
3 SQLSMALLINT 0 2 "
" 4 Status
""
Sorry it's a bit messy.
Where is 3376 coming from, and why are my identity values for column ID
not being imported?
View 3 Replies
ADVERTISEMENT
May 25, 2004
Hi, don't know how to do this,
i've got 100+ .CSV text files (50mb in total, not each) i need to import into a SQL server database but don't know how.
i've tried hunting the web for some file concatenation program but just came against pay-for-me software, which didn't help.
any ideas?, would the BCP command do it?
View 14 Replies
View Related
Feb 26, 2013
This is my source data in CSV format:
4,23,2AY5623,7235623
4,23,2GP1207,1451207
4,23,2GQ6689,4186689
Table:
CREATE TABLE [dbo].[Table1](
[idCodeLevel] [int] NOT NULL,
[idFirm] [int] NOT NULL,
[valCodeFrom] [varchar](15) NOT NULL,
[valCodeTo] [varchar](15) NOT NULL
) ON [PRIMARY]
[code]....
I googled and found out that I might have to use the format .fmt. But how can I convert a csv file to fmt. I have seen code to create fmt file from sql table.
View 5 Replies
View Related
Jul 11, 2001
Hi folks,
We are trying to import a flat file from a remote machine using the BULK INSERT command.
We have mapped the remote directory on the server.
we have used the following command.
Bulk insert table_name from my_servermy_sharefilename.txt
The error it gives is operating system Error number 5.( Access denied)
We are able to access the same file from the Windows explorer. We also refered the books online and as suggested,
have set the System Path to the same share name.
But it does not work.
Could you help us in this regard.
Thanks.
-Rajesh
View 2 Replies
View Related
Jun 25, 2014
I have imported data in my table using the bulk insert command. I was supposed to fill specific columns of my table with that data so I used a view to put them in the column I wanted.
The table looks like this now:
id | id_param | val_param
+-----------+--------------+
1 | no_tel | 742062141
2 | sex | 1
3 | age | 23
4 | no_tel | 765234157
5 | sex | 1
6 | age | 34
When I want to select only the val_param that is=1 for the id_param=sex using this interogation:
select * from bd_rox where id_param='sex' and val_param='1'
it returns no value and I don`t know why.The wanted result should look like this:
id | id_param | val_param
+-----------+--------------+-
2 | sex | 1
5 | sex | 1
View 9 Replies
View Related
Mar 11, 2008
i have a log file, i am trying to import data from it to SQL in order to analyze the data (able to query on the data), however that task seems impossible.
In fact the log file contains a varying number of column fields (error logged, various types of data logged demand varying number of columns). More than that the fields themself are hard to extract.
An example of data in my log is:xxxxxxxx is some alphanumeric chars2008-01-09 20:16:05,4784E36F.req,10.1.1.26,xxxxxxxxx,OK -- SMPP - xxxxxxx:xxxxx,Sender=xxxx;SMSCMsgId=2028eecc;Binary=1;DCS=8;Data=xxxxxxxxxxxxxx...2008-01-09 20:16:05,4784E338.req,10.1.1.26,xxxxxxxx,Retry Pending - ERROR: Timeout waiting for response from server or lost connection -- SMPP - xxxxxxxxxxx:xxxxx,Sender=xxxxx........
I may use regular expressions to extract the data, and maybe use a regular INSERT to put in the right table. Thus it seems like making a manual Bulk Insert(yeah and it may take much more time), it seems strange, can i use somehow some additional tool (in SQL package or external), to assist somehow.
Thanks and sorry if this is double posted !
View 1 Replies
View Related
Feb 27, 2008
Let me preface this request with the info that I am relatively new to sql server so I may be asking something that is rally basic, and/or is not a best practice but here goes....I need to import data from an excel spreadsheet - one of the columns may be null or may have an integer value. I'd like to replace any null values with a 1 during import so that calculations can be done with the field once the data are imported. Can someone give me an example of how to do this? I had planned to use the bulk insert option but if there's a better way please let me know. Thanks in advance for any advice.
View 3 Replies
View Related
Jul 26, 2006
Hi!.
Is there something like BCP utility in SQL CE?
I need to import (pereodicaly) large ammount of data to my CE database. When tested import on network this take a lot of time. That's why decided to send raw data in ASCII files (because of small size) and to import files to CE database.
Certainly, it's not a problem to write those cli by myself, but it's interesting if someone already did this...
Thanks, Sandr
View 3 Replies
View Related
Jun 19, 2014
I found loads of things but nothing seems to work...
I'm trying to get a link with XML data inside the page into a table but I can't find anything
View 9 Replies
View Related
Mar 18, 2008
Hi Guys,I’m trying to do a Bulk Insert but I am receiving the following error:
conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 8 (Phone).Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 8 (Phone).".
Task failed: dbo.tblNoName
The data is comma delimited which I stipulated in my Import Connection section. But the data also has quotes around it (ie. The field Lname nvarchar (17) = “Brown� and the field phone nvarchar (10) = “12345678921�. Is there a way to ignore the quotes or do I have to remove them before I import?
Or is my problem something else all together?
The connection is solid;
Format = “Specify�
RowDelimiter = {CR}{LF}
columnDelimiter = Comma {,}
No other options are set.
The data looks like:
"tstLName","tstFname","000 N Tst DR","IDAHO sp","ID","00000000",
Thank you,
View 2 Replies
View Related
Nov 28, 2005
I would appreciate some help on a procedure that I have. Using BulkInsert, I would like to import records from a text file. The issue Ihave is the file contains a header - '1AMC_TO_Axiz' and a footer'1AMC_TO_Axiz2". Using a format file, I can get the import to work byediting the file and removing these two entries. Is there a way tosetup the format file to skip these two entries? My file currentlylooks like this:7.0161 SQLCHAR 0 50 "|" 1 keyMemberNo2 SQLCHAR 0 10 "|" 2 fldEffdate.................16 SQLCHAR 0 10 "
" 16 fldNewRecordThanksCharles
View 3 Replies
View Related
Apr 3, 2006
I have a web application that I am rebuilding. I have many picture files that want to take off the file system and move into SQL as a blob. I will create an index of uids against the file names but need a good way to bulk add the files to the database... any hints on code or tools would be a great help.
Thanks
Bill
View 1 Replies
View Related
Nov 12, 2015
I have more than 500 CSV files with a similar structure [Same column name and same data format]. I would like to load these files in a database table on the SQL Server 2014 database.
View 21 Replies
View Related
Dec 21, 2006
We have a sql 2005 x64 database (datawarehouse related), essentially a work area for us, that we truncate and re-populate via BCP weekly. (We don't backup the database at all) . From the perspective of data-import speed what is the best recovery model to use: Bulk-Logged or Simple? (I have read sql 2005 BOL and don't find it partcularly clear on this point.)
Barkingdog
P.S. Anyone know of an article listing "best practices" for high-speed data import?
View 1 Replies
View Related
Feb 26, 2008
I'm experiencing issues importing XML data using a distributed query with the following statement which is run from an XP client named WorkstationA connecting to SQL2005 SP2 ServerB, the XML data is located on ServerC.
AdHoc Queries using OpenRowSet has been enabled and verified.
The SQL Server service is running using a domain user account with permissions to read the remote files. I have logged in locally to the SQL server and verified this. It still fails even if the SQL services are running using LocalSystem.
User on Workstation A is authenticated with Integrated security (SQL Admin) and has rights to read the XML files on ServerC.
WorkStationA = SQL2005 Mgt Studio running the query
ServerB = SQL2005 SP2
ServerC = XML data files
DECLARE @xml XML
SELECT @xml =CONVERT(XML, bulkcolumn, 2)
FROM OPENROWSET(BULK '\SERVERCSHAREPATHDATAFILE.XML', SINGLE_BLOB) AS x
SELECT @xml
Results: Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "\SERVERCSHAREPATHDATAFILE.XML" could not be opened. Operating system error code 5(Access Denied).
The query fails when it is run from Workstation A connected to SQL ServerB querying data on ServerC via a UNC.
The query is succesful when it is run from the local SQL ServerB. The problem is with distributed queries.
The query is succesful when the XML files are local to the SQL server including referencing them via a local UNC
Thank you for any responses.
Hamish
View 4 Replies
View Related
Sep 21, 2006
It seems to me that files created on Unix machines with line terminator , or chr(10), cannot be imported using the Bulk Insert statement. Is this a bug, or an oversight by Microsoft? Does this mean that unless one replaces all with
, there is no way to use Bulk Insert to import Unix files? This is a very strange behavior by MSSQL. Even lessor programs such as Excel and Word automatically recognize chr(10) as a line termination character. Am I missing something, or is this just the way MSSQL is?
View 7 Replies
View Related
Sep 27, 2004
Hi,
Iam trying to import data from a csv file into my table in SQL Server 2000. My table is called as temp_table and consists of 3 fields.
column datatype
-------- -----------
program nvarchar(20)
description nvarchar(50)
pId int
pId has been set to primary key with auto_increment.
My csv file has 2 columns of data and it looks like follows:
program, description
"prog1", "this is program1"
"prog2", "this is program2"
"prog3", "this is program3"
Now i use BULK INSERT like this
"BULK INSERT ord_programs FROM 'C:datafile.csv' WITH (FIELDTERMINATOR=',', ROWTERMINATOR='', FIRSTROW=2)"
to import data into my table in SQL server and it gives me this error
"Bulk insert data conversion error (type mismatch) for row 2, column 3 (pId)"
I guess i have to use fileformat or something since i dont have anything for pId field in the csv file to make it work...
Please help me out guys and please post a snippet of code if you have.
Thank You.
View 2 Replies
View Related
Sep 11, 2015
I have a record in an Excel format (Excel 2010) and I would like to bulk import that into SQL Server 2008 and also while importing, SQL Server will automatically create a new table based on the header fields or row of the source file.
I am not sure if SQL Server 2008 has this capabilities.
View 0 Replies
View Related
May 1, 2001
Hi,
Is there anyway to test the loadtesting/Stress testing on SQLSERVER. I wanted to executes a Store procedure with concurrent users.
Thanks
JK
View 1 Replies
View Related
Oct 26, 2000
can anyone tell me where can download stress testing tool for SQL Server 7.0.
Thank You,
Piyush Patel
View 1 Replies
View Related
Dec 24, 2002
Anyone have a tool or script to run to test a SQL server under heavy stress?
View 1 Replies
View Related
Sep 22, 1998
What exactly `Stress Management` is.
Any tools for assessing Stress Management?
View 2 Replies
View Related
May 18, 2001
Anyone know a decent stress tester tool for SQL Server2000? I need to simulate many concurrent users executing queries...
Regards,
/S
View 1 Replies
View Related
Jan 28, 2005
What do you recommend for stress-testing the performance of key stored procedures (they have been identified) for our application? The parameters can be programatically selected, for example:
Select 'exec my_proc @id = ' + Cast(id As varchar)
From myTable
Where foo = 'bar'
I have the Support Tools Available For Stress Testing & Performance Analysis (http://www.microsoft.com/downloads/details.aspx?FamilyId=5691AB53-893A-4AAF-B4A6-9A8BB9669A8B&displaylang=en) from Microsoft's site and they are pretty good.
Recommendations appreciated, and thanks in advance!
View 2 Replies
View Related
Nov 20, 2006
Hi,
I have a new server where 32GB of RAM is installed and I have user databases on this server.I am using SQL server 2000 Enterprise edition and Platform is Windows 2003 adv server, which supports upto 128GB of memory.
sp_configure 'awe enabled' is set to 1 and at OS level, AWE is enabled as well.
max server memory (MB) is 2147483647
I was doing some stress test on this server but memory usage doesn't go beyond 180MB....can someone suggest a test for physical RAM ?
How can I make sure that application will make full use of available physical memory?
Rgds
Wilson
View 6 Replies
View Related
Dec 27, 2007
Hi all,
We have configured a DR server for our Production Server for Database Mirroring.
But, before bringing DR Server into live, we will setup Mirroring between our DR & TEST Server for Stress Testing on new DR Server.
So, What is best way to Stress test the DR Server, before bringing DR Server into Live.
Thanks.
View 6 Replies
View Related
Mar 20, 2001
I am looking to purchase tool where I can perform the stress test on SQL 7.0.
Any recommendation will be appreciated.
View 1 Replies
View Related
Mar 8, 2002
I am testing different raid setups in a Datawarehousing environment.
E.G
3 X Raid 5 (with four filegroups)
3 X Raid 10 (with four filegroups)
At the moment I have chance to trial different hardware configurations.
I want to measure what are the performance differences
I need to stress test these configs. Is there any tools I can use to do this.
Any loads I can use. What should I measure.
Are there any aricles for the environment
Any help would be great
Pete
View 2 Replies
View Related
Jan 11, 2005
I have been asked to perform a performance stress test on a SQL server with new hardware that we are going to be receiving.
How have some of you performed your stress analysis against new or existing hardware?
This hardware that I am going to receive will have to be configured within a high availabilty environment. I want to take this opportunity to really put a beat down on this server.
Thank you all for your suggestions.
View 5 Replies
View Related
Apr 2, 2007
I have a Windows 2003 Server running SQL 2005. The server has 32 GB of memory and I have enabled AWE in SQL. I have also configured the min and max SQL memory as 1 GB and 28 GB, respectively. However, this server currently has very low activity so I'm not sure whether my AWE-related changes worked. SQLSERVR.EXE process takes up about 100 MB of memory. Is there any tool or scripts that I can use to memory stress SQL to confirm that AWE is really in effect ?
View 1 Replies
View Related
Dec 5, 2006
Hi,
We have built two testing apps for sending and receiving files across the network reliably using SQL Express as the database backend. The apps seem to be working fine under light load. However during stress test, we always get the following exception:
"System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."
During stress test, both the sender and receiver are running on the same machine. Sender creates file fragments, store them in the sender database and then send out to the network. File fragments will be deleted from the sender database when the sender receives acknowledgement from the receiver. On the receiver side, file fragments will be stored in the receiver database as they are coming in from the network. Corresponding file fragments will be deleted from the receiver database when a complete file is received.
There is maximum of about 1500 updates and 1500 deletes per second on the sender database. On the receiver side, maximum is about 300 updates and 300 deletes per second. Our goal is to send 30 GB of data (it should run for about 10 hrs). As said before we never have a good completed test run, a "timeout" exception is always thrown from the sender app (when it tries to end a transaction). It could happen as early as 1.5 hrs after we started the test. Note that although we are sending 30 GB of data, but at any point in time the database shouldn't be too big (should be well within 4 GB limit) because we delete file fragments relatively soon.
Next we changed the "Query Wait" setting in the Management Studio Advanced setting from the default "-1" to a very big number, then we have a successful run of sending 30 GB of data.
- First of all, are we not doing this properly in terms of dealing with SQL Express? Is SQL Express able to handle long running heavy load transactions for hours?
- We also noticed even before we got the timeout exception, the memory usage of sqlserver.exe keeps growing. Maybe it doesn't have a chance to cleanup internally. If the app hammers SQL Express for hours, I wonder how does it handle fragmentation? I assume it needs some sort of de-fragmenation, otherwise performance will degrade significantly...
- Seems like the Query Wait setting plays an important role here, any guideline on how to pick a reasonable value? Or should we pick a relatively small number and then do re-try in our app when we get timeout exceptions?
- Is it possible that we are running into some SQL Express resource limits? Any idea of how can we tell other than the VM size of sqlserver.exe?
Any help or suggestions would be greatly appreciated!
Thanks very much
W Wong
View 5 Replies
View Related
Mar 2, 2006
I am currently working on a project which needs to load over a 1000 xml files. The files are stored across 10 subfolders. I am using a foreach loop with a file enumerator, which is configured at the top of the folder structure and traverses the subfolders. This loops through the files, load the data and then moves the file to another folder. The package executes fine for a few 100 files but then hangs; this happens after a different number of processed files each time the package is run. While trying to resolve the problem we ran performance counters and noticed that the number open handles increased significantly just about the time Dtexec looked like it had hanged and DTexec also then started taking a lot of the cpu processing time. Has one else come across similar situations?
View 9 Replies
View Related
Jan 17, 2008
Im having some issues with bulk insert.
This is the table:
CREATE TABLE [dbo].[tmp_GA_status](
[GA_recno] [int] NOT NULL,
[GA_desc] [varchar](40) NULL
)
This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"
and this is the sql:
bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'
with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')
so yeah, pretty simple. But whatever I do I get this;
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).
So what am I doing wrong ?
View 13 Replies
View Related