How To Import Data Using Bcp Without A Format File Or Table Layout
Mar 20, 2008
I have several bcp output files I need to import into tables. I do not have format files for them. As far as I know they are in native format. I do not know the layout of the destination table they would populate.
1) how can I determine from the bcp file itself the schema of the destination table? Once I know that I should be able to import the data into the table.
Hi all,I have a huge excel format file wants to export to sql serverdatabase. One of the field has combination of numeric andalphanumeric. When I import the excel format to sql server database,the numeric is successfully imported but the alphanumeric does notimport at all.Does anyone know how to solve my problem?Thanks.-HS Phuah
I have a huge excel format file wants to export to sql server database. One of the field has combination of numeric and alphanumeric. When I import the excel format to sql server database, the numeric is successfully imported but the alphanumeric does not import at all.
Does anyone know how to solve my problem?
I am using SQL 2k and the spreadsheet is excel 2003
I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."
Seems simple, right?
I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?
Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.
I have tried my file name vendor_insertFromFile2.sqlBULK INSERT (dbo.vendors VendorName, Street, City, Region, Country, PostalCode, Telephone, PortalId, Fax, Email, Website, CreatedDate, Unit, LastName, Cell) from 'D:AccessNAICSxPortal0_SampleData.txt'error messageServer: Msg 170, Level 15, State 1, Line 2Line 2: Incorrect syntax near '('.Data in the "CompanyName","Address","City","State","country","fullzip","FullPhone","Portal","Fullfax","Email","URL","date","ID","Contact_1","800No""DAVID STRAWN","460 FLOWERING TRL","GRAYSON","GA","United States","30017�,"770-277-8709",0,"“,"bkroom@bellsouth.net","WWW.BACKROOM4MEN.COM","09/02/05",1,"DAVID STRAWN","BRADRUSS","3353 S MAIN ST STE 130","SALT LAKE CITY","UT","United States","84115�,"801-975-0374",0,"“,"rusbrad@msn.com","WWW.BADANDNASTY.COM","09/02/05",2,"BRAD JENSEN",Thanks in advance on what is wrong or a link with working examples
If I make a XML formatfile which look like this: <?xml version="1.0"?> <BCPFORMAT ...> <RECORD> <FIELD ID="1" xsi:type="CharTerm" TERMINATOR=" " MAX_LENGTH="20"/> </RECORD> <ROW> <COLUMN SOURCE="1" NAME="ABLBELNR" xsi:type="SQLNVARCHAR"/> </ROW> </BCPFORMAT> then the import failed with the following error message: [Microsoft][SQL Native Client]All bound columns are not read-only
If I delete the identity column then the import works correct.
What can I do, so that I can import with the identity column ?
Hi! I am new to SQL Server... looking for some veteran assistance.
"Data Integrity Report"
I need a Stored Procedure that takes a table name as a parameter and returns a cursor suitable as a data source for a pre-built Report Services report (I guess Report Services would call the SP?).
The cursor/report needs to have the following columns:
Ordinal_Position (I.E. Column number) Column_Name Number Of Blank Rows (how many missing values for this column in this table) Difference (Between total rowcount and population of this column)
Data_Type
Column_Length (either Character_Maximum_Length or the numeric widths rolled up with COALESCE?) Sample Data (The contents of the "first" row in the table, based on a TOP(1) and ORDER BY xxx) The report should look like this (for a table with 100 rows):
Col Num Col Name # Blanks Difference Data Type Col Length Sample Data 1 Name 12 88 varchar 30 Sally Smith 2 Address 34 66 varchar 45 123 Main St Apt 45 3 Acct_ID 0 100 varchar 4 AB12345
Using the "Information_Schema.Columns" I can get everything I need except for #3 (blanks count) and #7 (Sample data).
Is it possible to do this as 1 query, with a CTE or APPLY or something, or do I need to do a table variable based on the Information_Schema and then use dynamic SQL and row-by-row COUNT(*) for each column? And the same for the Sample Data.
Sorry for the long post, and thanks in advance! John
I need to import few tables from MS Access to MS SQL but the table structure in Access is always different, as I would like the destination table in SQL to be.
Therefore I would like that a table would be created in SQL at runtime, according to the structure the Access table accessed has.
Hi, I have one column of data which is 15.678 but in the excel, i format it to 15.68 ( two decimal place, so in excel i should see 15.68), when i am trying to import the data from excel to sql server by using odbc connection, it still getting 15.678, how can i get the data from 15.678 to 15.68 ( what i see is wat i get). Thanks for help.
A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server. I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard. However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection
Operation stopped...
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success) - Setting Source Connection (Error) Messages Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed. (SQL Server Import and Export Wizard)
Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)
- Setting Destination Connection (Stopped)
- Validating (Stopped)
- Prepare for Execute (Stopped)
- Pre-execute (Stopped)
- Executing (Stopped)
- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)
- Post-execute (Stopped)
Does anyone encounter this problem before and know what is happening?
Hi,I am trying to use BULK INSERT with format file. All of our data hasfew bytes of header in the data file which I would like to skip beforedoing BULK INSERT.Is it possible to write format file to skip these few bytes ofheader before doing BULK INSERT? For example, I have a 1 GB data filewith 1000 byte header. Except for first 1000 bytes, rest of the data isgood for BULK INSERT.Thanks in advance. Sorry if it is really a dumb question as I am newto BULK INSERT and practicing still.Bob
I have one column in SQL Server 2005 of data type VARCHAR(4000).
I have imported sql Server 2005 database data into one mdb file.After importing a data into the mdb file, above column data type converted into the memo type in the Access database.
now when I am trying to import a data from this MS Access File(db1.mdb) into the another SQL Server 2005 database, got the error of Unicode Converting a memo data type conversion in Export/Import data wizard.
Could you please let me know what is the reason?
I know that memo data type does not supported into the SQl Server 2005.
I am with SQL Server 2005 Standard Edition with SP2.
Please help me to understans this issue correctly?
hi! my problem both concerns vb and sql. the problem is I have a text file delimited with semicolon. I have to enter this data into Sql table. for example; item description ;code1;code2;code3; there are too many lines like this. The code should enter the table these values.Table name is products, column names are desc,code1,code2,code3 I am using vb.net and sql server thanks all!
Hi all, I need to export/generate a data file in dbf format from SQL Server 2000 table. I wonder how can this be done inside SQL Server 2000? Would DTS helps? Please advise.
I backup my database through SQL Server 2000 SQL Query Analyzer : "BACKUP DATABASE Mysite to disk = 'd:BatabaseBackupMysite.bak' ", and got a bak file to import it to hoster server. Now I need import (backup) database from that hoster server back to my local server. I got a bak file call "MysiteExport.bak" file and transfer it to my computer d:BatabaseBackupMysiteExport.bak' . Then use SQL Server import wizard to import "MysiteExport.bak" and fail to do it.
How can I import/restore database from this "MysiteExport.bak" which have already under my computer. If use SQL Query Analyzer, what is the correct statement.
I have no previous experience in MS SQL and i have this little problem : i import data into a table from a txt file (log file after process) and i want to put the time/date string into a column with data type timedate, but i cant make it, there is an error message (cant store char data into timedate field...) Any help?? thanx in advance :P
I have a .csv-file which is imported into my database using DTS. When look at the DTS package in design-mode i can see that it creates a table and then copies the data from the file into the created table.
Nothing wrong there.
But when the contents of the file changes (e.g. column added or removed) the create table script raises an error. The script is not dynamic, it is created when the package is created and only changes when the package itself is re-created.
Is there a way to make the create table script be dynamic so that the table is dropped re-created according to the format of the .csv-file?
I want to put every value between double quotes in an exact column of a table (lets say it IMPORT table). The problem is that values in quotes refer to hours. So I'd like to get a result 184,5 hours for "184:30".
i have 12 different sql server tables. all are binded to one gridview. i have kept one admin user to view the details of the 12 tables in a gridview (pagesize=5). now the admin wants to generate 12 CSV files from the 12 tables (that is, he has to choose the table name from a dropdown and then has to click "Generate" button. CSV file name should be like this "<table_name>_<date>" ). also want to put links(dynamically [after generation of those csv files]) in a separate page to download those csv files. how to do this in asp.net (C#)? its urgent.
I want to import an XML file in an Table. The XML file is on an Other Network location than the Database server I’m connected to and I have access to this location with my windows Credential.
I connect from my local PC to the Database server “RS1†with my Windows Credentials.
There I run this script
CREATE TABLE XMLwithOpenXML ( Id INT IDENTITY PRIMARY KEY, XMLData XML, LoadedDateTime DATETIME ) INSERT INTO XMLwithOpenXML(XMLData, LoadedDateTime) SELECT CONVERT(XML, BulkColumn) AS BulkColumn, GETDATE() FROM OPENROWSET(BULK '<Networkname><MAP>Name_.xml', SINGLE_BLOB) AS x;
I Get the error:
Msg 4861, Level 16, State 1, Line 10 Cannot bulk load because the file "<Networkname><MAP>Name_.xml'" could not be opened. Operating system error code 5(Access is denied.).
(<Networkname><MAP>Name_.xml' is not the real name )
So it looks like the openrowset connects to the network location with other credentials.
Where I can find (and change) the credentials which is used to connect to the network location?
I found this article on MSDN (Security Considerations) but cannot find solution [URL] .....
Hai all., Im trying to import the CSV file values using bulk insert but im getting an error in my code .so can anyone help me on this. the following is the coding i have created --create table CREATE TABLE CSVTest (ID INT, FirstName VARCHAR(40), LastName VARCHAR(40), BirthDate SMALLDATETIME ) --import from CSV using bulk insert
BULK INSERT CSVTest FROM 'c:csvtest.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '' ) GO
im gettin an error like Msg 4860, Level 16, State 1, Line 1 Cannot bulk load. The file "c:csvtest.txt" does not exist
The thing is i have created a CSV file in C drive with some values.
Hi folks. I am having an excel file. I need to import this file to database and update some other tables with data contained in this file. I would like to automate this process as much as possible. Now, I am just using SQL Server Import Wizard to create a table and then I am running an update query. Is there any (more automate) way to do this?