Hi, I'm new to SQL Server, and would appreciate some advice on the quickest way to import data from a CSV file.
I've created a database using Visual Web Developer Express, and added a couple of tables. The Help file in SQL Server Express (which is installed on the same PC) indicates that I should use BULK INSERT to populate the table. Only snag is, I could find anywhere to enter the commands! Eventually, I found out about the SQLCMD command which I entered in a Windows Command Window. I successfully connected to the default (SQLEXPRESS) server instance this way, but when I typed USE <my database name> I got an error back saying it couldn't find the database. I know that Visual Web Developer Express by default creates user-specific instances of the database, but I've turned that off (I think!) via the connection string. So, please could someone tell me how I can connect to my database via the SQLCMD command, or alternativley please let me know how else I can bulk inmport data from a CSV file. Many thanks in advance.
A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server. I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard. However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection
Operation stopped...
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success) - Setting Source Connection (Error) Messages Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed. (SQL Server Import and Export Wizard)
Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)
- Setting Destination Connection (Stopped)
- Validating (Stopped)
- Prepare for Execute (Stopped)
- Pre-execute (Stopped)
- Executing (Stopped)
- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)
- Post-execute (Stopped)
Does anyone encounter this problem before and know what is happening?
Hi I have a situation where i have a drop down list. The user selects a value from it and then a query behind the scenes is generated based on the value from the list. I want the result of the query to be diplayed in the textbox.I've checked a few examples but i don't really have a good understanding of how to generate this query and get the results to the textbox. I think you use the SQLDATASOURCE but still need some help...Can anyone help? Thanks.
With classic asp I would do something like this: set conn = server.createobject("ADODB.Connection") conn.open "PROVIDER=SQLOLEDB;DATA SOURCE=sqlservername;UID=username;PWD=password;DATABASE=databasename "set rs = conn.execute("select count(*) from myTable")response.write rs(0)How can I do this with asp.net in the fewest number of lines?
What is the easiest way to insert a new row in a existing sql table through web developer.net (visual basic)? E.g. a database called Names and the columns Firstname and Surname and you want to insert "anna","johnsson"?
im pretty new to SQL and i'm just reading up on full-text searches... i need to do a a full-text search on one table i have in the database.. however I'm reading about full-text indexing/searching and a lot of pages are saying that it uses a lot of resources when searching.. i was wondering how bad is it really? we have about ~100 users who would access the database, probably the peak would be 75 at a time.. would people using a full-text searching slow it down a lot? the servr is a dell poweredge 1750 server, dual 2.8ghz xeon, 1 gig ram.. and also, about the incermental population, if i read right, it populates the catalog each time a item in the table is deleted/inserted/modified.. so would that use a lot of resources as well?
i'm just trying to see if it's worth it to enable full-text indexing for searches on the database if it doesn't slow down the server too much... or are there any better/easier ways to perform searches?
I am trying to sort out the member "Adams" from the committee members, and sum up the total amounts that were donated while he was the committee member.
The Table:
CREATE TABLE contribution_list (contrib_date DATE NOT NULL, donor_name VARCHAR(30) NOT NULL, amount NUMBER(8,2), program VARCHAR2(30), committee_member VARCHAR2(20) NOT NULL, PRIMARY KEY (contrib_date,donor_name,committee_member));
The Code that is giving me errors:(not a group by expression):
CREATE or REPLACE VIEW adams_conrtibution_total as select program, committee_member, sum(amount) from contribution_list where committee_member = 'Adams' group by program;
I'm trying to find the easiest way to get a list of every data source on a RS instance, including the name of the SQL Server that the data source points to. We want to make sure none of the data sources point at our primary OLTP server.
I looked at the DataSource table in the ReportServer DB, but of course the connection string info is encrypted. I'm hoping that I don't need to resort to writing .net code to hit the web service.
I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."
Seems simple, right?
I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?
Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.
HelloI have created a table in mssql.2000 which holds details of names etc. I have also included categories of interest. However the table is growing very big and unmanageable as the list of interest expand.Instead I would like to create seperate tables for each category of interest within the same database and populate the table with names taken from the Names_Table I could then indicated yes or no if any name is interested in this category.for example: Art_category. However, I am unsure how I can import the column of names from the Names_Table to polulate the NameID column in the Art_category table.I would appreciate advice and possibly a link to step by step tutorial.Thanks.Lynn
Hi everyone,I have some data in a CSV file, and I have to import it into a table. For some reason, I am supposed to import this data into a temp table and then move it to the original table and I have to convert it to the right data types while I do this. Is there a better way to do this and how can I give custom error messages saying, for e.g., the data type cannot be converted, the right number of records are not present etc. Thanks for the help.
I am very new to the entire world SQL Server databases. I am starting from scratch.
Currently I have a little Website I am doing for myself that is .asp based and will allow users to query some sports boxscores. I hope to create a user interface that will allow folks to seperate team results based on certain criteria...
It is just a hobby of mine that I have been doing for year with excel and now hope to let others like me do it aswell.
here is what I got.
MSSQL 2005 Server with a database. Iam using SQL 2005 Server Express Studio. Therefore, do not have access to SSIS or DTS or anything like that.
However, I want to import several hundred records into a db I created (hosted by Crystal tech). Since, I don't have access to the Server root directory, I can't use the BULK INSERT statement.
I am looking for a method to query an excel file (or .csv something..) that is stored on my local drive and upload it to the Server db tables.
I would like to do this either through SQL with a query. Or I would to add this VB code to the current VB that I use in my Excel file.
How to refresh table data on sqlserver without dts or backup/restore whole database. In sqlserver, is there a way to import table(s) data (like import export in oracle) in a file and export to another unconnected server/db.
I'm trying to use DTS to import data from an XLS into a SQL table.It works fine in that it INSERT's the data. However, I need it toUPDATE the table, based upon a ProjectID. Can this be done?Can a DTS package be fired from a SP using parameters?Eg UPDATE tProjects SET MyField1=XLS.Sheet1.CellA1,MyField2=XLS.Sheet2.CellA1 WHERE ProjectID = @ProjectID.Also, it must handle dynamic XLS file names, eg 981-Budget.xls,513-Budget.xls, xyz-Budget.xlsIs this the best way to go? Other suggestions most welcome?Thanks everyone in advance!
Hi! I have to develop an application for transfering data from an excel file into a sql table.The excel file is uploaded to a server.The database(and the table) is on another server.At first,I used openrowset for transferring data to the table.My sql command looked like this(in my asp page):
SQLstr = "SELECT * INTO dbo.shopping_TSR FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database="+Server.MapPath("upload/tmb2.xls")+";hdr=yes', 'SELECT * FROM [Sheet1$]')"
I kept getting this error: [Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB error trace [OLE/DB Provider 'Microsoft.Jet.OLEDB.4.0'IDBInitialize::Initialize returned 0x80004005: The provider did not give any information about the error.]
After reading a few articles,I think the cause of my error is that the excel file is uploaded into the folder where the asp script is located.I have 2 servers : one running the asp scripts and one containing the database. Is my error generated by the fact that the excel file is on a different server than the sql server?How could I make this work?
I am new to SSIS, and was only a novice to intermediate skill level with SQL 2000 DTS, so please excuse me if this is an easy question. I am trying to import data from a table in one DB into a table in another. After insertion, I need to store the newly created ID (an identity seed) in a separate table that maps to the original DB's row id. My eventual goal is to import a bunch of related tables from the old DB into the new DB, and maintain relationships, so the mapping of newly created IDs is necessary to make sure data is imported with the correct relationships. Any advice would be greatly appreciated!
i have a table in sql 2000 db and want to import data from excel sheet in to the table. my table = Table1 excel file = data.xls is there a simple method where i can import data from the sheet into the existing table?
I was wondering if there was a different approach I should take in appending data to a table...
My destination table has about 94+ million records in it, and I have been taking two approaches to getting new files into this table:
1) I do a data pump task in a DTS to import the file to a trans (temp) table, which is truncated every time, and then do an INSERT INTO statement from the temp table to my destination table.
The import to the trans table only takes a few minutes (about 1 - 2 million records per file, but have short record lenghts,) but when I do the INSERT INTO statement, it takes upwards of 6 hrs to append.
2) I have tried doing a bulk insert task, going directly to the destination table (which defeats the purpose of my trans table to check out the data prior, but I feel the data is clean at this point.)
I am running the bulk insert right now, and it's been running for over 3 hours...so I'm going to assume this will take just as long as the INSERT INTO statement does like I did before.
My destination table does not have any indexes in it at all, and I don't need to do any transformations to the data when bringing it into SQL since the data is clean. Also, I have a default value constraint on one of my fields on the destination table.
Plus there are other ppl and applications hitting the server which could impact the overall processing, but nothing out of the ordinary is going on the server today. I know there are only so many ways to get a file into a table...but maybe someone knows a different way I should try this.
I have a remote DB I am wokring with at present. The DBA has provided me with a non owner LOGIN so I can't copy tables from the live to the staged DB as objects I can only copy tables and data.
The PKEY and IDENTITY COLUMNS get reset to just regular columns on each table. I can restore the PKEY constraint and have come across the DBCC CHECKIDENT to get the new ident value. I just can't figure out how to set a column to be an identity. The ALTER TABLE command isn't having any of it.
I am obviously missing the right bit on Books online
Our SQL 2008 R2 relational database has tables with foreign key relationships for part numbers. We receive production data from a separate program and we need to import the CSV data into our database application.
The problem is our separate program creates a CSV file with the actual part number "362S162-33". In our database we have a separate parts table (example: 362S162-33 has identity "15").
We need to import data into a production table that has a "part number" (FK) column.
How can we, when importing, cross-reference the "parts table" to convert the part number to the identity number. We have thousands of parts, so we need this change of part number column to the FK identity automatically on import.
Production Table: idComponent (PK), [1000] ComponentName, [Assembly108] idPartNumber (FK), [15] ComponentLength, [230.5] UserMessage, [Assembly is 230.5 inches using 362S162-33] Qty; [1]
I am using the import wizard in SQL Server 2008 R to import data from an Excel spreadsheet into a table I have created.
The spreadsheet contains 3 columns that SQL recognises as DOUBLE and they contain a 1 or 0. What data type do the corresponding fields in SQL table need to be? I have tried BIT, INT and FLOAT but keep getting an error (can't view details of the error because I get chucked out every time the error pops up). I know the problem is with the DOUBLE data because when I 'ignore' those columns the import works fine.
I am looking solutions to import csv data into my SQL database table. BUT we want to collect the data from specific columns in the csv file, (NOT the whole csv file) into SQL Database Table.
I need to make a script in SQL 2005 to import data from an Excel sheet into a SQL table. I am using the wizard to import now. Import from Excel 2000. First row of the excel sheet has column names. Excel file name is: EXL.xls, sheet name is: Sheet1 Destination sql database name is: NM, table name is: Sht1 I use SQL Server Authentication to access the database. User name: ABC and password: DEF Database name is: DB I am using the following setting when importing now: - Delete rows in destination table - Enable identity insert
I need to import data from more than 10 excels having the same format in to a single sql server table.
I tried to use
INSERT INTO MyTempTable SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 11.0;Database=C:Book1.xls', [Sheet1$])
but got the below error Ad hoc access to OLE DB provider 'Microsoft.Jet.OLEDB.4.0' has been denied. You must access this provider through a linked server.
If DTS package is used then I am not sure how I can place 10 excels at a time so that they can be picked one by one and data is imported in to table.
I've imported data from an Excel spreadsheet to a table that has fields to match the destination table I'm trying to populate. The destination table has an Insert trigger with several checks on certain fields to make sure they have corresponding records in other tables.
If I do a statement like "INSERT INTO destinationTable ( ItemId, Product, SuperID, etc etc ) SELECT * FROM oldtable" it runs for a while then gives me error messages from the trigger and rolls back the Insert.
The trigger has code such as "IF (SELECT COUNT(*) FROM inserted WHERE ((inserted.Product Is Not Null))) != (SELECT COUNT(*) FROM tblInProduct, inserted WHERE (tblInProduct.Product = inserted.Product)) BEGIN (Error message code goes here) END"
So, do I need to do an INNER JOIN to each of the related files? When I try that, I get this error: "Msg 121, Level 15, State 1, Line 2 The select list for the INSERT statement contains more items than the insert list. The number of SELECT values must match the number of INSERT columns." Is SQL counting the foreign key fields as separate fields, or what?
Hi guys, I need to import all data from Excel spreadsheet to a Sharepoint Content Database (SQL Server).Please suggest the best way to do this. For this when i run the Import wizard under Tasks--> Import in Management Studio 2005 ....it asks me to choose the database name etc....but How to use the Import/Export Wizard to Export Data from a .xls source to an existing table in a database....that is i need to append/insert my excel data into an existing table.
I have tried my file name vendor_insertFromFile2.sqlBULK INSERT (dbo.vendors VendorName, Street, City, Region, Country, PostalCode, Telephone, PortalId, Fax, Email, Website, CreatedDate, Unit, LastName, Cell) from 'D:AccessNAICSxPortal0_SampleData.txt'error messageServer: Msg 170, Level 15, State 1, Line 2Line 2: Incorrect syntax near '('.Data in the "CompanyName","Address","City","State","country","fullzip","FullPhone","Portal","Fullfax","Email","URL","date","ID","Contact_1","800No""DAVID STRAWN","460 FLOWERING TRL","GRAYSON","GA","United States","30017�,"770-277-8709",0,"“,"bkroom@bellsouth.net","WWW.BACKROOM4MEN.COM","09/02/05",1,"DAVID STRAWN","BRADRUSS","3353 S MAIN ST STE 130","SALT LAKE CITY","UT","United States","84115�,"801-975-0374",0,"“,"rusbrad@msn.com","WWW.BADANDNASTY.COM","09/02/05",2,"BRAD JENSEN",Thanks in advance on what is wrong or a link with working examples
I have several bcp output files I need to import into tables. I do not have format files for them. As far as I know they are in native format. I do not know the layout of the destination table they would populate.
1) how can I determine from the bcp file itself the schema of the destination table? Once I know that I should be able to import the data into the table.
I created a file People.txt containing firstName, LastName seperated by a pipe.
------------------content----------- John | Doe Mike | James Adam | Smith -----------------------------------------
and another one called gender.txt
------------------content----------- M ---------------------------------------
I will would like to create integration services package that compines each record of the first file with the record of the second file and inserts the result into table.
I have someone who is sending me .htm documents, with a table in them, and I was wondering if there is a way to import the data from those tables into a SQL table, probably using an SSIS Package.