I'm trying to use DTS to import data from an XLS into a SQL table.
It works fine in that it INSERT's the data. However, I need it to
UPDATE the table, based upon a ProjectID. Can this be done?
Can a DTS package be fired from a SP using parameters?
Eg UPDATE tProjects SET MyField1=XLS.Sheet1.CellA1,
MyField2=XLS.Sheet2.CellA1 WHERE ProjectID = @ProjectID.
Also, it must handle dynamic XLS file names, eg 981-Budget.xls,
513-Budget.xls, xyz-Budget.xls
Is this the best way to go? Other suggestions most welcome?
Hi, I'm working with MRS and I've got a table with a lot of entries. For each value in the table I'm trying to get the text colour to be set to 'red' when the value of the cell is less than 0. Otherwise remain black.
I can do this by setting the colour property cell by cell. But I have a lot of cells in the table. Is there a way to set the statement to apply to ALL cells in the table?
Basically I'm asking if there is a way to set the property in bulk instead of going through tediously cell by cell.
I have a column defined as smalldatetime. Default length (4), and "allow NULLS" is checked.In the Enterprise Manager UI, when i enter data into that table row, if i just tab past that column, all is well, and the value is represented in the UI as <NULL>.The problem comes once i ever enter a date into that column. Say i have entered a date (all is well), and now i want to remove that entry and go back to NULL (after the date value has been committed, different entry session, say).How is that done?It seems to me, once a date has ever been entered into that column, now, if i try to remove it, i get the error "The value you entered is not consistant with the data type or length of the column, or over grid buffer limit". I have tried deleting the value, entering spaces, entering the string NULL or the string <NULL>; maybe some other tries as well, but none works, i always get that error message and am not allowed to proceed past that cell until i restore a date value to it. I want to get back to <NULL>.Anybody know?Thank you.Tom
HI,I HAVE AN EXCEL SHEET WITH SOME DATA, I WANT TO IMPORT THAT DATA (CELLBY CELL WITH MANIPULATION) INTO THE SQL SERVER TABLES BY USING STOREDPROCEDURE(IF POSSIBLE).IF ANYBODY HAVE DONE SIMILER TYPE OF JOB OR KNOWING ABOUT IT, PLS. LETME KNOW.THANKS IN ADV.T.S.NEGI
I uploaded custtable under the database, the data looks fine except that the name that apprears has a lot of distance e.g
it should be :
firstname lastname however the format appears very strange:
firstname lastname
firstname lastname
fistname lastname
Same is the case with the address, I need to adjust or format the apperance that appears on the cell. Is there a way/ sql statement to format the data under the table so that the apprearence looks okay.
I will really appreciate any sort of help on this one.
A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server. I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard. However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection
Operation stopped...
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success) - Setting Source Connection (Error) Messages Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed. (SQL Server Import and Export Wizard)
Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)
- Setting Destination Connection (Stopped)
- Validating (Stopped)
- Prepare for Execute (Stopped)
- Pre-execute (Stopped)
- Executing (Stopped)
- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)
- Post-execute (Stopped)
Does anyone encounter this problem before and know what is happening?
I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."
Seems simple, right?
I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?
Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.
I have a report where in I want to show each record on a separate page.
So, to achieve that I took a single cell from table control, expanded it and used all the controls in that single cell. This looks nice so far.
Now, I also have to show a sub grid on each record. So I took a table control and added on the same single cell and tried to add a parent group to the table row.
When I preview, it throws this error.
"The tablix has a detail member with inner members. Detail members can only contain static inner members."
What am I doing wrong? How can I achieve table grouping inside a table cell?
HelloI have created a table in mssql.2000 which holds details of names etc. I have also included categories of interest. However the table is growing very big and unmanageable as the list of interest expand.Instead I would like to create seperate tables for each category of interest within the same database and populate the table with names taken from the Names_Table I could then indicated yes or no if any name is interested in this category.for example: Art_category. However, I am unsure how I can import the column of names from the Names_Table to polulate the NameID column in the Art_category table.I would appreciate advice and possibly a link to step by step tutorial.Thanks.Lynn
Hi everyone,I have some data in a CSV file, and I have to import it into a table. For some reason, I am supposed to import this data into a temp table and then move it to the original table and I have to convert it to the right data types while I do this. Is there a better way to do this and how can I give custom error messages saying, for e.g., the data type cannot be converted, the right number of records are not present etc. Thanks for the help.
Hi, I'm new to SQL Server, and would appreciate some advice on the quickest way to import data from a CSV file.
I've created a database using Visual Web Developer Express, and added a couple of tables. The Help file in SQL Server Express (which is installed on the same PC) indicates that I should use BULK INSERT to populate the table. Only snag is, I could find anywhere to enter the commands! Eventually, I found out about the SQLCMD command which I entered in a Windows Command Window. I successfully connected to the default (SQLEXPRESS) server instance this way, but when I typed USE <my database name> I got an error back saying it couldn't find the database. I know that Visual Web Developer Express by default creates user-specific instances of the database, but I've turned that off (I think!) via the connection string. So, please could someone tell me how I can connect to my database via the SQLCMD command, or alternativley please let me know how else I can bulk inmport data from a CSV file. Many thanks in advance.
I am very new to the entire world SQL Server databases. I am starting from scratch.
Currently I have a little Website I am doing for myself that is .asp based and will allow users to query some sports boxscores. I hope to create a user interface that will allow folks to seperate team results based on certain criteria...
It is just a hobby of mine that I have been doing for year with excel and now hope to let others like me do it aswell.
here is what I got.
MSSQL 2005 Server with a database. Iam using SQL 2005 Server Express Studio. Therefore, do not have access to SSIS or DTS or anything like that.
However, I want to import several hundred records into a db I created (hosted by Crystal tech). Since, I don't have access to the Server root directory, I can't use the BULK INSERT statement.
I am looking for a method to query an excel file (or .csv something..) that is stored on my local drive and upload it to the Server db tables.
I would like to do this either through SQL with a query. Or I would to add this VB code to the current VB that I use in my Excel file.
How to refresh table data on sqlserver without dts or backup/restore whole database. In sqlserver, is there a way to import table(s) data (like import export in oracle) in a file and export to another unconnected server/db.
Hi! I have to develop an application for transfering data from an excel file into a sql table.The excel file is uploaded to a server.The database(and the table) is on another server.At first,I used openrowset for transferring data to the table.My sql command looked like this(in my asp page):
SQLstr = "SELECT * INTO dbo.shopping_TSR FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database="+Server.MapPath("upload/tmb2.xls")+";hdr=yes', 'SELECT * FROM [Sheet1$]')"
I kept getting this error: [Microsoft][ODBC SQL Server Driver][SQL Server]OLE DB error trace [OLE/DB Provider 'Microsoft.Jet.OLEDB.4.0'IDBInitialize::Initialize returned 0x80004005: The provider did not give any information about the error.]
After reading a few articles,I think the cause of my error is that the excel file is uploaded into the folder where the asp script is located.I have 2 servers : one running the asp scripts and one containing the database. Is my error generated by the fact that the excel file is on a different server than the sql server?How could I make this work?
I am new to SSIS, and was only a novice to intermediate skill level with SQL 2000 DTS, so please excuse me if this is an easy question. I am trying to import data from a table in one DB into a table in another. After insertion, I need to store the newly created ID (an identity seed) in a separate table that maps to the original DB's row id. My eventual goal is to import a bunch of related tables from the old DB into the new DB, and maintain relationships, so the mapping of newly created IDs is necessary to make sure data is imported with the correct relationships. Any advice would be greatly appreciated!
i have a table in sql 2000 db and want to import data from excel sheet in to the table. my table = Table1 excel file = data.xls is there a simple method where i can import data from the sheet into the existing table?
I was wondering if there was a different approach I should take in appending data to a table...
My destination table has about 94+ million records in it, and I have been taking two approaches to getting new files into this table:
1) I do a data pump task in a DTS to import the file to a trans (temp) table, which is truncated every time, and then do an INSERT INTO statement from the temp table to my destination table.
The import to the trans table only takes a few minutes (about 1 - 2 million records per file, but have short record lenghts,) but when I do the INSERT INTO statement, it takes upwards of 6 hrs to append.
2) I have tried doing a bulk insert task, going directly to the destination table (which defeats the purpose of my trans table to check out the data prior, but I feel the data is clean at this point.)
I am running the bulk insert right now, and it's been running for over 3 hours...so I'm going to assume this will take just as long as the INSERT INTO statement does like I did before.
My destination table does not have any indexes in it at all, and I don't need to do any transformations to the data when bringing it into SQL since the data is clean. Also, I have a default value constraint on one of my fields on the destination table.
Plus there are other ppl and applications hitting the server which could impact the overall processing, but nothing out of the ordinary is going on the server today. I know there are only so many ways to get a file into a table...but maybe someone knows a different way I should try this.
I have a remote DB I am wokring with at present. The DBA has provided me with a non owner LOGIN so I can't copy tables from the live to the staged DB as objects I can only copy tables and data.
The PKEY and IDENTITY COLUMNS get reset to just regular columns on each table. I can restore the PKEY constraint and have come across the DBCC CHECKIDENT to get the new ident value. I just can't figure out how to set a column to be an identity. The ALTER TABLE command isn't having any of it.
I am obviously missing the right bit on Books online
Our SQL 2008 R2 relational database has tables with foreign key relationships for part numbers. We receive production data from a separate program and we need to import the CSV data into our database application.
The problem is our separate program creates a CSV file with the actual part number "362S162-33". In our database we have a separate parts table (example: 362S162-33 has identity "15").
We need to import data into a production table that has a "part number" (FK) column.
How can we, when importing, cross-reference the "parts table" to convert the part number to the identity number. We have thousands of parts, so we need this change of part number column to the FK identity automatically on import.
Production Table: idComponent (PK), [1000] ComponentName, [Assembly108] idPartNumber (FK), [15] ComponentLength, [230.5] UserMessage, [Assembly is 230.5 inches using 362S162-33] Qty; [1]
I am using the import wizard in SQL Server 2008 R to import data from an Excel spreadsheet into a table I have created.
The spreadsheet contains 3 columns that SQL recognises as DOUBLE and they contain a 1 or 0. What data type do the corresponding fields in SQL table need to be? I have tried BIT, INT and FLOAT but keep getting an error (can't view details of the error because I get chucked out every time the error pops up). I know the problem is with the DOUBLE data because when I 'ignore' those columns the import works fine.
I am looking solutions to import csv data into my SQL database table. BUT we want to collect the data from specific columns in the csv file, (NOT the whole csv file) into SQL Database Table.
I need to make a script in SQL 2005 to import data from an Excel sheet into a SQL table. I am using the wizard to import now. Import from Excel 2000. First row of the excel sheet has column names. Excel file name is: EXL.xls, sheet name is: Sheet1 Destination sql database name is: NM, table name is: Sht1 I use SQL Server Authentication to access the database. User name: ABC and password: DEF Database name is: DB I am using the following setting when importing now: - Delete rows in destination table - Enable identity insert
I need to import data from more than 10 excels having the same format in to a single sql server table.
I tried to use
INSERT INTO MyTempTable SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 11.0;Database=C:Book1.xls', [Sheet1$])
but got the below error Ad hoc access to OLE DB provider 'Microsoft.Jet.OLEDB.4.0' has been denied. You must access this provider through a linked server.
If DTS package is used then I am not sure how I can place 10 excels at a time so that they can be picked one by one and data is imported in to table.
I've imported data from an Excel spreadsheet to a table that has fields to match the destination table I'm trying to populate. The destination table has an Insert trigger with several checks on certain fields to make sure they have corresponding records in other tables.
If I do a statement like "INSERT INTO destinationTable ( ItemId, Product, SuperID, etc etc ) SELECT * FROM oldtable" it runs for a while then gives me error messages from the trigger and rolls back the Insert.
The trigger has code such as "IF (SELECT COUNT(*) FROM inserted WHERE ((inserted.Product Is Not Null))) != (SELECT COUNT(*) FROM tblInProduct, inserted WHERE (tblInProduct.Product = inserted.Product)) BEGIN (Error message code goes here) END"
So, do I need to do an INNER JOIN to each of the related files? When I try that, I get this error: "Msg 121, Level 15, State 1, Line 2 The select list for the INSERT statement contains more items than the insert list. The number of SELECT values must match the number of INSERT columns." Is SQL counting the foreign key fields as separate fields, or what?
Hi guys, I need to import all data from Excel spreadsheet to a Sharepoint Content Database (SQL Server).Please suggest the best way to do this. For this when i run the Import wizard under Tasks--> Import in Management Studio 2005 ....it asks me to choose the database name etc....but How to use the Import/Export Wizard to Export Data from a .xls source to an existing table in a database....that is i need to append/insert my excel data into an existing table.
I have tried my file name vendor_insertFromFile2.sqlBULK INSERT (dbo.vendors VendorName, Street, City, Region, Country, PostalCode, Telephone, PortalId, Fax, Email, Website, CreatedDate, Unit, LastName, Cell) from 'D:AccessNAICSxPortal0_SampleData.txt'error messageServer: Msg 170, Level 15, State 1, Line 2Line 2: Incorrect syntax near '('.Data in the "CompanyName","Address","City","State","country","fullzip","FullPhone","Portal","Fullfax","Email","URL","date","ID","Contact_1","800No""DAVID STRAWN","460 FLOWERING TRL","GRAYSON","GA","United States","30017�,"770-277-8709",0,"“,"bkroom@bellsouth.net","WWW.BACKROOM4MEN.COM","09/02/05",1,"DAVID STRAWN","BRADRUSS","3353 S MAIN ST STE 130","SALT LAKE CITY","UT","United States","84115�,"801-975-0374",0,"“,"rusbrad@msn.com","WWW.BADANDNASTY.COM","09/02/05",2,"BRAD JENSEN",Thanks in advance on what is wrong or a link with working examples
I have several bcp output files I need to import into tables. I do not have format files for them. As far as I know they are in native format. I do not know the layout of the destination table they would populate.
1) how can I determine from the bcp file itself the schema of the destination table? Once I know that I should be able to import the data into the table.
I created a file People.txt containing firstName, LastName seperated by a pipe.
------------------content----------- John | Doe Mike | James Adam | Smith -----------------------------------------
and another one called gender.txt
------------------content----------- M ---------------------------------------
I will would like to create integration services package that compines each record of the first file with the record of the second file and inserts the result into table.
I'm looking at a system where formulas have been added into fields in a table and I need to look at the field to see what formula to use when selecting eg: eg this + this, this / this etc.Here's a basic table I have knocked up to try different things...
CREATE TABLE #HeaderOrder( [HeaderCode] [varchar](10) NOT NULL, [HeaderCode2] [varchar](10) NOT NULL, [FormulaCode] [varchar](10) NOT NULL