DTS Issue -- Dtsrun Can't Find My Text File That I'm Inserting Into My Sql Table
Apr 21, 2005
Here is the error message that I'm getting
Error string: Error opening datafile: The system cannot find the path specified.
The file it's bombing out on is the text file that Im importing into one of my tables through a dts package (which is getting called by the dtsrun statement -- that is giving off this message.) If anyone would know what may be sending this message off to me let me know.
I have an application taht requires the use of a table. The device that this application works on, has a local memory that does not allow me to insert the 800,000 records that I need. Therefore I have two approaches:
1. To insert less records into my local memory database e.g 40,000 but not row by row, bulk insert is better. How do I do the bulk insert?
2. This is the most prefferable way: To find a way to insert all 800,000 records into a table on the storage card which is 1GB. What do you suggest? Will using threads be helpfull? Any ideas?
I use C# from VS 2005, SQL ME, compact framework 2.0 and windows 4.2.
Hello friends.... I am looking for 2 things(using c#.net or vb.net and sql svr 2000) 1.convert data from sql server 2000 database (say customers table from northwinds database) to a text file(separated by commas or just plain space) 2.Insert the data from text file back to database. Can someone pls give me the detailed code to achieve this....really need this on urgent basis.......Thank You.
Hello all, I have a question regarding importing text file data into SQL Server. I'm hoping someone can point me in the right direction, as my searches haven't turned up anything specific enough. I'm trying to parse a large (24MB) text file. It's a fixed-width file, with multiple columns. I need to parse this file, check if a record already exists, and then import the data into the database. But I don't need to insert every column. There's only a few columns from the file I need to insert. This parsing also needs to occur at regular intervals (daily). I looked at BULK INSERT, but I can't find an example that uses only some of the columns. Every example uses all columns, and the file is delimited, not fixed-width. Is there anything within SQL Server that can accomplish this? I haven't turned up anything that will solve my problem. The only other solution I can think of is an application that parses the file for me and inserts the data into the database. But can I schedule that application to run every night at midnight (for example) through SQL Server? I'm not too familiar with SQL Server, so I appreciate any help offered. Thanks,Jay
If this is the wrong place for this question, would someone please tell me so. I am new to SQL Server and still feeling out resources. I have a few books on SQL Server but none cover this question.
I have a text file of dates and numbers that I want to insert into a table. There are way too many rows of data in the file to do this by hand.
Question == How can I essentially insert the text file into my table?
I am working on an SSIS project where I create two flat files for submission to a data contractor. This contractor requires a control record be the first line in the file. I create the control record based on the table information being exported.
What I would like to know is, is it possible to utilize the Header Section of the Flat File Destination Editor to insert the control record? And, as it is dynamic, what kind of coding must I do in order to utlise this functionality?
Newbie here.In my database I'm needing to automate some data imports. I have theimport set up as a DTS package and it works wonderfully. But I'mhaving trouble kicking it off as a stored procedure, or even from theQuery Analyzer. I used dtsrunui to get a proper connection string, butwhen I enterEXEC xp_cmdshell 'dtsrun /S "(local)" /N "MyPackage" /A"KeyNum":"19"="19687627" /W "0" /E'I get an error that says "Could not find stored procedure'xp_cmdshell'". xp_cmdshell is indeed there, under Master, ExtendedProcedures. I tried calling it dbo.xp_cmdshell, but that didn't help.I'm guessing that I need to point the command to the location of theSP, but I have no idea how to do that. Anyone willing to shed a littlelight would get my eternal gratitude. :)Thanks in Advance, maddman
I have a text file which has rows 7 rows.I want to insert the data into SQL table using ssis In text file we have a column which has values as Y or N...I wanted to take only those rows which are Y...But we have only 6 rows in SQL table.It does not have the column with Y or N.
When we insert text into field in a table, SQL SERVER apparentlyreplaces apostrophes with question marks -- is there a way to not havethis occur? We don't have this happen with the mySQL databases thatwe also support.Much help appreciated.
I am trying to insert an NTEXT value from one table T1 to another table T2 within a database. Table structures are as below;
Create table T1 (T1ID INT NOT NUll, SourceColumn Ntext null)
Create table T2 (T2ID INT NOT NUll, TargetColumn NVARCHAR(MAX) null)
Every time when long text value is getting inserted into T2.TargetColumn , it is appending with an unwanted character '?' either at the beginning or at the ending of the text string. Same problem happens even when I am trying to update T2.TargetColumn = T1.TextColumn. Because of this, the same column never matches to the source column and gets updated every time even there is no change...
I am also converting NTEXT column to NVARCHAR(MAX) and replacing CHAR(10) to '' .
I am using SQL Server 2012. How can I avoid inserting '?' in T2.TargetColumn . Is there any setting which I need to set in the target table?
I have a problem with inserting records into table when an indexed viewis based on it.Table has text field (without it there is no problem, but I need it).Here is a sample code:USE testGOCREATE TABLE dbo.aTable ([id] INT NOT NULL, [text] TEXT NOT NULL)GOCREATE VIEW dbo.aViewWITH SCHEMABINDING ASSELECT [id], CAST([text] AS VARCHAR(8000)) [text]FROM dbo.aTableGOCREATE TRIGGER dbo.aTrigger ON dbo.aView INSTEAD OF INSERTASBEGININSERT INTO aTableSELECT [id], [text]FROM insertedENDGODo the insert into aTable (also through aView).INSERT INTO dbo.aTable VALUES (1, 'a')INSERT INTO dbo.aView VALUES (2, 'b')Still do not have any problem. But when I need index on viewCREATE UNIQUE CLUSTERED INDEX [id] ON dbo.aView ([id])GOI get following error while inserting record into aTable:-- Server: Msg 8626, Level 16, State 1, Procedure aTrigger, Line 4-- Only text pointers are allowed in work tables, never text, ntext, orimage columns. The query processor produced a query plan that requireda text, ntext, or image column in a work table.Does anyone know what causes the error?
Question pls. I have an MS SQL local package where it exports data from SQL table to Excel file. My question is, how can erase all the records in my excel file before i export the new data from SQL table?
What i want is to delete the rows in the destination file before inserting new records.
I have an access database (access 95 Version7)dumping a delimited text file onto my server. I am then using DTS in SQL 2000 to import the file into a table.
My issue is that each time the DTS runs, it imports the whole text file each time, this is causing duplicate records.
So I created a transformation script as follows :
Function Main()
If DTSSource("counter") <= DTSDestination("counter") Then Main = DTSTransformStat_SkipRow Else
The theory behind the If statement, is if it sees that the counter field is less than or equal to what is there, it will skip the record and move forward. For some reason this is not working.
Does anyone have a workaround or another solution to this problem
I have to find the latest file in a folder and export data to a table in sql server. The code should be something that has to be incorporated into a t-sql stored procedure.
The file name would for example abc_defYYYYMMDD.xls. would i be able to find the latest file in the folder using the the datestamp (YYYYMMDD) in the filename.
Please note i would have files in other format and names with datestamp attached to it, so the code has to pick specific file for which the file name starts with 'abc_def'
I have portions of data coming in as text files containing new records and updates of existing records. The solution I've figured out till yet is to import a portion of data into some intermediate table and then run a stored procedure to migrate the data into the real table. Any ideas how to do this in a more efficient way? Thanks in advance,
I am familiar with the MySQL Load Data command to load an external ascii file into a database table, but am having trouble finding a T-Sql command that is equivalent without creating an executable...any help would be appreciated...
Anbody please help I am trying to export a text file to a table using enterprise manager and all tasks But the process keeps adding strange charater like squares at the end of each line and also replaces each empty line in the text file with a record in the table with that square type character. I used the following code to delete all rows with that character (as a work around) but no joy. I am losing hope.
How to convert a SQL table into Text file? I have a table and I want to extract the values with the field names above to a text file. The query should also allow me to define the starting position of the fields in the text file.
Is there an example anywhere of how to output selected fields in a sql table to a text file with fixed length fields. ie pad data out to required length.
I am writing program in VC++ through SQl-DMO calls.My problem is when i when i tranfer(import) a text file(comma seperated) into SQl server through a SQl-DMO method called ImportData which is a method of Bulk copy object.Its is not able to convert the data field in the text file to corresponding value datetime in SQl server whereas other data types are working perfectly.
This is the record i need to convert:
90,MichaelB,Wintriss,Inspection,Paper,11,Job101,1, {ts '2000-12-10 15:54:56.000'},D:public233 and 247233.mcs,
and this is the date field {ts '2000-12-10 15:54:56.000'}
Whereas if i export a table in SQl server in Binary mode and then import the file back it works but when do it as text it gives the above error
Pls help me in this i would be very thankful to you.
I need to export data from a table to a text file, where the data in the table is deleted after written to the file. It is simple using DTS, but I want to do the export in "chunks" of data, committing the delete say after every 1000 rows.
My thought was a stored procedure would be easy enough to do this (done these in Oracle many times), but I don't know the quickest way to export a row of data from a stored procedure to a text file. Isn't using a command-line shell too slow? What are my options?
Hey guys, I have a dilemma and hope someone can help.
I don't know of any utilities or commands in SQL that do this but I hope someone does.
What I need to do is something like a bcp import a text file in. I can do that with DTS as well. But what I wanted to do is create a table on the import. So lets say, I am importing a tab-delimited file with column names as the first row that is called ax.txt. On import, it would create the table ax with the column names in the file and then import the data into that table.
I hope I explained it clearly. Please let me know if there is anything I can use to do this without writing lots of code.
I have an idea how to do it the long way but hope there is a utility that already does it.
I am very very new to sql server 2005.I want to create SSIS package for my text file to transfer in table.How i do this and where i will get that SSIS package (like in sql server 2000 we get that in EM under DTS) and how do i schedule these packages.
Dear MSSQL- experts,I have a strange problem with SQLSERVER 2000.I tried to export a table of about 40000 lines into a text file usingthe Enterprise managerexport assitant. I was astonished to get an exported text file of about400 MB instead 16 MB which is the normal size of that data.By examining this file with a text editor I found that the fileincluded alongside the data of my table MANY zeros which caused the bigfile size.Does someone of you have an idea what could cause the export oftrillions zeros into my textfile and how to only export the significantdata of my table ?Best regards,Daniel