Hopefully one of you gurus can offer a solution to this problem:
First off I do not have the ability to use SQLMail - so that is not an option.
I have a DTS package that performs the following steps in order to populate an excel sheet with some data and email it off to a coworker:
1) FTP Task to copy excel sheet template from one folder to another on same machine
2) Data pump between SQL server connection (local) and copied excel sheet.
3) SQL script task which truncates a table on the sql server
4) activeX script task which contains the following code that uses CDO to attach the excel sheet that was just populated to an outgoing email:
Function Main()
Dim iMsg
set iMsg = CreateObject("CDO.Message")
Dim objMail
Set objMail = CreateObject("CDO.Message")
objMail.From = "from@from.com"
objMail.To = "to@to.com"
objMail.AddAttachment("F:copiedexcelsheet.xls")
objMail.Subject="subject"
objMail.HTMLBody = "Body"
objMail.Send
Set objMail = nothing
Main = DTSTaskExecResult_Success
End Function
The result of DTS execution is a failure on the activeX step:
Error Description: The process cannot access the file because it is being used by another process.
I assume this means Excel still has its grubby paws on the file - but I've tried alleviating this by (1) using a waitfor delay between the excel connection and the activex script task and even by (2) using the run package task to run a separate package that only has the activex mailsender script in it (thinking the parent package would terminate along with the excel process before running the next package - maybe I'm wrong here?).
I am writing a package to process perfmon logs. The issue I have come across is that the perfmon process holds onto the log file and SSIS fails because it wants to exclusive read access. Is there any way of getting SSIS to not take an exclusive read on the file.
I have a package the looks for any Excel files in a folder, moves the data to a SQL table, then archives the file to one of two archive folders--a success folder or an error folder. I have an OnError handler on the Data Flow that sets a flag that lets the archive process know where to move the file.
This works when the processing is successful. It also works when the error in the Data Flow occurs right off the bat, i.e., in the Source. When the error occurs later on, say in the Destination, it doesn't work correctly. In this case, the OnError sets the flag, but when the archive process tries to move the Excel file, it can't because it's locked. I assume this is because OnError interrupted the Data Flow before the Excel file could be closed properly.
Any ideas on how I can avoid this problem? Can I manually get the Data Flow to close the Excel connection somehow?
when trying to ímport files to our database server from a client, I keep getting an error:
- Validating (Error) Messages Error 0xc00470fe: Data Flow Task: The product level is insufficient for component "Source_txt" (1). (SQL Server Import and Export Wizard)
Error 0xc00470fe: Data Flow Task: The product level is insufficient for component "Data Conversion 1" (175). (SQL Server Import and Export Wizard)
... doing the same import when logged on the server, hasn't been giving me any errors, how come. I can from my client without trouble import tables from other DB servers but when ever it is files it won't do it.
I tried as mentioned in other threads rerun setup to re-install SSIS, but as it was already installed it wouldn't re-install. My next move would be to make a clean install, but not sure it would help, as I think this is a buck.
i unstalled my sql server and installed it again, and i want the old database. so how i can i import the old database file .mdf, ldf here. i tried to copy this 2files in to data folder, but its not showing database files in the server.
please help me out,i havent taken any backup before uninstallation
Im looking to write a DTS script to import a CSV file into an existing table. I have made sure that all columns correspond. I ideally want to create a stored procedure and a variable to be entered as the file name.
I'm new to the SQL environment and am looking for some help. I have a client that has a CSV file that we need to import to a table but we need to make this part of an existing job. I'm wondering if anyone has any scripts for importing a CSV file into a DB. I've looked at the Import Wizard but don't think that is going to work the way I want it to.
Env: SQL Server 2000 on in WIN NT 5.xJob: import mutiple flat files into several tables daily.Catch: one or two of the several flat files might be empty.First thought/test:Use [first row as fields] option for the import process.Problem, DTS can't complete (as a package).As an alternative, I could probably detect if a file is empty thendecide what to do with it, with VB activeX, it might be feasible,question, VB has a command for "FileExist", how about "FileLen" or thelike for determining the length of a file?TIA.
I have a SQL Server 2005 database *.mdf file that I would like to import into another SQL Server 2005 engine on another computer. I don't see how to do this.
How can I import a *.mdf file to create a copy of an exiisting database from another computer?
i am trying to import a csv file into the sql, but my problem is that I have a message in one of the columns. suggestions
- Executing (Error)
Messages
Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "INTVAL" returned status value 2 and status text "The value could not be converted because of a potential loss of data.". (SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task: The "output column "INTVAL" (22)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "INTVAL" (22)" specifies failure on error. An error occurred on the specified object of the specified component. (SQL Server Import and Export Wizard)
Error 0xc0202092: Data Flow Task: An error occurred while processing file "E:minessightlastinfoattrib.csv" on data row 2. (SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task: The PrimeOutput method on component "Source - blastinfoattrib_csv" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. (SQL Server Import and Export Wizard)
Error 0xc0047021: Data Flow Task: Thread "SourceThread0" has exited with error code 0xC0047038. (SQL Server Import and Export Wizard)
Error 0xc0047039: Data Flow Task: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. (SQL Server Import and Export Wizard)
Error 0xc0047021: Data Flow Task: Thread "WorkThread0" has exited with error code 0xC0047039. (SQL Server Import and Export Wizard)
Hi Guys, I have been trying to search for a free asp or asp.net script that will allow me to upload a .csv file and import it into an MS SQL Database. As its going to be a ProductCatalog and pricing changes nearly 2nd day. And wanting to an a script that I can put in my admin panel on my site to upload a .csv file and import it to a MS SQL Database. I will be updating fields as well as adding new products. So the upload script would need to be able to handle those two things. Is their any good free scripts around that people can recommend. Thanks Matthew
hi! my problem both concerns vb and sql. the problem is I have a text file delimited with semicolon. I have to enter this data into Sql table. for example; item description ;code1;code2;code3; there are too many lines like this. The code should enter the table these values.Table name is products, column names are desc,code1,code2,code3 I am using vb.net and sql server thanks all!
Hi All,I have to import a csv file to sql database.I searched on the net and found plenty of exampls but all with msaccess database.I did not find any example with sql server.Can anybody have the code or the link?here is one example with msaccess Dim cn As New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:;Extended Properties=""Text;HDR=No;FMT=Delimited""")
Dim da As New OleDbDataAdapter()
Dim ds As New DataSet()
Dim cd As New OleDbCommand("SELECT * FROM C:Test.csv, cn)
cn.Open() da.SelectCommand = cd ds.Clear() da.Fill(ds, "CSV") dg.DataSource = ds.Tables(0) cn.Close() but when i am using the sqlserver how to get csv data by setting extended properties .RegardsBrijesh Singh
I backup my database through SQL Server 2000 SQL Query Analyzer : "BACKUP DATABASE Mysite to disk = 'd:BatabaseBackupMysite.bak' ", and got a bak file to import it to hoster server. Now I need import (backup) database from that hoster server back to my local server. I got a bak file call "MysiteExport.bak" file and transfer it to my computer d:BatabaseBackupMysiteExport.bak' . Then use SQL Server import wizard to import "MysiteExport.bak" and fail to do it.
How can I import/restore database from this "MysiteExport.bak" which have already under my computer. If use SQL Query Analyzer, what is the correct statement.
Hello, I'm trying to import data from a flat file into a table that has smalldatetime data types. I tried creating triggers on the smalldatetime data types that converts the data from a string to a datetime value but the import is still unsuccessful. What should I do?
Col002 looks like this in my flat file 'ex: 20000112' DTSDestination("entry_dt") = DTSSource("Col002") I get an error when trying to put the value of col002 into entry_dt.
I have a small project on that involves importing a series of csv files held within an ftp directory into our Datawarehouse. Every day a series of csv files will be added to the directory. These will be named something like:
Audit1.csv,Audit2.csv etc.
I would like to automate this process as this can involve up to 400 files at a time. The proecedure would need to identify a valid file, import it into the database, delete the file and then move onto the next one.
Does anyone know of a way to achieve this? I was thinking along the lines of using a cursor and bcp but I'm not sure how to identify these files to the database i.e. how do i make it step through the directory and process the files?
In a DTS package I have a text file import object, a data pump, and a SQL object. The text file import object has been set up to splice a 500 character wide file into 20 columns. The data pump task does a copy column for all the columns into the appropriate table. What I need to do is have a way of changing the file name I specify in the text import object. I have 12 months worth of data in seperate files (DBF0199.TXT, DBF0299.TXT, DBF0399.TXT, etc..) which all use the same format. Is there a way to change the text import objects file name inside the package using an active script task or something?
I'm trying to import a fixed field text file into SQL Server using DTS but everytime I go past 3640 characters, I am not able to add, delete or move column breaks after that. Is anyone else experiencing this problem and know of a work around. Any help would be appreciated. Thanks!