Insert Each Csv File Into Their Respective Table (was Newbie Need Help)
Apr 25, 2006
HI all,
I am very new to db. so pardon me for some stupid questions.
I have 3 tables Table1, Table2, and table3.
Table1 and table2 has 5 columns each and table3 has 4 columns.
i have 6 or 7 csv files (these number of files can change) uploaded to a upload directory.
these csv filename contains name of the table. for example table1_ab.csv, table1_cd.csv, table2_ef.csv, table2_gh.csv, table3.aa.csv, table3_bb.csv.
a person goes to a web page and clicks a button.
This is what the button should do.
1) Check how many files are there in the upload directory.
2) use bulk insert to individually insert each csv file into their respective table (the name of the table is in file name).
3) Copy the csv file into backup directory and delete the file from upload directory.
I am using vb.net. any help would me much appriciated.
Thanks
View 2 Replies
ADVERTISEMENT
Sep 5, 2007
Hello All
Firstly thanks a lot Phil and Jamie on such a helpful article on "Checking to see if a record exists and if so update else insert"
Here is my question
I have about 10 tables and there respective working tables
For examples: A, B, C, D, E.... and WorkA, WorkB, WorkC....
Notes:
1) When I execute a package these work table (Work A, WorkB ...) get populated with certain rows say about 5
2) Its not that all the work table are populated on every execution.
3) Tables A, B, C... have thousands of records in it.
4) Work table is of same structure as there parent table..Like WorkA same structure as A.....
5) The table A and WorkA and as on... are linked with a KeyID
Now I want to build a SSIS package that can
1) Get the the data from these multiple tables(WorkA, WorkB...)
2) Process each row of these tables WorkA, WorkB..
3) Depending upon the KEYID of WorkA., WorkB.. etc Update a Flag colunm of table A, B...where the KeyID is equal to KeyID of Work Table
4) After updating insert that processed row of Work A, WorkB ...into Table A, B..
I can do this if I have one source table and one destination table. Here i have some say 10 randomly source tables to respective random destination .
All I could think of creating 10 different packages as adviced in Jamie's article. But I am sure there might some other alternative.
Can somebody advice me the best practice of doing this. Thanks a lot in advance
View 5 Replies
View Related
Dec 24, 2004
I have an assignment and need to dosomething that should be simple, basically output the contents of a table to a text file.
I have been trying this syntax:
bcp "dbo.items_with_constraints_tbl" out "J:items.txt" -c
But I keep on getting this error message:
Server: Msg 179, Level 15, State 1, Line 1
Cannot use the OUTPUT option when passing a constant to a stored procedure.
I am completely lost!! :confused:
Can anyone help me please?
View 12 Replies
View Related
Oct 12, 2007
Hi,
i have a file which consists data as below,
3
123||
456||
789||
Iam reading file using bulk insert and iam inserting these phone numbers into table having one column as below.
BULK INSERT TABLE_NAME FROM 'FILE_PATH'
WITH (KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '||')
but i want to insert the data into table having two columns. if iam trying to insert the data into table having two columns its not inserting.
can anyone help me how to do this?
Thanks,
-Badri
View 5 Replies
View Related
Aug 5, 2007
I have the following query in sql 2005:
PROCEDURE [dbo].[uspInsert_Blob] (
@fName varchar(60),
@fType char(5),
@fID numeric(18, 0),
@bID char(3),
@fPath nvarchar(60)
)
as
DECLARE @QUERY VARCHAR(2000)
SET @QUERY = "INSERT INTO tblDocTable(FileName, FileType, ImportExportID, BuildingID, Document)
SELECT '"+@fName+"' AS FileName, '"+@fType+"' AS FileType, " + cast(@fID as nvarchar(18)) + " as ImportExportID, '"+@bID+"' AS BuildingID, * FROM OPENROWSET( BULK '" +@fPath+"' ,SINGLE_BLOB)
AS Document"
EXEC (@QUERY)
This puts some values including a pdf or .doc file into a table, tblDocTable.
Is it possible to change this so that I can get the values from a table rather than as parameters. The Query would be in the form of: insert into tblDocTable (a, b, c, d) select a,b,c,d from tblimportExport.
tblImportExport has the path for the document (DocPath) so I would subsitute that field, ie. DocPath, for the @fPath variable.
Otherwise I can see only doing a Fetch next from tblIportExport where I would put every field into a variable and then run this exec query on these. Thus looping thru every row in tblImportExport.
Any ideas how to do this?
View 1 Replies
View Related
Jan 12, 2004
I'm trying to insert a blob (*.ICO) in a SQL Server table ?
How can I do this ?
View 3 Replies
View Related
Apr 22, 2004
Is that possible to load files (*.bmp, *.jpg etc) to table (field type IMAGE) using BULK INSERT?
Or is it better to do it otherwise?
Thanks
View 5 Replies
View Related
Apr 15, 2006
Hi. my websever mysql manager allows to Insert data from a text file into the table.
currently i have a table with the following fields.
email name country
so i wanto insers data from a text file into this table.
so my question is : how the text file format should be prepared.
i.e. the first record would be > xxxx@yahoo.com john us
View 2 Replies
View Related
Aug 7, 2007
Hi Guys,
I hope this is easy stuff for you..
First of all i searched the forum but didnt exactly find what iam searching for:
I have a File folder which contains 1..n Files of the same type.
The files contain a DateValue at the beginning of each row. I now want to read the first Record of the file - extract the datevalue and search in my Importtable if there are any records with that Date. If there are allready Records with this date i know i allready imported that file and skip it in my For each File Container. If no records where found I want to copy them from the file to the table.
So I have a flatfile source and thoght I just make an oledb command task afterwards which looks like "Select count(*) from Import where Processdate = ?. and then a conditional split if the count == 0 or not... but i have problems getting the Count value out auf the OLD DB Command Task because everytime I try to add an outputcolumn i get the message: "An Output cannot be added to the output collection" and since there is no possibility to map an expression to the result...
I tried to workaround the problem using a lookup task.. but that seems to be the wrong way.
thanks for your help
bye
AS
View 3 Replies
View Related
May 20, 2008
Hi,
Here is the steps I should take...
1- Check for the log table and find run status ( there is a date field which tells the day run)
2- Lets say last day was 2008-05-15, So I have to check A1.DDDDMMYYY file exists in the folder for each
day like A1.20080516,A1.20080517 and A1.20060518 ( until today)
3- if A1.20080516 text file exist then I have to move it to the table and same thing for other dates
like if A1.20080517 exists I have to load it to table and so on
it looks like for each loop, first I have to get the last date and then I have to check the file exists for each date and
if the date file exists then I have to load it into table...
Please tell me How can I do it. it looks complex looping...
thanks,
J
View 4 Replies
View Related
Nov 25, 2007
Can someone tell me how to insert PowerPoint file into SQL Server database?
Thanks,
Peter
View 1 Replies
View Related
Apr 12, 2006
Hi
i need to inser the .dbf file records into sql table.
How should i go about it?
regards,
Kiran
View 6 Replies
View Related
Jun 17, 2007
Hey all
I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.
First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.
If this is not possible, is it possible to bulk insert into a temp table?
Thanks
View 1 Replies
View Related
Aug 31, 2015
I have been tasksed to create a data table and stored procedure to extract a special formatted XML file that is an attachment with a standard XML envelope. The XML file is an attchment in a node within the XML wrapper. There are other MIME files (pdf's ) that are handle by a seperate procedure. But I need to just extract the XML file attached along with those and put it into the datable with some other PK?FK fields.
Is a blob the best datatype. How to I insert that XML file into it?
View 2 Replies
View Related
Mar 12, 2015
I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:jedox_dailyjdcom4401.txt'
WITH
(
FIRSTROW = 2,
MAXERRORS = 0,
FIELDTERMINATOR = '|',
ROWTERMINATOR = '
'
)
GO
The error is :
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)
If it is permissions issue, how do I overcome this?
View 1 Replies
View Related
Mar 28, 2008
hi my friends
i want read data from text file,and write to my table in sql,
please help me
M.O.H.I.N
View 6 Replies
View Related
May 25, 2007
Hai Everbody,
for me in my project i want to read data from a csv file and insert it in a sql server databse table.The csv file may contain n number of columns,but i want only certain columns from that, and insert it in the database table.How to achieve this. Plz help me it is urgent. Thanks in advance.
Thanks and regards
Biju.S.G
View 1 Replies
View Related
Mar 24, 2006
Ok, I'm not quite sure how to approach this one. This is a VB.NET console app in which I want to capture each row and throw it into a table. The reason being, they want a report on what was processed...which I'll be able to do easily in Reporting Services 2005 once this crap is in a table where it should be.
1) What should I use to do this, dataset? I want to use stored procedures also, not inline SQL
Function here takes an incoming file, and splits it up into separate files. I want to insert each row that is succesfully split
Public Sub ProcessFiles(ByVal sIncomingfile As String, ByVal sOutputDirectory As String)
If sIncomingfile <> "" And sOutputDirectory <> "" Then
Dim f As New Security.Permissions.FileIOPermission(Security.Permissions.PermissionState.None) f.AllLocalFiles = Security.Permissions.FileIOPermissionAccess.Read
Dim file As New IO.FileInfo(sIncomingfile) Dim filefs As IO.FileStream = Nothing If file.Exists Then Try filefs = New IO.FileStream(file.FullName, IO.FileMode.Open) 'Place: 1 Catch ex As Exception SendEmail("Incoming .mnt or .naf Filename Invalid or not found", "Place: 1") Application.Exit() End Try End If
Dim reader As New IO.StreamReader(filefs) Dim counter As Integer = 0
Dim CurrentFS As IO.FileStream Dim CurrentWriter As IO.StreamWriter Dim extension As String = IO.Path.GetExtension(file.FullName)
If extension = ".mnt" Then While Not reader.Peek < 0 Dim Line As String = reader.ReadLine If IsNumeric(Line.Substring(0, 1)) Then Dim Parts() As String = Line.Split(" "c) ' split row into parts If Parts(0).Length = 8 Then ' if first part is 8 then know we hit another header so cut and then write to file counter += 1 If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close() CurrentFS = New IO.FileStream(IO.Path.Combine(IO.Path.GetDirectoryName(sOutputDirectory), Line.Substring(59, 4) & "[" & counter.ToString & "]" & Now.ToString("MM-dd-yyyy") & IO.Path.GetExtension(file.FullName)), IO.FileMode.Create) CurrentWriter = New IO.StreamWriter(CurrentFS) End If
If Not CurrentWriter Is Nothing Then CurrentWriter.WriteLine(Line) End If
End If End While
If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close()
MoveFilesFTP(sOutputDirectory, "mnt")
ElseIf extension = ".naf" Then While Not reader.Peek < 0 Dim Line As String = reader.ReadLine If Not IsNumeric(Line.Substring(0, 1)) Then ' if first part is not a number, then we know it's a header so split the file counter += 1 If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close() CurrentFS = New IO.FileStream(IO.Path.Combine(IO.Path.GetDirectoryName(sOutputDirectory), Line.Substring(6, 4) & "[" & counter.ToString & "]" & Now.ToString("MM-dd-yyyy") & IO.Path.GetExtension(file.FullName)), IO.FileMode.Create) CurrentWriter = New IO.StreamWriter(CurrentFS) End If
If Not CurrentWriter Is Nothing Then CurrentWriter.WriteLine(Line) End If
End While
If Not CurrentWriter Is Nothing Then CurrentWriter.Flush() : CurrentWriter.Close()
MoveFilesFTP(sOutputDirectory, "naf") End If Else 'input file not valid SendEmail("Incoming .mnt or .naf Filename Invalid", "Place: 1") End If End Sub
View 2 Replies
View Related
Sep 15, 2015
How do I do a bulk insert into a temp table from a text file. Text file looks like that:
ver_id TYPE
E57AB326-803C-436E-B491-398A255C919A 58
34D2A601-6BBA-46B1-84E1-3A805FDA3812 58
986E140C-62F1-48F1-B428-3571EBF00DA0 58
My statement looks like this:
CREATE TABLE [dbo].[tblTemp]([ver_id] [nvarchar](255), [TYPE] [smallint])
GO
BULK INSERT [dbo].[tblTemp]
FROM 'C:v2.txt'
I keep receiving errors.
The error I receive is: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (TYPE).
View 2 Replies
View Related
Jul 9, 2015
I have an Excel file with .csv extension . it has on sheet with name Sheet1.
Now, I'm trying to insert this Excel data into one #temp table. I tried with syntax:
----------------
Exec sp_configure 'show advanced options', 1;
RECONFIGURE;
GO
Exec sp_configure 'Ad Hoc Distributed Queries', 1;
RECONFIGURE;
GO
EXEC master.dbo.sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0' , N'AllowInProcess' , 1;
[Code] ...
But, I'm getting error:
The OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" reported an error. The provider did not give any information about the error.
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".
If I'm executing for .xls file this statement is working finr and rows are inserted into #temp table. How to take excel file of .csv extension??
View 3 Replies
View Related
Apr 16, 2008
Hi!
I am creating a scheduling web application. I have managed to insert data into the database. The code is as follow:
Dim insertSQLDatasource As New SqlDataSource()
insertSQLDatasource.ConnectionString = ConfigurationManager.ConnectionStrings("ScheduleConnectionString").ToString
insertSQLDatasource.InsertCommandType = SqlDataSourceCommandType.Text
insertSQLDatasource.InsertCommand = "INSERT INTO Demo_Theatre_DB (StartTime, EndTime, CompanyName, Purpose, AccountManager, Presenter, ColorCode, Status, Comments) VALUES (@StartTime, @EndTime, @CompanyName, @Purpose, @AccountManager, @Presenter, @ColorCode, @Status, @Comments)"
insertSQLDatasource.InsertParameters.Add("StartTime", StartTimeTextBox.Text)
insertSQLDatasource.InsertParameters.Add("EndTime", EndTimeTextBox.Text)
insertSQLDatasource.InsertParameters.Add("CompanyName", CompanyNameTextBox.Text)
insertSQLDatasource.InsertParameters.Add("Purpose", PurposeTextBox.Text)
insertSQLDatasource.InsertParameters.Add("AccountManager", AccountManagerTextBox.Text)
insertSQLDatasource.InsertParameters.Add("Presenter", PresenterTextBox.Text)
insertSQLDatasource.InsertParameters.Add("ColorCode", ColorDDL.SelectedValue.ToString)
insertSQLDatasource.InsertParameters.Add("Status", StatusRadioButtonList.SelectedValue)
insertSQLDatasource.InsertParameters.Add("Comments", CommentTextBox.Text)
Try
insertSQLDatasource.Insert()
Catch ex As Exception
Panel1.Visible = True
InsertMsgLabel.Text = ex.Message.ToString
Finally
insertSQLDatasource = Nothing
End Try
Now I am trying to select the values for a particular event. For example, if the user selects an event with id 40, the values that corresponds to that row will be read and filled into the respective controls. I tried the following code, but doesn't work.
Dim sqlConn As SqlConnection = New SqlConnection("ScheduleConnectionString")
Dim com As SqlCommand = New SqlCommand("SELECT * FROM Demo_Theatre_DB WHERE ID=@ID", sqlConn)
sqlConn.Open()
Dim r As SqlDataReader = com.ExecuteReader()
While r.Read()
StartTimeTextBox.Text = r("StartTime")
EndTimeTextBox.Text = r("EndTime")
CompanyNameTextBox.Text = r("CompanyName")
PurposeTextBox.Text = r("Purpose")
AccountManagerTextBox.Text = r("AccountManager")
PresenterTextBox.Text = r("Presenter")
ColorDDL.SelectedValue = r("ColorCode")
StatusRadioButtonList.SelectedValue = r("Status")
CommentTextBox.Text = r("Comments")
End While
I hope someone can help me with this.
Your help will be appreciated.
Thanks!
View 8 Replies
View Related
Feb 29, 2008
My Question: Does anyone know of a decent way (i.e. I do not want to loop to insert each row and check the SCOPE_IDENTITY() field or anything like that) to copy parent/child rows to their same respective table besides using the method I have listed below (my manager does not really like the idea of the "PreviousID" field)? More details are listed below.
My Table Situation: I have a parent table and a child table. Both tables have an identity column as the primary key. The relationship between the tables is established using the parent table's primary key to the column in the child table that stores the relationship. The identity column in both tables is the only column that is unique in the tables.
Sample Data (made up and simplified for visualization purposes):
ParentTable - "Items"
ID ItemCategory Price
1,T-Shirt,$20
2,Blue Jeans,$50
3,T-Shirt,$40
ChildTable - "Components
ID ItemID Component
1,1,Fabric
2,1,ScreenPrinting
3,2,Fabric
4,2,Zipper
5,3,ThickFabric
6,3,ScreenPrinting
7,3,Elastic
My Need: I need to make a copy of the records (keeping the parent/child relationship intact) to the same table as the source records. For example, in my data example above, I may need to make a copy of all the "T-Shirt" items and their child records. As the parent records are copied, they will be assigned new keys since the primary key is an identity. Obviously, this new key needs to be used when creating the child records, but I need to somehow associate this new key to the new child records.
Possible Solution: I know this can be achieved by adding another column to the parent table to store the "PreviousID" (INT NULL). Using this new field, when I want to copy the "T-Shirt" items, I would insert the new records and store the ID of source records (i.e. the identity value of the row that was copioed would be stored in this "PreviousID" field). Once the parent record has been copied, I could then insert the child records, and I could join on the "PreviousID" field to get to the new ID to use for inserting the copies of the child records.
Thanks for reading this and for any help offered.
View 2 Replies
View Related
Sep 5, 2015
My report looks like this :
In that above report maximum Total quantity is 125929 and its respective date is 7/1/2013.
How can i insert a row(Top or bottom) that gives respective date of maximum quantity?
View 7 Replies
View Related
Jul 22, 2013
Overall goal: Write a Bulk Insert statement using the UNC path of a filetable directory.
Issue: When using the UNC path of the filetable directory in a Bulk Insert Statement, receiving "Operating system error code 50(The request is not supported.)" Looking for confirmation as to whether this is truly not supported.
Environment: SQL Server 2012 Standard. Windows Server 2008 R2 Standard
View 9 Replies
View Related
Feb 23, 2008
Hello Guys,
I am working in one company and currently I am assigned to new project for Data Migration from company X to our company Y using SSIS. I am totally new and i just completed 5 tutorial which was gien on MSDN website.
Basically client is going to send us first flat file with 1 million records with Header, Detail and Trailer records.
I want to create a Package in such a way that it dumps all this first load into 7 to 8 different tables at a time.
we also have to include functionlity for validation and error check.
On successfull load error file should only return Header and Trailer but no detail records.
If there are any errors then error file should contain Header, Detail records which were unable to load plus trailer which we have to sent back to client.
When 2nd file comes that time we have to check whether this is new records or change (update) one depending on Flag which tells it.
This is basically high level idea of my Package what i need to create. If u guys have any question then let me know.
I know you guys are very experienced one. Anyone of you please give me some detail idea on it I would really appricate it.
I have very limited time line for it.
Thanks
Shah
View 4 Replies
View Related
Aug 2, 2007
Hi,
I'm new to SQL Server 2005 SSIS. I'm trying to do something very simple, but I cannot figure it out, PLEASE HELP!
I have a flat file, which I read and then insert the data in a database table, that works fine. The problem is that I don't want to insert duplicate records. For example; if I run the package again, it will appent to the table. What I need to do is that if the package runs again, check if the record already exist, based one two columns, date and hour, and do not insert the record.
Thank you,
Aldo
View 1 Replies
View Related
May 18, 2015
We are planning to move all of our System Center Databases that reside on front end servers to each system center application to a centrally located SQL 2012 server. We'd like to centralize everything and have our DBA care for the server. here is our setup:
SCOM has 1 monitoring and 1 Data warehouse server. SCCM has 1 server with all roles on it. DPM database is on the same server as Application. Same with SCVMM. I have 2 questions regarding this move:
1. Can I have all these databases running on 1 SQL instance?
2. Is there a best practice document that highlights steps and "gotchyas"
View 4 Replies
View Related
Mar 18, 2015
Can we bulk insert only the desired column from a flat file to a table?
I am using SSIS to bulk insert from a file with more than 200 columns. I am trying to find a way I can bulk insert them to multiples table through SSIS.
The one way I can think is pre map the columns from the file to the destination tables. Build numerous Bulk Insert tasks to achieve that. But not sure if SSIS will let me do that.
View 4 Replies
View Related
Jun 20, 2006
Hi ,
i was used the Follwing DataFlow for my Package.using Oracle 8i
FalteFile Source -------------> Data Conversion --------------->OLEDB Destination (Oracle Data table)
using above control flow to map the Source file to Destination . When i run the SSIS Package teh Folwing Error i got
"Truncation Occur maydue to inserting data from data flow column "columnName " with a length of 50 "
regarding this Error i i understood its for happening Data Length . so that i was changed the Source Column Length Exactly Match with the The Destination table.
still i am getting this Error. pls any one give me a solution . SHould i Change the DataType also?
pls give your suggestion
Thanks & Regards
Jeyakumar.M
chennai
View 3 Replies
View Related
Apr 29, 2015
I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file.
Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.
Which is the best way to setup the Dataflow task for this requirement?Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
View 10 Replies
View Related
Jul 15, 2004
I have some Log files that are getting extremely large as I teach myself the importing/Exporting/Tranformation 's and such. I need to shrink the Log file to a manageable size. The data file is 16gig and the log file is 16gig. Help..
View 2 Replies
View Related
Feb 3, 2010
we can easily load a file into db tables. However, my main concern here is the number of columns in the file. A text file TEXT_1400.txt has 1400 columns. I am unable to load data to my db table using BCP or BULK INSERT commands, as maximum of 1024 columns are allowed per table in SQL Server 2008.
We can still go ahead and create ‘Wide Table’ (a special table that holds up to 30,000 columns. The maximum size of a wide table row is 8,019 bytes.). But when operating on wide table, BCP/BULK INSERT commands still fail. After few hours of scratching my head over BCP and BULK INSERT, I observed that while inserting BCP/BULK INSERT commands are unable to identify SPARSE columns and skip these columns, which disturbs column mapping and results in data conversion and trancation errors.
Is there any proper way to load this kind of files into the db table?
View 6 Replies
View Related