Data Access :: Bulk Fetch Records And Insert / Update Same In Other Table With Some Business Logic
Apr 21, 2015
I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some business logic? How do i do this?
How to INsert and Update Large Amount of Records (4 Lacs) into Destination Table Through Business Intelligence Studio Using SSIS Pacakge .How to Achieve this .i tryed with left outer join & conditional split but the problem its not able to insert & update records simultaneously . can any one give me a sample . Thanks & Regards Jeyakumar.M
I have got a business logic update conflict handler working, but I have had to work round what appears to be a bug.
Please can someone confirm if this is indeed a bug €“ and if it is a known bug?
My conflict handler needs to take some columns from the publisher row and some from the subscriber row in the event of conflict.
I can quite happily generate a custom dataset which contains the winning row that I want €“ I can see that because I can step through the conflict handler with debug when a conflict occurs.
However, just returning ActionOnUpdateConflict.AcceptCustomConflictData from the UpdateConflictsHandler method does not set the publisher and subscriber columns correctly. I end up with different values on the two databases.
I have found that the only way to get the correct rows on both publisher and subscriber is to create a new ADO connection to the publisher and actually perform an update €“ updating all the modified columns. This now works reliably in my testing.
Fortunately, due to business rules the frequency of update conflicts are likely to be very infrequent, but I would very much like to avoid having to do the €˜unnecessary€™ update.
Notes:
I am using column level tracking €“ but I have seen the problem with row level tracking too I have mainly been using SP1 but have repeated the test on a configuration using the SP2 CTP €“ and the problem occurs there too The problem is not due to complex logic in my code. If the method just sets customDataSet = publisherDataSet.Copy and then returns ActionOnUpdateConflict.AcceptCustomConflictData, the changed and winning publisher values are not sent to the subscriber
I have a master securities table which has 7 fields. As a part of the daily process I am uploading flat files into database tables. The flat files contains the master(static) security data as well as the analytics(transaction) data. I need to
1) separate the master (static) data from the flat files,
2) check whether that data is present in the master table, if not then insert that data into the master table
3) If data present then move that existing record to an history table and then update the main master table.
All the 7 fields need to be checked to uniquely identify a single record in the master table.
How can this be done? Whether we can us a combination of data flow items or write a sql procedure to do all this.
I have Three tables Student,Daily_Attendance_Master and Daily_Attendence_Details.
I want to run sql of insert or update of student attendence(apsent or present) in Daily_Attendence_Details based on Daily_Attendance_Master_Id and Student_Id(from one roll number to another).
If Both are present in table Daily_Attendence_Details then i want to run Updating of attendance from one roll number to another roll number in Daily_Attendence_Details on the basis of Daily_Attendence_Details_Id
And if both or any one is not present i want to run insert of student attendense from  one roll number to another roll number in Daily_Attendence_Details.
I give below the structure of three tables Student,Daily_Attendance_Master and Daily_Attendance_Details.
I receive the following error message when I try to use the Bulk Insert Task to load BCP data into a table:
Error: 0xC002F304 at Bulk Insert Task, Bulk Insert Task: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 4. Verify that the field terminator and row terminator are specified correctly.Bulk load data conversion error (overflow) for row 1, column 1 (rowno).".
Task failed: Bulk Insert Task
In SSMS I am able to issue the following command and the data loads into a TableName table with no error messages: BULK INSERT TableName FROM 'C:DataDbTableName.bcp' WITH (DATAFILETYPE='widenative');
What configuration is required for the Bulk Insert Task in SSIS to make the data load? BTW - the TableName.bcp file is bulk copy file as bcp widenative data type. The properties of the Bulk Insert Task are the following: DataFileType: DTSBulkInsert_DataFileType_WideNative RowTerminator: {CR}{LF}
Any help getting the bcp file to load would be appreciated. Let me know if you require any other information, thanks for all your help. Paul
What is the way we could implement a business logic from a Table bystoring it statemnnets in a colums and defining an execute sql toexecute it.Some legal requirements make it diffcult for us to createmodify stored procdures so Iwant to have a process where we create newrows in a table and execute it to execute business logic.All views are welcome.Havin g one table two tables different approaches to store ststementsand execute them....Case logic how to implement it?Flags in the tble colums in the tablesetcThanksAjay Garg
I am getting inconsistent results when BULK INSERTING data from a tab-delimited text file. As part of my testing, I run the same code on the same file again and again, and I get different results every time! I get this on SQL 2005 and SQL 2012 R2.We have an application that imports data from a spreadsheet. The sheet contains section headers with account numbers and detail rows with transactions by date:
AAAA.1234 /* (account number)*/ 1/1/2015Â Â Â Â Â $150Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â First Transaction 1/3/2015Â Â Â Â Â $24.233Â Â Â Â Â Â Â Â Â Â Â Â Â Second Transaction
BBBB.5678 1/1/2015Â Â Â Â Â $350Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Third Transaction 1/3/2015Â Â Â Â Â $24.233Â Â Â Â Â Â Â Â Â Â Â Â Â Fourth Transaction
My Import program saves this spreadsheet at tab-delimited text, then I use BULK INSERT to bring the data into a generic table full of varchar(255) fields. There are about 90,000 rows in each day's data; after the BULK INSERT about half of them are removed for various reasons. Next I add a RowID column to the table with the IDENTITY (1,1) property. This gives my raw data unique row numbers.
I then run a routine that converts and copies those records into another holding table that's a copy of the final destination table. That routine parses though the data, assigning the account number in the section header to each detail row. It ends up looking like this:
AAAA.1234Â Â Â Â 1/1/2015Â Â Â Â Â $150Â Â Â Â Â Â Â Â Â First Purchase AAAA.1234Â Â Â Â 1/3/2015Â Â Â Â Â $24.233Â Â Â Â Â Second Purchase BBBB.5678Â Â Â Â 1/1/2015Â Â Â Â Â $350Â Â Â Â Â Â Â Â Â Â Â Â Third Purchase BBBB.5678Â Â Â Â 1/3/2015Â Â Â Â Â $24.233Â Â Â Â Â Â Â Fourth Purchase
My technique: I use a cursor to get the starting RowID for each Account Number: I then use the upper and lower RowIDs to do an INSERT into the final table. The query looks like this:
SELECT RowID, SUBSTRING(RowHeader, 6,4) + '.UBC1' AS AccountNumber FROM Â GenericTable WHERE RowHeader LIKE '____.____%'
Results look like this:
But every time I run the routine, I get different numbers! my results are not accurate. I get inconsistent results EVERY TIME.Here is my code, with table, field and account names changed for business confidentiality.This is a high profile project at my company;
TRUNCATE TABLE GenericImportTable; ALTER TABLE GenericImportTable DROP COLUMN RowID; BULK INSERT GenericImportTable FROM 'SERVERGeneralAppnameDataFile.2015.05.04.tab.txt' WITH (FIELDTERMINATOR = ' ', ROWTERMINATOR = ' ', FIRSTROW = 6)
I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.
Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?
Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!
I'd like to do the following thing with a data flow task
Get all the records from a source (for example customers from a textfile, flat file source) Then check for each record if the customer already exists in a table, for example with a customerID. If not, insert the record in the table (ole db destination), else copy the customer thats already in the table to another table (history table) and update the record with the customer from the textfile.
Is this possible?, and what kind of data flow transformation do I need?
I have a bulk insert situation that would be nice to be able to pull off. I have a flat file with 46 columns that are to go into a table. The table, I want to have a 47th column to be updated later on by means of a stored proc saying if the import into the system was sucessful or not. I have the rowterminator set as '"' thinking that would tell SQL to begin on the next row, leaving the importstatus column null but i still receive an error.
First of all, is this idea possible within this insert statement. Secondly, if so, what would be the syntax to tell the insert statement to skip that particular column. It is the last column listed in the table so it just needs to start on the next row after it inserts the last bit of data in the flatfile.
If this is not possible, is it possible to bulk insert into a temp table?
I saved the result into a csv file and then truncated the table. Now, I am trying to bulk insert the data into the table. So I used:
bulk insert rdb.dbo.scd_event_tab from 'C:userssluintel.ctrdesktopeventtab.csv' with ( codepage = 'RAW', datafiletype = 'native', fieldterminator = ' ', keepidentity, keepnulls ); go
However, I get this error:
Msg 4867, Level 16, State 1, Line 1 Bulk load data conversion error (overflow) for row 1, column 1 (JOB_ID). Msg 4866, Level 16, State 5, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 3. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I am running a set of SQL statements on a SQL server, to insert flat file data into a SQL table. The flat file is already FTP'ed to the SQL server. I seem to be getting an error, which is possibly pointing to a permissions issue
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer] FROM 'c:jedox_dailyjdcom4401.txt' WITH ( FIRSTROW = 2, MAXERRORS = 0, FIELDTERMINATOR = '|', ROWTERMINATOR = ' ' ) GO
The error is : Msg 4861, Level 16, State 1, Line 1 Cannot bulk load because the file "c:jedox_dailyjdcom4401.txt" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 1815)
If it is permissions issue, how do I overcome this?
On my site users can register using ASP Membership Create user Wizard control. I am also using the wizard control to design a simple question and answer form that logged in users have access to. it has 2 questions including a text box for Q1 and dropdown list for Q2. I have a table in my database called "Players" which has 3 Columns UserId Primary Key of type Unique Identifyer PlayerName Type String PlayerGenre Type Sting
On completing the wizard and clicking the finish button, I want the data to be inserted into the SQl express Players table. I am having problems getting this to work and keep getting exceptions. Be very helpful if somebody could check the code and advise where the problem is??
To match the answers to the user I get the UserId and insert this into the database to.protected void Wizard1_FinishButtonClick(object sender, WizardNavigationEventArgs e) { SqlDataSource DataSource = (SqlDataSource)Wizard1.FindControl("InsertArtist1"); MembershipUser myUser = Membership.GetUser(this.User.Identity.Name); Guid UserId = (Guid)myUser.ProviderUserKey;String Gender = ((DropDownList)Wizard1.FindControl("PlayerGenre")).SelectedValue; DataSource.InsertParameters.Add("UserId", UserId.ToString());DataSource.InsertParameters.Add("PlayerGenre", Gender.ToString()); DataSource.Insert();
Our system runs a SQL Server 2012 DB, it has a table (table_a) which has over 10M records. Our system have to receive data file from previous system daily which contains approximate 3M updated or new records for table_a. My job is to update table_a with the new data.
The initial solution is:
1 Create a table (table_b) which structur is as the same as table_a
2 Use BCP to import updated records into table_b
3 Remove outdated data in table_a: delete from table_a inner join table_b on table_a.key_fileds = table_b.key_fields
4 Append updated or new data into table_a: insert into table_a select * from table_b
As the test result, this solution is very inefficient. Step 3 costs several hours, e.g. How can I improve it?
Hi, I have a data file which consists of data as below, 4 PPU_FFA7485E0D|| T_GLR_DET_11||
While iam inserting into table using bulk insert, this pipe(||) is also getting inserted into the table, here is my query iam using to insert the data using bulk insert.
BULK INSERT TABLE_NAME FROM FILE_PATH WITH (FIELDTERMINATOR = ''||'''+',KEEPNULLS,FIRSTROW=2,ROWTERMINATOR = '''')
I have a web form with a text field that needs to take in as much as the user decides to type and insert it into an nvarchar(max) field in the database behind. I've tried using the new .write() method in my update statement, but it cuts off the text after a while. Is there a way to insert/update in SQL 2005 this without resorting to Bulk Insert? It bloats the transaction log and turning the logging off requires a call to sp_dboptions (or a straight-up ALTER DATABASE), which I'd like to avoid if I can.
I want to update table2.message based on the criteria of table1.name. for example, all records named John will be updated with 'Msg1' in table 2.message. I am using MS SQL 2000 and below is the scenario.
table1 columns ID Name
table2 columns ID Message
Select a.Id, a.name, b.message from table1 a, table2 b where a.id =b.id
a.id a.name b.message 1 John Msg1 2 Steve Msg2 3 Scott Msg3 4 John NULL - update b.message to 'Msg1' 5 Steve NULL - update b.message to 'Msg2' 6 Scott NULL - update b.message to 'Msg3' 7 John NULL - update b.message to 'Msg1' 8 Steve NULL - update b.message to 'Msg2'
If i will update the record per name i am using the query below and i am pre-selecting all the existing names.
update table2 b set b.message=(Select top 1 b.message from table1 a, table2 b where a.id =b.id
[Code] ...
How to update this in bulk without preselecting all the names?
Or maybe it can't find something it's looking for to make my additional symetrical with everything else. This is from one of my businesslogic files. What causes Visual Studio to not allow me to add something here and have it behave like the other parameters/ I'm talking about ShortDesc. Visual studio inserted () and the Get instead of GET like all the others. It would not allow me to capitalize the "get". There has to be a reason for this. Currently my program is adding all of the fields except the ShortDesc to the table in my database. Does anyone have any thoughts as to why this is the case?
#Region "Pvt Members" Private _ArticleID As System.Int32 Private _UserID As System.Int32 Private _WebSiteID As System.Int32 Private _ArticleTitle As System.String Private _Author As System.String Private _ShortDesc As System.String Private _ArticleText As System.String Private _AnchorText As System.String Private _ShowInDirectory As System.Boolean Private _Active As System.Boolean Private _DateAdded As System.DateTime#End Region#Region "Properties" Public Property ArticleID As System.Int32 GET Return _ArticleID End Get Set(ByVal Value As System.Int32) _ArticleID= Value End Set End Property Public Property UserID As System.Int32 GET Return _UserID End Get Set(ByVal Value As System.Int32) _UserID= Value End Set End Property Public Property WebSiteID As System.Int32 GET Return _WebSiteID End Get Set(ByVal Value As System.Int32) _WebSiteID= Value End Set End Property Public Property ArticleTitle As System.String GET Return _ArticleTitle End Get Set(ByVal Value As System.String) _ArticleTitle= Value End Set End Property Public Property Author As System.String GET Return _Author End Get Set(ByVal Value As System.String) _Author= Value End Set End Property Public Property ShortDesc() As System.String Get Return _ShortDesc End Get Set(ByVal value As System.String) End Set End Property Public Property ArticleText As System.String GET Return _ArticleText End Get Set(ByVal Value As System.String) _ArticleText= Value End Set End Property Public Property AnchorText As System.String GET Return _AnchorText End Get Set(ByVal Value As System.String) _AnchorText= Value End Set End Property Public Property ShowInDirectory As System.Boolean GET Return _ShowInDirectory End Get Set(ByVal Value As System.Boolean) _ShowInDirectory= Value End Set End Property Public Property Active As System.Boolean GET Return _Active End Get Set(ByVal Value As System.Boolean) _Active= Value End Set End Property Public Property DateAdded As System.DateTime GET Return _DateAdded End Get Set(ByVal Value As System.DateTime) _DateAdded= Value End Set End Property #End Region
I have a csv that contains attendance records that I get daily from a 3rd party grade book solution. I need to import directly into the attend table in our student database.
The file is setup as follows, School Year, school number, student_id, absence date, absence code, course number, section number.
I need to check the student schedule to see if they are scheduled for that class when the import runs. So if they had a schedule change in the middle of the day it won't post attend to a dropped class.
I have done something similar to this before with the way I export teachers out to our grade book. I have it check the master schedule to see if the teacher is teaching at least one class, that way it won't export tutors and office staff to the grade book. I used the script below to do that but not sure who to apply it to a bulk insert.
Code: Script Used to export teachers note last four lines, checks master USE [GSchool] GO SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON
[Code] ....
Secondly I need to check the date of the record and overwrite a record if one already exists for that exact course and section for that student, I need this because if they make changes to a previous day from absence un excused to excused I need to get rid of the unexcused by overwriting it.
One more thing that would be nice but is optional, is there a way to send log of errors on the import via email?
I need to write a SQL statment to insert data into a table. Not a problem. But, can I insert more than one row/record at a time?
e.g. If I have a 'Person' table with 'First_Name' and 'Last_Name' fields and 50 peeps to put in them all in the same statment, or do I have to keep reusing the query 50 times??
Hey guys,I have to import a csv file to my database and then add some values to other fields based on these values. At present I use a store procedure that does a bulk insert and then I use an update and fetch inside the sql to perform these updates. I am not sure if this is the best way. Is this bad, since some amount of busniess logic is in the store procedure?If it is, how can I do this better?Thanks for the answer.