Integration Services :: Purge Data In Transaction Table Or Delete Some Data And Store In Separate Table

Aug 18, 2015

How to purge data  in transaction table or we can delete some data and store in separate table in data warehouse?

View 7 Replies


ADVERTISEMENT

Integration Services :: Load Incremental Data Into Fact Table When Source Table Not Have Timestamp And Unique Key

Sep 24, 2015

I have a transaction table having about 40 crore rows in source. It don't have timestamp and unique key columns. It have only Bill_month and Bill_Year columns. Actually for loading this table into staging I have added a new datetime column by adding default bill_date as 01. Then

* First we delete last 3 month data from staging tables.
* Get last 3 months data from source table.
* Load that 3 months data from source to staging table. 

We do this because we only get update for last three months data. Now I have to include this transaction table as Fact table in DW. What will be the best practice for loading the fact table by picking data form staging table. Also we have to look up with dimensions for Foreign Keys. 

* Should I implement the same method of deleting last 3 months records and loading them again. 

View 3 Replies View Related

Transact SQL :: How To Purge Data From A Table

Nov 4, 2015

We have staging table in which data is dumped from files . The staging table is truncated for every load . In order to retain data from staging table we are creating staging_purge table which hold the staging data. what is the fastest way  to copy data from staging to purge table without impacting the load process.

View 13 Replies View Related

Integration Services :: Data Synchronization Using Table Partitioning

Jul 14, 2015

We have a requirment, we have different databases in different servers, we need to syncronize the data in data bases in different server. created Partitioned on these tables. 

We tried some options:

1) Snapshot replication
2)CDC (Change Data Capture) 

What are the best approaches to achieve the same.

View 2 Replies View Related

Automatic Purge Of Transaction Log Data?

Jan 25, 2007

Hello,

I have a database that I am setting up in SQL Server 2005. Initially, I am doing very large imports of data. Every time I run an import, I am having the increase the size of my transaction logs, and now they are approaching 2 GB. Should these be purging themselves? I have to keep increasing the max size of the log so that I can get my data in. While this will work for now, it is not a long term solution, because I can see the log size growing quite large and the amount of space on the server obviously isn't infinite. Is there a setting that I can change so they will automatically purge? If not, how do I purge this information myself?

Thanks so much!

Christine

View 4 Replies View Related

Integration Services :: Import Data Form One Table To Another Server

May 27, 2015

import data form one table to another.Both table have different schema.Lets say

1. Employee(Empid, ename,address,designation,Joindate,DOB).
2.Person(name,address,DOB)

I need to import person table data into Employee table.

Note :

1 empid is auto increment
2. If Person.DOB is not present insert null into employee.DOB
3.JoinDate should be initialized with current date. 

Currently I am using

1.OLE DB Source
2.Data Conversion
3.OLE DB Destination.

View 11 Replies View Related

Integration Services :: How To Load Data From CSV File To Dynamic Table

Jun 11, 2015

I have a requirement to load bulk of csv files to sql table. some times, some columns could not come in csv file(some times 100 columns and some times 80 cloumns). That time the package is getting failed. How to create a table dynamically based on csv file structure.

View 8 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

Integration Services :: Importing Data From Word Document (Table Format)?

Aug 4, 2015

how to import data from word document into sql server table in SSIS.

View 18 Replies View Related

Integration Services :: Insert Data Into Header And Detail Table From XML Through SSIS Package

Jun 2, 2015

I need to insert data into Header & Detail table. As shown in the below xml,

RecordID is identity-column and incremented by 1, after new record is saved into Header table. Need to assign the same recordID for the detail also.

Expecting output should be like as shown below:

How can we accomplish this requirement.

View 8 Replies View Related

Integration Services :: Importing Text File Into Table - Random Data Order

Aug 3, 2015

I'm importing comma-delimited text files into a SQL table. The data imports in a seemingly random order. One time I import and the lines appear one way and the next time I import they import another way.

Is there a way to force the text files to import in the same order the data is found in the file?

View 10 Replies View Related

Integration Services :: SSIS Data Transfer Says Rows Copied But Count On Table Is Less?

Sep 18, 2015

I've a SSIS 2008  parent/child package solution to manage data transfers between two different data sources, so we can copy multiple tables and capture how many rows were transferred and duration for each transfer.  This solution was working fine up until last week, when I made some changes to allow the package to perform a source count using standard SQL determined by an expression, or SQL provided from configuration tables, I also changed the package to Truncate or not the destination table, again controlled by configuration settings in a table. The child packages which perform the data flows have not changed!

The day after the controlling package promotion to live, I saw the bizarre behaviour of the Package log stating all rows transferred, but the actual table counts were not what the log stated, see attached file. The package solution works ok on other servers and was ok in DEV, but there were less tables and rows transferred.Re-running the package gave the same errors, but on some of the same tables and some different ones. 

As it is the child packages doing the transfers and nothing has changed in them. I cannot see how the log would be able to say all rows are transferred and yet not all of the rows are actually moved?

Process output - where you can see counts and log Table transfer controller (as txt not dtsx)

An example of the data transfer child packages (as txt not dtsx)When I set the ExecuteOutOfProcess = True the package worked fine, unfortunately, this is not a good solution as SSIS 2008 does not tidy up the Dtshost.exe processes it starts and I'd be left with a memory issue after a very short time, we transfer hundreds of tables each day. ( I could write a .net script in the controlling package to kill the child processes, but that would still have hundreds of processes running before I could end them, as we have three parallel streams to allow a bit better performance.

View 6 Replies View Related

Integration Services :: Dataflow Task Read CSV File And Insert Data To Table

Apr 29, 2015

I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file.

Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.

Which is the best way to setup the Dataflow task for this requirement?Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.

View 10 Replies View Related

Integration Services :: Check Table For Existing Record Before Data Flow Task

Jun 1, 2015

Using SSIS 2012 (within Visual Studio) on Windows 7.

Before allowing my Data Flow task to fire, I'd like to check the target table (OLE DB Destination) for a specific date value in a specific field. I've seen how the Lookup Task is commonly used to check for dupes before inserting, but I'm not able to use that method because the data value I want to search the table for is contained in a Global Variable (let's say "MyVariableDate"). 

Is there any way to check for any records in a target table where Date1 = MyVariableDate (i.e. scanning the entire table for any occurrence of MyVariableDate in the Date1 field)?

View 12 Replies View Related

Integration Services :: Validating Data Loaded From Flat File Into Table Points

Oct 16, 2015

In my SSIS package I have flat files as a source, I have to load numbers of flat files into SQL target table. I am using For each loop container for that. I am doing it correctly. My aim is to validate the source data from all angle before writing it into target sql table. I am using below points to validate the source data , if I found any bad data I am redirecting those data to error output.

To Checking

1. To check whether data type of column.
2. To check whether buisinesskey column null.

Is there any thing which I am missing to validate source data.

Screen shot for reference

View 10 Replies View Related

Integration Services :: Store Multiple Config In Single Table Using SSIS Configurations?

Oct 20, 2015

I want to store multiple config files into single table I am unable to do it.

View 8 Replies View Related

Integration Services :: Truncate Or Delete Access Table Using SSIS?

Sep 4, 2015

I have to truncate  access table before I insert records to access database table. 

I tried using Delete From Table_name  or Truncate table Table_NAME and I have used Microsoft Jet 4.0 OLE DB Provider. It does not seem to work. I read some post on forums. None of them seem to work while truncating or deleting records from one of the access database table.

Is there any easy way to truncate access database without using script component and VB scripts. I do not know how to Write VB scripts so trying to find alternatives.

[URL]

View 5 Replies View Related

Integration Services :: Loading Data From Multiple Excel Sheets To Server 2014 Table

Aug 5, 2015

I have one excel sheet contains 50 sub sheets with different names on it. Is it possible can i load all sheets into SQL using SSIS?

View 2 Replies View Related

Integration Services :: Create SSIS Package Dynamically For Inserting Data From Flat File To Table?

Sep 30, 2015

I have requirement like  to develop dynamic package for inserting data from flat file to table.

Find below points for more clarification :--

1) if I changed the flat file values and name  in source variable AND  the table name should be also changed based on variable value .

2) it should dynamically mapped with column values with source file as we have to insert data in target table.

See below diagram for more clarification.

View 10 Replies View Related

Integration Services :: Handling Empty Text File Load Into Table Through SSIS Data Flow?

Jun 16, 2015

We have created SSIS package to load a text file into a table. Source system shares 10 text files and recently they stopped generating data for one of the text file (comping empty), after few months they will start generating the data for the empty file batch processing. 

The Issue here is Data Flow task is getting failed while loading empty text file into table. How to handle this empty file load issue in SSIS package.

View 3 Replies View Related

Sampling Data Set Via Integration Services Data Flow For Data Mining Models Without Saving Training And Test Data Set?

Nov 24, 2006

Hi, all here,

Thank you very much for your kind attention.

I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.

Thank you very much in advance for any help.

With best regards,

Yours sincerely,

View 5 Replies View Related

SQL 2012 :: Count Records From Each Separate Data Table Corresponding To Feed

Nov 5, 2014

I have 1 table that is just a list of feeds. A, B, C, D etc (15 rows in total) and each feed has other information attached to it such as Full name, location etc etc. I am only interested in the Feed column.

Each feed then has a corresponding data table which contains a list of records. Eg Feed A data is contained in TableA, Feed B data is contains in TableB and so on.

Basically what I need is a combined table that shows the list of Feeds in the 1st Column ( So A, B, C, D…..) and then a second column which counts the records from each separate data table corresponding to that feed.

So the end result would look something like this:

Feed------No of Records
A----------4 (from TableA)
B----------7 (from TableB)
C----------8 (from TableC)
D----------1 (from TableD)

Possible?

View 2 Replies View Related

Transact SQL :: Querying Data From Master Table Into Transaction Table?

Oct 13, 2015

I am stuck in the following scenario.

Tbl_Loan
Trans_ID
Emp_ID
Guarantor_1_ID
Guarantor_1_ID
TRN_01
EMP_01

[code]....

View 5 Replies View Related

SQL Server 2008 :: Store Long Values To Be Used In (IN) Statement In Separate Table?

Jul 14, 2015

I have several reports that are looking for a code within a certain set of codes or ranges. The specific list of codes to be including is determined by the end user. Currently my "IN" statement can be a hundred lines, listing several ranges, lists of specific codes, etc. I am constantly getting asked what codes does it include, is this code included, etc. Sometimes they'll give me a printed 10 page list of codes and want me to compare to what I have included in the report. Not ideal in the slightest.

What I'd like to do is have a table or a file of some kind somewhere where the end user can view the codes contained, add new ones, and delete ones they no longer want. Then I'd like to be able to just reference that file in my IN statement. Leaving the responsibility of listing the correct codes on them.

View 9 Replies View Related

Produce Data -&&> Store In Table?

Sep 4, 2007

Hi!

I'm quite new to T-SQL and is about to build a small reporting db using SQL.
Most of the data I can move with normal INSERT INTO ... SELECT, but there are some tables that I want to
produce using T-SQL. For example I want to build a Date table like..

Date
Year
Quarter
Month
WeekDay
...

With some precalculated values for each date.

I've searched the forum but have not found any information on how to produce table contents in a good manner. I would appreciate if someone would have the time to point me in the right direction. I do not want to do this by code in my application.

My first thought is to use some kind of Insert Cursor in a While loop...

Pseudo:
---------------------
declare cursor ex for 'Insert table ... '

while
begin

(produce data)
insert data
end

close cursor
---------------------

While browsing the net I've got the feeling that you use cursor less in SQL Server than in other db-engines...


Have a nice day!

View 3 Replies View Related

Solution To Store Million Of Data In A Table.

Jun 27, 2007

Dear all,

i need to design a database table which will store supplier's demand information. 1 supplier will probably have 10000 records and there are posibility that there are 10,000 suppliers. So, in total, the number of records will be 10000 * 10000 = XXXXA LOT XXXX which will be very large number of record to be inserted into a table. So, how can i design an table and structure to cater this scenario? Thanks.

Hope to hear from you..

View 10 Replies View Related

Transact SQL :: Format To Store Data In Table

May 10, 2015

I would like to know what is the preferred format for form submissions to a SQL table. 

View 5 Replies View Related

Reporting Services :: Table Data Types For Data Driven Subscriptions

Jun 11, 2015

I am trying to find a reference for a client that lists the fields available to be substituted into a data driven subscription from the query, along with the expected data types.  For example, the field on whether or not to include a link to the report seems to be expecting a bit data type.I have searched and can't seem to find anything.  I guess I could walk through the interface and try different data types, but if  a list exists, that would be better. 

View 4 Replies View Related

SQL Delete Table Data

Aug 8, 2000

How can I erase all data from every table in a SQL Server 7.0 database and leave all constraints and relationships in tact? I'm wanting to have just the structure or frame work with no data in any table. There are over 130 tables so I need to automate this. Any Suggestions?

View 3 Replies View Related

Data Type To Store A Version Number In A Table

May 8, 2012

I have a C# app linked to a SQL db and I need to store it's version number in a table (could be something like 1.2.789) but I cannot find any datatype which allows me to do this.

I could create three fields in the table for each number but I don't want to.

View 3 Replies View Related

Build Hierarchy - Store Data Link In Table

Sep 3, 2013

I have one table Emp_MAster with two column ID-Sup_Id. I need to create a table where i can store data link this

Id-Hreid

Only difference is that I need to store data in an hierarchy view so Emp 1 is reporting to Emp2 and Emp2 is reportign to Emp3 so in new table I should get

Emp3-Emp2
Emp3-Emp1
Emp2-Emp1

View 2 Replies View Related

Master Data Services :: How To Transfer Data From Table To MDS

Nov 26, 2015

I am newbie to MDS of SQL,want to know how we can transfer tables from two different SQL Databases to MDS.Suggest me the steps to proceed with any examples.

View 2 Replies View Related

Delete All Data From An Sql Table Using Asp.net 2.0 Vb.net 2005

Nov 6, 2007

hi how to empty a table (delete all the data in it) using sql commands in asp.net 2.o and vb.net 2005please help .thanks  

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved