Integration Services Extraction/loading Throughput/performance

Apr 28, 2006

I'm new to integration services.
I want to create a centralized reporting system for our customers. Some customers have up to 1,000 sites and some are expected to grow past 5,000 sites. The sites are running POS applications and I want to extract the POS sales data from these sites. Is it practical to expect that SSIS can handle the extraction of data from this many sites and load the data into a central
SQL database? The POS sales data at the sites is stored in SqlExpress databases but the data is also available in XML format.
If it's practical for Integration Services to do this, what frequency is it possible to pull this data?
I realize that the amout of data is relative but just wondering if anyone is attempting to do this with integration services.
If not with integration services, then what method(s) are available and used to extract data from this many remote sites?

View 3 Replies


ADVERTISEMENT

Integration Services :: Data Extraction From Hyperion Financial Management Using SSIS

Jul 21, 2015

I want to build a data import process with SSIS, sourcing Hyperion Financial Management. Accoring to my knowldge there were a Star Integration Server (Star Analytics acquired by IBM in Feb 2013) doing the extraction job and which could be used in SSIS.

As this product is not available now, how to do this.

View 3 Replies View Related

Finding SQL Server Performance Throughput Load

Aug 31, 2007

Hello!! guys..

I am using sql server 2005 enterprise edition in the clustered environment with netapp storage (one server) I monitor the server performance, basically using windows performance monitor counters such as avg.disk queue length,avg. disk reads/sec, avg disk writes/sec, processor time, available memory Mbytes, cache hit ratio etc. and using some sql server dmvs .

At this point all looks good.

My problem i dont know how much load sql server can handle more .How much it being utilized now? whehet it has reached its limits...how do i know it is just about to reach its linits??

how much throughput sql server is handling at present, how much more it can handle if incoming traffic increases.

Basically i want to find performance baselines , how much more it can handle so that i can plan for the future ..

How do i measure all this?
what are different methods available if i want to handle increased incoming traffic? (e.g. adding multiple servers etc.)
It will be really great if someone can share his experience on this..

Is there any article/white paper on this ..

Any suggestion/help appreciated

Thanks

View 1 Replies View Related

Integration Services :: Incremental Loading Data In SSIS

Aug 30, 2015

I am looking to load data incrementally from staging to spectrum database.

Master = Staging table
Detail = Spectrum table
On below logic

.If record from Detail (Spectrum table) is null
then do insert the record into Spectrum table
set status_flag to 'A' for active
else do update the record (replace all old values with new values)
set status_flag to 'A' for active
end-if

· If record from Master (Staging table) is null
then do soft delete
set status_flag to 'D' for delete
end-if

View 2 Replies View Related

Integration Services :: Loading PGP File Into Table Using SSIS

Oct 13, 2015

I have a load a zipped folder which is PGP encrypted into SQL table, How to unzip and load it into sql table using SSIS.

View 4 Replies View Related

Integration Services :: Data Folder Not Loading With New Project

Jun 30, 2015

When creating a new integration project, the data folder to create a new data source does not load. 

View 5 Replies View Related

Integration Services :: For-each Loop Container For Loading Excel Sheet

Aug 10, 2015

I have used for-each loop container for loading excel sheet contains multiple sheets with same structure. It is loading data into SQL table even there is no data in sheets.

View 3 Replies View Related

Integration Services :: Loading Data To Destination With Foreign Key Relationship

Apr 22, 2015

I have to load data into destination table, it has foreign key relation to two different tables called person table and organization table . sample data to be loaded is like

person_id organization_id
1                   Null
2                    NULL
Null                1
null               null

person table and organization table doesn't have null values in them, when I try to load this data none of them are laoded, I know either person_id or organization id having null value is failing foreign key constraint. But I want to transfer all the rows except  the ones having both nulls. how this can be achieved ?

View 7 Replies View Related

Integration Services :: How To Download Files From Web Page Before Loading Into Server

Oct 26, 2015

How to download files from a webpage before loading into SQL Server tables? I have the following URL and under the Downloads & Resources section, I have different file formats.

By doing hover on the download tab for each file type, I see that there is a link that is associated with it just like the following:

For CSV - [URL] ....
For XML - [URL] ....

The above is just an example for your reference/understanding. In the sample data from the internal website I have, I need to do a similar operation. The only difference would be that I would be having multiple XLS files with a description for each.

Example:
Sales Q1 - <xls download tab>
Sales Q2 - <xls download tab>
Sales Q3 - <xls download tab>
Sales Q4 - <xls download tab>

<li>
<sub>Sales for Calendar Year 2015--All Countries </sub>
<a href="/Data/Downloads/Documents/Sales/Sales_Quarter1.xlsx">
<sub>[XLS]</sub></a><sub> , <a href="/Data/Downloads/Documents/Sales/Sales_Quarter1.pdf"><sub>[PDF]</sub></a><sub>​</sub></sub>
</li>

I need to download the file based on the month/quarter every time.

View 7 Replies View Related

Integration Services :: Date Timestamp Not Loading Correctly Into CSV File As Destination

Nov 5, 2015

I have a simple package to load data from sql server db into a flat file. I have a date field in the source data base (data type DATETIME) when i open the csv file some show the exact time stamp and some records show just the seconds like (00:00:0.7). I used CAST CONVERT bu still the same issue.

AppliedDate
00:00.6
00:00.6
10/2/2015 0:00
10/2/2015 0:00
00:00.3
00:00.3

View 9 Replies View Related

Integration Services :: Decrypt XML Node Failed On Loading Master Package

Dec 24, 2013

We run 2012 enterprise.  When I open my project on a different machine than the one I used to create the project,  I get the following warnings.  I'm concerned about 1) checking in source from different machines, 2) what is going to happen when we run this in production.  All of the project params are sensitive=false and required = true.  The master package stageprototype.dtproj has no pkg params and no configs. 

The project's protection level is encryptsensitivewith user key but as far as i know there is nothing sensitive in this collection of master and sub packages.  I'm concerened that id I change this to dont save sensitive, I'll be looking for a needle in a haystack, specifically the thing or things ssis thinks are sensitive right now.

Warning 1 Warning loading StagePrototype.dtproj: Warning: Failed to decrypt an encrypted XML node. Verify that the project was created by the same user. Project load will attempt to continue without the encrypted information.

 StagePrototype.dtproj 0 0
 
Warning 2 Warning loading StagePrototype.dtproj: Warning: Failed to decrypt sensitive data in project with a user key. You may not be the user who encrypted this project, or you are not using the same machine that was used to save the project. If the sensitive data is a parameter value, the value may be required to run the package on the Integration Services server.  

StagePrototype.dtproj 0 0 

View 2 Replies View Related

Integration Services :: Loading Multiple Flat Files Into Different Tables Using SSIS?

Oct 25, 2015

I have been tasked to do the following using SSIS.

We received two csv files each week and we would like to load these files to two different sql server tables using SSIS.

These files should be archived into a folder after each load.  

How can I achieve this?

View 6 Replies View Related

Integration Services :: Loading Datetime Field Give Wrong Dates In SSIS

May 30, 2015

I am using Sql Server 2012. I have a table which has a field as Datetime (it is a table in Dynamics CRM 2011 so I have no control of the data type). Say this field is called BisStartDate. If I run this query in management studio.

select
BisStartDate, BisStartDateutc
from myTable
where _bisnumber=10375

I will get:

BisStartDate                                             BisStartDateutc               

2014-07-29 00:00:00.000                      2014-07-29 05:00:00.000

*in CRM, datetime is saved in 2 fields, one is the current time, the other one is the utc time.

You can see the offset  between the datetime and utc is 5 hours.

However when the same statement was running inside a SSIS package on the server, the result returned is:

BisStartDate                                             BisStartDateutc               

2014-07-28 23:00:00.0000000            2014-07-29 05:00:00.000

And if I do

datediff(MINUTE,CRMAF_BisSection.ttc_section_startdatetimeutc,CRMAF_BisSection.ttc_section_startdatetime)

I will get -5 if I run it in ManagementStudio and -6 is running on server package(running inside VisualStudio will be -5, same as running a query in ManagementStudio).

I think when the record was saved, “date” is 5 hours offset to UTC time but now the system use the current utc offset which is 6 hours. I just want to use the BisStartDate as it is. How do I let the SSIS turn off the conversion.

The same datatime is saved in another system then we compare them to check the data entry. Now because of this one hour difference, sometime the Day will be different.

View 8 Replies View Related

Integration Services :: Loading Flat Files Without Duplicate Rows Into Destination Server

Sep 25, 2015

I have some duplicate records in my flat file. But i don't want to load those duplicate rows into my destination.

View 2 Replies View Related

Integration Services :: Loading XLSB File Using Excel Source Component In SSIS

Aug 5, 2015

How to load .xlsb file using Excel source component in SSIS. Below is the connection manager i see in the properties window.

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=;Extended Properties="Excel 8.0;HDR=YES";

Do I need to change any values here to process .xlsb file

View 2 Replies View Related

Integration Services :: Loading Multiple XML File With Different Metadata In Server Tables With SSIS 2008

Feb 17, 2011

I have multiple xml data file in a directory say C:XMLData abc1.xml, abc2.xml, abc3.xml etc.

Need to loop through each file in ssis with Foreach loop container, and get the file name say abc1, and load the data of abc1.xml to abc1 table in sql server DB.

Next iteration will pick up the abc2.xml and find the abc2 table in sql server DB then insert the data in abc2 table.

While each iteration, xml source should also point each xsd file correspondingly.

 Tables are already created in DB

I solved my problem up to getting the file name from ech iteration and assigned file name to variable, in oledb destination data access mode I select Table or view name variable, then corresponding table will get selected for data insertation.

Just wanted to know how can I read each xsd file for each xml data files while iteration. 

View 12 Replies View Related

Integration Services :: Loading Data From Multiple Excel Sheets To Server 2014 Table

Aug 5, 2015

I have one excel sheet contains 50 sub sheets with different names on it. Is it possible can i load all sheets into SQL using SSIS?

View 2 Replies View Related

Integration Services :: Loading Tables Created In Previous Sequence Into Local Archive File - SSIS Path Error

Oct 5, 2015

I've been working on an SSIS package trying to load some data and the archive sequence is faulty. I've been trying to load a few tables created in a previous sequence into a local archive file and I've been getting the error "Could not find a part of the path."

The results aren't telling me what it's finding last and so I don't know where to start.

And the source DOES have data in it. It's something between the source and the destination.

View 2 Replies View Related

Integration Services :: SSIS VB Script Loading Data Into Oracle DB Missing Some Data

Nov 10, 2015

I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below: 

SQL Server:
Oracle:
DDL:

create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);

Result:

I follow up this article: [URL] ....

VB Script: 
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

[Code] ..........

View 8 Replies View Related

Integration Services :: Data Flow Task Failed After Loading 29000 Rows Out Of 234567 Rows

Oct 13, 2015

I am facing an issue that Data flow task failing after loading 29000 rows out of 2lakhs rows.

I am loading data from .csv file to OLE DB Destination.

This data flow task is placed inside For each loop container.

is this issue because of any performance issue in SSIS packages such as buffer size.

find the error below:

DFT Load Data from FlatFile:Error: The conditional operation failed.
DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. 

The "DER Add Calc Columns" failed because error code 0xC0049063 occurred, and the error row disposition on "DER Add Calc Columns.Outputs[Derived Column Output].Columns[M_VALUE_NUM]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

DFT Load Data from FlatFile:Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "DER Add Calc Columns" (48) failed with error code 0xC0209029 while processing input "Derived Column Input" (49). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

[code]....

View 8 Replies View Related

Integration Services :: Could Not Open Global Shared Memory To Communicate With Performance DLL

Nov 17, 2009

I am getting the following warning for my SSIS08 package: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console. I did check Warning in SSIS 2008 , but didn't find any solution. The package processes data and executes fine , but why do I see this warning? When I run this package on my machine, I see no such warning, it's only when I deploy it to our DEV SSIS server, I get this warning.

View 7 Replies View Related

Integration Services :: Parameterized Bulk Copy - Disabling Indexes For Performance Causes Timeout Exception

Sep 23, 2015

My requirement is to sling a rowset from one place in SQL server into a table in another place in the most performant way. I want this to be parameterizable -  I want to provide just a connection string and some SQL for the source and a connection string and a table name for the destination.  The package should do the rest. 

The solution I chose was an 2014 SSIS package with source and destination as ADO.NET connections configured from project variables.  The package has a script task to bulk copy the data.  For performance I disable the non-clustered indexes first. 

But this performance precaution causes the bulk copy to timeout after delivering the correct rowcount to the destination table. What I can do to avoid this error?

Here's my script code:

//get hold of the source and a data reader from it
SqlConnection sqlconnSource = new SqlConnection();
sqlconnSource = (SqlConnection)(Dts.Connections["source"].AcquireConnection(Dts.Transaction) as SqlConnection);
SqlCommand sourcesqlCommand = new SqlCommand(SourceSQL, sqlconnSource);
sourcesqlCommand.CommandTimeout = 1500;

[Code] ....

This takes 128 seconds to put 13 million thin rows into my empty destination table and then throws an exception with this message:

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

View 5 Replies View Related

Integration Services :: Rebuild Index / Refresh Index And Stats Improves Ssis Package Performance

Oct 28, 2015

My SSIS package is running very slow taking so much time to execute, One task is taking 2hr for inserting 100k records, i have disabled unused index still it is taking time.I am rebuilding/Refreshing indexes and stats once in month if i try to execute on daily basis will it improve my SSIS Package performance? 

View 2 Replies View Related

Performance Problem Loading Packages

Jul 17, 2006



We're having a performance problem with a package since an error occurred. The original error came from the Job Manager whicj was unable to start a thread. Our understanding is that this is a memory related issue and we'll deal with that.

What is realy odd is what happened after that. The package executes from a SQL Agent job that includes 3 other packages. Each package is stored on the file system. Package execution time for the affected package changed from less than a minute to over 5 minutes. The other packages continued to execute normally.

In checking the logs there is large time gap between the start of the SQL Agent step and the first pre-validation message that accounts for 4 minutes, as if there is some issue in loading the package. The issue is ongoing. Does anyone know what happens between the start of the SQL Agent job and the first prevalidate message? Is this some type of caching issue?

SQL Agent step starts at 4:05:27
First Prevalidate message: 4:09:24
Package Execution start: 4:09:57
End Execution 4:10:59

thanks

Peter

View 1 Replies View Related

Slow Performance While Loading The Reports..

Mar 27, 2008



Hi,
When i run my reports in the reportmanager or Reportserver it takes 2 mins for it pop up.. Can someone tell why does it happen???

Any help will be appreciated.
Regards,
Karen

View 11 Replies View Related

How To Optimize Integration Pacakages Or Best Practices For Integration Services

Sep 11, 2007

Hello friends.
I managed to design an Integration service package,but the desired level of performance has not been achieved(i.e it is performing slow).
So I want to know what are the best practices for optimized solution .
In my package I'm exreacting data from XML file and Storing it in SQL server database with some processing dring data flow.

I'm using
1) Two Script Task Control -In these control,I m opening the connection to XML file through VB.net code and
iterating each record at a time.
2)Two OLE DB Command -Each fetched record from script task component is processed in OLEDB command through
stored procedure and then inseted into database.
3)One for Loop -This loop contains two script Task control and two OLEDB Command control,
(mentioned above),for fetching single record and inserting it in database.
4)One derived Column
5)One Multicast
6)One Character Map
7)One OlEDB Source

As with my current performance I'm able to insert one record in every .5 second (Which is much below to acceptable limits)
Is control lying disabled on SSIS designer pane also affect the performance of execution.

View 4 Replies View Related

Backup Slow Throughput

Nov 16, 2006

Hi Team,

I have a SQL 2000 instance with 46 Databases (all databases put together will be 15 GB in size). I am running a SQL Backup using a third party software.

My full backup of the SQL instance which backs up 15 GB of data finishes within 30-45 minutes. But, my differential backup of the same instance which backs up only 150 -250 MB of data takes 12 hours to backup.

I found a knowledge article from MS Support site which says, differntial backup would take more time than full in few scenarios.

http://support.microsoft.com/default.aspx?scid=kb;en-us;196658

But the above document is for SQL 7.0. Will it be the same for SQL 2000 and 2005 too? If yes, can you please tell if I can increase the speed of differential backup in my environment. Should I modify any SQL Parameters?

Please let me know your thoughts on this..

Thanks
Santhosh

View 4 Replies View Related

Capturing Component Throughput

Dec 19, 2007

[Microsoft follow-up]

I've just been reading this thread by a guy asking about capturing the throughput of a dataflow. I suggested that there is no real notion of capturing throughput of a dataflow but I believe there IS a notion of capturing the throughput of a component or an execution tree.

I believe all the information that one would need to capture the throughput of a component (apart from the name of the component that is) is available in the OnPipelineRowsSent event. If there were a OnPipelineRowsSent eventhandler and the OnPipelineRowsSent event contained the name of the component then I think we would be able to capture the throughput of a component. So, some questions:


Why is there no eventhandler for the OnPipelineRowsSent event?

Can the name of the component be added to the information in the OnPipelineRowsSent event?

Following on from this... I once had a conversation here with Kirk Haselden about capturing pipeline throughput. He thought it was a good idea and suggested I raised a DCR for it which I did but that was in the old pre-Connect days and it seems as though that DCR (like SO many other things) didn't make it across to Connect. So, some more questions:

Can you find any Connect DCRs relating to capturing throughput? I've found this: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=152162 that I raised 18 months ago but which hasn't even had a single comment from anyone at Microsoft.

Do you think that capturing throughput would be useful? I can foresee huge advantages by capturing this in the debugger. (Note that Informatica does this and has done for years. It has a very nice GUI that shows the throughput of each destination in the mapplet.)
I'd welcome any thoughts around this. Its a big ask and it fits in very nicely with my constant, nay INCESSENT, requests for debugging enhancements so maybe this is one for Darvey to have a read of???

Thanks
Jamie

View 16 Replies View Related

At What Throughput Should I Leave A Connection Open ?

Feb 3, 2005

I am used to writing applications that hit the database "every so often" and am happy with opening and closing the connection to SQL Server for each one.

I am now writing an application that monitors a table where rows are written to it by a 3rd party application at possibly several rows per second. My job is to "pickup" those rows, analyse the data and move them to different tables. This will be done with a timer which is currently set to tick every second.

My question is: At what stage should I start to think about keeping open a permanent connection to the database ?

1 row per second ?

100 rows per second ?

Any suggestions appreciated.

Steve.

View 2 Replies View Related

Boilerplate Activation SQL For High-throughput

Apr 9, 2007



OK, so assume I am recycling dialogs in my client code, and assume I am doing something similar to get a dialog handle in my TSQL. What should the activated stored procedure that is processing my queue look like if I am expecting thousands of messages per second? Assume also that there is a small bit of logic need to process each individual message? I am building for a high-throughput scenario and would like to get as much as possible out of each second-tier service broker server as possible before the aggregated data is moved up the chain to a master. The first tier is Express on a web server and exists primarily only as a forwarding mechanism.

View 1 Replies View Related

Help Loading Data From Excel To Analysis Services

Jul 20, 2005

I have a 40 MB database in excel format.I need to use it in Analysis Services, I imported the data by DTS (DataTransformation Services), everything is working I can see the database,but I can't load it in the analysis services.I have to construct a cube but I can't see the database in any way Itried.Thank youI hope my message is quite clearHope to find somebody that can answer me in Italian, but English is goodas well.

View 3 Replies View Related

Reporting Services :: Add Message During The Loading Of Report?

Sep 28, 2015

I want to add message, during loading of window,Case: If my report taking more than 10 second to fetch the record from database, then  I want to add message during the loading of report.

I want to display this message before loading the report(means during processing time).

Can be possible in reporting services.

View 5 Replies View Related

Reporting Services :: PPS (Performance Point Services) Filter Does Not Refresh

Aug 6, 2012

I am facing sever Refresh issue in PPS Reports. I have Two Dashboard

Dashboard_One and Dashboard_Two

I have few Filters on both the Dashboard .

In Dashboard_One  I have 2 Filters

1)Year filter where Year 2012 is my Default value
2)City Filter where "CityOne" is Default Filter Value

If I  select Year"2010" in Period Filter and "CityTwo" In City Filter.I see Related reports . Now I navigate to Dashboard_Two to See Other Reports where I have few Other Filter Like  "Country" where I select "CountryThree" . When I navigate Back to Dashboard_one I do not see Dashboard with Default value given to them

I still see Filter value Year=2010 and CountryFilter="CountryTwo" in Dashboard Dashboard_One .. where as I should have  seen it based on the Default  value given to the Filter.. How should I resolve this refresh issue which I am facing in PPS Dashboard. I do not see Default value in the Filter ,It always give the filter value which was selected later when Navigated back.

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved