Problem Running Package With 'larger' Amount Of Data
Jun 9, 2006
Dear,
I created a package getting data from files and database sources, doing some transformations, retrieving dimension id's and then inserting it into a fact table.
Running this package with a limited amount of data (about a couple of 100.000 records) does not result in any errors and everything goes fine.
Now running the same package (still in debug mode) with more data (about 2.000.000 rows) doesn't result in any errors as well, but it just stops running. In fact, it doesn't really stop, but it doesn't continue as well. If I've only been waiting for some minutes or hours, I could think it's still processing, but I waited for about a day and it still is 'processing' the same step.
Any ideas on how to dig further into this in order to find the problem? Or is this a known problem?
Thanks for your ideas,
Jievie
View 4 Replies
ADVERTISEMENT
May 9, 2007
I am trying to do a sum on a goal amount that is repeated for each record. But what is the the forumla to only sum on the distinct goal amount.
Example:
Month Year Goal Other Value
March 2007 500 5568
March 2007 500 5568
March 2007 500 5569
April 2007 600 5568
April 2007 700 5569
Total (I am receive)
March-April 2007 1600 5568
March-April 2007 1200 5569
Total (I excpect)
March-April 2007 1100 5568
March-April 2007 1200 5569
I haven't found anything online to help.
Thanks,
Sam
View 5 Replies
View Related
Dec 22, 2014
I need to calculate running total which resets when reached to 16, and also needs to calculate remaining amount as paid.
I have attached sample data, so we have two columns Earned, and Used. And we need to calculate Balance, and Paid columns.
View 4 Replies
View Related
Nov 28, 2012
I have a sales tables which looks as below.
DEPARTMENT
Barnd_Name
Item_Group
   S_DATE
     S_AMOUNT
Administration
IBM
[code]....
Now i need Month Wise Running Totals.but i should check the following group as show below i that order
1) DEPARTMENT
1) Brand
3) Item Group
4) Month
View 12 Replies
View Related
Jul 20, 1998
Is it possible to make a text-field or something like that, which can contain more than 255 characters ?
View 1 Replies
View Related
Jan 8, 2008
I have written code to combine, delete redundant data in my system. The table structure remains the same, except I changed some INTs to TINYINTs.
When I do sp_spaceused, it tells me that number of rows is smaller(which is correct), but the datasize, and index_size is significantly larger AFTER the deletions.
I tried using shrink, but that doesn't seem to change anything.
When I right-click the database, and choose PROPERTIES, it also confirms that the database got significantly larger.
I am confused about how deleting data and changing to TINYINTs could make my database bigger. What would cause this?
View 9 Replies
View Related
Nov 23, 2010
My package is connecting to an external data provider using an OLEDB driver . The package runs fine in debug mode.When i tried to run the same from SQL server agent it failed  to aquire the connection. The OLEDB provider does not contain too much of information , ( connection string, initial catalog, blank user name and password).The same package executes successfully if i run using dtexec in BAT file.But if i use the dtexec in sql server job step as operating system command and try to run, the job will fail reporting " can not aquire the connection".
View 2 Replies
View Related
Jun 1, 2015
We have been trying for a while to use Power BI tools (Power Query, Power BI Designer (Desktop) and Power BI Designer (Cloud) in line with larger data sources ( 2-7m records in fact table). Unfortunately up to now with unsatisfactory results. It seems that these tools are just not designed to handle this kind of data volumes?
To my understanding it seems that the approach with PowerBI components is to always try to create a local (or cloud based) cache that only has a limited capacity. 2m+ records already seem to be exceeding this limit. So as opposed to firing off a query on demand (for example only based on distinct filter options as opposed to entire fact set) only the intermediary cache of the model can be used.
Our initial focus was to use a normalized table in SQL Server with around 4m records. First problem is that Power Query/Power BI Designer fails to provide a complete list of the distinct filter items:
This I can understand in some way as doing a distinct on a large data set like that is not trivial. As a workaround I could imagine to setup a star scheme dimension table with the distinct "dimension members". Â I.e. that filters are driven by this table which has only a few thousand rows. The filters there are then applied to the fact table. I haven't found a way to do this effectively with Power Query.Â
Another option that we have tried was using the cloud based Power BI service in conjunction with a SQL Cloud Service with the same data set. That  unfortunately didn't work. The setup of the source works fine but as soon as we try to start a query by dragging a value field on the dashboard errors occur:
View 5 Replies
View Related
Mar 10, 2008
Hi,
I created a SSIS package which will generate an output file and place it on a remote fileshare location which will look something like this
\RemoteServerNameRemoteFilePath
The package is executing fine when I am executing it through BIDS or through execute package utility and writing the output file to remote file share location.
I created a SQL job for the package and ran the Job. Then, its throwing an error saying
Executed as user: DomainUser. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:33:06 AM Error: 2008-03-10 10:33:22.22 Code: 0xC020200E Source: DFT_Generate Output File Description: Cannot open the datafile "
\RemoteServerNameRemoteFilePathOutputFileName.txt". End Error Error: 2008-03-10 10:33:22.34 Code: 0xC004701A Source: DFT_Generate Output File DTS.Pipeline Description: component "FF_DST_Output" (160) failed the pre-execute phase and returned error code 0xC020200E. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:33:06 AM Finished: 10:33:22 AM Elapsed: 15.891 seconds. The package execution failed. The step failed.
DomainUser have all the permissions on the remote file share location.
SQL server agent is running with the log-on account DomainUser(same as the above).
Could anyone help me in resolving this issue.
Thanks & Regards,
Sriram.
View 8 Replies
View Related
Mar 19, 2014
In the full recovery model, if i run a transaction that inserts 10MB of data into a table, then 10 MB of data is moved in the data file. Does this mean then that the log file will grow by exactly 10MB as well?
I understand that all transactions are logged to the log file to enable rollback and point in time recovery, but what is actually physically stored in the log file for this transactions record? Is it the text of the command from the transaction or the actual physical data from that transaction?
I ask because say if I have two drives, one with 5MB/s write speed for the log file and one with 10MB/s write speed for the data file, if I start trying to insert 10 MB of data per second into the table, am I going to be limited to 5MB/s by the log file drive, or is SQL server not going to try and log all 10 MB each second to the log file?
View 6 Replies
View Related
Apr 23, 2015
Where is a package visible when running the Data Import/Export wizard, choosing to save a package, and choosing "SQL Server" as the location? When I make an SSIS connection in Management Studio I do not see the package under the "MSDB" node.
View 4 Replies
View Related
Sep 25, 2015
I have created an event that contains a Data flow tasks with OLE DB source & Excel Destination.
This event is executed/triggered based on an execute SQL task failure in the control flow Sequence container.
However, when I execute the Data Flow task of the Event Handler, it runs successfully but fails when I execute the whole package.
I get the below error message:
[OLE DB Source [21]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. Â The AcquireConnection method call to the connection manager "TK463DW" failed with error code 0xC0202009. Â There may be error messages posted before this with more information on why the AcquireConnection method call failed.
I have tried setting the property 'DelayValidation' to 'True' on all the Control Flow and Data Flow tasks on the package and on the Event Handler, but still I could not fix this.Not sure What I am missing.Â
View 4 Replies
View Related
Aug 20, 2015
I am looking for solution for "Communication link failure" Â since many months in google but no luck, am running an SSIS package to load data. job failing many times with error 'Communication link failure', searched every where but found nothing.
Below is the complete error description when job failed.
OS - Windows server 2008 R2 Enterprise Edition
RAM - 198GB
 SQL server 2008 R2 Enterprise Edition and error description is below,
Started: Â 6:22:40 AM Â Error: 2015-08-19 18:50:32.70 Â Â Code: 0xC0202009 Â Â Â
Source: Data Flow Task Lookup [193] Â Â Â
Description: SSIS Error Code DTS_E_OLEDBERROR. Â An OLE DB error has occurred.Â
Error code: 0x80004005. Â An OLE DB record is available. Â
[Code] ....
View 5 Replies
View Related
Jun 1, 2006
Hi ,
How to INsert and Update Large Amount of Records (4 Lacs) into Destination Table Through Business Intelligence Studio Using SSIS Pacakge .How to Achieve this .i tryed with left outer join & conditional split but the problem its not able to insert & update records simultaneously . can any one give me a sample .
Thanks & Regards
Jeyakumar.M
View 3 Replies
View Related
May 2, 2008
Hi All,
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
Other times I get this error message:
.NET Runtime version 2.0.50727.1433 - Fatal Execution Engine Error (79FFEE24) (80131506)
And still other times
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
Does anyone have any other suggestions to try?
Thanks.
View 4 Replies
View Related
Dec 6, 2007
Hi there
We have a SSIS run which runs as follows
The master package has a configuration file, specifying the connect strings
The master package passes these connect-strings to the child packages in a variable
Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s.
We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1:
The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€?
Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2:
When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3:
When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4:
When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€?
Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
Thanx,
/Nils M - Copenhagen
View 3 Replies
View Related
Nov 5, 2007
Hi all,
I am creating a dts package to export files from one database to another database.
I tried to search for ways to execute the files and found out that i need to add
reference to Microsoft.Sql.managedDts. However, I cannot find this reference from
my reference. Do i have other alternatives to run this file?
View 8 Replies
View Related
Apr 26, 2006
I am receiving an error on my master package that executes a number of other packages. The individual packages work fine when executed by themselves. However, I am getting the following error when I attempt to execute it from another package:
Error: Failed to acquire connection "conneciton". Connection may not be configured correctly or you may not have the right permissions on this connection.
Thanks in advance for your help.
View 1 Replies
View Related
Feb 10, 2007
Hy im PCV
I want to know how to calculate the amount of data(in MB) that is transfered from 1 server trought another
Puplisher--->Subscriber, using a merge replication. I know that the amount of data depends on the number of the rows and the scale of the colums. I only want to know how to calculate that amount of data. I am using Sql server 2000, and a OS windows XP profesional, thank you
PCV
View 2 Replies
View Related
Mar 18, 2008
I'm creating a temporary table in a Sql 2005 stored procedure that contains the transaction amount entered in a period <= the period the user enters.
I can return that amount in my result set. But I also need to separate out by account the amounts just in the period = the period the user enters. There can be many entries or no entries in any period. I populate the temporary table this way:
SELECT
t.gl7accountsid,
a.accountnumber,
a.description,
a.category,
t.POSTDATE,
t.poststatus,
t.TRANSACTIONTYPE,
t.AMOUNT,
case
when t.transactiontype=2 then amount * (-1)
else amount
end as transamount,
t.ENCUMBRANCESTATUS,
t.gl7fiscalperiodsid
FROM
UrsinusCollege.dbo.gl7accounts a
join
ursinuscollege.dbo.gl7transactions t on
a.gl7accountsid=t.gl7accountsid
where
(t.gl7fiscalperiodsid >= 97
And
t.gl7fiscalperiodsid<=@FiscalPeriod_identifier)
And poststatus in (2,3)
and left(a.accountnumber,5) between '2-110' and '2-999'
And right(a.accountnumber,4) > 7149
And not(right(a.accountnumber,4)) in ('7171','7897')
order by a.accountnumber
Later I create a temporary table that contains budget information. I join these 2 temporary tables to produce my result set. But I don't know how to get the information for just one period. For example, if the user enters 99 as the FiscalPeriod_identifier, I need a separate field that contains only those amounts(if any) that were entered for each account in Period 99.
Can anyone help? It may be that I am not seeing the forest for the trees, but I can't figure it out.
Thanks very much.
Sue
View 6 Replies
View Related
Nov 20, 2006
Hi,I've an application, lets call it simply "A", which creates in a Microsoft Sql Database two huge tables.Lets call them "table1" and "table2"It safes really much data into this tables.After application "A" has finished another application is executed which deletes this two tables.Then application "A" is started again and it will create this two tables again, but the amount of data becomes bigger.It can only proceed if the tables were deleted completely before and the database is empty. This is the procedure which I repeat very often, but everytime the amount of data becomes bigger (table1 and table2 becomes bigger).A couple if times it works fine, but once it seems data becomes too big and application "A" fails. Mostlikely because the data wasnt removed correctly / completely. This is my code of deleting the two tables, maybee there is something I have to change:</p><p> try { SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder("Server=mycomputerdbname;Integrated Security=SSPI;" + "Initial Catalog=testing"); builder["Server"] = "(local)dbname"; builder["Connect Timeout"] = 10; builder["Trusted_Connection"] = true; builder["Initial Catalog"] = ((ComponentConfiguration)this.componentConfig).Persistency.DatabaseName; SqlConnection sqlconnection = new SqlConnection(); sqlconnection.ConnectionString = builder.ConnectionString; sqlconnection.Open(); SqlCommand cmd1 = new SqlCommand("DROP TABLE table1"); // TO Do delete all tables SqlCommand cmd2 = new SqlCommand("DROP TABLE table2"); // TO Do delete all tables cmd1.Connection = sqlconnection; cmd2.Connection = sqlconnection; cmd1.ExecuteNonQuery(); Thread.Sleep(7000); cmd2.ExecuteNonQuery(); Thread.Sleep(7000); sqlconnection.Close(); Thread.Sleep(3000); } catch { }</p><p> </p><p> Thanks for help! mulata
View 6 Replies
View Related
Jul 2, 2007
Generally, on any screen, we design filter screen by allowing user to identify range or one value to search.
But sometime in some screen, It will be more convenient for user if user can identify No matter how value to search.
For example
On screen which have information of people in any province.
So user would like to search it by identify no matter how value to search.
There are check box of any province on filter part which enable users choose it.
Hence, if sometime user choose (by clicking on checkbox) 3 provinces : LA, Michigan, WachingtonDC to see description only 3 chosen province.
and sometime user choose (by clicking on checkbox) 2 provinces : LA, Michigan to see description only 2 chosen provinces.
Please give me any idea for create Stored Procedure or any tecnique to complete my idea....
Help me Pleaseeeee
View 1 Replies
View Related
Jan 12, 2006
hello
i have just created a test database and now need to insert a large number of records into one of the tables, we were thinking of about 1 million records, has anyone got an sql script that i could use to create these records
cheers
john
View 6 Replies
View Related
Aug 6, 2007
I need to make a job that will update up to 8000 rows with the list description of 'berkhold' to 'berknew' in SQL 2000. This is something that I have to do with several projects manually every day by doing the following 2 steps.
SELECT ListDescription, CRRecordID
FROM dbo_BerkleyGroupInventory
WHERE ListDescription ="BerkHold" AND CRCallDateTime<'1/1/2003' AND CRCallResultCode ='CC'
ORDER BY CRRecordID
I then scroll to the 8000th row and copy the CrrecordID and run the following query
UPDATE dbo.berkleygroupinventory
SET listdescription ='berknew'
WHERE ListDescription ='BerkHold' AND CRRecordID <=5968432 AND CRCallDateTime ='1/1/2003' AND CRCallResultCode='CC'
I'm sure there's an easier way to do this, but I'm very new to SQL and haven't figured it out yet
View 11 Replies
View Related
Oct 7, 2005
This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney
View 5 Replies
View Related
May 18, 2001
I need to delete data from a particular table which has more than half a million records. The data needs to be deleted is more than 200,000 records from the table. What is the best way to delete the data from the table other than importing into a temporary table and performing the same operation?
Let me know if the strategy to be followed is okay.
1. Drop all the triggers
2. Drop all the indexes
3. Write a procedure with a loop setting ROWCOUNT to 1000 and delete the records. ( since if I try to delete all the rows it will give timeout error )
The above procedure will delete 1000 records for each batch inside the loop till it wipes out all the data for the specified condition.
4. Recreate Indexes and Triggers.
Please let me know if there are any other optimal solution.
Thanx,
Zombie
View 2 Replies
View Related
Jul 23, 2005
Hi,I am using SQL 2000 and has a table that contains more than 2 millionrows of data (and growing). Right now, I have encountered 2 problems:1) Sometimes, when I try to query against this table, I would get sqlcommand time out. Hence, I did more testing with Query Analyser and tofind out that the same queries would not always take about the sametime to be executed. Could anyone please tell me what would affect thespeed of the query and what is the most important thing among all thefactors? (I could think of the opened connections, server'sCPU/Memory...)2) I am not sure if 2 million rows is considered a lot or not, however,it start to take 5~10 seconds for me to finish some simple queries. Iam wondering what is the best practices to handle this amount of datawhile having a decent performance?Thank you,Charlie Chang[Charlies224@hotmail.com]
View 5 Replies
View Related
Jul 23, 2005
I have SQL 2000 and need to retrieve fairly large amout of data (~50.000 characters) in XML format and then insert it into the field ofthe text type.As 'FOR XML' can't be used with either local variables, INSERT INTO orSELECT INTO this makes "XML support" quite useless in many aspects.Can anyone please help me in solving this.Thanks a lot for your help and time.Pavel
View 1 Replies
View Related
Jul 6, 2007
I am trying to run a SSIS package (called "Parent") that calls another package (called "Child"). The packages are stored in SQL Server (MSDB database) and are called from a console app by the code below:
Public Shared Function RunPackage(ByVal server As String, _
ByVal userName As String, _
ByVal password As String, _
ByVal packageName As String, _
ByVal packagePassword As String) As String
Dim pkg As New Package
Dim app As New Application
Dim pkgResults As DTSExecResult
Dim result As String = ""
app.PackagePassword = packagePassword
pkg = app.LoadFromSqlServer(packageName, server, userName, password, Nothing)
pkgResults = pkg.Execute()
result = pkgResults.ToString() & vbNewLine
For Each err As DtsError In pkg.Errors
result += "Task: " & err.Source & vbNewLine
result += ParseErrorMessage(err.Description) & vbNewLine
Next
Return result
End Function
If I just run the Child package with this code, then it works! But is I run the Parent package from the same code, then it fails with this error: "Error 0x80004002 while preparing to load the package. No such interface supported".
The only difference between the 2 packages is that the Parent calls the Child, and the Child calls nothing.
The Connection Manager in Parent used to connect to Child uses SQL Server Authentication and it needs to be that.
My question: From VB.NET, how do I run a SSIS package that calls another package?
View 1 Replies
View Related
Sep 30, 2005
i need to retrieve a large amount of data from the sql server database and make changes to one field and put the data back using a console application. how do i do it?
View 3 Replies
View Related
May 5, 2003
Hi,
My application needs to retrieve data from a table which has more than 15 lakh records. The records keep increasing in thousands every 15 days.
Is there anyway i can reduce the time to retrieve? basically i have a select statement with a few conditions and a clause for the id's of these records.
View 2 Replies
View Related
Jan 30, 2007
Hi Experts:
We have several database linked via merge replications. Due to business requirements, we need to delete 5M rows in one table, we did it on one subscriber. However, the publisher kept uploading the deletion operations from the subscriber and blocked any downloading operation from publisher to subscriber. How can we acceralte the replications now as this has already operated in 2 days, and will continue 1-2 days? Is it possible to set the publisher take the downloading before uploading? How to speed up large amount data deletion operations in replication environment?
Thanks in advance!
Ron
View 4 Replies
View Related
Sep 1, 2006
I have an interesting challenge. we are not allowed to allow users direct access to data in our SQL Server. Audit requires us to take the data out of our production server and pass it to the user. my situation is i have a table in SQl with over 100,000 records and growing. i want to pass that to an access data base. i am utilizing DTS and a data transform.
i s there a better way? the package is running slow and even appears to freeze. i see this amount of data as growing as well.
Don S
View 1 Replies
View Related