SSIS Using MASSIVE Amounts Of Memory

Feb 8, 2008

Hi,

I have a series of SSIS packages, all of which are ultimately executed by a parent package.

I'm consitently getting "OutOfMemory" errors when working with the packages which is temporarily solved by closing Visual Studio and re-opening the package(s)... This solution is short lived however as the OutOfMemory error occurs quite quickly after re-opening, often after doing nothing other than altering a variables default value and attempting to save the package.

The average size of the packages in question (.dtsx files) is around 7,000kb with the largest being 12,500kb. The total size of all the solution's packages is ~75,000kb.

The Processes tab in Task Manager shows a Mem Usage counter for devenv.exe *32 of around 20,000kb when Visual Studio is first opened however, when a single ~6,000kb dtsx file is opened this counter jumps to +300,000kb and when the entire solution is opened (When the parent package is executed), the Mem Usage counter for devenv.exe *32 is a massive +800,000kb!!!

Is this normal SSIS behaviour or do I have a major problem? Any tips or suggestions as to how to resolve this issue would be gratefully received.

FYI, "SELECT @@VERSION" gives me "Microsoft SQL Server 2005 - 9.00.3042.00 (X64) Feb 10 2007 00:59:02 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) "

My Server is Windows Server 2003 R2 Enterprise x64 SP2 with 8GB of RAM.

Thanks in advance.

Leigh.

View 7 Replies


ADVERTISEMENT

Sizing A Pagefile On Servers With Large Amounts Of Memory

Sep 19, 2007

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory. However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary. With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile. Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed. If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile. 10% of this is 4.8GB, which I would hope I never see consumed.

Any thoughts?

Thanks, Dave

View 11 Replies View Related

DtsDebugHost Processes Consumes Huge Amounts Of Memory, Then Fails

Jul 18, 2006

I gave up on the ScriptTasks, but desided to use Custom tasks instead. Problem again. My code opens a 400M file and reads it line by line using StreamReader. Each line is approximately the same length. For each line, there is some processing, and then the line is written into another file using StreamWriter. I am watching the DtsDebugHost process with TaskManager open and here is what happens:

Initially its all good. Then when it read through first 150M+ of the input file, the memory usage of the DtsDebugHost shoots up dramatically - about 1Gig (both virtual and physical memory). Then the task fails with OutOfMemoryException. I thought the problem is with my code, but it still happens even if I only read a line from one file and write it to another, without any processing!

When I invoke the same code from Execute Process task, its all good - no problems at all whatsoever.

Any ideas?









View 1 Replies View Related

Would Max Memory Including SSIS And SSAS Memory

Mar 27, 2008

Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.

My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?

Thanks,
fshguo.

View 1 Replies View Related

Memory Issues, SSIS Package Out Of Memory Help

Dec 6, 2006

I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?

Thanks! Aaron B.

View 6 Replies View Related

Massive Inserts

Oct 15, 2004

Currenlty I have huge amounts of data going into a table.
I'm sending an xmldoc and using openxml with a cursor to seed them.

the question I have is whether to let duplicate keyed data rows bounce
and then check @@error and then do an update on the nokeyed field
or
to do a select on the keyed field and then do an insert or update based on the
selects results.

Speed is my goal.

View 3 Replies View Related

Massive .bak File

Feb 6, 2007

Rather than posting twice, I thought I would put both issues I'm having in one. Our server is Windows Server 2003 and we're running SQL Server 2005.

The first issue is this: We have several databases and I have scheduled their backups to run nightly which works just fine. A couple weeks ago, one of the databases .bak file grew from about 500MEG to 2GB overnight. Then, just a few days ago, it went from 2GB to 3.5GB. There is nothing unusual going on in the live db that would warrant such an increase in the .bak file. All the dbs are in the same backup job schedule but this is the only one affected. Additionally, I had autogrowth enabled on all the dbs but today disabled it for this particular db. Any ideas?

The second issue is my tempdb.mdf file on my C drive. It will go from just a few hundred KB's to 4.5GB overnight consuming most of what is left on my C drive. I'm afraid I'm in for a system crash if it continues. I have to stop SQL Server and restart it to clear the size. Is there a way to move the location of the tempdb.mdf file to my F drive?

I don't know if these two issues are related or not but certainly would like to hear from someone.

Sorry, in advance, for the large post.

Dave

View 20 Replies View Related

Massive Delete In DB

Sep 6, 2005

Hello,I have a huge database (2 GB / month) and after a while it is becomingnon-operational (time-outs, etc.) So I have written an SQL sentence(delete) that can reduce around 60% of the db size without compromisingthe application data needs. The problem is that when I execute it, thedb does reduce its size 60%, but the transaction log increases at thesame rate. Can I execute the sentence in a "commit" or"transaction" mode so to impede the SQL Server write in the log?Thanks for the help!Antonio

View 6 Replies View Related

Max Memory For SSIS

May 3, 2006

If we have a SQL Server 2005 standard edition/Windows 2003 Standard Environment on a 32 bit 2 dual core proc Dell 2850 server, what is the maximum size of the memory we could possibly have?



Thank you,

Shiva

View 4 Replies View Related

SSIS Using Up All My Memory

Nov 22, 2005

I have a WO-WO (no sorting, aggregating, etc.) SSIS package that reads ~26,000,000 rows from a SQL DataReader source, looks up a bunch of surrogate keys, derives a couple of values, and writes out to two SQL Server Destinations. The process gradually consumes all of the memory on the server, slowing it to a crawl. I reconfigured SQL Server to use a maximum of 8000 MB to mitigate the problem; however, the memory is not released unless I stop and restart SQL Server. Is this expected behavior for SSIS?

View 4 Replies View Related

SQL 7.0 Massive Row Locking Performance

Mar 3, 2000

When updating large sets a row at a time the performance is lacking in comparison to 6.5. When using PeopleSoft which uses cursors with a begin transaction with a loop inside and a commit after the loop completes, SQL 6.5 with Page locking could handle a 300,000 row transaction in 3-4 hours. 7.0 took 17.5 hours. The difference is 6.5 used 50,000 locks and 7.0 used 300,000 locks.

Does anybody have solution short of rewriting PeopleSoft ?

View 2 Replies View Related

Massive TRN File, But Small DB

Apr 5, 2006

Hi Everyone,

We have a large and active MSSQL 2000 database. Recently, after a rebuild of the server, we had a problem with the SQL service SQLSERVERAGENT. The service could not start as the service account lost local permission to the registry. During this time, all of the data being sent to the database from our application accumulated into the database .ldf file. By the time we were able to get the service restarted, our .ldf file was approx. 28 Gigs. When the service restarded, the .ldf file shrunk down to regular size,about 40 megs, and the .trx tlog file grew up to 28 gigs for that specific period (new file every hour).

The problem is, the database file (database.mdf) stayed about the same as it was before the service was restarted. When the .ldf transfered to the .trn none of the 28 gigs of data got stored in the database. What does this mean? Perhaps with the service stopped the application using the db saw problems and did not commit the data making it all useless? Or is it possible that the data in the .trn log just needs to be forced to commit to the .mdf???

Is there any way to verify the data in the 28 gig .trn file and figure out if we should get it stored to the database? If yes, how would we go about verifying it, and after that how would we force it to commit to the .mdf file? Am I on the right track here or is it not as I see it??

Thanks!
Mike

View 4 Replies View Related

General Question About Massive SP Use

Jul 2, 2006

Hi

Would you say that it's ok for a web site code to make ALL of it's access to a db through SP and views? And I mean everything including inserting new records and updating others with no use with SQL in the code.

The advantage would be very strict control over the access, but in order to achieve this it would take many many SP and views to cover all types of actions, can you think about a disadvantage except all the work creating those SP?? what about the server resources and performance? how demanding it would be?

Thanks,
Inon.

View 7 Replies View Related

Execute Massive SQL Statement

Jan 3, 2006

Hello,

I thought this was a neat solution I came up with, but I'm sure it's
been thought of before. Anyway, it's my first post here.

We have a process for importing data which generates a SELECT statement
based on user's stored configuration. Since the resulting SELECT statement
can be massive, it's created and stored in a text field in a temp table.

So how do I run this huge query after creating it? In my tests, I was
getting a datalength > 20000, requiring 3 varchar(8000) variables in
order to use the execute command. Thing is, I don't know how big it could
possibly get, I wanted to be able to execute it regardless.

Here's what I came up with, it's very simple:

Table is named #IMPORTQUERY, one field SQLTEXT of type TEXT.


>>
declare @x int, @s varchar(8000)

select @x = datalength(sqltext) / 8000 + 1, @s = 'execute('''')' from #importquery

while @x > 0
select @s = 'declare @s' + cast(@x as varchar) + ' varchar(8000) ' +
'select @s' + cast(@x as varchar) +
'=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery ' +
replace(@s,'execute(','execute(@s' + cast(@x as varchar) + '+')
, @x = @x - 1

set @s = 'declare @x int set @x=1 ' + @s

execute(@s)
<<

At the end, I execute the "@s" variable which is SQL that builds and
executes the massive query. Here's what @s looks like at the end:

>>
declare @x int set @x=1
declare @s1 varchar(8000)
select @s1=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s2 varchar(8000)
select @s2=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s3 varchar(8000)
select @s3=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
execute(@s1+@s2+@s3+'')
<<

View 4 Replies View Related

Massive Amoutns Of Reading

Jul 23, 2005

Our database server has started acting weird and at this point I'm eithertoo sleep deprived or close to the problem to adequately diagnose the issue.Basically to put it simply... when I look at the read disk queue length, thedisks queues are astronomical.normally we're seeing a disk queue length of 0-1 on the disks that containthe DB data and index. (i.e non clustered indexes are on a disk of theirown).Writes are just fine.Problem is, all our databases are on the same drive, and I can't seem tonail down which DB, let alone which table is the source of all our reads.Now, to really make things weirder.. during the busier times of the daytoday (say 1:00 PM to 4:00 PM) things were fine.At 4:20 PM or so it was like someone hit a switch and read disk queue lengthjumped from 0-1 up to 100-200+... with spikes up to 1500 for a split secondor so.What's the best way folks know to nail down this?Thanks.----

View 9 Replies View Related

Massive Slowdown With Query

Dec 29, 2007

If I remove the TOP 200 this query returns about 2.5 million rows. It combines a lot of records and turns it into much more programmer friendly results. The query slowed down from 2 seconds to about 13 seconds as it has grown from about 10k to the now couple of million.



Code Block

SELECT TOP 200 *
FROM
(
SELECT
[UserProfile].[UserId]
,[aspnet_Users].[UserName]
,[City]
,[State]
,[RoleName]
,[ProfileItemType].[Name] AS pt_name
,[ProfileItem].[Value]
FROM
[UserCriteria]
,[aspnet_Users]
,[aspnet_Roles]
,[aspnet_UsersInRoles]
,[Location]
,[ProfileType]
,[ProfileTypeItem]
,[ProfileItem]
INNER JOIN [UserProfile]
ON [ProfileItem].[ProfileId] = [UserProfile].[ProfileId]
INNER JOIN [ProfileItemType]
ON [ProfileItem].[ProfileItemTypeId] = [ProfileItemType].[ProfileItemTypeId]
WHERE [UserProfile].[UserId] IN (
SELECT [UserCriteria].[UserId]
FROM [UserCriteria]
WHERE
Zipcode IN (
SELECT [Zipcode]
FROM [ZipcodeProximitySQR] ('89108' , 150))
)

AND [UserProfile].[UserId] = [aspnet_Users].[UserId]
AND [UserCriteria].[UserId] = [UserProfile].[UserId]
AND [Location].[Zipcode] = [UserCriteria].[Zipcode]
AND [aspnet_UsersInRoles].[UserId] = [aspnet_Users].[UserId]
AND [aspnet_UsersInRoles].[RoleId] = [aspnet_Roles].[RoleId]
) AS t
PIVOT
(
MIN([Value])
FOR pt_name IN ([field1],[field2]],[field3]],[field4]])
) AS pvt
ORDER BY RoleName DESC, NEWID()





The line: FOR pt_name IN ([field1],[field2]],[field3]],[field4]]) I change the values from the long names to read field1, field2... because it was irrelevant but confusing because of the names.

Here is the showplan text



Code Block
|--Sequence
|--Table-valued function(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
|--Top(TOP EXPRESSION:((200)))
|--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId], [aous].[dbo].[aspnet_Users].[UserName], [aous].[dbo].[Location].[City], [aous].[dbo].[Location].[State], [aous].[dbo].[UserCriteria].[Birthdate], [aous].[dbo].[aspnet_Roles].[RoleName]) DEFINE:([Expr1039]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'height' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1040]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'bodyType' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1041]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'hairColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1042]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'eyeColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END)))
|--Nested Loops(Inner Join)
|--Nested Loops(Inner Join)
| |--Sort(ORDER BY:([aous].[dbo].[UserCriteria].[UserId] ASC, [aous].[dbo].[Location].[City] ASC, [aous].[dbo].[Location].[State] ASC, [aous].[dbo].[UserCriteria].[Birthdate] ASC, [aous].[dbo].[aspnet_Roles].[RoleName] ASC))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserCriteria].[Zipcode])=([Expr1043]), RESIDUAL:([Expr1043]=[aous].[dbo].[UserCriteria].[Zipcode]))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[ProfileItemType].[ProfileItemTypeId])=([aous].[dbo].[ProfileItem].[ProfileItemTypeId]))
| | | |--Index Scan(OBJECT:([aous].[dbo].[ProfileItemType].[ProfileTypes]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[ProfileId]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserProfile].[UserId])=([aous].[dbo].[aspnet_UsersInRoles].[UserId]), RESIDUAL:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[aspnet_UsersInRoles].[UserId]))
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | |--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | | |--Nested Loops(Left Semi Join, WHERE:([aous].[dbo].[UserCriteria].[Zipcode]=[Expr1044]))
| | | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId] < {guid'E3D72D56-731A-410E-BCB1-07A87A312137'} OR [aous].[dbo].[UserCriteria].[UserId] > {guid'E3D72D56-731A-410E-BCB1-07A87A312137'}), WHERE:([aous].[dbo].[UserCriteria].[Male]=(1) AND [aous].[dbo].[UserCriteria].[SeekingMale]=(0)) ORDERED FORWARD)
| | | | | | | | |--Compute Scalar(DEFINE:([Expr1044]=CONVERT_IMPLICIT(nvarchar(5),[aous].[dbo].[ZipcodeProximitySQR].[Zipcode],0)))
| | | | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
| | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserProfile].[UserProfileIds]), SEEK:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[UserCriteria].[UserId]) ORDERED FORWARD)
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[aspnet_Roles].[RoleId]))
| | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[aspnet_Roles].[aspnet_Roles_index1]))
| | | | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_UsersInRoles].[aspnet_UsersInRoles_index]), SEEK:([aous].[dbo].[aspnet_UsersInRoles].[RoleId]=[aous].[dbo].[aspnet_Roles].[RoleId]) ORDERED FORWARD)
| | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_Users].[_dta_index_aspnet_Users_5_37575172__K2_K1_K4_3]), SEEK:([aous].[dbo].[aspnet_Users].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | |--Index Seek(OBJECT:([aous].[dbo].[ProfileItem].[_dta_index_ProfileItem_5_1714105147__K2_K1_K3_4]), SEEK:([aous].[dbo].[ProfileItem].[ProfileId]=[aous].[dbo].[UserProfile].[ProfileId]) ORDERED FORWARD)
| | |--Compute Scalar(DEFINE:([Expr1043]=CONVERT_IMPLICIT(nchar(5),[aous].[dbo].[Location].[Zipcode],0)))
| | |--Index Scan(OBJECT:([aous].[dbo].[Location].[CityLocation]))
| |--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileType].[PKProfileTypeProfileTypeId]))
|--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileTypeItem].[ProfileTypeItem]))




Here is a link to the execution plan from Microsoft SQL Server management Studio.
http://epi.cc/BasicUserSearch.zip

There are no table scans, but the Hash Match from the inner join is pretty bad.

Can anyone give me a pointer or two?

View 1 Replies View Related

Out Of Memory. Error In SSIS

Jun 27, 2007

Hi!



I am currently encountering an error of testing an SSIS package in the server.

The package runs fine on my laptop, but not in the server.



I appreciate it for any of your input, comments, and suggestions.



The package is to populate records (150k rows) from a DB2 table and insert them into

another DB2 table (12,754,715 rows).



The server,

Windows Server 2003 Enterprise

SQL server 2005 sp1

8 CPUs

6 GB RAM

Native OLE DBIBM OLE DB Provider for DB2



My laptop,

Windows 2000 Professional

SQL server 2005 sp1

2 CPUs

2 GB RAM

Native OLE DBIBM OLE DB Provider for DB2



It fails on the insertion part (see the error message at the bottom). I have been playing with "DefaultBufferMaxRows" and "DefaultBufferSize" properties, but still no luck so far.



For the testing purpose, I even only select 2 rows from the source table, but it still fails with the same error message. And strangely, it still takes a very long time to process (for just two rows). In my laptop, it only takes few second to finish.



I have been really pulling my head to try to figure it out. Any of your help/input is highly appreciated! Thanks!



======================

Error Message

[POLICY [1683]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x8007000E. An OLE DB record is available. Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description: "Out of memory.".

View 2 Replies View Related

Deleting Massive Data From A Table

Jan 20, 2014

I have to delete a ton of data from a SQL table. I have a unique identifier called the version. I would like to use if not in these versions then delete. I tried to using the statement below, but learned the hard way that it created an error this is the message I got....

Msg 9002, Level 17, State 4, Line 3...

The transaction log for database 'MonthEnds' is full due to 'ACTIVE_TRANSACTION'.

I was reading about truncate, I am not sure how I would do this or how I would setup the statement.

Delete Products
where versions were not in (('48459CED-871F-4971-B888-5083990332BC','D550C8D3-58C7-4C74-841D-1C1675F19AE3','C77C7817-3F04-4145-98D3-37BB1610DB35',
'21FE83FA-476D-4604-80EF-2ED57DEE2C16','F3B50B81-191A-4D71-A406-011127AEFBE1','EFBD48E7-E30F-4047-909E-F14DCAEA4181','BD9CCC41-D696-406B-
'C8BEBFBC-D362-4D0F-A555-B281FC2B3023','EFA64956-C2CF-41FC-8E21-F060597DAFCB','77A8DE56-6F7F-4490-8BED-AA6809B947EF','0F4C1E5F-B689-4DCB-

[code]....

View 2 Replies View Related

SQL Server Memory Consumption With SSIS?

Aug 7, 2006

I've been working on serveral packages for the past hour and after finishing for the night I quickly wanted to check to see how much memory SQL Server was consuming on my laptop. It was using almost 500MB of memory. It typically hovers around 50-100MB when I'm not doing anything with it. Is this normal?

View 3 Replies View Related

SSIS Performance And Memory Usage

Sep 24, 2007

Hi -

I am facing 2 problems :
PROBLEM 1 :
We have a few packages that run pretty fast on a desktop server with 2 Gig RAM, Dual processor (approx 4-5 hours). But the same packages run very very slow on the another server containing 8 CPU and 12 Gig RAM (ran for 24 hours without completing).

PROBLEM 2 :
The CPU% ranges from 40-80% and the PF usage is stagnant at 2GB on desktop server for the same package. But in the 8CPU server, the CPU % ranges from 0-10% but the PF Usage raises from 750 MB to 8 GB.

This has become critical to our application.

TIA,

Shabs

View 4 Replies View Related

SSIS Package Out Of Memory Exception

Aug 23, 2007

I have an SSIS Package that loads data from a log file. Prior to loading the data I need to prepare the file. I run a script that cleans the file. Then I import the flat file into SQL Server.

Log File Management Task
1. Run Unix Log File Task
2. Import the new log file (flat file) into SQL Server

Error
i.Unix.dtsx
Message: The script threw an exception: Exception of type 'System.OutOfMemoryException' was thrown.

Is this because the system is running out of memory? The RAM on the server is 4gb. Below is a sample of the script. The job doesn't always fail; there are times when the job executes with success and other times when it fails.

Script Source Code
-----------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.Collections.Generic
Imports System.IO
Imports System.Text
Imports System.Diagnostics
Imports System.Globalization
Imports Microsoft.VisualBasic
Imports System.Text.RegularExpressions
Public Class ScriptMain
'********** Begin Error Log Settings **********
'Dim sSource As String = "i.SSIS.Unix.FileManager"
'Dim sLog As String = "Application"
'Dim sMachine As String = "."
'Dim ELog As New EventLog(sLog, sMachine, sSource)
'********** End Error Log Settings **********

Public Sub Main()
'variables for the unix log file
Dim newFile As String = "D:iLogunixlog.txt"
Dim copyFile As String = "\server16iLogunixlog.txt"
'variables for working log files
Dim oldFile As String = "D:i empunixlog.txt"
Dim difFile As String = "D:i empunixdiff.txt"
Dim trimdiff As String = "D:i empunixdifft.txt"
Dim formatTemp As String = "D:i empunixlog_formatted.txt"
Dim errorFile As String = "D:i empunixlog_bad.txt"

'delete unixlog.txt copy unixlog.txt
'if the file is on the local server delete it and copy the new file over
'if the file is not present copy the new file over
Try
If File.Exists(newFile) Then
File.Delete(newFile)
File.Copy(copyFile, newFile)
Else
File.Copy(copyFile, newFile)
End If
While Not File.Exists(newFile)
System.Threading.Thread.Sleep(1000)
End While
'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try

'open the old file; read backwards until we reach the carriage
'return and store that "seek" position; now open the new file and
'seek to that stored position. finally, read the rest of the file
'and write that data to the difference file.
' determine position of last line in the old file
Dim lastLine As Long = GetLastLinePosition(oldFile)
' get all data in new file starting at position determined above
Dim fi As New FileInfo(newFile)
Dim buffer(fi.Length - lastLine) As Byte
Dim fs As New FileStream(newFile, FileMode.Open)
Try
fs.Seek(lastLine, SeekOrigin.Begin)
fs.Read(buffer, 0, buffer.Length)
fs.Close()
' write that new data to the difference file
fs = New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Write, FileShare.None)
fs.Write(buffer, 0, buffer.Length)
fs.Close()
'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try

'remove the partial row from the difference file
Try
TrimFinal(difFile, trimdiff)
'ELog.WriteEntry("TrimFinal.Call.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("TrimFinal.Call.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
'perform the file formatting
sFormatFile(trimdiff, formatTemp, errorFile)
'
Dts.TaskResult = Dts.Results.Success
End Sub

Function GetLastLinePosition(ByVal fileName As String) As Long
Dim pos As Long = -1
Dim fs As New FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
Try
fs.Seek(-2, SeekOrigin.End) ' -2 to skip a potential vbcrlf at the end of file
While fs.Position > 0
fs.Seek(-1, SeekOrigin.Current)
If fs.ReadByte = 10 Then
pos = fs.Position
Exit While
Else
fs.Seek(-1, SeekOrigin.Current)
End If
End While
fs.Close()
'ELog.WriteEntry("GetLastLinePosition.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("GetLastLinePosition.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
Return pos
End Function

Sub TrimFinal(ByVal difFile As String, ByVal trimdiff As String)
Dim fi2 As New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Read)
Dim fo2 As New FileStream(trimdiff, FileMode.OpenOrCreate, FileAccess.Write)
Dim sr2 As New StreamReader(fi2)
Dim sw2 As New StreamWriter(fo2)
Dim line2 As String
Try
Do While sr2.Peek <> -1
line2 = sr2.ReadLine()
If (sr2.Peek <> -1) Then
sw2.WriteLine(line2)
End If
Loop
sw2.Flush() : sw2.Close()
sr2.Close()
fi2.Close() : fo2.Close()
'ELog.WriteEntry("TrimFinal.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("TrimFinal.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short))
End Try
End Sub

Sub sFormatFile(ByVal currentFile As String, ByVal tempFile As String, ByVal errorFile As String)
Dim tfp As New Microsoft.VisualBasic.FileIO.TextFieldParser(currentFile)
Dim sw As New System.IO.StreamWriter(tempFile)
Dim swErrorFile As New System.IO.StreamWriter(errorFile)
tfp.TextFieldType = FileIO.FieldType.Delimited
tfp.SetDelimiters(",")
tfp.HasFieldsEnclosedInQuotes = True
tfp.TrimWhiteSpace = True
Dim fields() As String
Try
While Not tfp.EndOfData
Try
fields = tfp.ReadFields()
If fields.Length <> 23 Then
'write bad rows to error-file
swErrorFile.WriteLine(String.Join(",", fields))
Else
If fields(3) = "" And fields(13) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") And fields(13) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") And fields(3) = "" Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") _
And IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") Then
sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34))
Else
swErrorFile.WriteLine(String.Join(",", fields))
End If
End If
Catch ex As Exception
'ELog.WriteEntry("sFormatFile.TFP.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short))
End Try
End While
tfp.Close()
sw.Close()
swErrorFile.Close()
File.Delete(currentFile)
File.Move(tempFile, currentFile)
'ELog.WriteEntry("sFormatFile.Success".ToString(), EventLogEntryType.SuccessAudit, 0, CType(4, Short))
Catch ex As Exception
'ELog.WriteEntry("sFormatFile.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short))
Finally
GC.Collect()
End Try
End Sub
End Class
-------------------------

Does my script seem okay for releasing the server memory usage?

Thanks.

View 1 Replies View Related

Memory Goes To One GB And More When My Ssis Package Strated

Feb 28, 2008

Dear Friends

I have SSIS package When I run it SQL Server memory Shoots Very Much
How To check and Solve this problem When I queried In the same package as sp it does not take that much memory
please help me how to control SQL Memory shoot.

View 3 Replies View Related

Massive DTS Delete/import: Logging Problems

Feb 14, 2005

Hey all,

We are using SQL Server 7 on Win 2k and there are some DTS packages set up which empty some large tables (delete from) and then import some datafiles.
The imported files are about 13 GB and during the process the log file gets to about 10GB and then runs out of disk space.

Is there a trick to empty a table without logging it? (a la LOAD Replace from Null in DB2)?
How can I go about keeping the log file size down during this operation?

I think the DB is set to autocommit, the trunc log on chkpt. is set on as is the select into/bulk copy (altho I'm reasonable sure we arent availaing of the bulk copy for the import).

Help? :)

View 2 Replies View Related

Massive Data Import, How To Avoid Dublicates?

Dec 4, 2006

Hello,

I am currently working on a project where I have to import a huge amount of data from CSV files into a database.
I don't want to have dublicate keys in my table, but my CSV file contains them. That means the line more at the end of the file contains the mor up to date information that I have to store.

I try to fix this problem since serveral weeks, but my algorithm is very slow and blocks all other processes on the server. At the moment I am copying all records into a temp table that occure more than once in the CSV file. After that I am running through this table line by line and check if the key already exists in the target table and then either make an insert or an update.

Does somebody know a better process?

I hope somebody can help me... :(

View 5 Replies View Related

Delete Records From A Massive Table (heap)

May 21, 2013

So I've stumbled across an audit table on one of our systems that has reached a hearty 180M rows in size.

The table is a heap (no indexes whatsoever).

Each record has a datetime value indicating when it was created.

I need to delete everything that was created prior to the last 6 months; what is my best plan of attack?

View 12 Replies View Related

Massive UPDATE And SELECT TOP 1 QUERIES, Slowing Down...

Apr 10, 2007

Background

SQL Server 2005 Standard 9.0.1399 64bit

Windows 2003 64-bit

8gb RAM

RAID-1 70gb HD 15K SCSI (Log Files, OS)

RAID-10 1.08TB HD 10K SCSI (Data Files)

Runs aproximately _Total 800 Transaction/Second

We deliver aproximately 70-80 million ad views / day



8 Clustered Windows 2003 32-bit OS IIS Servers running Asp.net 2.0 websites

All 8 servers talking to the one SQL server via a private network (server backbone).



In SQL Server Profiler, I see the following SQL statements with durations of 2000 - 7000:



select top 1 keywordID, keyword, hits, photo, feed from dbo.XXXX where hits > 0 order by hits



and



UPDATE XXXX SET hits=1906342 WHERE keywordID = 7;



Where the hits number is incremented by one each time that is selcted for that keyword ID.



Sometimes these happen so frequently the server stops accepting new connectinos, and I have to restart the SQL server or reboot.



Any ideas on why this is happening?



Regards,

Joe







View 6 Replies View Related

Low Virtual Memory When Running SSIS Package As SQL Job

Oct 18, 2007

I see following error when I execute a SSIS package as part of a job from within SQL Server


OnInformation,006-CIS-SQL,apdsvcPM2SQL,VistaMain,{F902B487-D543-4F31-AC80-EF088CD0CBA4},{74325B35-DC59-4B51-AE8E-756BCC879633},10/18/2007 6:15:12 AM,10/18/2007 6:15:12 AM,1074036748,0x,The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 4 buffers were considered and 4 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

SQL Server has 6 GB memory allocated to it. How can I best troubleshoot this issue?

View 6 Replies View Related

Out Of Memory Error (bUFFER SWAPPING )- SSIS

Feb 27, 2007

HI ,

Need some quick fix Help

I have been
trying to load data from AS400 to DB2 (windows) using ADO.NET connection in
Data reader source and OLEDB Destination (IBM Oledb provider )

The files, I€™m trying to load, have
number of rows more then 15 million.

On execution of the package I get
Out of Memory Error (see below)

My Destination Box is 4GB+ RAM and 4
CPU Box.

There seems to be some Buffer and
Swapping related issue which I€™m not able to figure out. It says that System is
unable to allocate memory

Please help me on the same.

Thanks in Advance

Amit S

SSIS package "ABCDE
1.dtsx" starting.

Information: 0x4004300A at ABCDE
2003 to 2004, DTS.Pipeline: Validation phase is beginning.

Information: 0x4004300A at ABCDE
2003 to 2004, DTS.Pipeline: Validation phase is beginning.

Information: 0x40043006 at ABCDE
2003 to 2004, DTS.Pipeline: Prepare for Execute phase is beginning.

Information: 0x40043007 at ABCDE
2003 to 2004, DTS.Pipeline: Pre-Execute phase is beginning.

Information: 0x4004300C at ABCDE
2003 to 2004, DTS.Pipeline: Execute phase is beginning.

Error: 0xC0202009 at ABCDE
2003 to 2004, OLE DB Destination [12]: An OLE DB error has occurred. Error
code: 0x8007000E.

An OLE DB record is available.
Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description:
"Out of memory.".

Error: 0xC0047022
at ABCDE 2003 to 2004, DTS.Pipeline: The ProcessInput method on component
"OLE DB Destination" (12) failed with error code 0xC0202009. The
identified component returned an error from the ProcessInput method. The error
is specific to the component, but the error is fatal and will cause the Data
Flow task to stop running.

Error: 0xC0047021 at ABCDE
2003 to 2004, DTS.Pipeline: Thread "WorkThread0" has exited with
error code 0xC0202009.

Error: 0xC02090F5
at ABCDE 2003 to 2004, DataReader Source [61]: The component "DataReader
Source" (61) was unable to process the data.

Error: 0xC0047038 at ABCDE
2003 to 2004, DTS.Pipeline: The PrimeOutput method on component
"DataReader Source" (61) returned error code 0xC02090F5. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the component, but
the error is fatal and the pipeline stopped executing.

Error: 0xC0047021 at ABCDE
2003 to 2004, DTS.Pipeline: Thread "SourceThread0" has exited with
error code 0xC0047038.

Information: 0x40043008 at ABCDE
2003 to 2004, DTS.Pipeline: Post Execute phase is beginning.

Information: 0x40043009 at ABCDE
2003 to 2004, DTS.Pipeline: Cleanup phase is beginning.

Information: 0x4004300B at ABCDE
2003 to 2004, DTS.Pipeline: "component "OLE DB Destination"
(12)" wrote 289188 rows.

Task failed: ABCDE 2003 to
2004

Warning: 0x80019002 at ABCDE
1: The Execution method succeeded, but the number of errors raised (6) reached
the maximum allowed (1); resulting in failure. This occurs when the number of
errors reaches the number specified in MaximumErrorCount. Change the
MaximumErrorCount or fix the errors.

Executing ExecutePackageTask:
C:Documents and SettingsAdministratorMy DocumentsVisual Studio
2005ProjectsIntegration Services Project1Integration Services Project1ABCDE
2.dtsx

Information: 0x4004300A at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Validation phase is beginning.

Information: 0x4004300A at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Validation phase is beginning.

Information: 0x4004300A at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Validation phase is beginning.

Information: 0x40043006 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Prepare for Execute phase is beginning.

Information: 0x40043007 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Pre-Execute phase is beginning.

Information: 0x4004300C at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Execute phase is beginning.

Information:
0x4004800D at ABCDE 2005_04 to 2005_11, DTS.Pipeline: The buffer manager failed
a memory allocation call for 10484320 bytes, but was unable to swap out any
buffers to relieve memory pressure. 3 buffers were considered and 3 were
locked. Either not enough memory is available to the pipeline because not
enough are installed, other processes were using it, or too many buffers are
locked.

Error: 0xC0047012
at ABCDE 2005_04 to 2005_11, DTS.Pipeline: A buffer failed while allocating
10484320 bytes.

Error: 0xC0047011
at ABCDE 2005_04 to 2005_11, DTS.Pipeline: The system reports 63 percent memory
load. There are 4294660096 bytes of physical memory with 1548783616 bytes free.
There are 2147352576 bytes of virtual memory with 227577856 bytes free. The
paging file has 6268805120 bytes with 3607072768 bytes free.

Error: 0xC02090F5 at ABCDE
2005_04 to 2005_11, DataReader Source [61]: The component "DataReader
Source" (61) was unable to process the data.

Error: 0xC0047038 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: The PrimeOutput method on component
"DataReader Source" (61) returned error code 0xC02090F5. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the component, but
the error is fatal and the pipeline stopped executing.

Error: 0xC0047021 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Thread "SourceThread0" has exited
with error code 0xC0047038.

Error: 0xC0047039 at ABCDE 2005_04
to 2005_11, DTS.Pipeline: Thread "WorkThread0" received a shutdown
signal and is terminating. The user requested a shutdown, or an error in
another thread is causing the pipeline to shutdown.

Error: 0xC0047021 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Thread "WorkThread0" has exited
with error code 0xC0047039.

Information: 0x40043008 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Post Execute phase is beginning.

Information: 0x40043009 at ABCDE
2005_04 to 2005_11, DTS.Pipeline: Cleanup phase is beginning.

Information: 0x4004300B at ABCDE
2005_04 to 2005_11, DTS.Pipeline: "component "OLE DB
Destination" (12)" wrote 0 rows.

Task failed: ABCDE 2005_04 to
2005_11

Warning: 0x80019002 at ABCDE:
The Execution method succeeded, but the number of errors raised (7) reached the
maximum allowed (1); resulting in failure. This occurs when the number of
errors reaches the number specified in MaximumErrorCount. Change the
MaximumErrorCount or fix the errors.

Executing ExecutePackageTask:
C:Documents and SettingsAdministratorMy DocumentsVisual Studio
2005ProjectsIntegration Services Project1Integration Services Project1ABCDE
3.dtsx

Information: 0x4004300A at ABCDE
2005_11 to 2006_04, DTS.Pipeline: Validation phase is beginning.

Information: 0x4004300A at ABCDE
2005_11 to 2006_04, DTS.Pipeline: Validation phase is beginning.

Information: 0x4004300A at ABCDE
2005_11 to 2006_04, DTS.Pipeline: Validation phase is beginning.

Information: 0x40043006 at ABCDE
2005_11 to 2006_04, DTS.Pipeline: Prepare for Execute phase is beginning.

Information: 0x40043007 at ABCDE
2005_11 to 2006_04, DTS.Pipeline: Pre-Execute phase is beginning.

€¦€¦.

€¦€¦€¦€¦

 

View 11 Replies View Related

SSIS - On Execute Package Out Of Memory Error

Feb 20, 2007

Hi,

when i am trying to execute package in ssis then given below errors comes many times.how to fix it.any body can ......

in ssis default buffer size 10 mb.

soure is iseries-db2 on as400 in production server ,

and destination is db2 udb on windows in dev server.

usersapce page size in db2 is 16-32k

4 gb ram support in server with 2003 server standard edition.

errors are---

Information: 0x4004800D at CHDRPF 312-315, DTS.Pipeline: The buffer manager failed a memory allocation call for 15728400 bytes, but was unable to swap out any buffers to relieve memory pressure. 3 buffers were considered and 3 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.
Error: 0xC0047012 at CHDRPF 312-315, DTS.Pipeline: A buffer failed while allocating 15728400 bytes.
Error: 0xC0047011 at CHDRPF 312-315, DTS.Pipeline: The system reports 83 percent memory load. There are 3488509952 bytes of physical memory with 558743552 bytes free. There are 2147352576 bytes of virtual memory with 222920704 bytes free. The paging file has 7416537088 bytes with 3703283712 bytes free.
Error: 0xC0047056 at CHDRPF 312-315, DTS.Pipeline: The Data Flow task failed to create a buffer to call PrimeOutput for output "DataReader Source" (15437) on component "DataReader Output" (15442). This error usually occurs due to an out-of-memory condition.
Error: 0xC0047021 at CHDRPF 312-315, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0x8007000E.
Error: 0xC0047039 at CHDRPF 312-315, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at CHDRPF 312-315, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039.



so what need to do for fix that problem ......

View 10 Replies View Related

SSIS Transformations When Data Exceeds Available Memory

May 25, 2006

I've read that SSIS tries to do all transformations in memory as a way of enhancing processing speed. What happens though if the amount of data processed exceeds the available RAM? Are raw files then used (similar to staging tables) or is an error generated?

Barkingdog



View 1 Replies View Related

ASAP Help Needed Need Sql Guru To Help With Massive Script Issue

May 3, 2007

I need some help I have this massive sql script the problem is I tried to put it in to the query string box in my sql reports and it will not take it this script will run if I break it up but I think it is to large is there a sql guru out there that can show me how to reduce the size of this script maybe by using an out parameter to a stored proceedure. I just dont know what to do and need to produce the report from this script.  Below is the entire script
SELECT  'Prior Year All ' as 'qtr', COUNT(JOB.JOBID) AS 'transcount',  COUNT(DISTINCT JOB.PATIENTID) AS 'patientcount',  SUM(JOB.TRANSPORTATION_TCOST) AS 'tcost',  SUM(JOB.TRANSPORTATION_DISC_COST) AS 'dtcost',  AVG(JOB.TRANSPORTATION_DISC) AS 'avgTDisc',  SUM(JOB.TRANSPORTATION_TCOST) + SUM(JOB.TRANSPORTATION_DISC_COST) AS 'TGrossAmtBilled',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(DISTINCT JOB.PATIENTID) AS 'PatAvgT',  SUM(JOB.TRANSPORTATION_DISC) AS 'avgPercentDiscT',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(JOB.JOBID) AS 'RefAvgT',  JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                         PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                         JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE AS Expr1,                         INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE,
            (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS  'CompletedSuccessfullyItems',
             (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with complaint') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithComplaintItems',                                                  (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Show') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoShowItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Charge') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoChargeItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with Situation') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithSituationItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Not Completed') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'NotCompletedItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Cancelled Prior to service') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CancelledPriorToServiceItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Cancelled During Service') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CancelledDuringServiceItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'AwaitingforcompletionItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Pending for review') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like'%T ')) AS 'PendingforreviewItems'
FROM JOB                   INNER JOIN INVOICE_AR                                  ON JOB.JOBID = INVOICE_AR.JOBID                   LEFT OUTER JOIN PAYER                                 ON PAYER.PAYERID = JOB.PAYERID                  LEFT OUTER JOIN STATES                                 ON JOB.JURISDICTION = STATES.INITIALS                LEFT OUTER JOIN PATIENT                                ON PATIENT.PATIENTID = JOB.PATIENTID                LEFT OUTER JOIN JOB_OUTCOME                                ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                LEFT OUTER JOIN SERVICE_TYPE                                ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME               LEFT OUTER JOIN PAT_SERVICES_HISTORY                                ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID
WHERE                 (INVOICE_AR.AMOUNT_DUE > 0)AND                 (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                 (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12))AND                (PAYER.PAY_GROUPNAME like '%' + @Company + '%')AND                (INVOICE_AR.INVOICE_NO like '%T')  
GROUP BY                         JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                        PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                        JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE,                        INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE
UNION ALL
SELECT  'Current Year 2007 All ' as 'qtr', COUNT(JOB.JOBID) AS 'transcount',  COUNT(DISTINCT JOB.PATIENTID) AS 'patientcount',  SUM(JOB.TRANSPORTATION_TCOST) AS 'tcost',  SUM(JOB.TRANSPORTATION_DISC_COST) AS 'dtcost',  AVG(JOB.TRANSPORTATION_DISC) AS 'avgTDisc',  SUM(JOB.TRANSPORTATION_TCOST) + SUM(JOB.TRANSPORTATION_DISC_COST) AS 'TGrossAmtBilled',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(DISTINCT JOB.PATIENTID) AS 'PatAvgT',  SUM(JOB.TRANSPORTATION_DISC) AS 'avgPercentDiscT',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(JOB.JOBID) AS 'RefAvgT',  JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                         PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                         JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE AS Expr1,                         INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE,
            (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startDate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS  'CompletedSuccessfullyItems',
             (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with complaint') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithComplaintItems',                                                  (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Show') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoShowItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Charge') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoChargeItems',
           &nb

View 8 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

SQL Server 2008 :: SSIS Warning - Global Shared Memory

Apr 8, 2009

I'm busy rewriting DTS packages as SSIS packages. As and when I finish a package I run it in debug mode via Microsoft Visual Studio and then examine the Exection Results to see the messages generated.

Now it may or may not matter how I run the package but the following warning has been generated :-

[SSIS.Pipeline] Warning: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available. To resolve, run this package as an administrator, or on the system's console.

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved