Massive Inserts

Oct 15, 2004

Currenlty I have huge amounts of data going into a table.
I'm sending an xmldoc and using openxml with a cursor to seed them.

the question I have is whether to let duplicate keyed data rows bounce
and then check @@error and then do an update on the nokeyed field
or
to do a select on the keyed field and then do an insert or update based on the
selects results.

Speed is my goal.

View 3 Replies


ADVERTISEMENT

Massive .bak File

Feb 6, 2007

Rather than posting twice, I thought I would put both issues I'm having in one. Our server is Windows Server 2003 and we're running SQL Server 2005.

The first issue is this: We have several databases and I have scheduled their backups to run nightly which works just fine. A couple weeks ago, one of the databases .bak file grew from about 500MEG to 2GB overnight. Then, just a few days ago, it went from 2GB to 3.5GB. There is nothing unusual going on in the live db that would warrant such an increase in the .bak file. All the dbs are in the same backup job schedule but this is the only one affected. Additionally, I had autogrowth enabled on all the dbs but today disabled it for this particular db. Any ideas?

The second issue is my tempdb.mdf file on my C drive. It will go from just a few hundred KB's to 4.5GB overnight consuming most of what is left on my C drive. I'm afraid I'm in for a system crash if it continues. I have to stop SQL Server and restart it to clear the size. Is there a way to move the location of the tempdb.mdf file to my F drive?

I don't know if these two issues are related or not but certainly would like to hear from someone.

Sorry, in advance, for the large post.

Dave

View 20 Replies View Related

Massive Delete In DB

Sep 6, 2005

Hello,I have a huge database (2 GB / month) and after a while it is becomingnon-operational (time-outs, etc.) So I have written an SQL sentence(delete) that can reduce around 60% of the db size without compromisingthe application data needs. The problem is that when I execute it, thedb does reduce its size 60%, but the transaction log increases at thesame rate. Can I execute the sentence in a "commit" or"transaction" mode so to impede the SQL Server write in the log?Thanks for the help!Antonio

View 6 Replies View Related

SQL 7.0 Massive Row Locking Performance

Mar 3, 2000

When updating large sets a row at a time the performance is lacking in comparison to 6.5. When using PeopleSoft which uses cursors with a begin transaction with a loop inside and a commit after the loop completes, SQL 6.5 with Page locking could handle a 300,000 row transaction in 3-4 hours. 7.0 took 17.5 hours. The difference is 6.5 used 50,000 locks and 7.0 used 300,000 locks.

Does anybody have solution short of rewriting PeopleSoft ?

View 2 Replies View Related

Massive TRN File, But Small DB

Apr 5, 2006

Hi Everyone,

We have a large and active MSSQL 2000 database. Recently, after a rebuild of the server, we had a problem with the SQL service SQLSERVERAGENT. The service could not start as the service account lost local permission to the registry. During this time, all of the data being sent to the database from our application accumulated into the database .ldf file. By the time we were able to get the service restarted, our .ldf file was approx. 28 Gigs. When the service restarded, the .ldf file shrunk down to regular size,about 40 megs, and the .trx tlog file grew up to 28 gigs for that specific period (new file every hour).

The problem is, the database file (database.mdf) stayed about the same as it was before the service was restarted. When the .ldf transfered to the .trn none of the 28 gigs of data got stored in the database. What does this mean? Perhaps with the service stopped the application using the db saw problems and did not commit the data making it all useless? Or is it possible that the data in the .trn log just needs to be forced to commit to the .mdf???

Is there any way to verify the data in the 28 gig .trn file and figure out if we should get it stored to the database? If yes, how would we go about verifying it, and after that how would we force it to commit to the .mdf file? Am I on the right track here or is it not as I see it??

Thanks!
Mike

View 4 Replies View Related

General Question About Massive SP Use

Jul 2, 2006

Hi

Would you say that it's ok for a web site code to make ALL of it's access to a db through SP and views? And I mean everything including inserting new records and updating others with no use with SQL in the code.

The advantage would be very strict control over the access, but in order to achieve this it would take many many SP and views to cover all types of actions, can you think about a disadvantage except all the work creating those SP?? what about the server resources and performance? how demanding it would be?

Thanks,
Inon.

View 7 Replies View Related

Execute Massive SQL Statement

Jan 3, 2006

Hello,

I thought this was a neat solution I came up with, but I'm sure it's
been thought of before. Anyway, it's my first post here.

We have a process for importing data which generates a SELECT statement
based on user's stored configuration. Since the resulting SELECT statement
can be massive, it's created and stored in a text field in a temp table.

So how do I run this huge query after creating it? In my tests, I was
getting a datalength > 20000, requiring 3 varchar(8000) variables in
order to use the execute command. Thing is, I don't know how big it could
possibly get, I wanted to be able to execute it regardless.

Here's what I came up with, it's very simple:

Table is named #IMPORTQUERY, one field SQLTEXT of type TEXT.


>>
declare @x int, @s varchar(8000)

select @x = datalength(sqltext) / 8000 + 1, @s = 'execute('''')' from #importquery

while @x > 0
select @s = 'declare @s' + cast(@x as varchar) + ' varchar(8000) ' +
'select @s' + cast(@x as varchar) +
'=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery ' +
replace(@s,'execute(','execute(@s' + cast(@x as varchar) + '+')
, @x = @x - 1

set @s = 'declare @x int set @x=1 ' + @s

execute(@s)
<<

At the end, I execute the "@s" variable which is SQL that builds and
executes the massive query. Here's what @s looks like at the end:

>>
declare @x int set @x=1
declare @s1 varchar(8000)
select @s1=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s2 varchar(8000)
select @s2=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s3 varchar(8000)
select @s3=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
execute(@s1+@s2+@s3+'')
<<

View 4 Replies View Related

Massive Amoutns Of Reading

Jul 23, 2005

Our database server has started acting weird and at this point I'm eithertoo sleep deprived or close to the problem to adequately diagnose the issue.Basically to put it simply... when I look at the read disk queue length, thedisks queues are astronomical.normally we're seeing a disk queue length of 0-1 on the disks that containthe DB data and index. (i.e non clustered indexes are on a disk of theirown).Writes are just fine.Problem is, all our databases are on the same drive, and I can't seem tonail down which DB, let alone which table is the source of all our reads.Now, to really make things weirder.. during the busier times of the daytoday (say 1:00 PM to 4:00 PM) things were fine.At 4:20 PM or so it was like someone hit a switch and read disk queue lengthjumped from 0-1 up to 100-200+... with spikes up to 1500 for a split secondor so.What's the best way folks know to nail down this?Thanks.----

View 9 Replies View Related

Massive Slowdown With Query

Dec 29, 2007

If I remove the TOP 200 this query returns about 2.5 million rows. It combines a lot of records and turns it into much more programmer friendly results. The query slowed down from 2 seconds to about 13 seconds as it has grown from about 10k to the now couple of million.



Code Block

SELECT TOP 200 *
FROM
(
SELECT
[UserProfile].[UserId]
,[aspnet_Users].[UserName]
,[City]
,[State]
,[RoleName]
,[ProfileItemType].[Name] AS pt_name
,[ProfileItem].[Value]
FROM
[UserCriteria]
,[aspnet_Users]
,[aspnet_Roles]
,[aspnet_UsersInRoles]
,[Location]
,[ProfileType]
,[ProfileTypeItem]
,[ProfileItem]
INNER JOIN [UserProfile]
ON [ProfileItem].[ProfileId] = [UserProfile].[ProfileId]
INNER JOIN [ProfileItemType]
ON [ProfileItem].[ProfileItemTypeId] = [ProfileItemType].[ProfileItemTypeId]
WHERE [UserProfile].[UserId] IN (
SELECT [UserCriteria].[UserId]
FROM [UserCriteria]
WHERE
Zipcode IN (
SELECT [Zipcode]
FROM [ZipcodeProximitySQR] ('89108' , 150))
)

AND [UserProfile].[UserId] = [aspnet_Users].[UserId]
AND [UserCriteria].[UserId] = [UserProfile].[UserId]
AND [Location].[Zipcode] = [UserCriteria].[Zipcode]
AND [aspnet_UsersInRoles].[UserId] = [aspnet_Users].[UserId]
AND [aspnet_UsersInRoles].[RoleId] = [aspnet_Roles].[RoleId]
) AS t
PIVOT
(
MIN([Value])
FOR pt_name IN ([field1],[field2]],[field3]],[field4]])
) AS pvt
ORDER BY RoleName DESC, NEWID()





The line: FOR pt_name IN ([field1],[field2]],[field3]],[field4]]) I change the values from the long names to read field1, field2... because it was irrelevant but confusing because of the names.

Here is the showplan text



Code Block
|--Sequence
|--Table-valued function(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
|--Top(TOP EXPRESSION:((200)))
|--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId], [aous].[dbo].[aspnet_Users].[UserName], [aous].[dbo].[Location].[City], [aous].[dbo].[Location].[State], [aous].[dbo].[UserCriteria].[Birthdate], [aous].[dbo].[aspnet_Roles].[RoleName]) DEFINE:([Expr1039]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'height' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1040]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'bodyType' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1041]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'hairColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1042]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'eyeColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END)))
|--Nested Loops(Inner Join)
|--Nested Loops(Inner Join)
| |--Sort(ORDER BY:([aous].[dbo].[UserCriteria].[UserId] ASC, [aous].[dbo].[Location].[City] ASC, [aous].[dbo].[Location].[State] ASC, [aous].[dbo].[UserCriteria].[Birthdate] ASC, [aous].[dbo].[aspnet_Roles].[RoleName] ASC))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserCriteria].[Zipcode])=([Expr1043]), RESIDUAL:([Expr1043]=[aous].[dbo].[UserCriteria].[Zipcode]))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[ProfileItemType].[ProfileItemTypeId])=([aous].[dbo].[ProfileItem].[ProfileItemTypeId]))
| | | |--Index Scan(OBJECT:([aous].[dbo].[ProfileItemType].[ProfileTypes]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[ProfileId]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserProfile].[UserId])=([aous].[dbo].[aspnet_UsersInRoles].[UserId]), RESIDUAL:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[aspnet_UsersInRoles].[UserId]))
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | |--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | | |--Nested Loops(Left Semi Join, WHERE:([aous].[dbo].[UserCriteria].[Zipcode]=[Expr1044]))
| | | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId] < {guid'E3D72D56-731A-410E-BCB1-07A87A312137'} OR [aous].[dbo].[UserCriteria].[UserId] > {guid'E3D72D56-731A-410E-BCB1-07A87A312137'}), WHERE:([aous].[dbo].[UserCriteria].[Male]=(1) AND [aous].[dbo].[UserCriteria].[SeekingMale]=(0)) ORDERED FORWARD)
| | | | | | | | |--Compute Scalar(DEFINE:([Expr1044]=CONVERT_IMPLICIT(nvarchar(5),[aous].[dbo].[ZipcodeProximitySQR].[Zipcode],0)))
| | | | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
| | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserProfile].[UserProfileIds]), SEEK:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[UserCriteria].[UserId]) ORDERED FORWARD)
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[aspnet_Roles].[RoleId]))
| | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[aspnet_Roles].[aspnet_Roles_index1]))
| | | | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_UsersInRoles].[aspnet_UsersInRoles_index]), SEEK:([aous].[dbo].[aspnet_UsersInRoles].[RoleId]=[aous].[dbo].[aspnet_Roles].[RoleId]) ORDERED FORWARD)
| | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_Users].[_dta_index_aspnet_Users_5_37575172__K2_K1_K4_3]), SEEK:([aous].[dbo].[aspnet_Users].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | |--Index Seek(OBJECT:([aous].[dbo].[ProfileItem].[_dta_index_ProfileItem_5_1714105147__K2_K1_K3_4]), SEEK:([aous].[dbo].[ProfileItem].[ProfileId]=[aous].[dbo].[UserProfile].[ProfileId]) ORDERED FORWARD)
| | |--Compute Scalar(DEFINE:([Expr1043]=CONVERT_IMPLICIT(nchar(5),[aous].[dbo].[Location].[Zipcode],0)))
| | |--Index Scan(OBJECT:([aous].[dbo].[Location].[CityLocation]))
| |--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileType].[PKProfileTypeProfileTypeId]))
|--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileTypeItem].[ProfileTypeItem]))




Here is a link to the execution plan from Microsoft SQL Server management Studio.
http://epi.cc/BasicUserSearch.zip

There are no table scans, but the Hash Match from the inner join is pretty bad.

Can anyone give me a pointer or two?

View 1 Replies View Related

Deleting Massive Data From A Table

Jan 20, 2014

I have to delete a ton of data from a SQL table. I have a unique identifier called the version. I would like to use if not in these versions then delete. I tried to using the statement below, but learned the hard way that it created an error this is the message I got....

Msg 9002, Level 17, State 4, Line 3...

The transaction log for database 'MonthEnds' is full due to 'ACTIVE_TRANSACTION'.

I was reading about truncate, I am not sure how I would do this or how I would setup the statement.

Delete Products
where versions were not in (('48459CED-871F-4971-B888-5083990332BC','D550C8D3-58C7-4C74-841D-1C1675F19AE3','C77C7817-3F04-4145-98D3-37BB1610DB35',
'21FE83FA-476D-4604-80EF-2ED57DEE2C16','F3B50B81-191A-4D71-A406-011127AEFBE1','EFBD48E7-E30F-4047-909E-F14DCAEA4181','BD9CCC41-D696-406B-
'C8BEBFBC-D362-4D0F-A555-B281FC2B3023','EFA64956-C2CF-41FC-8E21-F060597DAFCB','77A8DE56-6F7F-4490-8BED-AA6809B947EF','0F4C1E5F-B689-4DCB-

[code]....

View 2 Replies View Related

SSIS Using MASSIVE Amounts Of Memory

Feb 8, 2008

Hi,

I have a series of SSIS packages, all of which are ultimately executed by a parent package.

I'm consitently getting "OutOfMemory" errors when working with the packages which is temporarily solved by closing Visual Studio and re-opening the package(s)... This solution is short lived however as the OutOfMemory error occurs quite quickly after re-opening, often after doing nothing other than altering a variables default value and attempting to save the package.

The average size of the packages in question (.dtsx files) is around 7,000kb with the largest being 12,500kb. The total size of all the solution's packages is ~75,000kb.

The Processes tab in Task Manager shows a Mem Usage counter for devenv.exe *32 of around 20,000kb when Visual Studio is first opened however, when a single ~6,000kb dtsx file is opened this counter jumps to +300,000kb and when the entire solution is opened (When the parent package is executed), the Mem Usage counter for devenv.exe *32 is a massive +800,000kb!!!

Is this normal SSIS behaviour or do I have a major problem? Any tips or suggestions as to how to resolve this issue would be gratefully received.

FYI, "SELECT @@VERSION" gives me "Microsoft SQL Server 2005 - 9.00.3042.00 (X64) Feb 10 2007 00:59:02 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) "

My Server is Windows Server 2003 R2 Enterprise x64 SP2 with 8GB of RAM.

Thanks in advance.

Leigh.

View 7 Replies View Related

Massive DTS Delete/import: Logging Problems

Feb 14, 2005

Hey all,

We are using SQL Server 7 on Win 2k and there are some DTS packages set up which empty some large tables (delete from) and then import some datafiles.
The imported files are about 13 GB and during the process the log file gets to about 10GB and then runs out of disk space.

Is there a trick to empty a table without logging it? (a la LOAD Replace from Null in DB2)?
How can I go about keeping the log file size down during this operation?

I think the DB is set to autocommit, the trunc log on chkpt. is set on as is the select into/bulk copy (altho I'm reasonable sure we arent availaing of the bulk copy for the import).

Help? :)

View 2 Replies View Related

Massive Data Import, How To Avoid Dublicates?

Dec 4, 2006

Hello,

I am currently working on a project where I have to import a huge amount of data from CSV files into a database.
I don't want to have dublicate keys in my table, but my CSV file contains them. That means the line more at the end of the file contains the mor up to date information that I have to store.

I try to fix this problem since serveral weeks, but my algorithm is very slow and blocks all other processes on the server. At the moment I am copying all records into a temp table that occure more than once in the CSV file. After that I am running through this table line by line and check if the key already exists in the target table and then either make an insert or an update.

Does somebody know a better process?

I hope somebody can help me... :(

View 5 Replies View Related

Delete Records From A Massive Table (heap)

May 21, 2013

So I've stumbled across an audit table on one of our systems that has reached a hearty 180M rows in size.

The table is a heap (no indexes whatsoever).

Each record has a datetime value indicating when it was created.

I need to delete everything that was created prior to the last 6 months; what is my best plan of attack?

View 12 Replies View Related

Massive UPDATE And SELECT TOP 1 QUERIES, Slowing Down...

Apr 10, 2007

Background

SQL Server 2005 Standard 9.0.1399 64bit

Windows 2003 64-bit

8gb RAM

RAID-1 70gb HD 15K SCSI (Log Files, OS)

RAID-10 1.08TB HD 10K SCSI (Data Files)

Runs aproximately _Total 800 Transaction/Second

We deliver aproximately 70-80 million ad views / day



8 Clustered Windows 2003 32-bit OS IIS Servers running Asp.net 2.0 websites

All 8 servers talking to the one SQL server via a private network (server backbone).



In SQL Server Profiler, I see the following SQL statements with durations of 2000 - 7000:



select top 1 keywordID, keyword, hits, photo, feed from dbo.XXXX where hits > 0 order by hits



and



UPDATE XXXX SET hits=1906342 WHERE keywordID = 7;



Where the hits number is incremented by one each time that is selcted for that keyword ID.



Sometimes these happen so frequently the server stops accepting new connectinos, and I have to restart the SQL server or reboot.



Any ideas on why this is happening?



Regards,

Joe







View 6 Replies View Related

ASAP Help Needed Need Sql Guru To Help With Massive Script Issue

May 3, 2007

I need some help I have this massive sql script the problem is I tried to put it in to the query string box in my sql reports and it will not take it this script will run if I break it up but I think it is to large is there a sql guru out there that can show me how to reduce the size of this script maybe by using an out parameter to a stored proceedure. I just dont know what to do and need to produce the report from this script.  Below is the entire script
SELECT  'Prior Year All ' as 'qtr', COUNT(JOB.JOBID) AS 'transcount',  COUNT(DISTINCT JOB.PATIENTID) AS 'patientcount',  SUM(JOB.TRANSPORTATION_TCOST) AS 'tcost',  SUM(JOB.TRANSPORTATION_DISC_COST) AS 'dtcost',  AVG(JOB.TRANSPORTATION_DISC) AS 'avgTDisc',  SUM(JOB.TRANSPORTATION_TCOST) + SUM(JOB.TRANSPORTATION_DISC_COST) AS 'TGrossAmtBilled',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(DISTINCT JOB.PATIENTID) AS 'PatAvgT',  SUM(JOB.TRANSPORTATION_DISC) AS 'avgPercentDiscT',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(JOB.JOBID) AS 'RefAvgT',  JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                         PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                         JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE AS Expr1,                         INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE,
            (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS  'CompletedSuccessfullyItems',
             (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with complaint') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithComplaintItems',                                                  (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Show') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoShowItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Charge') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoChargeItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with Situation') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithSituationItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Not Completed') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'NotCompletedItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Cancelled Prior to service') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CancelledPriorToServiceItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Cancelled During Service') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CancelledDuringServiceItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'AwaitingforcompletionItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Pending for review') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like'%T ')) AS 'PendingforreviewItems'
FROM JOB                   INNER JOIN INVOICE_AR                                  ON JOB.JOBID = INVOICE_AR.JOBID                   LEFT OUTER JOIN PAYER                                 ON PAYER.PAYERID = JOB.PAYERID                  LEFT OUTER JOIN STATES                                 ON JOB.JURISDICTION = STATES.INITIALS                LEFT OUTER JOIN PATIENT                                ON PATIENT.PATIENTID = JOB.PATIENTID                LEFT OUTER JOIN JOB_OUTCOME                                ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                LEFT OUTER JOIN SERVICE_TYPE                                ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME               LEFT OUTER JOIN PAT_SERVICES_HISTORY                                ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID
WHERE                 (INVOICE_AR.AMOUNT_DUE > 0)AND                 (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                 (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12))AND                (PAYER.PAY_GROUPNAME like '%' + @Company + '%')AND                (INVOICE_AR.INVOICE_NO like '%T')  
GROUP BY                         JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                        PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                        JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE,                        INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE
UNION ALL
SELECT  'Current Year 2007 All ' as 'qtr', COUNT(JOB.JOBID) AS 'transcount',  COUNT(DISTINCT JOB.PATIENTID) AS 'patientcount',  SUM(JOB.TRANSPORTATION_TCOST) AS 'tcost',  SUM(JOB.TRANSPORTATION_DISC_COST) AS 'dtcost',  AVG(JOB.TRANSPORTATION_DISC) AS 'avgTDisc',  SUM(JOB.TRANSPORTATION_TCOST) + SUM(JOB.TRANSPORTATION_DISC_COST) AS 'TGrossAmtBilled',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(DISTINCT JOB.PATIENTID) AS 'PatAvgT',  SUM(JOB.TRANSPORTATION_DISC) AS 'avgPercentDiscT',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(JOB.JOBID) AS 'RefAvgT',  JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                         PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                         JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE AS Expr1,                         INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE,
            (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startDate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS  'CompletedSuccessfullyItems',
             (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with complaint') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithComplaintItems',                                                  (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Show') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoShowItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Charge') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoChargeItems',
           &nb

View 8 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

Inserts Per Second

Mar 3, 2006

Just curios,

I have four 72 GB Drive on a RAID 5

Disk Specs
IO/Second = 130 per disk
Speed RPM = 15 K

When I did a load test of inserting data into a table
with four Columns

Col1 INT
Col2 VARCHAR(32)
Col3 VARCHAR(4000)
Col4 DATETIME

I could insert around 1044 Inserts per second where as
I thought I could do max of 520 Inserts ( 130 * 4 ) because
each disk can only take 130 Inserts multiplied by 4 Disks
gives me a theoritical limit of 520.

Also How does the Query Analyser Connects to the datbase
Server.. does it use ODBC

Thx
Venu

View 1 Replies View Related

No Of Inserts

Mar 21, 2006

I am doing a simple IO Test with the below script ...

Just wanted to keep things simple and to check how many Inserts I can do on a given SQL Server.
I am running the below script from QA for 1 minute and then divide the No or rows inserted by 60.

Will it give me approximate results by duing this?

Actually the datafiles .MDF files are sitting on a single drive where the manufacturer specs shows that it will handle 130 IO's per disk. With the below script I am getting around 147 Inserts per second.

But my boss says that he is getting 2000 inserts per second on his laptop from a ...Am I missing some thing?

DECLARE @lnRowCnt INT
SELECT @lnRowCnt = 100000

WHILE @lnRowCnt > 0
BEGIN
SET NOCOUNT ON
INSERT INTO CTMessages..Iotest
SELECT @lnRowCnt , 'VENU' , REPLICATE ( 'V' , 4000 ) , 1000000

SELECT @lnRowCnt = @lnRowCnt - 1
END

Thx
Venu

View 3 Replies View Related

Two Inserts In One InsertCommand

Apr 11, 2007

I'm using a SQLDataSource and trying to do two inserts into two different tables with one InsertCommand, but it's not working. Here's the code I'm trying to use. Do you see anything wrong with the syntax? I keep getting an error that says error near ','  but I can't figure out why. Thanks
 
InsertCommand="INSERT INTO [OurProjects] ([Title], [Description], [Location], [Anchors], [Size], [Developer], [DesignBuilder], [Architect], [ImageName], [MapName], [ProjectTypeAbbrev], [Deleted]) VALUES (@Title, @Description, @Location, @Anchors, @Size, @Developer, @DesignBuilder, @Architect, @ImageName, @MapName, @ProjectTypeAbbrev, @Deleted),
INSERT INTO [OurProjectsImages] ([OurProjectsID], [ImageMonthName], [SwfName]) VALUES (@OurProjectsID, @ImageMonthName, @SwfName)"

View 2 Replies View Related

SQL TimeSpan And Inserts

May 27, 2008

I need to Add a Check in the database to ensure that user can only enter up to 20 entries to Database in a period of 10 minutes. Basically, to guard against people using scripts to add data to the database ( instead of using CAPTCHAE on the front end) what we want to do is restrict user to entering at most 20 transactions in 10 Minutes.How do I handle or do this in SQl Server 2005??
 
What I figure to do is right after I do an INSERT into the table, I Select the last 20 entries into that same table and then Calculate the Total time it took to add those 20 transactions and set the righ flag.
1) How do I select last 20 entries into a table??
2) How do I calculate the total time that elapsed between adding the first of those 20 records and the last??
Thanks in Advance

View 6 Replies View Related

Next Inserts ID Number, Someone Please Help

Sep 21, 2004

G'day,
I have a table with a primary key being a bigint and its set to auto increment (or identity or whatever ms calls it). Is there anyway I can get the ID number that will be assigned to the next Insert before I insert it? I want to use that ID number within another field when inserted.

I hope that makes sense.

Thanks for any help.

Robbo

View 3 Replies View Related

Multiple Inserts

Jul 14, 2005

Hi, I'm trying to create a form where new names can be added to a database. The webform looks like this:<body MS_POSITIONING="GridLayout">        <form id="Form1" method="post" runat="server">         Name:<asp:TextBox ID="newName" runat="server" />         <INPUT id="NewUserBtn" type="button" value="Create New User" name="NewUserBtn" runat="server"            onServerClick="NewBtn_Click">&nbsp;     </form>And the code behind looks like this:Public Sub NewBtn_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles NewUserBtn.ServerClick        Dim DS As DataSet        Dim MyConnection As SqlConnection        Dim MyCommand As SqlDataAdapter
        MyConnection = New SqlConnection("server=databaseserver;database=db;uid=uid;pwd=pwd")        MyCommand = New SqlDataAdapter("insert into certifications (name) values ('" & newName.Text & "'); select * from certifications", MyConnection)
        DS = New DataSet        MyCommand.Fill(DS, "Titles")
        Response.Redirect("WebForm1.aspx", True)    End SubWhen I try to insert one name it works. When I try to insert a second name, it overwrites the old one. Why is that?Thanks.James

View 3 Replies View Related

Identity Inserts

Apr 15, 2006

Hey All,
I was trying to use a typed dataset to create a very simple DAL. I found that the code generated for the INSERT statement includes an identity field the table has. That can obviously never work (unless identity_insert is set, which it is not). My question is whether it is possible to control this insert statement generation? Is there a property I am missing somewhere? My solution was to change the INSERT statement on the DataTableAdapter, but that seems awkward for me to have to do that..
Thanks,
Yuval

View 3 Replies View Related

Updates, Inserts

Jan 22, 2001

I have a number of columns with predefined character length but user can input more from gui. i want to trucncate automatically to the desired length and insert or update the database right now it does not allow me to update , or insert the values can i do it and how this is urgent

View 2 Replies View Related

Volume Inserts

Jan 24, 2000

We have a 4 processor 350 Hz NT 4.0 SQL server. Currently we have an application
that is inserting rows one at a time, each row insert is a separate transaction.
Currenty we are averaging 2500 rows a second with each row ( 56 bytes wide).
The data and the log are on one string of Raid disk. We plan to get another controller
and raid string to separate the data and the log onto separate controllers.
The developer is modifying the application to insert the data in blocks. What is the
impact to the transaction log? He seems to think that by inserting page blocks on
rows there would be less data going into the transaction log. Why would this be so?
Does anyone have any information on practical limits for inserts and log truncation
with similar machine configurations. He would like to try to get around 150,000 rows a second.
Has anyone accomplished inserts at this rate? What type of machine configuration?

View 1 Replies View Related

Inserts Overwriting

Nov 21, 2001

Hi,
I have a small web application managing complaints. During multiuser testing we noticed that when complaints where added at "exactly" the same time one complaint text seemed to be over writing the other, and returning the current max value for table id as current complaint number.

I tested in my development environment and was able to recreate reasonably easily ( 1 go out of 3 recreated the issue ). The Id column itself is an auto increment ( primary key ), so I can't think of a concievable reason why one record should overwrite another. I should say that I am assuming the record is overwritten, perhaps there is a clash and one complaint is ignored by the database.

Have anyone encountered this in the past?


Thanks

View 1 Replies View Related

Inserts Across Databases

Mar 6, 2002

Hi,
I have a procedure that I call on one database, and one of it's steps is to write to a table on another database, same server. the user exists in both databases, but i keep getting errors when i try and write to this second database. i know i can fix this by giving the user insert permissions on the table in this second database, but i do not want this for security purposes. any other ideas on how to accomplish this?

View 1 Replies View Related

Looped Inserts Sql

Dec 5, 2005

i have a {date value}
i have a {frequency value}
1 = yearly
4 = quarterly
12 = monthly

i need to select an item
then check the frequency

then do a loop insert based on the frequency

if frequency = 1

insert item, date into table where date = {date value}

elseif frequency = 4

Per item -- insert 4 new entrys
insert item, date into table where date = {date value}
insert item, date into table where date = {date value + quarter}
insert item, date into table where date = {date value+ 2 quarters}
insert item, date into table where date = {date value + 3 quarters}

<
' below is how i can calculate quartly values from a date iv vb .net just need to do the same within sql

Dim Quarterloop AS Integer
for QuarterLoop = 0 to 3 >
<= formatdatetime(dateadd("q", Quarterloop , MyDate),DateFormat.longdate) ><br>
< Next >

elseif frequency = 12
--- per item insert 12 new entrys
insert 12 items into the table looping from date and then in 12 increments of 1 month values

View 9 Replies View Related

Dynamic Inserts

May 27, 2008

Hello All,

I have to create dynamic insert statements for the table. For example there are DevTableA and ProdTableA tables. I worte a SQL to get the new records added in the DevTableA but are not there in ProdTableA. The result gives me a list of rows. These tables have a column 'LanguageID' and 'LText'

The compare result has records only for LanguageID = 0. One I see the compare result. I am suppose to create insert statements for LanguageID = 1,2,5 and 6 and update the Ltext for those languages. The Ltext for other languages is in spreadsheet.

Can anyone advice me how to create the insert statements from the comapre result and add 4 more insert statements for LanguageID = 1,2,5 and 6 with their respective Ltext.

So far I thought I can create #table. Looks like I need more than 1 # table.

Thanks in advance
-S

View 4 Replies View Related

Bulk Inserts

Mar 3, 2006

I'm trying to perform a bulk insert as shown below. It's problematic b/c it's not updating the identity fields correctly and we're getting dups. I think, but I'm not sure, that On Update Cascade would solve all this, b/c we wouldn't have to concern ourselves with even touching the identity fields, b/c they would be autogenerated. Can someone shed some light?? I'm pretty confused.


CREATE PROCEDURE AddMiamirecords AS

BEGIN TRANSACTION

--USERS
INSERT INTO [Undex_Production].[dbo].[USERS]([LastName], [UserName], [EmailAddress], [Address1], [WorkPhone], [Company], [CompanyWebsite], [pword], [IsAdmin], [IsRestricted],[AdvertiserAccountID])
SELECT dbo.fn_ReplaceTags (convert (varchar (8000),Advertisername)), [AdvertiserEmail], [AdvertiserEmail],[AdvertiserAddress], [AdvertiserPhone], [AdvertiserCompany], [AdvertiserURL], [AccountNumber],'3',0, [AccountNumber]
FROM Miami
WHERE not exists (select * from users Where users.Username = miami.AdvertiserEmail)
AND validAD=1


--PROPERTY
INSERT INTO [Undex_Production].[dbo].[Property]([ListDate],[CommunityName],[TowerName],[PhaseName],[Unit], [Address1], [City], [State], [Zip],[IsActive],[AdPrintId])
SELECT [FirstInsertDate],[PropertyBuilding],[PropertyStreetAddress],PropertyCity + ' ' + PropertyState + ' ' + PropertyZipCode as PhaseName,[PropertyUnitNumber],[PropertyStreetAddress],[PropertyCity], [PropertyState], [PropertyZipCode],'0',[AdPrintId]
FROM [Undex_Production].[dbo].[miami]
WHERE miami.AdvertiserEmail IS NOT NULL
AND validAD=1


--ITEM
INSERT INTO [Undex_Production].[dbo].[ITEM] ([SellerID],[Price],[StartDate],[EndDate], [HomePageFeatured],[Classified],[IsClosed])
SELECT USERS.UserID, miami.PropertyPrice, convert(datetime,miami.FirstInsertDate), dateadd(day, 30, miami.FirstInsertDate)as EndDate, 1, convert (int,AdNumber) as Classified, 0
FROM USERS RIGHT OUTER JOIN
miami ON USERS.UserName = miami.AdvertiserEmail
WHERE validAD=1


--PROPERTYITEM
INSERT INTO [Undex_Production].[dbo].[propertyItem]( [propertyId], [ItemId])
SELECT Property.propertyId, ITEM.ItemID
FROM ITEM RIGHT OUTER JOIN
miami ON ITEM.StartDate = miami.FirstInsertDate AND ITEM.Price = miami.PropertyPrice AND ITEM.Classified = convert(int,miami.AdNumber) LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1

--CONDOFEATURES
INSERT INTO [Undex_Production].[dbo].[CondoFeatures](PropertyId,[Bedrooms], [Area], [PropertyDescription], [Bathrooms], [NumOfFloors])
SELECT Property.propertyId, [PropertyBedrooms], [PropertySquareFeet], dbo.fn_ReplaceTags (convert (varchar (8000),PropertyDescription)),
[PropertyBathrooms], [PropertyTotalFloors]
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1

--COMMUNITY FEATURES
INSERT INTO [Undex_Production].[dbo].[CommunityFeatures](PropertyId,[totalFloors],isComplete1)
SELECT Property.propertyId, miami.propertyTotalFloors,'0' as IsComplete
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1

--UNITDISCLOSURES
INSERT INTO [Undex_Production].[dbo].[UnitDisclosures]([propertyId],[monthcondoasso])
SELECT Property.propertyId, [propertyassocfee]
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1


--BROKERDEVELOPER
INSERT INTO [Undex_Production].[dbo].[BrokerDeveloper]([IsFSBO],[FSBOName],
[FSBOEmail],[FSBOWebsite],[IsDeveloper],[DeveloperName],[DeveloperWebsite],[IsBroker],[BrokerName],[BrokerageWebsite],
[propertyId],[brokercommission],[isComplete])SELECT
CASE AdvertiserType when 'FSBO' THEN 1 else 0 end,
CASE AdvertiserType when 'FSBO' THEN [AdvertiserName] else NULL end,
CASE AdvertiserType when 'FSBO' THEN [AdvertiserEmail] else NULL end,
CASE AdvertiserType when 'FSBO' THEN [AdvertiserURL] else NULL end,
CASE AdvertiserType when 'Developer' THEN 1 else 0 end,
CASE AdvertiserType when 'Developer' THEN [AdvertiserName] else NULL end,
CASE AdvertiserType when 'Developer' THEN [AdvertiserURL] else NULL end,
CASE AdvertiserType when 'Realtor' THEN 1 when 'Broker' THEN 1 else 0 end,
CASE AdvertiserType when 'Realtor' THEN [AdvertiserName] when 'Broker' THEN [AdvertiserName] else NULL end,
CASE AdvertiserType when 'Realtor' THEN [AdvertiserURL] when 'Broker' THEN [AdvertiserName] else NULL end,
Property.propertyId,[PropertyCommBroker],'0' as IsComplete
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1
IF @@ERROR <> 0
BEGIN
ROLLBACK TRAN
RETURN
END
COMMIT TRANSACTION
GO

View 2 Replies View Related

Generate Inserts

Nov 24, 2007

is there any easy way I can take a select statment
(such as select from payments where datetime>'20071122' and output a sql insert statment for these records?

I basically need to move a specific set of records from one sql server to another (both sql server 2005)
any suggestions for the best way to do this?

View 4 Replies View Related

Speeding Up Inserts

Jul 23, 2005

Hello everybody,Just short question:I have tables, which are only log tables (very less used for selects),but there is a lotof writing.I would like to have as much speed as possible by writing data intothis tables.create table [tbl] ([IDX] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[Time_Stamp] [datetime] NOT NULL ,[Source] [varchar] (64) COLLATE Latin1_General_CI_AS NULL ,[Type] [varchar] (16) COLLATE Latin1_General_CI_AS NULL ,[MsgText] [varchar] (512) COLLATE Latin1_General_CI_AS NULL ,CONSTRAINT [tbl] PRIMARY KEY NONCLUSTERED([IDX]) ON [PRIMARY]) ON [PRIMARY]GOQuestion:Is it better for inserts,, to remove PK but leave identity insert?How to make this table optimized for writing?If I will set fill level of the table with 0%, will I winn much?Once information: this table will be deleted with old data, dependingon row count (oldest ID's will be deleted each night).Thank You in advanceMateusz

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved