Massive Data Import, How To Avoid Dublicates?

Dec 4, 2006

Hello,

I am currently working on a project where I have to import a huge amount of data from CSV files into a database.
I don't want to have dublicate keys in my table, but my CSV file contains them. That means the line more at the end of the file contains the mor up to date information that I have to store.

I try to fix this problem since serveral weeks, but my algorithm is very slow and blocks all other processes on the server. At the moment I am copying all records into a temp table that occure more than once in the CSV file. After that I am running through this table line by line and check if the key already exists in the target table and then either make an insert or an update.

Does somebody know a better process?

I hope somebody can help me... :(

View 5 Replies


ADVERTISEMENT

Massive DTS Delete/import: Logging Problems

Feb 14, 2005

Hey all,

We are using SQL Server 7 on Win 2k and there are some DTS packages set up which empty some large tables (delete from) and then import some datafiles.
The imported files are about 13 GB and during the process the log file gets to about 10GB and then runs out of disk space.

Is there a trick to empty a table without logging it? (a la LOAD Replace from Null in DB2)?
How can I go about keeping the log file size down during this operation?

I think the DB is set to autocommit, the trunc log on chkpt. is set on as is the select into/bulk copy (altho I'm reasonable sure we arent availaing of the bulk copy for the import).

Help? :)

View 2 Replies View Related

Deleting Massive Data From A Table

Jan 20, 2014

I have to delete a ton of data from a SQL table. I have a unique identifier called the version. I would like to use if not in these versions then delete. I tried to using the statement below, but learned the hard way that it created an error this is the message I got....

Msg 9002, Level 17, State 4, Line 3...

The transaction log for database 'MonthEnds' is full due to 'ACTIVE_TRANSACTION'.

I was reading about truncate, I am not sure how I would do this or how I would setup the statement.

Delete Products
where versions were not in (('48459CED-871F-4971-B888-5083990332BC','D550C8D3-58C7-4C74-841D-1C1675F19AE3','C77C7817-3F04-4145-98D3-37BB1610DB35',
'21FE83FA-476D-4604-80EF-2ED57DEE2C16','F3B50B81-191A-4D71-A406-011127AEFBE1','EFBD48E7-E30F-4047-909E-F14DCAEA4181','BD9CCC41-D696-406B-
'C8BEBFBC-D362-4D0F-A555-B281FC2B3023','EFA64956-C2CF-41FC-8E21-F060597DAFCB','77A8DE56-6F7F-4490-8BED-AA6809B947EF','0F4C1E5F-B689-4DCB-

[code]....

View 2 Replies View Related

Dublicates

Apr 26, 2004

can someone please explain to me how to append data to current database tables?

If I have infromation from access and want to had the (NEW) information to current SQL tables how to I append without writing over current table information and without creating dups if the infromation currently exists within the table?

I would like to keep the current table information and append anything new only.

Thanks

View 1 Replies View Related

Massive Bulk Delete / Data Purge Problem

Jan 28, 2008

I've got a large MS Sql Server 2000 database that has 15 indexes, with roughly 180 million rows representing 240 GB worth of data. Due to the massive size of the database we are trying to purge it down to a smaller dataset, about 40 million rows, in order to speed up the query performance and to be able to defrag the indexes (which are 30-50% fragmented). To complicate the matter, this table is also a publisher in a transactional replication setup, with one subscriber. Also, the system needs to be up constantly so I'm only allowed about a 3-5 hour period to take an outage a week.

So far I've tested several methods of delete following all best practices (batch deletes, using indexes in delete's where clause), and have come up with deleting/commiting 500 rows at a time. The problem is that it still takes 3-4 seconds to delete this many rows, on a 8 GB RAM, 4 processor machine that is not currently used or replicated.

I'm at a loss on a way to pare down the data with a delete as the current purge script will take 7 hours a day for about 3 months. Another option I'm considering is to do a truncate and copy the data back over from the replicated database, but again this has its own set of problems, i.e. network latency and slow inset times. Yet another option would be to create a replica of the table on the production db, copy the data to it, then rename the table.

Any one have experience with purging such a massive amount of data? Any help would be greatly appreciated.

View 6 Replies View Related

Integration Services :: Validate And Avoid Invalid Row When Import Flat File In SSIS

Sep 8, 2015

I have a flat file which have some record data ex.

id name team
1 "A"my" "Bl"ue"s"
2 "Bob" "Reds"
3 "Chuck" "Blues"
4 "Dick" "Blues"

in above example first record contain invalid data so complete flat file will not import due to one invalid row or record in flat file. so is there any way to check invalid row from flat file and ignore it(write log about invalid record) and process importing flat file.

View 5 Replies View Related

How To Avoid For..each When There's No Data

Oct 25, 2007



Hi everyone,

As first task I've got Data Flow which loads a set of data into a .NET recordset.
After that, inmediately flow execution goes to For..Each Loop. I'd like to avoid go in that direction when Data Flow returns zero rows.

How can I do such thing?

I've tried this on Precedence Constraint Editor but it doesn't work..It doesn't recognize EOF keyword..

@[User::ResulSet] == EOF


Thanks in advance for your input,

Enric

View 3 Replies View Related

How To Avoid Duplicate Data

May 7, 2015

set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[sectionexpenses]
(@sectionname varchar(30),
@ExpensesName varchar(max),

[code]....

View 3 Replies View Related

Need To Avoid Repeated Data In A DataGrid

Apr 15, 2004

Hi there :)

I am developing a system for my uni course and I am stuck a little problem...

Basically its all about lecturers, students modules etc - A student has many modules, a module has manu students, a lecturer has many modules and a module has many lecturers.

I am trying to get a list of lecturers that run modules associated with a particular student. I am able to get a list of the appropriate lecturers, but some lecturers are repeated because they teach more than one module that the student is associated with.

How can I stop the repeats?

Heres my sql select code in my cs file:

string sqlDisplayLec = "SELECT * FROM student_module sm, lecturer_module lm, users u WHERE sm.user_id=" + myUserid + "" + " AND lm.module_id = sm.module_id " + " AND u.user_id = lm.user_id ";
SqlCommand sqlc2 = new SqlCommand(sqlDisplayLec,sqlConnection);
sqlConnection.Open();
lecturersDG.DataSource = sqlc2.ExecuteReader(CommandBehavior.CloseConnection);
lecturersDG.DataBind();

And here is a pic of my Data Model:
Data Model Screenshot

Any ideas? Many thanks :) !

View 1 Replies View Related

How Can Avoid Displaying Data Using A Cursor ??

Oct 16, 1998

I defined a stored procedure with a cursor inside for updating data.
When I call it from an MSAccess client, it fails.
When I execute it directly in a ISQL/w windows, it doesn`t fail but it displays me the data (wich is the reason for failing from MSAccess).
Do somebody know if I could do it without displaying data in the screen ??

View 2 Replies View Related

SQL Server Import And Export Wizard Fails To Import Data From A View To A Table

Feb 25, 2008

A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server.
I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard.
However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection


Operation stopped...

- Initializing Data Flow Task (Success)

- Initializing Connections (Success)

- Setting SQL Command (Success)
- Setting Source Connection (Error)
Messages
Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
(SQL Server Import and Export Wizard)

Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)


- Setting Destination Connection (Stopped)

- Validating (Stopped)

- Prepare for Execute (Stopped)

- Pre-execute (Stopped)

- Executing (Stopped)

- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)

- Post-execute (Stopped)

Does anyone encounter this problem before and know what is happening?

Thanks for kindly reply.

Best regards,
Calvin Lam

View 6 Replies View Related

Import Data From MS Access Databases To SQL Server 2000 Using The DTS Import/Export

Oct 16, 2006

I am attempting to import data from Microsoft Access databases to SQL Server 2000 using the DTS Import/Export Wizard. I have a few errors.

Error at Destination for Row number 1. Errors encountered so far in this task: 1.
Insert error column 152 ('ViewMentalTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 150 ('VRptTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 147 ('ViewAppTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 144 ('VPreTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 15 ('Time', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.

Could you please look into this and guide me
Thanks in advance
venkatesh
imtesh@gmail.com

View 4 Replies View Related

Avoid Cross Database Views In Data Warehouse

May 4, 2007

We have a relational database (rd) and a data warehouse (dw). This dw has a table (tw) with all key fields (dimension keys) and metric related (measures) fields. This table is populated with monthly data each month. The tw is joined to various look up views present in the dw to obtain name fields from rd. The DBA wants me to remove the look up views. I now have following 2 options that I can think of –

1)Further de-normalize the tw and store the name fields as well. However, there are two issues with this option –

a.The size of tw will grow tremendously.
b.We are storing monthly data and the values in the name fields may change after some time. Then we will have to put in additional views/objects to obtain the latest name.

2)Using ETL, obtain the copy of rd tables overnight in dw. We will then join tw with these tables and there will no longer be cross database joins. However, this will be a burden on maintenance and support.

As of now these are the possible options I can come up with. Which one would you suggest and why? If you have another option, please let me know.

Thank you all in advance,

sajmera

View 1 Replies View Related

How Can We Avoid Somebody To Access The MDF Data By Doing User Instance Connection?

Dec 11, 2006

I created a database that will be distributed to my customers. This database is running on an Instance of SQL Server 2005 Express edition. I removed the admin logins from my SQL Server Instance so in theory, only my application connecting itself using the Sql Server autenthication will be able to be access the data (using "sa" having a password that I set at the installation).

For now, all this is working fine and after some tests, I haven't been able to access the data in any ways except by using the "sa" and the password my app is the only one to know.

But the problem is coming from a security leak when using User Instance. Indeed, I've been able to create a program getting the content from my MDF file. If somebody try to get connected using User Instance on his own SQL Server instance, he will be able to reach the data.

How could I prevent this to happend? Is there a property or something that could be set into the database that would prevent the database (mdf file) to be used with User Instance?

Thanks!

View 5 Replies View Related

DB Engine :: How To Avoid Special Characters While Migrating Data

Jun 23, 2015

I have a source sql 2005 with the database collation SQL_Latin1_General_CP1_CI_AS and destination with sql 2012 with the same collation.

But the SQL server llvel collation is different, sql 2005 uses Latin1_General_CI_AI and sql 2012 uses "SQL_Latin1_General_CP1_CI_AS"

Now when i load the data from 2005  for one table to sql 2012 i could see special characters in one column. And i dont see that in the source database. Is there a way to avoid that or is it something we need to manually fix.

View 7 Replies View Related

Howto Get Avoid Bulk Insert Data Conversion Error?

Aug 7, 2006

hi, i having a problem in bulk insert , which is regard the text file that
to insert into database, when insertion processing,

if my textfile have NULL value, it give me Bulk insert data conversion error

for example in my text file c:mytest.txt , it contains data NULL

123 studentname NULL



can we let bulk insert detect NULL value ?

i have try on putting "KEEPNULLS" , but it doesn't help , caused some fields in table may in datetime type

BULK INSERT [mytable] FROM c:mytest.txt WITH (FIELDTERMINATOR = '' '', ROWTERMINATOR = '''', KEEPNULLS )'


thank you

View 4 Replies View Related

IMPORT New Data Since Last IMPORT - DTS/Stored Procs?

Jan 7, 2004

Hello:

I am not sure how to implement the following, but I believe it entails using DTS, and hopefully it is fine that I post it here b/c ultimately I will need this backend data for my frontend .aspx pages:

On a weekly basis, I need to IMPORT some data located on a remote Oracle DB into SQL Server 2k. Since there is so much data to transfer, I would only like to transfer the data that is new to the table since the last IMPORT, i.e. a week ago and leave behin the OLD data.

Is DTS the correct way to go or do I have more control via DTS with STORED PROCEDURES? Does anyone have any good references for me?

On a similar note, once this Oracle data is IMPORTED into a certain table, I would like to EXPORT some of these NEWLY acquired rows matching certain criteria into another table for auditing purposes. For this scenario, should I implement a TRIGGER UPDATE event here on the first table?

Any advice will be greatly appreciated!

View 3 Replies View Related

Massive Inserts

Oct 15, 2004

Currenlty I have huge amounts of data going into a table.
I'm sending an xmldoc and using openxml with a cursor to seed them.

the question I have is whether to let duplicate keyed data rows bounce
and then check @@error and then do an update on the nokeyed field
or
to do a select on the keyed field and then do an insert or update based on the
selects results.

Speed is my goal.

View 3 Replies View Related

Massive .bak File

Feb 6, 2007

Rather than posting twice, I thought I would put both issues I'm having in one. Our server is Windows Server 2003 and we're running SQL Server 2005.

The first issue is this: We have several databases and I have scheduled their backups to run nightly which works just fine. A couple weeks ago, one of the databases .bak file grew from about 500MEG to 2GB overnight. Then, just a few days ago, it went from 2GB to 3.5GB. There is nothing unusual going on in the live db that would warrant such an increase in the .bak file. All the dbs are in the same backup job schedule but this is the only one affected. Additionally, I had autogrowth enabled on all the dbs but today disabled it for this particular db. Any ideas?

The second issue is my tempdb.mdf file on my C drive. It will go from just a few hundred KB's to 4.5GB overnight consuming most of what is left on my C drive. I'm afraid I'm in for a system crash if it continues. I have to stop SQL Server and restart it to clear the size. Is there a way to move the location of the tempdb.mdf file to my F drive?

I don't know if these two issues are related or not but certainly would like to hear from someone.

Sorry, in advance, for the large post.

Dave

View 20 Replies View Related

Massive Delete In DB

Sep 6, 2005

Hello,I have a huge database (2 GB / month) and after a while it is becomingnon-operational (time-outs, etc.) So I have written an SQL sentence(delete) that can reduce around 60% of the db size without compromisingthe application data needs. The problem is that when I execute it, thedb does reduce its size 60%, but the transaction log increases at thesame rate. Can I execute the sentence in a "commit" or"transaction" mode so to impede the SQL Server write in the log?Thanks for the help!Antonio

View 6 Replies View Related

SQL 7.0 Massive Row Locking Performance

Mar 3, 2000

When updating large sets a row at a time the performance is lacking in comparison to 6.5. When using PeopleSoft which uses cursors with a begin transaction with a loop inside and a commit after the loop completes, SQL 6.5 with Page locking could handle a 300,000 row transaction in 3-4 hours. 7.0 took 17.5 hours. The difference is 6.5 used 50,000 locks and 7.0 used 300,000 locks.

Does anybody have solution short of rewriting PeopleSoft ?

View 2 Replies View Related

Massive TRN File, But Small DB

Apr 5, 2006

Hi Everyone,

We have a large and active MSSQL 2000 database. Recently, after a rebuild of the server, we had a problem with the SQL service SQLSERVERAGENT. The service could not start as the service account lost local permission to the registry. During this time, all of the data being sent to the database from our application accumulated into the database .ldf file. By the time we were able to get the service restarted, our .ldf file was approx. 28 Gigs. When the service restarded, the .ldf file shrunk down to regular size,about 40 megs, and the .trx tlog file grew up to 28 gigs for that specific period (new file every hour).

The problem is, the database file (database.mdf) stayed about the same as it was before the service was restarted. When the .ldf transfered to the .trn none of the 28 gigs of data got stored in the database. What does this mean? Perhaps with the service stopped the application using the db saw problems and did not commit the data making it all useless? Or is it possible that the data in the .trn log just needs to be forced to commit to the .mdf???

Is there any way to verify the data in the 28 gig .trn file and figure out if we should get it stored to the database? If yes, how would we go about verifying it, and after that how would we force it to commit to the .mdf file? Am I on the right track here or is it not as I see it??

Thanks!
Mike

View 4 Replies View Related

General Question About Massive SP Use

Jul 2, 2006

Hi

Would you say that it's ok for a web site code to make ALL of it's access to a db through SP and views? And I mean everything including inserting new records and updating others with no use with SQL in the code.

The advantage would be very strict control over the access, but in order to achieve this it would take many many SP and views to cover all types of actions, can you think about a disadvantage except all the work creating those SP?? what about the server resources and performance? how demanding it would be?

Thanks,
Inon.

View 7 Replies View Related

Execute Massive SQL Statement

Jan 3, 2006

Hello,

I thought this was a neat solution I came up with, but I'm sure it's
been thought of before. Anyway, it's my first post here.

We have a process for importing data which generates a SELECT statement
based on user's stored configuration. Since the resulting SELECT statement
can be massive, it's created and stored in a text field in a temp table.

So how do I run this huge query after creating it? In my tests, I was
getting a datalength > 20000, requiring 3 varchar(8000) variables in
order to use the execute command. Thing is, I don't know how big it could
possibly get, I wanted to be able to execute it regardless.

Here's what I came up with, it's very simple:

Table is named #IMPORTQUERY, one field SQLTEXT of type TEXT.


>>
declare @x int, @s varchar(8000)

select @x = datalength(sqltext) / 8000 + 1, @s = 'execute('''')' from #importquery

while @x > 0
select @s = 'declare @s' + cast(@x as varchar) + ' varchar(8000) ' +
'select @s' + cast(@x as varchar) +
'=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery ' +
replace(@s,'execute(','execute(@s' + cast(@x as varchar) + '+')
, @x = @x - 1

set @s = 'declare @x int set @x=1 ' + @s

execute(@s)
<<

At the end, I execute the "@s" variable which is SQL that builds and
executes the massive query. Here's what @s looks like at the end:

>>
declare @x int set @x=1
declare @s1 varchar(8000)
select @s1=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s2 varchar(8000)
select @s2=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
declare @s3 varchar(8000)
select @s3=substring(sqltext,@x,@x+8000),@x=@x+8000 from #importquery
execute(@s1+@s2+@s3+'')
<<

View 4 Replies View Related

Massive Amoutns Of Reading

Jul 23, 2005

Our database server has started acting weird and at this point I'm eithertoo sleep deprived or close to the problem to adequately diagnose the issue.Basically to put it simply... when I look at the read disk queue length, thedisks queues are astronomical.normally we're seeing a disk queue length of 0-1 on the disks that containthe DB data and index. (i.e non clustered indexes are on a disk of theirown).Writes are just fine.Problem is, all our databases are on the same drive, and I can't seem tonail down which DB, let alone which table is the source of all our reads.Now, to really make things weirder.. during the busier times of the daytoday (say 1:00 PM to 4:00 PM) things were fine.At 4:20 PM or so it was like someone hit a switch and read disk queue lengthjumped from 0-1 up to 100-200+... with spikes up to 1500 for a split secondor so.What's the best way folks know to nail down this?Thanks.----

View 9 Replies View Related

Massive Slowdown With Query

Dec 29, 2007

If I remove the TOP 200 this query returns about 2.5 million rows. It combines a lot of records and turns it into much more programmer friendly results. The query slowed down from 2 seconds to about 13 seconds as it has grown from about 10k to the now couple of million.



Code Block

SELECT TOP 200 *
FROM
(
SELECT
[UserProfile].[UserId]
,[aspnet_Users].[UserName]
,[City]
,[State]
,[RoleName]
,[ProfileItemType].[Name] AS pt_name
,[ProfileItem].[Value]
FROM
[UserCriteria]
,[aspnet_Users]
,[aspnet_Roles]
,[aspnet_UsersInRoles]
,[Location]
,[ProfileType]
,[ProfileTypeItem]
,[ProfileItem]
INNER JOIN [UserProfile]
ON [ProfileItem].[ProfileId] = [UserProfile].[ProfileId]
INNER JOIN [ProfileItemType]
ON [ProfileItem].[ProfileItemTypeId] = [ProfileItemType].[ProfileItemTypeId]
WHERE [UserProfile].[UserId] IN (
SELECT [UserCriteria].[UserId]
FROM [UserCriteria]
WHERE
Zipcode IN (
SELECT [Zipcode]
FROM [ZipcodeProximitySQR] ('89108' , 150))
)

AND [UserProfile].[UserId] = [aspnet_Users].[UserId]
AND [UserCriteria].[UserId] = [UserProfile].[UserId]
AND [Location].[Zipcode] = [UserCriteria].[Zipcode]
AND [aspnet_UsersInRoles].[UserId] = [aspnet_Users].[UserId]
AND [aspnet_UsersInRoles].[RoleId] = [aspnet_Roles].[RoleId]
) AS t
PIVOT
(
MIN([Value])
FOR pt_name IN ([field1],[field2]],[field3]],[field4]])
) AS pvt
ORDER BY RoleName DESC, NEWID()





The line: FOR pt_name IN ([field1],[field2]],[field3]],[field4]]) I change the values from the long names to read field1, field2... because it was irrelevant but confusing because of the names.

Here is the showplan text



Code Block
|--Sequence
|--Table-valued function(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
|--Top(TOP EXPRESSION:((200)))
|--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId], [aous].[dbo].[aspnet_Users].[UserName], [aous].[dbo].[Location].[City], [aous].[dbo].[Location].[State], [aous].[dbo].[UserCriteria].[Birthdate], [aous].[dbo].[aspnet_Roles].[RoleName]) DEFINE:([Expr1039]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'height' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1040]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'bodyType' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1041]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'hairColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END), [Expr1042]=MIN(CASE WHEN [aous].[dbo].[ProfileItemType].[Name]=N'eyeColor' THEN [aous].[dbo].[ProfileItem].[Value] ELSE NULL END)))
|--Nested Loops(Inner Join)
|--Nested Loops(Inner Join)
| |--Sort(ORDER BY:([aous].[dbo].[UserCriteria].[UserId] ASC, [aous].[dbo].[Location].[City] ASC, [aous].[dbo].[Location].[State] ASC, [aous].[dbo].[UserCriteria].[Birthdate] ASC, [aous].[dbo].[aspnet_Roles].[RoleName] ASC))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserCriteria].[Zipcode])=([Expr1043]), RESIDUAL:([Expr1043]=[aous].[dbo].[UserCriteria].[Zipcode]))
| | |--Hash Match(Inner Join, HASH:([aous].[dbo].[ProfileItemType].[ProfileItemTypeId])=([aous].[dbo].[ProfileItem].[ProfileItemTypeId]))
| | | |--Index Scan(OBJECT:([aous].[dbo].[ProfileItemType].[ProfileTypes]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[ProfileId]))
| | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserProfile].[UserId]))
| | | | | |--Hash Match(Inner Join, HASH:([aous].[dbo].[UserProfile].[UserId])=([aous].[dbo].[aspnet_UsersInRoles].[UserId]), RESIDUAL:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[aspnet_UsersInRoles].[UserId]))
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | |--Stream Aggregate(GROUP BY:([aous].[dbo].[UserCriteria].[UserId]))
| | | | | | | | |--Nested Loops(Left Semi Join, WHERE:([aous].[dbo].[UserCriteria].[Zipcode]=[Expr1044]))
| | | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId] < {guid'E3D72D56-731A-410E-BCB1-07A87A312137'} OR [aous].[dbo].[UserCriteria].[UserId] > {guid'E3D72D56-731A-410E-BCB1-07A87A312137'}), WHERE:([aous].[dbo].[UserCriteria].[Male]=(1) AND [aous].[dbo].[UserCriteria].[SeekingMale]=(0)) ORDERED FORWARD)
| | | | | | | | |--Compute Scalar(DEFINE:([Expr1044]=CONVERT_IMPLICIT(nvarchar(5),[aous].[dbo].[ZipcodeProximitySQR].[Zipcode],0)))
| | | | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[ZipcodeProximitySQR].[PK__ZipcodeProximity__5E54FF49]))
| | | | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserProfile].[UserProfileIds]), SEEK:([aous].[dbo].[UserProfile].[UserId]=[aous].[dbo].[UserCriteria].[UserId]) ORDERED FORWARD)
| | | | | | |--Nested Loops(Inner Join, OUTER REFERENCES:([aous].[dbo].[aspnet_Roles].[RoleId]))
| | | | | | |--Clustered Index Scan(OBJECT:([aous].[dbo].[aspnet_Roles].[aspnet_Roles_index1]))
| | | | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_UsersInRoles].[aspnet_UsersInRoles_index]), SEEK:([aous].[dbo].[aspnet_UsersInRoles].[RoleId]=[aous].[dbo].[aspnet_Roles].[RoleId]) ORDERED FORWARD)
| | | | | |--Clustered Index Seek(OBJECT:([aous].[dbo].[UserCriteria].[UserCriteria]), SEEK:([aous].[dbo].[UserCriteria].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | | |--Index Seek(OBJECT:([aous].[dbo].[aspnet_Users].[_dta_index_aspnet_Users_5_37575172__K2_K1_K4_3]), SEEK:([aous].[dbo].[aspnet_Users].[UserId]=[aous].[dbo].[UserProfile].[UserId]) ORDERED FORWARD)
| | | |--Index Seek(OBJECT:([aous].[dbo].[ProfileItem].[_dta_index_ProfileItem_5_1714105147__K2_K1_K3_4]), SEEK:([aous].[dbo].[ProfileItem].[ProfileId]=[aous].[dbo].[UserProfile].[ProfileId]) ORDERED FORWARD)
| | |--Compute Scalar(DEFINE:([Expr1043]=CONVERT_IMPLICIT(nchar(5),[aous].[dbo].[Location].[Zipcode],0)))
| | |--Index Scan(OBJECT:([aous].[dbo].[Location].[CityLocation]))
| |--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileType].[PKProfileTypeProfileTypeId]))
|--Clustered Index Scan(OBJECT:([aous].[dbo].[ProfileTypeItem].[ProfileTypeItem]))




Here is a link to the execution plan from Microsoft SQL Server management Studio.
http://epi.cc/BasicUserSearch.zip

There are no table scans, but the Hash Match from the inner join is pretty bad.

Can anyone give me a pointer or two?

View 1 Replies View Related

SSIS Using MASSIVE Amounts Of Memory

Feb 8, 2008

Hi,

I have a series of SSIS packages, all of which are ultimately executed by a parent package.

I'm consitently getting "OutOfMemory" errors when working with the packages which is temporarily solved by closing Visual Studio and re-opening the package(s)... This solution is short lived however as the OutOfMemory error occurs quite quickly after re-opening, often after doing nothing other than altering a variables default value and attempting to save the package.

The average size of the packages in question (.dtsx files) is around 7,000kb with the largest being 12,500kb. The total size of all the solution's packages is ~75,000kb.

The Processes tab in Task Manager shows a Mem Usage counter for devenv.exe *32 of around 20,000kb when Visual Studio is first opened however, when a single ~6,000kb dtsx file is opened this counter jumps to +300,000kb and when the entire solution is opened (When the parent package is executed), the Mem Usage counter for devenv.exe *32 is a massive +800,000kb!!!

Is this normal SSIS behaviour or do I have a major problem? Any tips or suggestions as to how to resolve this issue would be gratefully received.

FYI, "SELECT @@VERSION" gives me "Microsoft SQL Server 2005 - 9.00.3042.00 (X64) Feb 10 2007 00:59:02 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) "

My Server is Windows Server 2003 R2 Enterprise x64 SP2 with 8GB of RAM.

Thanks in advance.

Leigh.

View 7 Replies View Related

Delete Records From A Massive Table (heap)

May 21, 2013

So I've stumbled across an audit table on one of our systems that has reached a hearty 180M rows in size.

The table is a heap (no indexes whatsoever).

Each record has a datetime value indicating when it was created.

I need to delete everything that was created prior to the last 6 months; what is my best plan of attack?

View 12 Replies View Related

Massive UPDATE And SELECT TOP 1 QUERIES, Slowing Down...

Apr 10, 2007

Background

SQL Server 2005 Standard 9.0.1399 64bit

Windows 2003 64-bit

8gb RAM

RAID-1 70gb HD 15K SCSI (Log Files, OS)

RAID-10 1.08TB HD 10K SCSI (Data Files)

Runs aproximately _Total 800 Transaction/Second

We deliver aproximately 70-80 million ad views / day



8 Clustered Windows 2003 32-bit OS IIS Servers running Asp.net 2.0 websites

All 8 servers talking to the one SQL server via a private network (server backbone).



In SQL Server Profiler, I see the following SQL statements with durations of 2000 - 7000:



select top 1 keywordID, keyword, hits, photo, feed from dbo.XXXX where hits > 0 order by hits



and



UPDATE XXXX SET hits=1906342 WHERE keywordID = 7;



Where the hits number is incremented by one each time that is selcted for that keyword ID.



Sometimes these happen so frequently the server stops accepting new connectinos, and I have to restart the SQL server or reboot.



Any ideas on why this is happening?



Regards,

Joe







View 6 Replies View Related

ASAP Help Needed Need Sql Guru To Help With Massive Script Issue

May 3, 2007

I need some help I have this massive sql script the problem is I tried to put it in to the query string box in my sql reports and it will not take it this script will run if I break it up but I think it is to large is there a sql guru out there that can show me how to reduce the size of this script maybe by using an out parameter to a stored proceedure. I just dont know what to do and need to produce the report from this script.  Below is the entire script
SELECT  'Prior Year All ' as 'qtr', COUNT(JOB.JOBID) AS 'transcount',  COUNT(DISTINCT JOB.PATIENTID) AS 'patientcount',  SUM(JOB.TRANSPORTATION_TCOST) AS 'tcost',  SUM(JOB.TRANSPORTATION_DISC_COST) AS 'dtcost',  AVG(JOB.TRANSPORTATION_DISC) AS 'avgTDisc',  SUM(JOB.TRANSPORTATION_TCOST) + SUM(JOB.TRANSPORTATION_DISC_COST) AS 'TGrossAmtBilled',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(DISTINCT JOB.PATIENTID) AS 'PatAvgT',  SUM(JOB.TRANSPORTATION_DISC) AS 'avgPercentDiscT',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(JOB.JOBID) AS 'RefAvgT',  JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                         PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                         JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE AS Expr1,                         INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE,
            (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS  'CompletedSuccessfullyItems',
             (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with complaint') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithComplaintItems',                                                  (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Show') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoShowItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Charge') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoChargeItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with Situation') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithSituationItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Not Completed') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'NotCompletedItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Cancelled Prior to service') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CancelledPriorToServiceItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Cancelled During Service') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CancelledDuringServiceItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'AwaitingforcompletionItems',
                        (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Pending for review') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like'%T ')) AS 'PendingforreviewItems'
FROM JOB                   INNER JOIN INVOICE_AR                                  ON JOB.JOBID = INVOICE_AR.JOBID                   LEFT OUTER JOIN PAYER                                 ON PAYER.PAYERID = JOB.PAYERID                  LEFT OUTER JOIN STATES                                 ON JOB.JURISDICTION = STATES.INITIALS                LEFT OUTER JOIN PATIENT                                ON PATIENT.PATIENTID = JOB.PATIENTID                LEFT OUTER JOIN JOB_OUTCOME                                ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                LEFT OUTER JOIN SERVICE_TYPE                                ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME               LEFT OUTER JOIN PAT_SERVICES_HISTORY                                ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID
WHERE                 (INVOICE_AR.AMOUNT_DUE > 0)AND                 (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (year,0,@startate) and DATEADD(year,0,@endate)) AND                 (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12))AND                (PAYER.PAY_GROUPNAME like '%' + @Company + '%')AND                (INVOICE_AR.INVOICE_NO like '%T')  
GROUP BY                         JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                        PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                        JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE,                        INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE
UNION ALL
SELECT  'Current Year 2007 All ' as 'qtr', COUNT(JOB.JOBID) AS 'transcount',  COUNT(DISTINCT JOB.PATIENTID) AS 'patientcount',  SUM(JOB.TRANSPORTATION_TCOST) AS 'tcost',  SUM(JOB.TRANSPORTATION_DISC_COST) AS 'dtcost',  AVG(JOB.TRANSPORTATION_DISC) AS 'avgTDisc',  SUM(JOB.TRANSPORTATION_TCOST) + SUM(JOB.TRANSPORTATION_DISC_COST) AS 'TGrossAmtBilled',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(DISTINCT JOB.PATIENTID) AS 'PatAvgT',  SUM(JOB.TRANSPORTATION_DISC) AS 'avgPercentDiscT',  SUM(JOB.TRANSPORTATION_TCOST) / COUNT(JOB.JOBID) AS 'RefAvgT',  JOB.JURISDICTION,                        PAYER.PAY_GROUPNAME,                         PAYER.PAY_COMPANY,                         PAYER.PAY_CITY,                         PAYER.PAY_STATE,                         PAYER.PAY_SALES_STAFF_ID,                         JOB.PATIENTID,                         JOB.INVOICE_DATE,                        JOB.JOBOUTCOMEID,                        JOB.SERVICEOUTCOME,                        INVOICE_AR.INVOICE_NO,                         INVOICE_AR.INVOICE_DATE AS Expr1,                         INVOICE_AR.AMOUNT_DUE,                        INVOICE_AR.CLAIMNUMBER,                        PATIENT.LASTNAME,                        PATIENT.FIRSTNAME,                        PATIENT.EMPLOYERNAME,                        JOB_OUTCOME.DESCRIPTION,                        SERVICE_TYPE.DESCRIPTION,                        PAT_SERVICES_HISTORY.TRANSPORT_TYPE,
            (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed Successfully') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startDate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS  'CompletedSuccessfullyItems',
             (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with complaint') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithComplaintItems',                                                  (SELECT COUNT(JOB.JOBOUTCOMEID)                         FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Show') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoShowItems',
                         (SELECT COUNT(JOB.JOBOUTCOMEID)                          FROM JOB                                   INNER JOIN INVOICE_AR ON JOB.JOBID = INVOICE_AR.JOBID                                   LEFT OUTER JOIN PAYER ON PAYER.PAYERID = JOB.PAYERID                                   LEFT OUTER JOIN STATES ON JOB.JURISDICTION = STATES.INITIALS                                  LEFT OUTER JOIN PATIENT ON PATIENT.PATIENTID = JOB.PATIENTID                                  LEFT OUTER JOIN JOB_OUTCOME ON JOB_OUTCOME.JOB_OUTCOME_ID = JOB.JOBOUTCOMEID                                  LEFT OUTER JOIN SERVICE_TYPE ON SERVICE_TYPE.DESCRIPTION = JOB.SERVICEOUTCOME                                  LEFT OUTER JOIN PAT_SERVICES_HISTORY ON PAT_SERVICES_HISTORY.PATIENTID = JOB.PATIENTID                         WHERE (JOB_OUTCOME.DESCRIPTION = 'Completed with No Charge') AND (INVOICE_AR.AMOUNT_DUE > 0) AND                                      (INVOICE_AR.INVOICE_DATE BETWEEN DATEADD (@startdate) and DATEADD(@enddate)) AND                                     (MONTH(INVOICE_AR.INVOICE_DATE) in (1,2,3,4,5,6,7,8,9,10,11,12)) AND                                     (PAYER.PAY_GROUPNAME like '%' + @Company + '%') AND                                     (INVOICE_AR.INVOICE_NO like '%T')) AS 'CompletedWithNoChargeItems',
           &nb

View 8 Replies View Related

Import Data From Excel-Sheet Via OleDb In VB.Net - How To Get A Columns Data As String?

Oct 25, 2007

Hello,

i want to import data from an excel sheet into a database. While reading from the excel sheet OleDb automatically guesses the Datatype of each column. My Problem is the first A Column which contains ~240 Lines. 210 Lines are Numbers, the latter 30 do contain strings. When i use this code:







Code BlockDim sConn As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & conf_path_current & file_to_import & ";Extended Properties=""Excel 8.0;HDR=NO"""
Dim oConn As New OleDb.OleDbConnection(sConn)
Dim cmd1 As New System.Data.OleDb.OleDbCommand("Select * From [Table$]", oConn)
Dim rdr As OleDb.OleDbDataReader = cmd1.ExecuteReader
Do While rdr.Read()
Console.WriteLine(rdr.Item(0)) 'or rdr(0).ToString
Next




it will continue to read the stuff till the String-Lines are coming.
when using Item(0), it just crashes for trying to convert a DBNull to a String, when using rdr(0).ToString() it just gives me no value.

So my question is how to tell OleDB that i want that column to be completly read as String/Varchar?

Thanks for Reading

- Pierre from Berlin


[seems i got redirected into the wrong forum, please move into the correct one]

View 1 Replies View Related

Exported Flat File Data Will Not Import To Same Table Without Extensive Data-type Manipulation

Jul 13, 2007

I'm moving data between identical tables and have to use a flat file as an intermediary. I thought: "No problem, SSIS can do a quick export to a file, then move the file to another server, then use SSIS to import the data to the new server."



Seems simple, right?



I'm hitting all sorts of surprising data conversion errors. I used the export wizard to create the export package. This works fine. However using the same flat file definition, the import package fails -- even when I have no destination. That is I have just one data flow task that contains only one control: the Flat File source. When I run the package the flat file definition fails with data type conversion and truncation errors. One of the obvious errors is for boolean types. The SQL field is a bit, SSIS defined the column as DT_BOOL, the output of the data are literal text values "TRUE" and "FALSE". So SSIS converts a sql datatype of bit to "TRUE" and "FALSE" on export, but can't make the reverse conversion on import?



Does anyone else find this surprising? I would expect that what SSIS exports, it can import given all the same table and flat file definitions. Is SSIS the wrong tool to do such simple bulk copies? I'd like to avoid using BCP because this process will need to run automatically within SQL Agent so we can leverage all the error tracking and system monitoring.



View 12 Replies View Related

How To Optimize Data Import With Huge Volumes And Joins Across Data Sources Not All SQL Server Based?

Jun 7, 2006

I need to periodically import a (HUGE) table of data from an external data source (not SQL Server) into SQL Server, with the following scenarios:
Some of the records in the external data source may not exist in SQL.Some of the records in the external data source may have a different value at different imports, but this records are identified univocally by the same primary key in the external datasource and in SQL Server.Some of the records in the external data source may be the same in SQL.

Due to the massive volume of the import, I would like to import only the records which are different from what I have in SQL Server (cases 1 and 2 above). In fact case 2 is the most critical.

I thought of making a query with a left outer join between the data in the external data source table (SOURCE) and the data in the SQL Server table (DESTIN). The join is done on the respective primary keys (composed keys of up to 10 columns) and one of the WHERE conditions will be that the value in SOURCE is different from the value in DESTIN.

The result of this query would be exactly what I need to import.
How to do this in SSIS??? I couldn't figure out how to join tables in different data sources yet.

In fact I cannot write a stored procedure to do that, since one of the sources is in a datasources not SQL Server.
I have seen the Lookup transformation in this article http://www.sqlis.com/default.aspx?311 but this is not exacltly what I want to do.
Another possibility is to use the merge join, but due to the sorting I believe its performances would be terrible!

Thanks in advance for your suggestions!

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved