Using Muliple Smaller SPs And Functions Better Than Using 1 Big SP?

Dec 9, 2005

Is it generally or almost always better to have multiple
small SPs and functions to return a result set instead of
using a single big 1000+ lines SP?

I have one SP for example that is 1000+ lines and early
analysis of the SP I see it first has 3 big blocks of code
separated by IF statements. Then within each IF block
of code I see 3-4 UNIONs. UNIONs that means
they are all returning the same columns so I am
guessing these are prime candidates for becoming
individual functions or SPs, maybe even dynamic SPs.

Obviously I am not showing you the code but am I
right to think this way? This same SP has about 15 JOINs
including some LEFT JOINs and one LEFT JOIN to a (SELECT
statement) and almost all the tables referenced by these
JOINs have thousands of records, very possibly hundreds of
thousands.

The SELECT statement is returning 30-40 columns from
a lot of the these tables plus I also see a lot of CASE ELSE
statements within the main SELECT statement. The code of
each CASE statement is calling a function. As an example
if the CASE is for EmployeeID then a function is being called
to get the EmployeeID's FirstName and LastName. If the CASE
is for CustomerID then another function is being called to get
the Customer Name.

I am thinking to cut this big SP to many smaller SPs and/or functions
and I also plan on using table variable(s) to hold temporary result
while I continue processing the records from the table variable
with other code logic.

Also I want to leave as the last thing to do is to convert the
"machine result", i.e. EmployeeID or CustomerID to "human
readable result", i.e. Employee FirstName and LastName,
Customer Name.

I am trying to test this on the Northwind's Employees table,
but the Statistics IO, Time and the Execution Plan are
something I've only started to use. I am unable to make
conclusion which method is better. I'll work on posting another
post specifically with details to this test that I am currently doing.

My opinion is that by having 1 single SP with 15+ join cause
a lot more locking than if I would run smaller SPs and store the
result into temp table variables and continue processing the
remaining code logic.

I would like to know what you think and if I am right or wrong
on how I want to optimize this SP?


Thank you

View 5 Replies


ADVERTISEMENT

Mdf File Become Smaller And Smaller After Shrinking Db

Jul 20, 2005

Hi all:I restored one backup database (7.9 GB mdf) on two diffrent servers. Ishrunk them by clicking "Move pages to beginning of file beforeshrinking".After shrinking, one mdf file is 6.7 GB, and the other is 4.2GB. Ishunk again and again:1. the 6.7GB become 5.9GB, 5.2GB, 4.7GB and 4.2Gb (four times)2. the 4.2Gb become 4.0GB (just one more time)It is wired, I am wondering the mdf will be smaller and smaller if Icontinue to shrink them? What is the reason?ThanksWJ

View 1 Replies View Related

Tables In Muliple Servers

Jul 20, 2005

What is the best way to use a table located on 1 server on anotherserver. We have an application that needs to use data from 2 separateservers. TIA.TS

View 2 Replies View Related

When Then Muliple Colum Update 1 Statement

Apr 8, 2008

I have about 5 statements like the update below, depending on the PID different columns will be update "C2005, G2005,E2005...."

I would like to use 1 update statement in stead of 5 to update all columns below are 2 original update statements and my attempt at when then update. Note a different column is updated depending on the PID.

If when then isnt possible, any other suggestions are welcomed. Thanks
UPDATE #Sec
SET C2005 = Pos.USD / 1000
FROM #Sec INNER JOIN
Pos ON #Sec.ID = Pos.ID
WHERE (Pos.PID = 'B')

UPDATE #Sec
SET G2005 = Pos.USD / 1000
FROM #Sec INNER JOIN
Pos ON #Sec.ID = Pos.ID
WHERE (Pos.PID = 'G')

UPDATE #Sec
WHEN (Pos.PID = 'C') THEN SET C2005 = Pos.USD / 1000 end
WHEN (Pos.PID = 'G') THEN SET G2005 = Pos.USD / 1000 end
WHEN (Pos.PID = 'E') THEN SET E2005 = Pos.USD / 1000 end
FROM #Sec INNER JOIN
Pos ON #Sec.ID = Pos.ID

View 3 Replies View Related

Exporting To Muliple File From A Single Table Using DTS

Aug 15, 2003

Hi Friends

I have been trying to solve this problem for the last 2 days but no luck.

Here is the problem that I am facing.

The task on had is to transfer data from a single table (the source) to multiple files (Destination) based on the record type.

I have tried changing the Datasource property of the Text File Connection object dynamically by using an ActiveX Script. But the data is still being written only to one file.

Can anyone please help me.

Thanks in advance.

Srinivas.

View 3 Replies View Related

Restoring Smaller DB Into Larger DB

Mar 27, 2000

I have a situation where I need to migrate data from an older platform to a newer one. The data from the old system(s) will be available on DAT tapes. All database construction on the new system will be identical to the old one in size and schema, except for one table (call it "ARCHIVE").

If the ARCHIVE table on the old system is 210MB, and the ARCHIVE table on the new system has the same attributes but has been expanded to 380MB in size, can I simply restore the dump for the old table into the new ARCHIVE?

Empirically it works (I have done it with apparent success two times) but I seem to recall that backups are done by pages, and I'm concerned that there may be conditions not being met by simply doing the restore the way I'm planning to do it.

Also, are there any tests or checks built into SQL which I can use to check table integrity on the target ARCHIVE table after the restore?

Any help is greatly appreciated.
Best rgds,
Kevin

View 4 Replies View Related

Restore To A Smaller Size

Aug 28, 2000

I have a database in 2 GB .mdf and a 1 GB ldf. The backup is much
smaller. I need to copy this database to another server which does
not have that much free space. Can this be restored to a smaller
.mdf and .ldf? How?

Thanks.

Ranjit

View 1 Replies View Related

SQL Restore Into A Smaller File

Sep 4, 2007

Hello there....

I have a scenario where I am trying to set up multiple database instances for multiple test/development environment(s) for each group where the test/dev environment will contain a copy of what was in the production environment.. The test/dev environment can be refreshed on demand based on the prior night's full backup of the production environment.. This is good for our web developers and for training purposes, as the test environment(s) can be played around with, and will retain data for as long as the developers/testers/trainers need it, and then can be refreshed to the most current data when everyone in the group decides they want it refreshed...

Normally, this works out well...

However I am having a file size issue...

The production database was pre-allocated (a long time ago) to a large file size (probably to reduce external fragmentation).... So even though the backup file is only 5GB, the production database file itself is something like 40GB... I believe the production database has a maintainance plan on it already that rebuilds the indicies each weekend, etc...

Anyhow, the problem is that when I restore the 5GB database back into a newly created database file, the file expands all the way up to 40GB again, even though the backup file is 5GB...

Normally this would be fine, the problem is that I am trying to create multiple environments, and I do not have the disk space on my test/dev server for 40GB (plus another 15GB or so for the transaction log) multiplied by each of my test/dev environments... It would be much nicer if I could get this down to 5GB (or heck, even 10GB), since I know for sure that the total amount of data in the database doesnt exceed 5GB, and I have plenty of space on my disk for 5 (or 10) GB multiplied by each of the environments I want to create...

I have tried DBCC SHRINKDB and I have tried DBCC SHRINKFILE with the truncate after the restore, which seems to work but doesn't....

I have also tried to go into the database properties and change the "initial size" but that doesnt do anything etiher

Is there any way to get this file back down to a manageable size after the restore??

Or better yet, is there a special method to restore the database so it wont 'expand' back out to 40GB in the first place??? Perhaps some option to tell the restore process that even though the source database had a 40GB pre-allocation, that the database I am restoring into doesn't need to be pre-allocated??

View 1 Replies View Related

Making A Report Smaller...?

Jul 19, 2007

Alright. I'm stuck. I admit it!



I have a bunch of names, and each name can have one or more 'roles'(operator, reader, key operator, etc. Just random words really.) attached to it.



Using reporting services, I've managed to get the information I need with relative ease... the only problem is, with 900 some records to display, it's current length of 41 pages with just one column going down the left side of each page is not exactly preferred by my superior (can't say I blame him really. Looks kind of odd!)



It looks like this right now:



Name1

Function

Function

Function

Name2

Function

Function

Name3

Function

Function



etc all the way down to page 41



I need it to look something like this:



Name 1 Name 4 Name 7

Function Function Function

Name 2 Name 5 Function

Function Function Name 8

Function Function Function

Name 3 Name 6 Function

Function Function Function



etc. Or some variation of...



I've fiddled around, and merely adding one extra column to the initial table-layout with the same =(!UserName etc) just merely replicates the data in the second column... not giving me the new stuff.



I'm quite new to reporting services, but none of the tutorials I've seen/done seem to accomodate for this... Heeelp!

View 3 Replies View Related

Splitting Selects Into Smaller Parts

Oct 24, 2005

I'm having problems with handling a very large amount of user records - about 100.000 - 150.000 records. Instead of selecting all of them at a time, how do I f.ex. select 1000 of them? (f.ex. get nr. 1 - nr 1000, then get nr. 1001 - nr. 2000)  ???

View 1 Replies View Related

How To Make Database / Log Size Smaller

Apr 16, 2002

Hi all,

I found a database file and a log file over 2G on mssql 2000 server. Actually, they only need around 200M. I try to backup, truncate the database in order make the size smaller. But the size cannot be smaller. How can I do it?

Simon

View 3 Replies View Related

What Is The Best Way To Restore Production Db In Dev With A Smaller Log File?

Aug 20, 2004

I have a Database A in production with 12GB as data file and 8 GB as log file. How do I restore this db in Development with a smaller log file, say 1GB?
I can't shrink the log file or anything in production. What is the best way to restore in Dev with a smaller log file?

Thanks.

View 4 Replies View Related

Full-Text Indexing For Smaller Columns

Nov 1, 2004

Hello,

I'm looking at using full-text indexing for tables to query. I have some smaller fields (varchar(50) that stores names) that I was contemplating using full-text indexing for. I was just curious if it is worth it?

Basically the data that will be there are one-word names, without any spaces or whatnot.

Brian

View 3 Replies View Related

Breaking Up Sql Server Backups Into Smaller Files

Apr 5, 2006

Hey guys,

I'm wondering how most people manage very very large backups. What is the best approach to breaking up the backup files if you're restricted to a drive size (450gig in my case). I unix, you can pipe the backup to gzip and split, I'm not sure how the same thing could be accomplished in windows.

Thanks,
-Kilka

View 2 Replies View Related

SQL 2012 :: Does Not Load Smaller Splash Screen

May 29, 2014

However when I start SQL 2012 it loads the Management Studio but does not load the smaller splash screen that usally appears asking me to connect to a server. When I try to click any of the menu items at the top of the screen the system just hangs.

I also have 2012 Service pack 1 installed too.My installations of 2005 and 2008r2 still work fine.I also tried loading SQL2014 and had the same issues as with 2012.

View 0 Replies View Related

How To Backup A Database Into A Number Of Smaller Files ?

Jul 23, 2005

To all,How to backup a database into a number of smaller files ?For example, can I can fully backup a DB of 10 MB into 10 files (each 1MB)???The problem I've met is that the DB backup file is too large, over 4GB, and even Winzip can't compress it (after compressing, around 80 %of compression rate is possible)Thanks![color=blue]>From Jason (Kusanagihk)[/color]

View 11 Replies View Related

How Do I Make A Backup Device File Smaller ?

Apr 9, 2008

HiI have a backup device file ... ".bck" whick has grown pretty large.Is there any way I can reduce its size ?ThanksDavid Greenberg

View 4 Replies View Related

Can MsSQL Data File Be Archived And Made Smaller ?

Dec 6, 2006

Hi... We have application that connected to MSSQL 2000 database. The database file is getting bigger and bigger over the years. Recently the performance of the database and application is getting slower and slower, my senior oracle DBA told me that I should archive the MSSQL 2000 data file and export the old record to the archive DB. So that the number of record will be less and it will be faster.

I would like to know whether MSSQL 2000 support archiving of the database file ? If yes, what is the way to do it ? I could not find it in Enterprise manager option at all.

View 2 Replies View Related

Insert Items From One To Table To Multiple Smaller Tables

Nov 15, 2004

I have a table that I filled with data imported from another database.

What I need to do is now take this huge table and break apart the information and put it into 5 smaller tables.

So I have a huge insert statement.

I have one main table called Property with two keys. One key is a "Prop_ID" and the other is "owner" where Prop_Id is a automated unique ID. Once the information is inserted into that table, I then get the Unique ID that it was given, and I then used that ID to insert into the other tables.

The problem I am encountering is I keep getting the following error

Violation of PRIMARY KEY constraint 'PK_Prop_Res_Detail'. Cannot insert duplicate key in object 'Prop_Res_Detail'.
The statement has been terminated.

I have an idea what might be going wrong, but I am not sure. What I want to happen is that I want the query to look at the first row of the huge table and then do all 4 of the inserts, and then go to the next row. But I think it is trying to all the inserts into the property table, and then go on to the Prop_Res_Detail table and that is why I am getting that error.

Any help is greatly appreicated.

here is the code..


Code:

CREATE PROCEDURE [dbo].[Insert_Properties]

AS

DECLARE @Prop_ID Int

SET NOCOUNT ON

INSERT INTO Property(Acres,
Assoc_Phone,
Assoc_Cell,
AppraisalForm,
Area,
Assess_Account,
AttachDetach,
Block,
City,
County,
Directions,
DOM,
ER_EA,
FloodZone,
Import_From,
Import_ID,
Insert_Date,
LandSQFT,
LandSQFTDim,
LegalRemarks,
ListAppraiser_ID,
ListAssoc_ID,
ListBroker_ID,
ListDate,
Listing_Office_Remarks,
ListPrice,
Lot,
Map,
Num_Images,
Office_Phone,
Original_ListPrice,
Owner,
Pending_Date,
PhotoName,
PropSubType,
Prop_Type,
Quad,
Remarks,
State,
Status,
StreetDir,
StreetNum,
StreetName,
Township,
UnitNumber,
ZipCode)

SELECT CONVERT(FLOAT(8), Acres),
CONVERT(Varchar(25), Assoc_Phone),
CONVERT(Varchar(25),Assoc_Cell),
CONVERT(Varchar(50), AppraisalForm),
CONVERT(Varchar(10), Area),
CONVERT(Varchar(50), Assess_Account),
CONVERT(Varchar(20), AttachDetach),
CONVERT(Varchar(20), Block),
CONVERT(Varchar(40), City),
CONVERT(Varchar(50), County),
CONVERT(Varchar(1000), Directions),
CONVERT(int, DOM),
CONVERT(Varchar(10), ER_EA),
CONVERT(Varchar(50), FloodZone),
CONVERT(Varchar(20), Import_From),
CONVERT(Varchar(20), Import_ID),
CONVERT(datetime, Insert_Date, 101),
CONVERT(Varchar(20), LandSQFT),
CONVERT(Varchar(50), LandSQFTDim),
CONVERT(Varchar(2000), LegalRemarks),
CONVERT(Varchar(50), ListAppraiser_ID),
CONVERT(Varchar(50), ListAssoc_ID),
CONVERT(Varchar(50), ListBroker_ID),
CONVERT(varchar(11), ListDate),
CONVERT(Varchar(1000), Listing_Office_Remarks),
CONVERT(Varchar(10), ListPrice),
CONVERT(Varchar(20), Lot),
CONVERT(Varchar(10), Map),
CONVERT(Varchar(10), Num_Images),
CONVERT(Varchar(25), Office_Phone),
CONVERT(Varchar(10), Original_ListPrice),
CONVERT(Varchar(50), Owner),
CONVERT(datetime, Pending_Date, 101),
CONVERT(Varchar(50), PhotoName),
CONVERT(Varchar(25), PropSubType),
CONVERT(Varchar(20), Prop_Type),
CONVERT(Varchar(10), Quad),
CONVERT(Varchar(1000), Remarks),
CONVERT(Varchar(25), State),
CONVERT(Varchar(10), Status),
CONVERT(Varchar(4), StreetDir),
CONVERT(Varchar(15), StreetNum),
CONVERT(Varchar(50), StreetName),
CONVERT(Varchar(20), Township),
CONVERT(Varchar(6), UnitNumber),
CONVERT(Varchar(20), ZipCode )

FROM Imported_Closed_Property_From_MLS


SET @Prop_ID = @@Identity

/*Property Res Table */
INSERT INTO Prop_Res_Detail(Prop_ID,
Addition,
Appliances,
Basement_Area,
BasementDesc,
Builder,
Construction,
Cool,
Dining,
District_School,
Energy,
Exterior_Features,
Fence,
Floors,
Foundation,
FP,
FP_Type,
Garage_Attach_Detach,
Garage_Cap,
Handicap,
Heat,
HOA,
HOA_Fee,
HOA_Inc,
HOA_Period,
Inlaw_Plan,
Interior_Features,
Livestock,
Lot_Desc,
Mechanical,
NumLivingArea,
Num_Baths,
Num_Beds,
Num_Levels,
Other_Info,
OvenDesc,
Owner,
Parking,
Patio,
Patio_Dim,
Perc_Basement_Com,
Pool,
Pool_Type,
Prop_Faces,
Range,
RangeDesc,
Remodeled,
Rental,
RentalAmount,
Roof_Type,
Roof_Year,
RoomOther,
Sect,
SQFT,
SQFTSource,
Style,
Tax_Amount,
Tot_Rooms,
UtilityAvailable,
WindowType,
Year_Built)

SELECT @Prop_ID,
CONVERT(Varchar(50), Addition),
CONVERT(Varchar(100), Appliances),
CONVERT(Varchar(25), Basement_Area),
CONVERT(Varchar(100), BasementDesc),
CONVERT(Varchar(50), Builder),
CONVERT(Varchar(50), Construction),
CONVERT(Varchar(20), Cool),
CONVERT(Varchar(10), Dining),
CONVERT(Varchar(60), District_School),
CONVERT(Varchar(100), Energy),
CONVERT(Varchar(100), Exterior_Features),
CONVERT(Varchar(40), Fence),
CONVERT(Varchar(100), Floors),
CONVERT(Varchar(40), Foundation),
CONVERT(Varchar(50), FP),
CONVERT(Varchar(40), FP_Type),
CONVERT(Varchar(50), Garage_Attach_Detach),
CONVERT(Varchar(25), Garage_Cap),
CONVERT(Varchar(20), Handicap),
CONVERT(Varchar(20), Heat),
CONVERT(Varchar(40), HOA),
CONVERT(Varchar(30), HOA_Fee),
CONVERT(Varchar(100), HOA_Inc),
CONVERT(Varchar(20), HOA_Period),
CONVERT(Varchar(20), Inlaw_Plan),
CONVERT(Varchar(100), Interior_Features),
CONVERT(Varchar(40), Livestock),
CONVERT(Varchar(400), Lot_Desc),
CONVERT(Varchar(100), Mechanical),
CONVERT(Varchar(10), NumLivingArea),
CONVERT(Varchar(5), Num_Baths),
CONVERT(Varchar(5), Num_Beds),
CONVERT(Varchar(30), Num_Levels),
CONVERT(Varchar(100), Other_Info),
CONVERT(Varchar(100), OvenDesc),
CONVERT(Varchar(50), Owner),
CONVERT(Varchar(100), Parking),
CONVERT(Varchar(25), Patio),
CONVERT(Varchar(50), Patio_Dim),
CONVERT(Varchar(25), Perc_Basement_Com),
CONVERT(Varchar(20), Pool),
CONVERT(Varchar(20), Pool_Type),
CONVERT(Varchar(40), Prop_Faces),
CONVERT(Varchar(20), Range),
CONVERT(Varchar(100), RangeDesc),
CONVERT(Varchar(50), Remodeled),
CONVERT(Varchar(10), Rental),
CONVERT(Varchar(10), RentalAmount),
CONVERT(Varchar(20), Roof_Type),
CONVERT(Varchar(5), Roof_year),
CONVERT(Varchar(100), RoomOther),
CONVERT(Varchar(10), Sect),
CONVERT(Varchar(10), SQFT),
CONVERT(Varchar(50), SQFTSource),
CONVERT(Varchar(100), Style),
CONVERT(Varchar(10), Tax_Amount),
CONVERT(Varchar(5), Tot_Rooms),
CONVERT(Varchar(100), UtilityAvailable),
CONVERT(Varchar(50), WindowType),
CONVERT(Varchar(5), Year_Built)
FROM Imported_Closed_Property_From_MLS

/*Sold Info Table */
INSERT INTO Sold_Info(Prop_ID,
Buy_Pts,
Closed_Date,
Closed_Price,
Closed_Price_SQFT,
COOP_Sales,
Days_On_Market,
InterestRate,
Lender,
LoanAmount,
LoanTerms,
Loan_Years,
Origination_Fee,
Owner,
SellerConcessions,
LoanType,
Sold_Remarks)

SELECT @Prop_ID,
CONVERT(Varchar(10), Buy_Pts),
CONVERT(datetime, Closed_Date, 101),
CONVERT(Varchar(10), Closed_Price),
CONVERT(Varchar(50), Closed_Price_SQFT),
CONVERT(Varchar(50), COOP_Sales),
CONVERT(Varchar(5), DOM),
CONVERT(Varchar(10), InterestRate),
CONVERT(Varchar(50), Lender),
CONVERT(Varchar(10), LoanAmount),
CONVERT(Varchar(50), LoanTerms),
CONVERT(Varchar(10), Loan_Years),
CONVERT(Varchar(10), Origination_Fee),
CONVERT(Varchar(50), Owner),
CONVERT(Varchar(100), SellerConcessions),
CONVERT(Varchar(25), LoanType),
CONVERT(Varchar(1000), Sold_Remarks)
FROM Imported_Closed_Property_From_MLS

/*Remarks Table */
INSERT INTO Remarks(Prop_ID,
App_Date,
App_Remark,
Contract_Date,
Inspection_Type,
Owner,
PendingSalesPrice,
PendingSaleComments)

SELECT @Prop_ID,
CONVERT(datetime, App_Date, 101),
CONVERT(Varchar(1000), App_Remark),
CONVERT(datetime, Contract_Date, 101),
CONVERT(Varchar(50), Inspection_Type),
CONVERT(Varchar(50), Owner),
CONVERT(Varchar(10), PendingSalesPrice),
CONVERT(Varchar(1000), PendingSaleComments)
FROM Imported_Closed_Property_From_MLS

GO

View 2 Replies View Related

SQL Server 2005: CLR Functions Vs SQL Functions

May 26, 2006

I was playing around with the new SQL 2005 CLR functionality andremembered this discussion that I had with Erland Sommarskog concerningperformance of scalar UDFs some time ago (See "Calling sp_oa* infunction" in this newsgroup). In that discussion, Erland made thefollowing comment about UDFs in SQL 2005:[color=blue][color=green]>>The good news is that in SQL 2005, Microsoft has addressed several of[/color][/color]these issues, and the cost of a UDF is not as severe there. In fact fora complex expression, a UDF in written a CLR language may be fasterthanthe corresponding expression using built-in T-SQL functions.<<I thought the I would put this to the test using some of the same SQLas before, but adding a simple scalar CLR UDF into the mix. The testinvolved querying a simple table with about 300,000 rows. Thescenarios are as follows:(A) Use a simple CASE function to calculate a column(B) Use a simple CASE function to calculate a column and as a criterionin the WHERE clause(C) Use a scalar UDF to calculate a column(D) Use a scalar UDF to calculate a column and as a criterion in theWHERE clause(E) Use a scalar CLR UDF to calculate a column(F) Use a scalar CLR UDF to calculate a column and as a criterion inthe WHERE clauseA sample of the results is as follows (time in milliseconds):(295310 row(s) affected)A: 1563(150003 row(s) affected)B: 906(295310 row(s) affected)C: 2703(150003 row(s) affected)D: 2533(295310 row(s) affected)E: 2060(150003 row(s) affected)F: 2190The scalar CLR UDF function was significantly faster than the classicscalar UDF, even for this very simple function. Perhaps a more complexfunction would have shown even a greater difference. Based on this, Imust conclude that Erland was right. Of course, it's still faster tostick with basic built-in functions like CASE.In another test, I decided to run some queries to compare built-inaggregates vs. a couple of simple CLR aggregates as follows:(G) Calculate averages by group using the built-in AVG aggregate(H) Calculate averages by group using a CLR aggregate that similatesthe built-in AVG aggregate(I) Calculate a "trimmed" average by group (average excluding highestand lowest values) using built-in aggregates(J) Calculate a "trimmed" average by group using a CLR aggregatespecially designed for this purposeA sample of the results is as follows (time in milliseconds):(59 row(s) affected)G: 313(59 row(s) affected)H: 890(59 row(s) affected)I: 216(59 row(s) affected)J: 846It seems that the CLR aggregates came with a significant performancepenalty over the built-in aggregates. Perhaps they would pay off if Iwere attempting a very complex type of aggregation. However, at thispoint I'm going to shy away from using these unless I can't find a wayto do the calculation with standard SQL.In a way, I'm happy that basic SQL still seems to be the fastest way toget things done. With the addition of the new CLR functionality, Isuspect that MS may be giving us developers enough rope to comfortablyhang ourselves if we're not careful.Bill E.Hollywood, FL------------------------------------------------------------------------- table TestAssignment, about 300,000 rowsCREATE TABLE [dbo].[TestAssignment]([TestAssignmentID] [int] NOT NULL,[ProductID] [int] NULL,[PercentPassed] [int] NULL,CONSTRAINT [PK_TestAssignment] PRIMARY KEY CLUSTERED([TestAssignmentID] ASC)--Scalar UDF in SQLCREATE FUNCTION [dbo].[fnIsEven](@intValue int)RETURNS bitASBEGINDeclare @bitReturnValue bitIf @intValue % 2 = 0Set @bitReturnValue=1ElseSet @bitReturnValue=0RETURN @bitReturnValueEND--Scalar CLR UDF/*using System;using System.Data;using System.Data.SqlClient;using System.Data.SqlTypes;using Microsoft.SqlServer.Server;public partial class UserDefinedFunctions{[Microsoft.SqlServer.Server.SqlFunction(IsDetermini stic=true,IsPrecise=true)]public static SqlBoolean IsEven(SqlInt32 value){if(value % 2 == 0){return true;}else{return false;}}};*/--Test #1--Scenario A - Query with calculated column--SELECT TestAssignmentID,CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END ASCalcColumnFROM TestAssignment--Scenario B - Query with calculated column as criterion--SELECT TestAssignmentID,CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END ASCalcColumnFROM TestAssignmentWHERE CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END=1--Scenario C - Query using scalar UDF--SELECT TestAssignmentID,dbo.fnIsEven(TestAssignmentID) AS CalcColumnFROM TestAssignment--Scenario D - Query using scalar UDF as crierion--SELECT TestAssignmentID,dbo.fnIsEven(TestAssignmentID) AS CalcColumnFROM TestAssignmentWHERE dbo.fnIsEven(TestAssignmentID)=1--Scenario E - Query using CLR scalar UDF--SELECT TestAssignmentID,dbo.fnIsEven_CLR(TestAssignmentID) AS CalcColumnFROM TestAssignment--Scenario F - Query using CLR scalar UDF as crierion--SELECT TestAssignmentID,dbo.fnIsEven_CLR(TestAssignmentID) AS CalcColumnFROM TestAssignmentWHERE dbo.fnIsEven(TestAssignmentID)=1--CLR Aggregate functions/*using System;using System.Data;using System.Data.SqlClient;using System.Data.SqlTypes;using Microsoft.SqlServer.Server;[Serializable][Microsoft.SqlServer.Server.SqlUserDefinedAggregate (Format.Native)]public struct Avg{public void Init(){this.numValues = 0;this.totalValue = 0;}public void Accumulate(SqlDouble Value){if (!Value.IsNull){this.numValues++;this.totalValue += Value;}}public void Merge(Avg Group){if (Group.numValues > 0){this.numValues += Group.numValues;this.totalValue += Group.totalValue;}}public SqlDouble Terminate(){if (numValues == 0){return SqlDouble.Null;}else{return (this.totalValue / this.numValues);}}// private accumulatorsprivate int numValues;private SqlDouble totalValue;}[Serializable][Microsoft.SqlServer.Server.SqlUserDefinedAggregate (Format.Native)]public struct TrimmedAvg{public void Init(){this.numValues = 0;this.totalValue = 0;this.minValue = SqlDouble.MaxValue;this.maxValue = SqlDouble.MinValue;}public void Accumulate(SqlDouble Value){if (!Value.IsNull){this.numValues++;this.totalValue += Value;if (Value < this.minValue)this.minValue = Value;if (Value > this.maxValue)this.maxValue = Value;}}public void Merge(TrimmedAvg Group){if (Group.numValues > 0){this.numValues += Group.numValues;this.totalValue += Group.totalValue;if (Group.minValue < this.minValue)this.minValue = Group.minValue;if (Group.maxValue > this.maxValue)this.maxValue = Group.maxValue;}}public SqlDouble Terminate(){if (this.numValues < 3)return SqlDouble.Null;else{this.numValues -= 2;this.totalValue -= this.minValue;this.totalValue -= this.maxValue;return (this.totalValue / this.numValues);}}// private accumulatorsprivate int numValues;private SqlDouble totalValue;private SqlDouble minValue;private SqlDouble maxValue;}*/--Test #2--Scenario G - Average Query using built-in aggregate--SELECT ProductID, Avg(Cast(PercentPassed AS float))FROM TestAssignmentGROUP BY ProductIDORDER BY ProductID--Scenario H - Average Query using CLR aggregate--SELECT ProductID, dbo.Avg_CLR(Cast(PercentPassed AS float)) AS AverageFROM TestAssignmentGROUP BY ProductIDORDER BY ProductID--Scenario I - Trimmed Average Query using built in aggregates/setoperations--SELECT A.ProductID,CaseWhen B.CountValues<3 Then NullElse Cast(A.Total-B.MaxValue-B.MinValue ASfloat)/Cast(B.CountValues-2 As float)End AS AverageFROM(SELECT ProductID, Sum(PercentPassed) AS TotalFROM TestAssignmentGROUP BY ProductID) ALEFT JOIN(SELECT ProductID,Max(PercentPassed) AS MaxValue,Min(PercentPassed) AS MinValue,Count(*) AS CountValuesFROM TestAssignmentWHERE PercentPassed Is Not NullGROUP BY ProductID) BON A.ProductID=B.ProductIDORDER BY A.ProductID--Scenario J - Trimmed Average Query using CLR aggregate--SELECT ProductID, dbo.TrimmedAvg_CLR(Cast(PercentPassed AS real)) ASAverageFROM TestAssignmentGROUP BY ProductIDORDER BY ProductID

View 9 Replies View Related

.NET Class To Hold Single Disconnected Record? Nothing Smaller Than DataSet?

Jul 9, 2007

What is the most efficient standalone .NET class that can hold a single disconnected record?  The class must also retain column names, but other schema is not relevant (.NET data type is sufficient).If I understand System.Data.Common.DbDataRecord, it provides an interface on a DbDataReader, and has no storage of its own.I'm familiar with DataSet, is that the only .NET-standard class to do this? Thank you,Shannon 

View 7 Replies View Related

Transfer SQL Server Objects Task (for A Table): Can It Be Split Into Smaller Batches

May 29, 2008

We are using the Transfer SQL Server Objects Task to transfer a large table. The trans log is filling up for this table. Is there a method to split the Data Transfer Task into smaller batches? (Smaller tables are transferring without issue.)

Thanks.

View 2 Replies View Related

Physical Setup: 1 Data File Vs Multiple Smaller Data Files

Jul 20, 2005

Hello all. Before my arrival at my current employer, our consultantsphysically set up our MSSQL 7 server as follows:drive c: contains the mssql enginedrive d: contains the transaction logdrive e: contains the data filesNo filegroups were set up and the data files consist of only 1 largephysical file. Currently, our data file is >10GB. When I was trained onthe physical aspects of sqlserver, I was told to never create physical files[color=blue]> 2048MB each. If I did, I could expect inefficient physical storage of[/color]data and slower performance (due to the OS).Our server has 2 RAID-5 arrays. Drive c: and e: are located on the firstarray and drive d: on the second. We're running Windows 4.0 NT Server SP6with NTFS.Can someone comment on the use of 1 single large data file vs. more smallerdata files?

View 2 Replies View Related

Differential Backup Not Smaller Than The Full Backup

Jun 6, 2007

Hi,



Using SQL Server 2005, we have a 2.8Gb database under the Simple recovery model. The database contains ~50M rows and each night ~60k rows are loaded(appended) to the database by a SSIS task.



We configured a Maintenance Plan which is executed once a week to perform a full backup of the database. The resulting backup file is ~2.8Gb, as expected.



We also configured another Maintenance Plan which is executed every day, a few hours after the SSIS task is executed, to perform a differential backup. To our surprise, the resulting backup file is about the same size as the full backup, ~2.8Gb when it should only be a few MB (only 60k rows are added to the database)



When we launch the "Restore Database" wizzard we clearly see the different backup set, Full and Differential but they all have about the same size (same for the physical backup file on disk).



Is there anything we are missing, why are the differential backup that big?



Thanks for any advice.

View 4 Replies View Related

SPs, Functions And

Feb 22, 2007

Guys I need help with sql server 2005 syntax
My goal is up update a table called UserStats.
I have numerous functions in SQL that return Scalars (one for each statistics I want to use)
How do I then use a stored proceedure to return a table of the values for use on my site?

View 1 Replies View Related

Functions?

Mar 25, 2006

WIthin SQL Server 2005, there are functions.  This feature is new to me and I haven't found anyone that has written their own fucntions?  I'm wondering if functions are written the same as stored procedures, and can a function be called from a stored procedure or even from within a query.
 

View 2 Replies View Related

First/Last Functions In SQL?

Mar 12, 2007

hello Im having a difficult time translating this query from Access to SQL because it uses the First/Last functions.

I have a 'Projects' Table and a 'Project_Comments' table, each has a 'Project_ID' field which links the 2 together. What I need to do is retrieve a Project List from the Projects Table and also the first Comment of each project based on the Commend_date field in the Project_Comments table. This is the MS ACCESS query:


SELECT Projects.Project_Number, Projects.Project_Name, First(Project Comments.Comment_Date), First(Project_Comments.Notes)
FROM Projects Left Join Projec_Comments ON
Projects.Project_Number = Project_Comments.Project_Number
GROUP BY Projects.Project_Number, Projects.Project_Name


Now I can use Min() for the Date instead of First, however I dont know what to do with the Notes field. Any help on how to get this over to sql would be greatly appreciated!

View 2 Replies View Related

Functions Help

Jun 3, 2008

Hi,

I have created a function that returns a comma seperated list of product id's from a table. I need to call this function from a stored procedure to help filter my product results, something like the following:

SET @SQL = 'SELECT dbo.Products.ProductID FROM dbo.Products WHERE dbo.Products.ProductID IN (' + dbo.GetModels('dbo.Products.ProductID', '') + '))'

The problem I am having when executing the above is:

"Conversion failed when converting the varchar value 'dbo.Products.ProductID' to data type int."

Can anyone shed some light on how I can call the function, feeding through the product ID from the row of the select statement I am trying to execute (if this makes sense).

Any help would be great.

Matt

View 4 Replies View Related

SQL Functions

Aug 25, 2005

Iam trying to convert a date string to date format.....in access I could just use CDate, but SQL apparently does not allow this.
Any help appreciated
Thanks

View 4 Replies View Related

Using Own Functions

Aug 12, 2005

Hello,how I can use a function of my own within a select statement. I triedcreate function funny() returns intas beginreturn( 2 )end goand then "select funny()" but 'funny' is not a recognized function name.How can I solve this?thanks and regardsMark

View 1 Replies View Related

Functions

Jul 20, 2005

Hi,,I'm having a problem with calling a function from an activex scriptwithin a data transformation. the function takes 6 inputs and returnsa single output. My problem is that after trying all of the stuff onBOL I still can't get it to work. It's on the same database and I'mrunning sql 2000.when I try to call it I get an error message saying "object requiredfunctionname" If I put dbo in front of it I get "object required dbo".Can anyone shed any light on how i call this function and assign theoutput value returned to a variable name.thanks.

View 7 Replies View Related

SQL Functions

Aug 10, 2007

Hello,

I've created a function that performs modulo. I understand that SQL server 2000 / 2005 uses % for modulo, but we have an application that was written for Oracle. Oracle has a mod(dividend, divisor) function.

As to not rewite the queries, I would like to implement the function below:

the function executes properly but I must prefix it with the dbo schema.

Net: I can execute --- select dbo.mod(9,2) and it returns a 1 just like it should.

but I can not execute --- select mod(9,2) I receive the error "'mod' is not a recognized function name." on SQL 2000 and 2005.


If I can execute select mod(9,2) then I won't need to re-write any queries.

Also, on SQL 2005, I have tried to adjust the default shema, and the execute as clause, but neither helped my cause.

I'm going to try building the function in the CLR, but I think I will be faced with the same problem.

Can someone point me in the right direction?

Thanks

Tom



create function mod
(
@dividend int,
@divisor int
)
RETURNS int
as
begin
declare @mod int
select @mod = @dividend % @divisor
return @mod
end


View 3 Replies View Related

Functions In Functions

Sep 24, 2007

Hi,

I have to calculate data in function with "EXEC". During runtime I get the Error:

"Only functions and extended stored procedures can be executed from within a function."

I would use a Stored Procedure, but the function is to be called from a view. I don't understand, why that should not be possible. Is there any way to shut that message down or to work around?
btw: Storing all the data in a table, would mean a lot of work, I rather not like to do. ;-)

Thx for any help
Blubb10

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved