Fastest Way To Execute Aggegate Functions On A Table?
Jul 23, 2005
I've probably not given the best title to this topic, but that
reflects my relative "newbie" status.
I have a table that goes essentially
TSDATETIME
jobnumberVARCHAR
jobentryVARCHAR
...
the TS is a time stamp, and the other two fields are job number
and entries. There are (many) more fields, but this is the core of
it. The relationship is there will be several entries per job, with
one row in the table per entry (i.e several rows per job).
In constructing a web interface I want to create a list of recent
job numbers, and summarize it broadly as follows
max(ts)
jobnumber
count(jobentry)
...
I can do this by a select command as follows
select top 30 max(ts) as time, jobnumber, count(jobentry)
from Jobs
group by jobnumber
order by time desc
However I'm now finding that this is quite slow now that my teat table
has around 400,000 entries (Even though I've added indexes to most
relevant fields). In particular it's much slower than
select top 30 jobnumber
from Jobs
group by jobnumber
order by jobnumber desc
leading me to suspect that the aggregate functions are slowing this
down. I would guesstimate the difference in speed is around a factor
of 10-20.
As I type this I think I've just realised it's the
order by time desc
that is probably causing the problem (in that it will need to
calculate max(ts) for all jobnumber's in the table in order to
execute the "order by" clause).
I think I can see the solution now (amazing what typing out a long
question can do for you :-)
My question was going to be if there was any approved method/tricks
for avoiding this sort performance hit.
It seemed to me that if I could first grab a decent number of recent
records (quickly) I could then execute the aggregate functions against
that smaller set of records. I know in broad terms how any entries
there can be per job, so I could select enough to make at least 30
jobs, and then execute the real query against that. In my case there
will be probably less than 10 entries per job, so I could grab 300
records and execute against that, instead of against the whole
400,000.
That being the case is this best done
a) as a subquery, and if so, how
b) by creating a temporary intermediate table (which is
in essence a more explicit version of (a) isn't it?)
Another solution that occurred was
c) to create and maintain a summary table, with just
one record per jobnumber with a view to having it
serve this particular (common) enquiry.
For (c) to work I would probably need triggers on the Jobs table to
maintain the summary table. This would probably represent more
overhead overall, but should certainly give a better response
when the query outlined above is executed on the web site.
The response time to this query is becoming critical on the web
interface to avoid timeouts.
Any suggestions or comments are welcome. If it's RTFM, then I'd
appreciate page numbers :-)
Thanks for reading this far :-)
--
HTML-to-text and markup removal with Detagger
http://www.jafsoft.com/detagger/
View 4 Replies
ADVERTISEMENT
Mar 25, 2008
I hope this is a right form for ADO .net type of question.
My question is, can you call SQL function the way you call stored procedure from ADO .net. I coded it this way and does not seems to be getting result set back. The DataReader is seems to be coming back with nothing. Can someone post an example. I know you can write "SELECT udf_function()" but I really mean the way the stored procedure is called. Thanks.
View 2 Replies
View Related
Nov 22, 1999
We have a table that we BCP into, the data is then processed and inserted into its appropriate table.
Then the table or its data needs to be removed. This seems to be a very slow operation to remove
the table or table data. I have tried drop table, and truncate table and it takes nearly as long as
the bcp operation. The table has 12 million rows. I didn't think either operation wrote to the
transaction log except for page extent management. Why is the drop and truncate so slow. Suggestions?
View 1 Replies
View Related
Mar 23, 2008
Hi Guys,
What is the fast way to move huge table (77 million) records with 25 columns across servers? The servers are not linked though.
Thanks for the help.
View 3 Replies
View Related
Jan 8, 2008
Hi,
I have a number of large tables (50,000,000 + rows and 30+ columns) that are refreshed once per month with data from another system. Data is first loaded into a staging table, that has an indentical strcuture to the final table, cleansed and then copy to the final table.
The current copy process is simple but inefficient:
Truncate Table <final-table>
Insert Into <final-table>
Select * from <staging-table>
If possible I don't want to use DTS for this and I'd like to use a batch size of around 10,000 rows to help reduce the size of the log file because I have 3 of these types of Inserts running concurrently.
What is the fatest way to copy data from one table to another, within a stored procedure using SQL 2000 on a large, clustered server that uses logging?
Thanks,
Jason
View 5 Replies
View Related
Jul 20, 2005
How can i delete all user stored procedures and all table triggers very fastina single database?Thank you
View 17 Replies
View Related
May 25, 2008
Hello,
It is possible to write stored procedures which take table names as parameters; is it also possible to do this with table valued functions?
For example, a simple stored procedure is this:
CREATE PROCEDURE SelectTop(@tableName sysname)
AS
BEGIN
Execute('Select top 10 * from ' + @tableName + ';')
END
I want to be able to do the analogous thing with a table valued function (so that I can query the result set, without having to create a temp table). How should I do this (i.e., pass a tablename as an argument to a table valued function)?
View 11 Replies
View Related
Jan 1, 2004
Been digging around, I want to query the size of a table (disc size)
Anything?
View 1 Replies
View Related
May 12, 2004
I'm studying for the MCDBA test & understand table valued functions but am struggling to find a good use for them... can anyone explain to me why you'd want to use one over a view?
View 2 Replies
View Related
Aug 11, 2004
Hi I am writting Stored Procedures that have to be built on the base of other tables specially created for this purpose, is it better to create these intermediate tables as views or as functions returning tables? I guess views would be lighter on performance as they would not be created on the fly?
View 2 Replies
View Related
Apr 17, 2008
Hi!
Here's my function. The trouble - I can not make ORDER BY the "visits_count", "properties_count", "enquiries_count" fields.
May be some one could help me with this?
CREATE FUNCTION [dbo].[GetPagedStatistics]
(
@start_index int,
@count int,
@condition nvarchar(255),
@order_field nvarchar(255),
@date_from datetime,
@date_to datetime )
RETURNS @total_stat TABLE (
username nvarchar(255),
first_name nvarchar(255),
last_name nvarchar(255),
properties_count int,
enquiries_count int,
visits_count int,
id_user int)
BEGIN
INSERT @total_stat
SELECT
top (@count)
dbo.users.username,
dbo.users.first_name,
dbo.users.last_name,
ISNULL(COUNT(DISTINCT dbo.advertisement.id_advertisement), 0) AS properties_count,
ISNULL(COUNT(DISTINCT dbo.enquiry_emails.id_enquiry_email), 0) AS enquiries_count,
ISNULL(COUNT(DISTINCT dbo.property_statistics.id_statistics), 0) AS visits_count,
dbo.users.id_user
FROM
dbo.property_statistics RIGHT OUTER JOIN
dbo.advertisement RIGHT OUTER JOIN
dbo.users ON dbo.advertisement.id_user = dbo.users.id_user LEFT JOIN
dbo.enquiry_emails ON dbo.enquiry_emails.id_advertisement = dbo.advertisement.id_advertisement ON
dbo.property_statistics.id_advertisement = dbo.advertisement.id_advertisement
WHERE
1=@condition and
(dbo.advertisement.creation_date <= @date_to and dbo.advertisement.creation_date >= @date_from ) and
(
(dbo.enquiry_emails.creation_date <= @date_to
and dbo.enquiry_emails.creation_date >= @date_from
and dbo.property_statistics.view_date <= @date_to
and dbo.property_statistics.view_date >= @date_from ) or
(dbo.property_statistics.view_date is null) or
(dbo.enquiry_emails.creation_date is null)
) and
(ISNULL(dbo.advertisement.id_parent, 0) = 0)
GROUP BY
dbo.users.username,
dbo.users.first_name,
dbo.users.last_name,
dbo.users.id_user
order by
case when @order_field='username' then dbo.users.username end,
case when @order_field='first_name' then dbo.users.first_name end,
case when @order_field='last_name' then dbo.users.last_name end,
case when @order_field='properties_count' then 1 end,
case when @order_field='enquiries_count' then 1 end,
case when @order_field='visits_count' then 1 end
RETURN
END
View 1 Replies
View Related
Apr 24, 2008
Are there any disadvantages in respect to performance in using table valued functions instead of using a view.
Thanks...
View 3 Replies
View Related
May 1, 2007
Help! Been doing the box step with BOL for several hours , Using tables in Adventureworks to create inline-table-valued function to provide a parameterized view of three JOINS - Have sucessfully created the function but can't figure out where to 'Declare' my variable "@SalesAgentID" need to be able to invoke the function with a particular ID - If you can help me cut this dance short I would REALLY Appreciate it.
View 7 Replies
View Related
Sep 13, 2006
Hi,
I'm trying to create a CLR functions
this is the Sql Function attribute and the FillRowMethod signature
[Microsoft.SqlServer.Server.SqlFunction(DataAccess = DataAccessKind.Read, SystemDataAccess = SystemDataAccessKind.Read,
FillRowMethodName = "FillRows",IsPrecise=true,
TableDefinition = "SOCIETA nvarchar(55),CLIENTE nvarchar(150),NUMEROCONTRATTO nvarchar(255),FIRMA datetime,CHIUSURA datetime,AUTORIZZATO float"
)]
public static IEnumerable dbf_Create_RiepilogoAccordi(SqlInt32 commessa, SqlInt32 tipo_commessa, SqlInt32 progetto, SqlInt32 DAC, SqlInt32 figura, SqlDateTime dataFatturazioneDa, SqlDateTime dataFatturazioneA)
public static void FillRows(Object obj, out SqlString SOCIETA, out SqlString CLIENTE, out SqlString NUMEROCONTRATTO, out SqlDateTime FIRMA, out SqlDateTime CHIUSURA, SqlDouble AUTORIZZATO)
Whe I try to deploy my function, I get the following error:
Error 1 Function signature of "FillRow" method (as designated by SqlFunctionAttribute.FillRowMethodName) does not match SQL declaration for table valued CLR function 'dbf_Create_RiepilogoAccordi' due to column 6. CM.Reports.SIA.RiepilogoAccordi
I get this error whichever combination of name/value I use for column 6
Can someone help me?
Thanks
Marco
View 1 Replies
View Related
Jan 15, 2008
Hi,
I have just started working with CLR Userdefined functions(SQL 2005),the below code shows I am inserting a row into
table through a function.
I dont know where am going wrong;but nuthing is happening.
I checked the connection also,in Server Explorer its getting connected with the data base.
**Please help me regarding this******
public partial class UserDefinedFunctions
{
[Microsoft.SqlServer.Server.SqlFunction]
public static int EmpName()
{
using (SqlConnection conn = new SqlConnection("Context Connection=true"))
{
conn.Open();
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = "INSERT INTO dbo.Employee VALUES('MOHAN',66,22,'GDGDG',55)";
//SqlDataReader rec = new SqlDataReader();
//rec = cmd.ExecuteReader();
int rows = cmd.ExecuteNonQuery();
//string name = rec.GetString(0);
conn.Close();
return rows;
}
View 3 Replies
View Related
Apr 11, 2006
Hi everyone.I'd like to know how stored procedures and table-valued functions compare when it comes to returning a resultant set of data. I know there is a link somewhere but I can't immediately find it.Thanks.
View 2 Replies
View Related
May 25, 2004
Hello
I am trying to do the following:
1. Create a Multi-statement Table-valued Functions, say mstvF1, with 1 parameter.
2. Then, use it like this: "Select * from table T, mstvF1( T.id )"
It gives me Line 100: Incorrect syntax near 'T', referring to the T of T.id.
If I do
Select * from table T, mstvF1( 5 ), then it works.
Is there any way to do a select from a table T combined with an MSTV function and passing in as a parameter a field from T?
Thanks for any help.
View 3 Replies
View Related
Apr 1, 2008
Hello Gurus,
I have a stored procedure that gathers data from three tables and joins them, two of the tables need to have different rowcounts set, ie. pull only a certain number of rows from one table and only a certain number of rows from another table... The number of rows it should pull are stored within a table for each. Let me explain.... these tables hold Exchange storage group and mailstore data for a number of servers. Each server has a table entry with the number of child storage groups and each storage group has a table entry with the number of child mailstores. The tables get updated every two minutes via a program. I need to be able to get the most Data with the correct child counts for each server and storage group.
I believe that i've found a way to do this with a stored procedure that calls a table-valued function. The table-valued function simply filters down the storage group table to it's number of storage groups, ordered by timestamp. I may be way off here, but i can't tell because both the stored procedure and function check out fine but when i execute the stored procedure it gives me the following error:
Cannot find either column "dbo" or the user-defined function or aggregate "dbo.GetExchSGInfo", or the name is ambiguous.
My code is below:
Stored Procedure:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[GetExchangeData2]
@top INT,
@SID INT,
@SGCount INT,
@ServerName VARCHAR(50)
AS
Set @SID = (SELECT ServerID FROM dbo.Servers WHERE ServerName = @ServerName)
Set @top = (SELECT sum(Children) FROM dbo.ExchangeSG WHERE ServerID = @SID)
Set @SGCount = (SELECT SGCount FROM dbo.Servers WHERE ServerID = @SID)
SET ROWCOUNT @top
SELECT dbo.ExchangeMSData.*, dboExchangeMailStore.*, dbo.GetExchSGInfo(@SID,@SGCount) As ExchangeSG, dbo.Servers.*
FROM dbo.Servers INNER JOIN
ExchangeSG ON dbo.Servers.ServerID = ExchangeSG.ServerID INNER JOIN
dbo.ExchangeMailStore ON ExchangeSG.StorageGroupID = dbo.ExchangeMailStore.StorageGroupID INNER JOIN
dbo.ExchangeMSData ON dbo.ExchangeMailStore.MailstoreID = dbo.ExchangeMSData.MailstoreID
WHERE (dbo.Servers.ServerName = @ServerName)
ORDER BY dbo.ExchangeMSData.[TimeStamp] DESC, dbo.ExchangeSG.[TimeStamp] DESC
SET ROWCOUNT 0
And the Function:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [dbo].[GetExchSGInfo]
(
@SID INT,
@SGCount INT
)
RETURNS TABLE
AS
RETURN
(
SELECT TOP (@SGCount) *
FROM dbo.ExchangeSG
WHERE ServerID = @SID
ORDER BY [TimeStamp]
)
Can anyone help me?
Thanks.
View 7 Replies
View Related
Feb 26, 2014
I have query that doesn't even register a time when running it. But when I add the lines that are commented out in the code below, it takes between 20 and 30 seconds! When I run the code for functions directly,I know when I include it like this, it loses the Indexing capabilities?
SELECT
----, ISNULL(CAST(NULLIF(dbo.ufnGetRetail(I.ISBN13),0.00) AS VARCHAR(20)), 'N/A') RetailPrice
----, ISNULL(CAST(NULLIF(SP.LocalPrice,0.00) AS VARCHAR(20)),'on request') LocalPrice
[code]...
How to have the functions included but have the query response time come down?
View 4 Replies
View Related
Sep 6, 2006
I am a bit confused by the difference between a stored procedure and a table-valued function. Can somebody please either give me a simple explanation, or point me at something I can read.
I thought I had it worked out, and had coded some action queries as stored procedures, and I wrote a table-valued function that was effectively an encapsulated SELECT so that SELECT * FROM Spouse(@ID) worked fine. Then I wanted to use a function SpousePair, that was similar to Spouse, to power a Gridview. I discovered that I couldn't. It seems that a SQLDataSource requires either a SELECT statement or a stored procedure. So I wrote a stored procedure SpousePair(@ID1, @ID2).
I find that whereas I tested Spouse with
SELECT * FROM SPOUSE(@ID)
I tested SpousePair with
EXEC SpousePair @ID1 @id2
Now I want to combine these: if I could I would write
SELECT * FROM SPOUSE(@ID) WHERE SPOUSEID NOT IN
(SELECT SPOUSEID FROM SpousePair(@ID1, @ID2))
However this is invalid because you can't put a stored procedure in a Select statement, and SELECT .... NOT IN (EXEC SpousePair @ID1 @ID2) is also invalid.
Is there any alternative to creating a table-valued function, SpousePairA, that is identical to SpousePair but coded as a function. I'm reluctant to do this because then I'll have two bits of quite complicated SQL logic to maintain.
View 4 Replies
View Related
Aug 4, 2006
So I was creating a new table-valued function today which queries some data from a preexisting table. Since this is my first table-valued function, I decided to check out some of the examples and see what I can figure out.
One particular example helped me out a bit until I ran into some data access issues...
http://msdn2.microsoft.com/en-us/library/ms165054.aspx
So I create my function:
[SqlFunction(DataAccess = DataAccessKind.Read,SystemDataAccess=SystemDataAccessKind.Read,FillRowMethodName = "FillMyRow",TableDefinition ="p1 int, p2 int"]
public static IEnumerable getMyTable()
{
using (SqlConnection conn = ....)
{
using (SqlCommand command = conn.CreateCommand())
{
///.... populate command text, open connection
using (SqlDataReader rdr = command.ExecuteReader())
{
while (rdr.Read())
{
customObject1 o = new customObject1();
///... populate o's parameters from reader ...
yield return o;
}
}
}
}
public static void FillMyRow(
object source,
out int p1,
out int p2)
{
customObject1 f = (customObject1)source;
p1 = f.p1;
p2 = f.p2;
}
Notice, this example yield returns the value o upon each iteration of the reader.
Despite the fact that the DataAccess is set to Read I still get the error...
An error occurred while getting new row from user defined Table Valued Function :
System.InvalidOperationException: Data access is not allowed in this context. Either the context is a function or method not marked with DataAccessKind.Read or SystemDataAccessKind.Read, is a callback to obtain data from FillRow method of a Table Valued Function, or is a UDT validation method.
I did however get past this error, by creating a collection of customObject1, populated it within the while(rdr.Read()) loop, then return the collection after closing the connection, command and reader.
I assume this error has something to do with the fact that you can't yield return results from within an open reader. Is this error right though in this case? Whats causing it to throw a InvOp Exception? Or is this a bug?
Thanks for the attention.
View 4 Replies
View Related
Dec 29, 2005
Hi,
Following is the user defined function I want to get maximum value. It gives man an error "@strTableName must declare"
CREATE FUNCTION dbo.GetMaximum
(
@strFieldNamenvarchar(255),
@strTableName nvarchar(255)
)
RETURNS nvarchar(255)
AS
BEGIN
DECLARE @maxID int
SELECT @maxID=(SELECT IsNull(MAX(@strFieldName),0)+1 FROM @strTableName )
RETURN (@maxID)
END
View 5 Replies
View Related
May 13, 2008
Hi!
I need to expand resursion level for resursive CTE expression within CREATE FUNCTION statement for inline table function to a value greater than default. It turns out that OPTION clause for MAXRECURSION hint perfectly works if I use it outside CREATE FUNCTION (as well as CREATE VIEW for non-parametrized queries), but it does not within CREATE FUNCTION statement - I'm getting error:
Msg 156, Level 15, State 1, Procedure ExpandedCTE, Line 34
Incorrect syntax near the keyword 'option'.
Here is the function:
create FUNCTION [dbo].[ExpandedCTE]
(
@p_id int
)
RETURNS TABLE
AS
RETURN
(
with tbl_cte (id, tbl_id, lvl)
as
(
select
id, tbl_id, 0 lvl
from
tbl
where
id = @p_id
union all
select
t.id, t.tbl_id, lvl + 1
from
tbl_cte
inner join tbl t
on rnr.tbl_id = tbl_cte.id
)
select
id, tbl_id, lvl
from
tbl_cte
option (maxrecursion 0)
)
Please help!
Alexander.
P.S.
I'm really sorry if it is about syntax, but I could not find it in the documentation.
View 12 Replies
View Related
May 26, 2006
I was playing around with the new SQL 2005 CLR functionality andremembered this discussion that I had with Erland Sommarskog concerningperformance of scalar UDFs some time ago (See "Calling sp_oa* infunction" in this newsgroup). In that discussion, Erland made thefollowing comment about UDFs in SQL 2005:[color=blue][color=green]>>The good news is that in SQL 2005, Microsoft has addressed several of[/color][/color]these issues, and the cost of a UDF is not as severe there. In fact fora complex expression, a UDF in written a CLR language may be fasterthanthe corresponding expression using built-in T-SQL functions.<<I thought the I would put this to the test using some of the same SQLas before, but adding a simple scalar CLR UDF into the mix. The testinvolved querying a simple table with about 300,000 rows. Thescenarios are as follows:(A) Use a simple CASE function to calculate a column(B) Use a simple CASE function to calculate a column and as a criterionin the WHERE clause(C) Use a scalar UDF to calculate a column(D) Use a scalar UDF to calculate a column and as a criterion in theWHERE clause(E) Use a scalar CLR UDF to calculate a column(F) Use a scalar CLR UDF to calculate a column and as a criterion inthe WHERE clauseA sample of the results is as follows (time in milliseconds):(295310 row(s) affected)A: 1563(150003 row(s) affected)B: 906(295310 row(s) affected)C: 2703(150003 row(s) affected)D: 2533(295310 row(s) affected)E: 2060(150003 row(s) affected)F: 2190The scalar CLR UDF function was significantly faster than the classicscalar UDF, even for this very simple function. Perhaps a more complexfunction would have shown even a greater difference. Based on this, Imust conclude that Erland was right. Of course, it's still faster tostick with basic built-in functions like CASE.In another test, I decided to run some queries to compare built-inaggregates vs. a couple of simple CLR aggregates as follows:(G) Calculate averages by group using the built-in AVG aggregate(H) Calculate averages by group using a CLR aggregate that similatesthe built-in AVG aggregate(I) Calculate a "trimmed" average by group (average excluding highestand lowest values) using built-in aggregates(J) Calculate a "trimmed" average by group using a CLR aggregatespecially designed for this purposeA sample of the results is as follows (time in milliseconds):(59 row(s) affected)G: 313(59 row(s) affected)H: 890(59 row(s) affected)I: 216(59 row(s) affected)J: 846It seems that the CLR aggregates came with a significant performancepenalty over the built-in aggregates. Perhaps they would pay off if Iwere attempting a very complex type of aggregation. However, at thispoint I'm going to shy away from using these unless I can't find a wayto do the calculation with standard SQL.In a way, I'm happy that basic SQL still seems to be the fastest way toget things done. With the addition of the new CLR functionality, Isuspect that MS may be giving us developers enough rope to comfortablyhang ourselves if we're not careful.Bill E.Hollywood, FL------------------------------------------------------------------------- table TestAssignment, about 300,000 rowsCREATE TABLE [dbo].[TestAssignment]([TestAssignmentID] [int] NOT NULL,[ProductID] [int] NULL,[PercentPassed] [int] NULL,CONSTRAINT [PK_TestAssignment] PRIMARY KEY CLUSTERED([TestAssignmentID] ASC)--Scalar UDF in SQLCREATE FUNCTION [dbo].[fnIsEven](@intValue int)RETURNS bitASBEGINDeclare @bitReturnValue bitIf @intValue % 2 = 0Set @bitReturnValue=1ElseSet @bitReturnValue=0RETURN @bitReturnValueEND--Scalar CLR UDF/*using System;using System.Data;using System.Data.SqlClient;using System.Data.SqlTypes;using Microsoft.SqlServer.Server;public partial class UserDefinedFunctions{[Microsoft.SqlServer.Server.SqlFunction(IsDetermini stic=true,IsPrecise=true)]public static SqlBoolean IsEven(SqlInt32 value){if(value % 2 == 0){return true;}else{return false;}}};*/--Test #1--Scenario A - Query with calculated column--SELECT TestAssignmentID,CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END ASCalcColumnFROM TestAssignment--Scenario B - Query with calculated column as criterion--SELECT TestAssignmentID,CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END ASCalcColumnFROM TestAssignmentWHERE CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END=1--Scenario C - Query using scalar UDF--SELECT TestAssignmentID,dbo.fnIsEven(TestAssignmentID) AS CalcColumnFROM TestAssignment--Scenario D - Query using scalar UDF as crierion--SELECT TestAssignmentID,dbo.fnIsEven(TestAssignmentID) AS CalcColumnFROM TestAssignmentWHERE dbo.fnIsEven(TestAssignmentID)=1--Scenario E - Query using CLR scalar UDF--SELECT TestAssignmentID,dbo.fnIsEven_CLR(TestAssignmentID) AS CalcColumnFROM TestAssignment--Scenario F - Query using CLR scalar UDF as crierion--SELECT TestAssignmentID,dbo.fnIsEven_CLR(TestAssignmentID) AS CalcColumnFROM TestAssignmentWHERE dbo.fnIsEven(TestAssignmentID)=1--CLR Aggregate functions/*using System;using System.Data;using System.Data.SqlClient;using System.Data.SqlTypes;using Microsoft.SqlServer.Server;[Serializable][Microsoft.SqlServer.Server.SqlUserDefinedAggregate (Format.Native)]public struct Avg{public void Init(){this.numValues = 0;this.totalValue = 0;}public void Accumulate(SqlDouble Value){if (!Value.IsNull){this.numValues++;this.totalValue += Value;}}public void Merge(Avg Group){if (Group.numValues > 0){this.numValues += Group.numValues;this.totalValue += Group.totalValue;}}public SqlDouble Terminate(){if (numValues == 0){return SqlDouble.Null;}else{return (this.totalValue / this.numValues);}}// private accumulatorsprivate int numValues;private SqlDouble totalValue;}[Serializable][Microsoft.SqlServer.Server.SqlUserDefinedAggregate (Format.Native)]public struct TrimmedAvg{public void Init(){this.numValues = 0;this.totalValue = 0;this.minValue = SqlDouble.MaxValue;this.maxValue = SqlDouble.MinValue;}public void Accumulate(SqlDouble Value){if (!Value.IsNull){this.numValues++;this.totalValue += Value;if (Value < this.minValue)this.minValue = Value;if (Value > this.maxValue)this.maxValue = Value;}}public void Merge(TrimmedAvg Group){if (Group.numValues > 0){this.numValues += Group.numValues;this.totalValue += Group.totalValue;if (Group.minValue < this.minValue)this.minValue = Group.minValue;if (Group.maxValue > this.maxValue)this.maxValue = Group.maxValue;}}public SqlDouble Terminate(){if (this.numValues < 3)return SqlDouble.Null;else{this.numValues -= 2;this.totalValue -= this.minValue;this.totalValue -= this.maxValue;return (this.totalValue / this.numValues);}}// private accumulatorsprivate int numValues;private SqlDouble totalValue;private SqlDouble minValue;private SqlDouble maxValue;}*/--Test #2--Scenario G - Average Query using built-in aggregate--SELECT ProductID, Avg(Cast(PercentPassed AS float))FROM TestAssignmentGROUP BY ProductIDORDER BY ProductID--Scenario H - Average Query using CLR aggregate--SELECT ProductID, dbo.Avg_CLR(Cast(PercentPassed AS float)) AS AverageFROM TestAssignmentGROUP BY ProductIDORDER BY ProductID--Scenario I - Trimmed Average Query using built in aggregates/setoperations--SELECT A.ProductID,CaseWhen B.CountValues<3 Then NullElse Cast(A.Total-B.MaxValue-B.MinValue ASfloat)/Cast(B.CountValues-2 As float)End AS AverageFROM(SELECT ProductID, Sum(PercentPassed) AS TotalFROM TestAssignmentGROUP BY ProductID) ALEFT JOIN(SELECT ProductID,Max(PercentPassed) AS MaxValue,Min(PercentPassed) AS MinValue,Count(*) AS CountValuesFROM TestAssignmentWHERE PercentPassed Is Not NullGROUP BY ProductID) BON A.ProductID=B.ProductIDORDER BY A.ProductID--Scenario J - Trimmed Average Query using CLR aggregate--SELECT ProductID, dbo.TrimmedAvg_CLR(Cast(PercentPassed AS real)) ASAverageFROM TestAssignmentGROUP BY ProductIDORDER BY ProductID
View 9 Replies
View Related
May 14, 2008
right now I have a stored procedure that goes through each of the Line and Body fields using a cursor. The problem is that this method is very slow. How would you experts solve this problem? any Hints or suggestions?
BEFORE
EXAMPLEPartLineBodySeriesEngineYear
11234A,BWETC1998
25678991,93,94,95WET01997
3345656S,R5,6,12WENC1995
AFTER
EXAMPLEPartLineBodySeriesEngineYear
11234AWETC1998
11234BWETC1998
25678991WET01997
25678993WET01997
25678994WET01997
25678995WET01997
3345656S5WENC1995
3345656S6WENC1995
3345656S12WENC1995
3345656R5WENC1995
3345656R6WENC1995
3345656R12WENC1995
View 4 Replies
View Related
Sep 29, 2000
Hi,
In my SQL server 7.0, I have got 250 store procedures in each database.
Before using them for my application, I want to ecyption all.
I must add "WITH ENCRYPTION" string in each SP in all database and it'll take me a long time. Is there fastest way to encryption all SPs in all DBs? Have anyone got an utility SP ( or anyway else) to do this?
Thanks in advance.
View 1 Replies
View Related
Mar 10, 2008
What is the fast way a stored procedure can copy a table from a linked server?
I would like to tune this statement, possibly with hints or other logging options. Assume that table_A and table_B have the exact table structure and that I want to preserve table_A and all its indexes and contraints. The table will be truncated before this load, if that helps in any way.
insert into table_A select * from OpenQuery(Server,'select * from Table_B')
TIA, Mike
View 4 Replies
View Related
Jul 20, 2005
In relation to my last post, I have a question for the SQL-gurus.I need to update 70k records, and mark all those updated in a specialcolumn for further processing by another system.So, if the record wasKey1, foo, foo, ""it needs to becomeKey1, fap, fap, "U"iff and only iff the datavalues are actually different (as above, foobecomes fap),otherwise it must becomeKey1, foo,foo, ""Is it quicker to :1) get the row of the destination table, inspect all valuesprogramatically, and determine IF an update query is neededOR2) just do a update on all rows, but addingand (field1 <> value1 or field2<>value2) to the update querythat isupdate myTablesetfield1 = "foo"markField="u"where key="mykey" and (field1 <> foo)The first one will not generate new update queries if the record hasnot changed, on account of doing a select, whereas the second versionalways runs an update, but some of them will not affect any lines.Will I need a full index on the second version?Thanks in advance,Asger Henriksen
View 2 Replies
View Related
Dec 26, 2000
Hi,
1)I need to transfer 500 gb of data from one server to other, which is faster, DTS/BCP/Restore.
2)Which are the best methods for checking blocking, dead locks & Indexes!
Thanks you all in advance
Richard..
View 5 Replies
View Related
Nov 5, 2004
I have a master table which has demographic data such as name, dob, location along with a primary key id. It will have about 10-12000 records. We get a refresh file every hour which may or may not have corrections for these records hourly with about 3,000 records. I put this data into a table. This data should be considered always to be correct. To handle the update to the master table I need to create an update process. I can take one of two approaches, just update all the records in the master table regardless if they are correct or not, or do some type of left join on those that do not match (in other words, only update the ones where thae names or dob don't match) There is an underlying update trigger on the patient master which will also fire if these values are changed. An opinions on a best approach?
View 1 Replies
View Related
Feb 2, 2006
Hi,
I have a production server that has an 8Gb db. It is dual Xeon with 5x HDD - 2 mirrored and 3 striped. db on stripe, log and OS on mirror. 2x Gb network cards.
The application goes slow (ie users notice) when a backup is running so i have placed a crossover cable from one NIC to a test server so that it can back up to a HDD on that server, and then to tape. The test server has 2xGb NIC and the link between the two servers is on a seperate subnet to However, in the first trial of this the back up and verify takes 3 minutes longer.
Is this because the target server doesnt have a disk stripe?
What is the best config for the production server (ie will a slower backup but to another server be less load to contend with the application)?
thanks
Fatherjack
View 2 Replies
View Related
May 22, 2007
I've got a view that is driven from a 80 million record table in a data warehouse. I am trying to populate an aggregate table in a datamart, but am running into preformance problems. The datamart table needs to be updated daily. I understand there are many factors that effect performance, but in general would the fastest approach be:
1) Truncate the datamart table
2) Perform a bcp of the view to a text file
3) Bulk Insert to the datamart table
If you need more information to answer this please let me know.
Thanks,
Matt
View 7 Replies
View Related
Jul 23, 2005
Hi!We have Sql Server 2000 in our server (NT 4). Our database have now about+350.000 rows with information of images. Table have lot of columnsincluding information about image name, keywords, location, price, colormode etc. So our database don?t include the images itself, just a path tothe location of every image. Keywords -field have data for example likethis:cat,animal,pet,home,child with pet,child. Now our search use Full-TextSearch which sounded like good idea in the beginning but now it have hadproblems that really reduce our search engine?s performance. Also searchresults are not exact enough. Some of our images have also photographer?sname in keywords -column and if photographer?s name is, for example, PeterMoss, his pictures appears in web-page when customer want to search "moss"(nature-like) -pictures.Another problem is that Full-Text Search started to be very slow when queryresult contains thousands of rows. When search term gives maximum 3000rows, search is fast but larger searches take from 6 to 20 seconds tofinish which is not good. I have noticed also that first search is alwaysvery slow, but next ones are faster. It seems that engine is just"starting" when first query started to run.Is there better and faster way to handle the queries? Is it better torebuild the database somehow and use another method to search than Full-Text Search? I don?t know how to handle the database other way when everyimage have about 10 to even 50 different keywords to search.We have made web interface and search code with Coldfusion. ColdfusionServer then take care of sending all queries to Sql Server.I hope that somebody have some idea how to speed up our picture search.--Message posted via http://www.sqlmonster.com
View 2 Replies
View Related