Hi, I am writing a stored proc that will so a very simple search for users in the database. In my database I am storing a birthday as a datetime column called 'bday'. Users can search for people between a certain age range; say 23 - 30.
Here is my search query:
alter procedure sp_wm_quickSearch
@lookingFor int,
@mySex int,
@country varchar(500),
@ageTo int,
@ageFrom int,
@state varchar(10),
@userid int
as
select * from
wm_user a
inner join
wm_user_details b
on
a.userid=b.userid
where
a.lookingfor = 0 and
a.sex = 1 and
a.country = 'United States of America' and
a.state = 'Washington'
I found a simple function that looks like it can do what i need it to do:
CREATE FUNCTION dbo.fnYearsApart
(
@FromDate DATETIME,
@ToDate DATETIME
)
RETURNS INT
AS
BEGIN
RETURN CASE
WHEN @FromDate > @ToDate THEN NULL
WHEN DATEPART(day, @FromDate) > DATEPART(day, @ToDate) THEN DATEDIFF(month, @FromDate, @ToDate) - 1
ELSE DATEDIFF(month, @FromDate, @ToDate)
END / 12
END
MY QUESTIONS IS THIS:
How do I call this function in my query and pass it the age, and use the age as a search param. I imagine it woudl look something like this:
select * from
wm_user a
inner join
wm_user_details b
on
a.userid=b.userid
where
a.lookingfor = 0 and
a.sex = 1 and
a.country = 'United States of America' and
a.state = 'Washington' and
fnYearsApart(a.bday, Some DateTime.Now function) < 24 and > 30
but this line doesn't work: (obviously i need to replace datetime.now
fnYearsApart(a.bday, Some DateTime.Now function) < 24 and > 30
Supposing I have an SQL Server Function named Fxn1 which returns a VARCHAR(6) value. I want to use this returned value in another function named Fxn2 but Fxn2 accepts a parameter of type INT. Fxn2 itself returns an INT value. This has to be achieved in a single line.
The following line of code will not work in SQL Server. How should I achieve this?
I have a rather complex function (part of a production planning engine) that is written in VBA, and is part of my front end app. I'd like to be able to somehow call this function from either a sproc or DTS package in SQL. Is this possible, or am I going to have to convert the function to a SQL sproc (ugh)?
Hey guys, I recently found out u can actaully JOIN tables that are on different dbs as long as they are on the same SQL Server by specifying [DATABASE].[DATABASEOWNER].[TABLE]. I now have a question tho. Can you access functions the same way? i have tried using the same syntax, but I get an error saying 'Invalid object name'
Hi , I have a question about calling functions in SQl Server 2005.
Let's say that I have created as a dbo a function called Calculations.
If I want to call it from T-SQL I will write Select dbo.Calculations (arguments if any) etc.
My question is If I can skip the "dbo" part. Call the function without using the dbo . Can I do that ? Should I create the function as supervisor ? Does Sql Server has a property or something which will allow me to call the function without using the "dbo." ?
I have a number of databases that require a set of common functions. I'd like to place all those functions in a central DB. I'm having trouble calling them using the syntax FunctionDB.GetParamLength() for example, which works within the FunctionDB database.
Any ideas/suggestions? I really don't want to maintain seperate copies of the functions across 5+ databases.
Hi,I alredy tried to search this problem in last posts, but I couldn'tfind the answer.I try to access via Extended SP the method in a dll.I registered the dll as a ExSP, with a name of method. But aftercalling it in T-SQL, I became such a error message:[Microsoft][ODBC SQL Server Driver][SQL Server]Cannot find thefunction SendGeneralNotify_FromA in the library [LibraryName.dll].Reason: 127(error not found).In this dll I have only one class an it has events, properities andmethods.I will to call one of these methods.aha ... very importand. Please don't say that this dll should bewritten in c++ because it is made like that (no VB).Maybe somebody of you have an example how I should call it, to becamean access on this dll?Sorry for my not well english.With best regards, looking forward for reply-----------------------------MatikJoin Bytes!
We have a user-defined function that can be called directly via SQL (in SQL Server Management Studio) without error. We would like to use this function to populate a column, whist data is being processed within Integration Services. Using an OLE DB Command transformation to achieve this would seem the most appropriate.
The following was inserted for the SQLCommand property:
However, when the Refresh button is pressed we are presented with the error below:
Error at Load Orderline [OLE DB Command [15171]]: An OLE DB error has occurred. Error code: 0x8004E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x8004E14 Description: "Invalid parameter number".
If we use SET instead of EXEC (e.g. SET ? = dbo.GetOrderlineStatus(@dt_required = ?, @dt_invoice = ?, @dt_despatch = ?, @ch_status = ?, @si_suffix = ?, @re_quantity = ?, @vc_invoice_id = ?, @vc_order_id = ?)) the following error is produced:
Error at Load Orderline [OLE DB Command [15171]]: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Syntax error, permission violation, or other nonspecific error".
Can someone tell me if this is a SQL Server bug? I tried this in both version 7 and 2000, the results are the same.
DECLARE @timeA DATETIME DECLARE @timeB DATETIME DECLARE @msDiff INT
SET @timeA = GETDATE() SET @msDiff = 0
WHILE @msDiff <= 10 BEGIN SET @timeB = DATEADD(ms,@msDiff,@timeA) PRINT 'If adding ' + CONVERT(VARCHAR,@msDiff) + ' milliseconds to Time B, then Time B is ' + CONVERT(VARCHAR,DATEDIFF(ms,@timeA,@timeB)) + ' millisecond greater than Time A' SET @msDiff = @msDiff + 1 END
This seems like a serious bug if an application depends heavily on milliseconds comparison.
Edit: There seems to be some interaction between GROUP BY/NEWID() that can cause problems if these functions are used directly in an aggregate query. Please read the posts on this thread about this, and use caution.
This script creates three functions, F_RANDOM_INTEGER, F_RANDOM_SAMPLE, and F_RANDOM_DATETIME. The last parameter of each function must be function NEWID() to generate the random number.
Theses functions are designed for applications where it is necessary to generate random integers, random datetimes, or take random samples from sets of data. Typical applications would be software testing, inventory control, auditing, and product quality testing.
Function F_RANDOM_INTEGER returns a random integer in the range of the input parameters so that the return value is >= @START_INT and <= @END_INT. It is valid for any range of two integer values.
Function F_RANDOM_SAMPLE returns a 1 or a 0 to determine if a sample should be selected, based on the input sample rate. Input parameter @SAMPLE_RATE should be between 1 and 999,999. The sample rate determines how many samples should be selected out of each 1,000,000 samples. A sample rate below 1 will never select a sample, and a sample rate above 999,999 will always select a sample. A sample rate of 1,000 = 0.1%, 50,000 = 5%, 63,775 = 6.3775%, 100,000 = 10%, and 500,000 = 50%.
F_RANDOM_DATETIME returns a random datetime value >= @START_TIME and < @END_TIME. It is valid for any datetime range. Input parameters default, if null, to @START_TIME = '19000101' and @END_TIME = '19000102’. The datetime is random to the level of clock ticks (1/300 of as second). Note that the latest time is not included in the range of datatime values that can be returned. This is to allow selection of times within adjacent time periods, without having to specify times to the level of milliseconds. This means a range of 1990-12-01 01:00:00.000 through 1990-12-01 02:00:00.000 will never return a value of 1990-12-01 02:00:00.000.
The NEWID() function is the basis of the random numbers. These functions should not to be considered random for purposes of data encryption or other high security applications. However, they should be adequate for business applications of the types mentioned above. I conducted extensive testing with the functions where I generated millions of results, analyzed the results various ways to look for non-random patterns, and I saw no evidence of non-random results.
The script also includes a demo of each function, and sample output from the demos is also included.
The demo script uses the number table function, F_TABLE_NUMBER_RANGE, available on this link: http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=47685
if objectproperty(object_id('dbo.F_RANDOM_INTEGER'),'IsScalarFunction') = 1 begin drop function dbo.F_RANDOM_INTEGER end go create function dbo.F_RANDOM_INTEGER ( @START_INTint, @END_INTint, @NEWIDuniqueidentifier ) returns int as /* Function: F_RANDOM_INTEGER
This function returns a random integer value >= @START_INT and <= @END_INT.
Valid for any integer range.
Requires newid() to be input parameter @NEWID to generate the random number.
-- Return random integer between -100, 200000 select [Random Integer] = [dbo].[F_RANDOM_INTEGER](-100, 200000,newid()) */ begin
-- Set default values for input dates if they are null if @START_INT is null begin set @START_INT = 0 end if @END_INT is null begin set @END_INT =1000000 end
-- Set order of input parameters so that return value is always -- the same no matter what order the input values are passed. if @START_INT > @END_INT select @sn = @END_INT, @en = @START_INT else select @sn = @START_INT, @en = @END_INT
-- Return start int if start int = end int if @sn = @en return @sn
-- Get modulus select @mod = @en-@sn+1
-- Get random bigint from input parameter @NEWID select @rand_bigint = abs(convert(bigint,convert(varbinary(20),@NEWID)))
-- Get random integer select @rand_result = @sn+(@rand_bigint%@mod)
return @rand_result
end go grant execute on dbo.F_RANDOM_INTEGER to public go
if objectproperty(object_id('dbo.F_RANDOM_SAMPLE'),'IsScalarFunction') = 1 begin drop function dbo.F_RANDOM_SAMPLE end go create function dbo.F_RANDOM_SAMPLE ( @SAMPLE_RATEint, @NEWIDuniqueidentifier ) returns int as /* Function: F_RANDOM_SAMPLE
This function returns a 1 or a 0 to determine if a sample should be selected, based on the sample rate. It is designed to select random samples at a specific rate.
Input parameter @SAMPLE_RATE should be between 1 and 999,999. The sample rate determines how many samples should be selected out of each 1,000,000 samples. A sample rate below 1 will never select a sample, and a sample rate above 999,999 will always select a sample. 1,000 = 0.1%, 50,000 = 5%, 63,775 = 6.3775%, 100,000 = 10%, and 500,000 = 50%
Requires newid() to be input parameter @NEWID to generate the random number.
-- Select sample 200,000 times in 1,000,000 samples (20%) select [Random Sample] = [dbo].[F_RANDOM_SAMPLE](200000,newid()) */ begin
declare @rand_bigint bigint
-- Get random bigint from @NEWID select @rand_bigint = abs(convert(bigint,convert(varbinary(20),@NEWID)))
-- Select sample if the modulus of @rand_bigint is less than the sample rate return case when @rand_bigint%1000000 < @SAMPLE_RATE then 1 else 0 end
end go grant execute on dbo.F_RANDOM_SAMPLE to public go
if objectproperty(object_id('dbo.F_RANDOM_DATETIME'),'IsScalarFunction') = 1 begin drop function dbo.F_RANDOM_DATETIME end go create function dbo.F_RANDOM_DATETIME ( @START_TIMEdatetime, @END_TIMEdatetime, @NEWIDuniqueidentifier ) returns datetime as /* Function: F_RANDOM_DATETIME
This function returns a random datetime value >= @START_TIME and < @END_TIME.
Valid for any datetime range.
Input parameters default, if null, to: @START_TIME'19000101' @END_TIME'19000102'
Requires newid() to be input parameter @NEWID to generate the random number.
-- Return random time between 08:30 and 12:00 select [Random Time] = [dbo].[F_RANDOM_DATETIME]('08:30:00.000','12:00:00.000',newid()) */ begin
declare @st datetime declare @et datetime
declare @hours int declare @ms int declare @ticks bigint declare @rand_ticks bigint declare @rand_bigint bigint
declare @remaining_ticks int declare @return_hours int declare @return_ms int
-- Set default values for input dates if they are null if @START_TIME is null begin set @START_TIME = '19000101' end if @END_TIME is null begin set @END_TIME = '19000102' end
-- Set order of input parameters so that return value is always -- the same no matter what order the input values are passed. if @START_TIME > @END_TIME select @st = @END_TIME, @et = @START_TIME else select @st = @START_TIME, @et = @END_TIME
-- Return start time if start time = end time if @st = @et return @st
-- Get hours boundary difference. -- Subtract 1 from diff, before dividing by 2 and multiplying by 2 -- so the milliseconds remaining is always positive and -- hours is always >= zero. set @hours = ((datediff(hh,@st,@et)-1)/2)*2
-- Get remainder milliseconds set @ms = datediff(ms,0,@et-dateadd(hh,@hours,@st))
-- Convert remainder milliseconds to -- SQL Server 'clock ticks' of 1/300 of a second set @ticks = ((@ms/10)*3) + ((@ms%10)/3)
-- Add hours * tick per hour (3600*300) to give total -- ticks between @START_TIME and @END_TIME set @ticks = @ticks + (@hours * 0000001080000 )
-- Get random bigint from input parameter @NEWID select @rand_bigint = abs(convert(bigint,convert(varbinary(20),@NEWID)))
-- Get random number of ticks select @rand_ticks = @rand_bigint%@ticks
-- Get hours component of random ticks select @return_hours = @rand_ticks/1080000
-- Get left over ticks after removing hours. select @remaining_ticks = @rand_ticks%1080000
--Convert remaining clock ticks back to milliseconds select @return_ms = ((@remaining_ticks/3)*10) + floor(((@remaining_ticks%3)*3.5))
-- Return the random time between the start and end time return dateadd(ms,@return_ms,dateadd(hh,@return_hours,@st))
end go grant execute on dbo.F_RANDOM_DATETIME to public go
insert into @t select -- Get integert in range of 1 to 10,000,000 [Random Integer] = [dbo].[F_RANDOM_INTEGER](1,10000000,newid() ) from -- Function F_TABLE_NUMBER_RANGE -- available in Script Library forum F_TABLE_NUMBER_RANGE(1,100000)
select [Right Int] = [Random Integer]%10, [Count] = count(*) from @t a group by [Random Integer]%10 order by 1,2
select [Million Range] = [Random Integer]/1000000, [Count] = count(*) from @t a group by [Random Integer]/1000000 order by 1,2
insert into @t select -- Sample rate = 6.3775% [Sample Taken] = [dbo].[F_RANDOM_SAMPLE](63775,newid()) from -- Function F_TABLE_NUMBER_RANGE -- available in Script Library forum F_TABLE_NUMBER_RANGE(1,100000)
select [Sample Taken], [Result Count] = count(*) from @t a group by [Sample Taken] order by 1,2
go
print '-----------------------------------------------------------------' print ' Demo F_RANDOM_DATETIME function' print '-----------------------------------------------------------------' print '' select Random_Datetime = convert(varchar(23),[dbo].[F_RANDOM_DATETIME]( a.ST, a.ET,newid() ) ,121) , [Start] = convert(varchar(23),a.ST ,121) , [End] = convert(varchar(23),a.ET ,121) , a.Comment from ( select ST = getdate(), ET = getdate()+2 , Comment = 'Now thru 2 days from now' union all select '20060101', '20060102' , 'One day diff' union all select '20030101', '20030101' ,'Both times same' union all select '20030101', '20030108' ,'One week diff' union all select '20021228', '20030104' ,'One week diff' union all select '20010701', '20010713' ,'12 day diff' union all select '20010701', '20010714' ,'13 day diff' union all select '20010630', '20010713' ,'13 day diff' union all select '19901201 01:00:00.000', '19901201 02:00:00.000' ,'1 hour diff' union all select '19901201 01:00:33.003', '19901201 02:00:33.003' ,'1 hour diff' union all select '19901201 01:00:00.000', '19901201 01:30:00.000' ,'30 min diff' union all select '19901201 01:00:33.447', '19901201 01:30:33.447' ,'30 min diff' union all select '19901201 01:00:00.000', '19901201 01:05:00.000' ,'5 min diff' union all select '19901201 01:00:29.123', '19901201 01:05:29.123' ,'5 min diff' union all select '19901201 01:00:00.000', '19901201 01:01:00.000' ,'1 min diff' union all select '19901201 01:00:00.000', '19901201 01:00:01.000' ,'1 sec diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.100' ,'100 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.050' ,'50 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.023' ,'23 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.020' ,'20 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.013' ,'13 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.010' ,'10 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.007' ,'7 ms diff' union all select '19901201 01:00:00.000', '19901201 01:00:00.003' ,'3 ms diff' union all select '20030101', '20030201' ,'One month diff 31 days' union all select '20030101', '20040101' ,'One year diff' union all select '20050101', '20070101' ,'Two year diff' union all select '20060101', '20060301' ,'2 month diff' union all select null, '20060101' ,'Start time null' union all select '20060102', null ,'End time null' union all select null, null ,'Both null' union all select '17530101', '99991231 23:59:59.997' ,'Max datetime diff' union all select '99991231 23:59:59.997','17530101' ,'Max datetime diff reversed' ) a
Demo Results:
----------------------------------------------------------------- Demo F_RANDOM_INTEGER function -----------------------------------------------------------------
----------------------------------------------------------------- Demo F_RANDOM_SAMPLE function -----------------------------------------------------------------
(100000 row(s) affected)
Sample Taken Result Count ------------ ------------ 0 93669 1 6331
(2 row(s) affected)
----------------------------------------------------------------- Demo F_RANDOM_DATETIME function -----------------------------------------------------------------
Random_Datetime Start End Comment ----------------------- ----------------------- ----------------------- -------------------------- 2006-07-24 08:21:22.593 2006-07-23 17:54:47.283 2006-07-25 17:54:47.283 Now thru 2 days from now 2006-01-01 06:44:36.897 2006-01-01 00:00:00.000 2006-01-02 00:00:00.000 One day diff 2003-01-01 00:00:00.000 2003-01-01 00:00:00.000 2003-01-01 00:00:00.000 Both times same 2003-01-05 01:57:02.183 2003-01-01 00:00:00.000 2003-01-08 00:00:00.000 One week diff 2003-01-01 20:02:05.550 2002-12-28 00:00:00.000 2003-01-04 00:00:00.000 One week diff 2001-07-02 11:35:11.147 2001-07-01 00:00:00.000 2001-07-13 00:00:00.000 12 day diff 2001-07-02 16:39:57.433 2001-07-01 00:00:00.000 2001-07-14 00:00:00.000 13 day diff 2001-07-06 12:33:53.087 2001-06-30 00:00:00.000 2001-07-13 00:00:00.000 13 day diff 1990-12-01 01:15:42.530 1990-12-01 01:00:00.000 1990-12-01 02:00:00.000 1 hour diff 1990-12-01 01:02:21.647 1990-12-01 01:00:33.003 1990-12-01 02:00:33.003 1 hour diff 1990-12-01 01:21:06.267 1990-12-01 01:00:00.000 1990-12-01 01:30:00.000 30 min diff 1990-12-01 01:26:17.983 1990-12-01 01:00:33.447 1990-12-01 01:30:33.447 30 min diff 1990-12-01 01:00:56.327 1990-12-01 01:00:00.000 1990-12-01 01:05:00.000 5 min diff 1990-12-01 01:03:20.423 1990-12-01 01:00:29.123 1990-12-01 01:05:29.123 5 min diff 1990-12-01 01:00:21.617 1990-12-01 01:00:00.000 1990-12-01 01:01:00.000 1 min diff 1990-12-01 01:00:00.443 1990-12-01 01:00:00.000 1990-12-01 01:00:01.000 1 sec diff 1990-12-01 01:00:00.050 1990-12-01 01:00:00.000 1990-12-01 01:00:00.100 100 ms diff 1990-12-01 01:00:00.000 1990-12-01 01:00:00.000 1990-12-01 01:00:00.050 50 ms diff 1990-12-01 01:00:00.010 1990-12-01 01:00:00.000 1990-12-01 01:00:00.023 23 ms diff 1990-12-01 01:00:00.017 1990-12-01 01:00:00.000 1990-12-01 01:00:00.020 20 ms diff 1990-12-01 01:00:00.007 1990-12-01 01:00:00.000 1990-12-01 01:00:00.013 13 ms diff 1990-12-01 01:00:00.000 1990-12-01 01:00:00.000 1990-12-01 01:00:00.010 10 ms diff 1990-12-01 01:00:00.003 1990-12-01 01:00:00.000 1990-12-01 01:00:00.007 7 ms diff 1990-12-01 01:00:00.000 1990-12-01 01:00:00.000 1990-12-01 01:00:00.003 3 ms diff 2003-01-14 09:00:09.520 2003-01-01 00:00:00.000 2003-02-01 00:00:00.000 One month diff 31 days 2003-08-27 11:47:04.100 2003-01-01 00:00:00.000 2004-01-01 00:00:00.000 One year diff 2006-11-23 03:57:21.737 2005-01-01 00:00:00.000 2007-01-01 00:00:00.000 Two year diff 2006-01-12 08:50:40.717 2006-01-01 00:00:00.000 2006-03-01 00:00:00.000 2 month diff 1933-11-15 13:39:10.050 NULL 2006-01-01 00:00:00.000 Start time null 1997-05-28 06:42:32.407 2006-01-02 00:00:00.000 NULL End time null 1900-01-01 01:50:42.743 NULL NULL Both null 2758-10-18 13:50:47.987 1753-01-01 00:00:00.000 9999-12-31 23:59:59.997 Max datetime diff 8426-03-24 13:51:08.407 9999-12-31 23:59:59.997 1753-01-01 00:00:00.000 Max datetime diff reversed
I was playing around with the new SQL 2005 CLR functionality andremembered this discussion that I had with Erland Sommarskog concerningperformance of scalar UDFs some time ago (See "Calling sp_oa* infunction" in this newsgroup). In that discussion, Erland made thefollowing comment about UDFs in SQL 2005:[color=blue][color=green]>>The good news is that in SQL 2005, Microsoft has addressed several of[/color][/color]these issues, and the cost of a UDF is not as severe there. In fact fora complex expression, a UDF in written a CLR language may be fasterthanthe corresponding expression using built-in T-SQL functions.<<I thought the I would put this to the test using some of the same SQLas before, but adding a simple scalar CLR UDF into the mix. The testinvolved querying a simple table with about 300,000 rows. Thescenarios are as follows:(A) Use a simple CASE function to calculate a column(B) Use a simple CASE function to calculate a column and as a criterionin the WHERE clause(C) Use a scalar UDF to calculate a column(D) Use a scalar UDF to calculate a column and as a criterion in theWHERE clause(E) Use a scalar CLR UDF to calculate a column(F) Use a scalar CLR UDF to calculate a column and as a criterion inthe WHERE clauseA sample of the results is as follows (time in milliseconds):(295310 row(s) affected)A: 1563(150003 row(s) affected)B: 906(295310 row(s) affected)C: 2703(150003 row(s) affected)D: 2533(295310 row(s) affected)E: 2060(150003 row(s) affected)F: 2190The scalar CLR UDF function was significantly faster than the classicscalar UDF, even for this very simple function. Perhaps a more complexfunction would have shown even a greater difference. Based on this, Imust conclude that Erland was right. Of course, it's still faster tostick with basic built-in functions like CASE.In another test, I decided to run some queries to compare built-inaggregates vs. a couple of simple CLR aggregates as follows:(G) Calculate averages by group using the built-in AVG aggregate(H) Calculate averages by group using a CLR aggregate that similatesthe built-in AVG aggregate(I) Calculate a "trimmed" average by group (average excluding highestand lowest values) using built-in aggregates(J) Calculate a "trimmed" average by group using a CLR aggregatespecially designed for this purposeA sample of the results is as follows (time in milliseconds):(59 row(s) affected)G: 313(59 row(s) affected)H: 890(59 row(s) affected)I: 216(59 row(s) affected)J: 846It seems that the CLR aggregates came with a significant performancepenalty over the built-in aggregates. Perhaps they would pay off if Iwere attempting a very complex type of aggregation. However, at thispoint I'm going to shy away from using these unless I can't find a wayto do the calculation with standard SQL.In a way, I'm happy that basic SQL still seems to be the fastest way toget things done. With the addition of the new CLR functionality, Isuspect that MS may be giving us developers enough rope to comfortablyhang ourselves if we're not careful.Bill E.Hollywood, FL------------------------------------------------------------------------- table TestAssignment, about 300,000 rowsCREATE TABLE [dbo].[TestAssignment]([TestAssignmentID] [int] NOT NULL,[ProductID] [int] NULL,[PercentPassed] [int] NULL,CONSTRAINT [PK_TestAssignment] PRIMARY KEY CLUSTERED([TestAssignmentID] ASC)--Scalar UDF in SQLCREATE FUNCTION [dbo].[fnIsEven](@intValue int)RETURNS bitASBEGINDeclare @bitReturnValue bitIf @intValue % 2 = 0Set @bitReturnValue=1ElseSet @bitReturnValue=0RETURN @bitReturnValueEND--Scalar CLR UDF/*using System;using System.Data;using System.Data.SqlClient;using System.Data.SqlTypes;using Microsoft.SqlServer.Server;public partial class UserDefinedFunctions{[Microsoft.SqlServer.Server.SqlFunction(IsDetermini stic=true,IsPrecise=true)]public static SqlBoolean IsEven(SqlInt32 value){if(value % 2 == 0){return true;}else{return false;}}};*/--Test #1--Scenario A - Query with calculated column--SELECT TestAssignmentID,CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END ASCalcColumnFROM TestAssignment--Scenario B - Query with calculated column as criterion--SELECT TestAssignmentID,CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END ASCalcColumnFROM TestAssignmentWHERE CASE WHEN TestAssignmentID % 2=0 THEN 1 ELSE 0 END=1--Scenario C - Query using scalar UDF--SELECT TestAssignmentID,dbo.fnIsEven(TestAssignmentID) AS CalcColumnFROM TestAssignment--Scenario D - Query using scalar UDF as crierion--SELECT TestAssignmentID,dbo.fnIsEven(TestAssignmentID) AS CalcColumnFROM TestAssignmentWHERE dbo.fnIsEven(TestAssignmentID)=1--Scenario E - Query using CLR scalar UDF--SELECT TestAssignmentID,dbo.fnIsEven_CLR(TestAssignmentID) AS CalcColumnFROM TestAssignment--Scenario F - Query using CLR scalar UDF as crierion--SELECT TestAssignmentID,dbo.fnIsEven_CLR(TestAssignmentID) AS CalcColumnFROM TestAssignmentWHERE dbo.fnIsEven(TestAssignmentID)=1--CLR Aggregate functions/*using System;using System.Data;using System.Data.SqlClient;using System.Data.SqlTypes;using Microsoft.SqlServer.Server;[Serializable][Microsoft.SqlServer.Server.SqlUserDefinedAggregate (Format.Native)]public struct Avg{public void Init(){this.numValues = 0;this.totalValue = 0;}public void Accumulate(SqlDouble Value){if (!Value.IsNull){this.numValues++;this.totalValue += Value;}}public void Merge(Avg Group){if (Group.numValues > 0){this.numValues += Group.numValues;this.totalValue += Group.totalValue;}}public SqlDouble Terminate(){if (numValues == 0){return SqlDouble.Null;}else{return (this.totalValue / this.numValues);}}// private accumulatorsprivate int numValues;private SqlDouble totalValue;}[Serializable][Microsoft.SqlServer.Server.SqlUserDefinedAggregate (Format.Native)]public struct TrimmedAvg{public void Init(){this.numValues = 0;this.totalValue = 0;this.minValue = SqlDouble.MaxValue;this.maxValue = SqlDouble.MinValue;}public void Accumulate(SqlDouble Value){if (!Value.IsNull){this.numValues++;this.totalValue += Value;if (Value < this.minValue)this.minValue = Value;if (Value > this.maxValue)this.maxValue = Value;}}public void Merge(TrimmedAvg Group){if (Group.numValues > 0){this.numValues += Group.numValues;this.totalValue += Group.totalValue;if (Group.minValue < this.minValue)this.minValue = Group.minValue;if (Group.maxValue > this.maxValue)this.maxValue = Group.maxValue;}}public SqlDouble Terminate(){if (this.numValues < 3)return SqlDouble.Null;else{this.numValues -= 2;this.totalValue -= this.minValue;this.totalValue -= this.maxValue;return (this.totalValue / this.numValues);}}// private accumulatorsprivate int numValues;private SqlDouble totalValue;private SqlDouble minValue;private SqlDouble maxValue;}*/--Test #2--Scenario G - Average Query using built-in aggregate--SELECT ProductID, Avg(Cast(PercentPassed AS float))FROM TestAssignmentGROUP BY ProductIDORDER BY ProductID--Scenario H - Average Query using CLR aggregate--SELECT ProductID, dbo.Avg_CLR(Cast(PercentPassed AS float)) AS AverageFROM TestAssignmentGROUP BY ProductIDORDER BY ProductID--Scenario I - Trimmed Average Query using built in aggregates/setoperations--SELECT A.ProductID,CaseWhen B.CountValues<3 Then NullElse Cast(A.Total-B.MaxValue-B.MinValue ASfloat)/Cast(B.CountValues-2 As float)End AS AverageFROM(SELECT ProductID, Sum(PercentPassed) AS TotalFROM TestAssignmentGROUP BY ProductID) ALEFT JOIN(SELECT ProductID,Max(PercentPassed) AS MaxValue,Min(PercentPassed) AS MinValue,Count(*) AS CountValuesFROM TestAssignmentWHERE PercentPassed Is Not NullGROUP BY ProductID) BON A.ProductID=B.ProductIDORDER BY A.ProductID--Scenario J - Trimmed Average Query using CLR aggregate--SELECT ProductID, dbo.TrimmedAvg_CLR(Cast(PercentPassed AS real)) ASAverageFROM TestAssignmentGROUP BY ProductIDORDER BY ProductID
Hi, I'm inserting a datetime values into sql server 2000 from c#
SQL server table details Table nameate_test columnname datatype No int date_t DateTime
C# coding SqlConnection connectionToDatabase = new SqlConnection("Data Source=.\SQLEXPRESS;Initial Catalog=testdb;Integrated Security=SSPI"); connectionToDatabase.Open(); DataTable dt1 = new DataTable(); dt1.Columns.Add("no",typeof(System.Int16)); dt1.Columns.Add("date_t", typeof(System.DateTime)); DataRow dr = dt1.NewRow(); dr["no"] = 1; dr["date_t"] = DateTime.Now; dt1.Rows.Add(dr); for(int i=0;i<dt1.Rows.Count;i++) { string str=dt1.Rows["no"].ToString(); DateTime dt=(DateTime)dt1.Rows["date_t"]; string insertQuery = "insert into date_test values(" + str + ",'" + dt + "')"; SqlCommand cmd = new SqlCommand(insertQuery, connectionToDatabase); cmd.ExecuteNonQuery(); MessageBox.Show("saved"); } When I run the above code, data is inserted into the table The value in the date_t column is 2007-07-09 22:10:11 000.The milliseconds value is always 000 only.I need the millisecond values also in date_t column. Is there any conversion needed for millisecond values?
Hi all, having a little problem with saving dates to sql databaseI've got the CreatedOn field in the table set to datetime type, but every time i try and run it i get an error kicked up Error "The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value.The statement has been terminated."I've tried researching it but not been able to find something similar. Heres the code: DateTime createOn = DateTime.Now;string sSQLStatement = "INSERT INTO Index (Name, Description, Creator,CreatedOn) values ('" + name + "','" + description + "','" + userName + "','" + createOn + "')"; Any help would be much appreciated
Hey :)I'm facing a lot of troubles trying to create a new pause/break-system. Right now i'm building up the query that counts how many records that is inside 2 fields. Let me first show you my table: ID (int) | stamp_start (Type: DateTime) | stamp_end (Type: DateTime) | Username (varchar)0 | 17-03-07 12:00:00 | 17-03-07 12:30:00 | Hovgaard The client will enter a start time and a end time and this query should then count how many records that are inside this periode of time. Example: The client enter starttime: 12:05 and endtime: 12:35.The query shall then return 1 record found. The same thing if the user enters 12:20 and 12:50.My current query looks like this:SELECT COUNT(ID) AS Expr1 FROM table WHERE (start_stamp <= @pausetime_start) AND (end_stamp >= @pausetime_end)But this will only count if I enter the exact same times as the one inside the table.Any ideas how I can figure this out?Thanks for your time so far :)/Jonas Hovgaard - Denmark
Hi, I have a column of type datetime in sqlserver 2000. Whenever I try to insert the date '31/08/2006 23:28:59' I get the error "...datetime data type resulted in an out-of-range datetime value" I've looked everywhere and I can't solve the problem. Please note, I first got this error from an asp.net page and in order to ensure that it wasn't some problem with culture settings I decided to run the query straight in Sql Query Anaylser. The results were the same. What else could it be? cheers, Ernest
I am inserting date and time data into a SQL Server 2012 Express table from an application. The application is providing the date and time as a string data type. Is there a TSQL way to convert the date and time string to an SQL datetime date type? I want to do the conversion, because SQL displays an error due to the
My date and time string from the application looks like : 3/11/2014 12:57:57 PM
Nothing difficult, I just need a way to generate a new datetime column based on the column [PostedDate], datetime. So basically I want to truncate the time. Thanks a lot.
select convert(datetime,'04-20-' + right(term,4)) as dt, 'Deposit' as type, a.* from dbo.status_view a
where right(term,4) always returns a string which constitutes a 4 digit year eg '1999','2004',etc.
The SQL above returns
2004-04-20 00:00:00.000 Deposit ...
Which makes me think that it is able to successfully construct the datetime object inline. But then when I try and do:
select * from ( select convert(datetime,'04-20-' + right(term,4)) as dt, 'Deposit' as type, a.* from dbo.status_view a ) where dt >= a.submit_date
I get the following error:
Syntax error converting datetime from character string.
Given that it executes the innermost SQL just fine and seems to convert the string to a datetime object, I don't see why subsequently trying to USE that datetime object for something (in this case comparison with submit_date which is a datetime in the table a) should screw it up. Help!!! Thanks...
Hi,I have a text file that contains a date column. The text file will beimported to database in SQL 2000 server. After to be imported, I wantto convert the date column to date type.For ex. the text file look likeName dateSmith 20003112Jennifer 19991506It would be converted date column to ydm database in SQL 2000 server.In the table it should look like thisName DateSmith 2000.31.12Jennifer 1999.15.06Thanks in advance- Loi -
hi, How do i convert a varchar field into the datetime data type? the reason i need this lies in the requirement that in the earlier data base the column that is hlding the date value is having the data type as varchar. and in the new design the column data type is datetime. i am using sql scripts for the data migration from the older design to the newer and got stuck with this datetime convertion issue. do let me know the best possible solution.
following are the sample data that is theer in the older table for the date.
12/12/2003 1/13/2007 01132004 1-1-2004 1.2.2001
there is no uniformity of the data that is stored currently.
I have a table that has a unique ID and a datetime of when something changed.
See example: IDÂ Â Â TimeStamp 16094Â Â Â 2013-11-25 11:46:38.357 16095Â Â Â 2013-11-25 11:46:38.430 16096Â Â Â 2013-11-25 11:46:38.713 16097Â Â Â 2013-11-25 11:46:38.717 16098Â Â Â 2013-11-25 11:46:38.780
[Code] ....
Is there a way I can calculate the difference between row 16106 and 16105 and enter it in line 10601.
update tblPact_2008_0307 set student_dob = '30/01/1996' where student_rcnumber = 1830when entering update date in format such as ddmmyyyyi know the sql query date format entered should be in mmddyyyy formatis there any way to change the date format entered to ddmmyyyy in sql query?
Hi All,This is only my second day using C#/asp.net and English isn't my native language so I'll apologise in advance if this question doesn't make sense and/or is stupid . I'm trying to make a little event planner which was going well until I hit a small problem, the user needs to enter the event name and choose a date/time from dropdowns, this bit works okay... however I'm having difficulties with storing the date/time, at the moment I'm making a DateTime object and filling out the year/month/day/hour/minute selected on the form and adding that to the database field (which is a datetime field type).Problem is, I'm using the date format DD/MM/YYYY HH:MM:SS but the database stores it as MM/DD/YYYY HH:MM:SS so if I wanted to do things like: select * from tbl where date > DateTime.Now - it isn't going to work because the month is switched with the day in the DB-versions. Just to try and give another example, if I'm not being very clear, if I print DateTime.Now.ToString() it will display 10/06/2008 02:50:48 however if I put it into the database and pull it back out cast to a DateTime Object and then display that object via ToString() it'll be 06/10/2008 02:50:48 Any suggestions on what to do would be appreciated, usually in cases like this I'd just use the unix timestamp, but since C#/ASP.NET doesn't support that natively I was hoping there was an easier/native alternative. Also I don't have access to the server configuration so changing the localization of mssql isn't an option unfortunately.Thanks in advance!
Guys I need help with sql server 2005 syntax My goal is up update a table called UserStats. I have numerous functions in SQL that return Scalars (one for each statistics I want to use) How do I then use a stored proceedure to return a table of the values for use on my site?
WIthin SQL Server 2005, there are functions. This feature is new to me and I haven't found anyone that has written their own fucntions? I'm wondering if functions are written the same as stored procedures, and can a function be called from a stored procedure or even from within a query.