Transact SQL :: Counting Blocks Of Time In 15 Minute Chunks
Feb 16, 2012
We have data that consists of an employee number, a start time and a finish time, similar to the example below
EMP STARTTIME ENDTIME
00001 10-Feb-2012 06:00:00 10-Feb-2012 10:00:00
00002 10-Feb-2012 07:15:00 10-Feb-2012 10:00:00
00003 10-Feb-2012 08:00:00 10-Feb-2012 10:00:00
I am trying to come up with a procedure in SQL that will give me each 15 minute block throughout the day and a count of how many employees are expected to be at work at the start of that 15 minute block. So, given the example above I would like to return
I have the following SQL that I want to log the number of visits from clients. a session is a block of time, for example, any time between 9-12 is a morning sessions, 12-5 is an afternoon session etc. I need to count the number of sessions in a month not the appointments within the session, make sense?
select sr.resourceid, st.TimeStart, datename(m,sr.scheduledate) as sesmonth, datename(yy,sr.scheduledate) as year, (CASE WHEN DATEPART(hour, st.TimeStart) BETWEEN 0 AND 12 THEN 'MORNING SESSION' WHEN DATEPART(hour, st.TimeStart) BETWEEN 12 AND 17 THEN 'AFTERNOON SESSION' WHEN DATEPART(hour, st.TimeStart) BETWEEN 17 AND 24 THEN 'EVENING SESSION'
Just wondering what is the best time to ensure that we only return data when the datetime field is the same when compared between two datetimes within a minute difference.
As in the following should return the data:
'2015-04-09 09:00:20' compared to '2015-04-09 09:00:50'
And the following should not return the data:
'2015-04-09 09:01:20' compared to '2015-04-09 09:00:50'
The problem, is that I'm merging data from three different result sets, which they all have data for every minute, however, the timestamp can be different by seconds or milliseconds.
So, I'm only interested to return the data when the two fields that I'm comparing are equal within a minute. I need to ignore seconds and milliseconds.
I have configured an alert like below to track all blocked events in SQL Server across all databases and then kick start a sql job when a blocking happens which inserts data to a table, when there is a blocking in SQL server , i get an email --which is working fine and i am able to track all queries.
but, HOW to get notifications ONLY if BLOCKING IS HAPPENING FOR MORE THAN 30 SECONDS OR 1 MINUTE with out using sp_configure?
---ALERT USE [msdb] GO EXEC msdb.dbo.sp_update_alert @name=N'Blocking Process', @message_id=0, @severity=0, @enabled=1,
I just need the date, hour, and minute...not the micro seconds. Do I have to DATEPART and concatenate each or is there any way to simply truncate the milliseconds from it? Or is there a date format to put extract and report on it as...
Is there a way that I can do this at the table level to automatically handle the rounding of seconds, etc. down to the minute automatically without having to use a trigger?
Here is a very basic example of what I am trying to do:
--example: '09-22-2007 15:07:18.850' this is the value inserted into the table by the code select getdate()
--example: '2007-09-22 15:07:00.000' this is the value I want to store in the table select dateadd(mi, datediff(mi, 0, getdate()), 0)
I hope to update a DateTime column value with a Time input parameter. Poor attempt below but it looks like the @ApptTime param is coming in as 10:45:00.0000000 and I might have an existing @SendOnDate as: 2015-10-05 07:00:00.000...I hope to end up with 2015-10-05 10:45:00.000
ALTER PROCEDURE [dbo].[SendEditUPDATE] @QuePoolID int=null ,@ApptTime time(7) ,@SendOnDate datetime
i have a big table (120 million records) and i want to take all this table and to insert it into another table. since this BULK insert operation can make all kind of performance problem i would like to make the bulk insert via small chunks. the table does not have any idintity.
can someone give me an exapmle with rowcount or with a loop to make each time an insert into select statment and to insert in each time for example 5000 rows.
I have a table that's 25,000,000 records... about 10 fields. I need to export this data to a flat file in no more than 500,000 record chunks. I've tried the following algorithm, adding a flag field called "exported" with default value 0.
do: - mark random 500,000 records, setting exported = -1 - export everything in that table where exported = -1 - set exported = 1 where exported = -1 loop
This was pretty slow, taking about 10 hours last night to run.
I find myself wanting a sort of a split dataset task in SSIS, being able to split records a chunk of records out of a dataset and handle them. Anyone have ideas for me?
I need to export records to a flat file using a dataflow task, but want no more than 50,000 records in each file. What's the best way to automate this?
I have a table based around requisitions, and each requisition has a number of positions. That number can change over time through updates to pertinent rows rather than through transaction-like records that record an entire history, and I'm only able to get a monthly snapshot of the table. What I decided to do is still use one table for OLAP (fact_requisitions) but add a column called period_key that refers to the month the data comes from. So if I have two months of data then the table has each requisition twice, possibly with differing position counts, and new requisitions from the second month are only present once. Then I tried to filter the MDX query like so:
SELECT { ([Dim TimeRequestClosed].[Year - MonthNumber].[Year_Text].&[2008].&[1],[Dim Requisitions].[Period].[Period Key].&[200801]) } ON COLUMNS, NON EMPTY { ([Dim Location].[Region Name].MEMBERS, [Dim Location].[Period Key].&[200801]) } ON ROWS FROM [Requisitions] WHERE [Measures].[Request Closed Date Count]
This query doesn't work even though the data is there, it just returns nulls. Am I going about this all wrong? If not, what might I be doing wrong, and how would I get the query to return more than one period (e.g. tell Dim Requisition to match up with Dim Location on the period key)?
From what I can see, the 'varbinary(max)' data type is not supported, and the 'image' data type is supposed to go away. Is there some other way to store large chunks (10MB to 100MB) of data into an SSEv DB?
If I have to use the 'image' data type to so this, does anyone have a code sample that would let me push an array() of numbers into an 'image' field, and unload an 'image' field into an array()?
Here's a little SP to break up those long-running, massively-locking, bring-app-to-a-halt queries. By default it does 500 rows at a time and allows for a maximum SQL query size of 4000 characters; it should be trivial to adjust those.
Cheers -b
CREATE PROCEDURE p_BatchExecute (@vcSQL varchar(4000)) AS set nocount on DECLARE @iRows int select @iRows=1 SET ROWCOUNT 500 WHILE @iRows>0 BEGIN print 'Executing batch of 500...' exec (@vcSQL) set @iRows=@@ROWCOUNT END GO
I'm using the code below to send files that are in a blob file in my database to the browser client. The code sends the file in chunks in order to increase performance. The file I'm using to test with is 7MB. It works great on Windows XP with any browser. It takes virtually the same amount of time compared to downloading the file directly from the webserver. However, Windows 2000 and Mac OS X both take about 4x the amount of time it takes to download the file on XP machines. Why the performance difference? Is there anything I can do to fix this? I tried downloading the file directly from the webserver instead of getting it out of the database and it takes the same amount of time on all 3 OS. I had the same problem on Windows XP when I wasn't sending the file in chunks, but after using the code below, it started working for XP only.
Dim bufferSize As Integer = 24000 Dim outbyte(bufferSize - 1) As Byte Dim retval As Long Dim startIndex As Long = 0
Dim sql As String = "SELECT ..." Dim cmd As New SqlCommand(sql, conn) conn.open() Dim dr As SqlDataReader = cmd.ExecuteReader(CommandBehavior.SequentialAccess) If dr.Read() Then ' Reset the starting byte for a new BLOB. startIndex = 0
' Read bytes into outbyte() and retain the number of bytes returned. retval = dr.GetBytes(DocCol, startIndex, outbyte, 0, bufferSize) Current.Response.Clear() Current.Response.Buffer = True Current.Response.ContentType = "application/octet-stream" Current.Response.AddHeader("Content-Disposition", "attachment; filename=" & myfile" & "." & myextension)
Do While retval = bufferSize Current.Response.BinaryWrite(outbyte) Current.Response.Flush()
' Reposition the start index to the end of the last buffer and fill the buffer. startIndex += bufferSize retval = dr.GetBytes(DocCol, startIndex, outbyte, 0, bufferSize) Loop
'Write the remainder of the last chunk Dim remaining(retval) As Byte Array.Copy(outbyte, 0, remaining, 0, retval) Current.Response.BinaryWrite(remaining) Current.Response.Flush() Current.Response.Close() End If dr.Close() conn.Close()
Using the SqlClient provider I'm trying to write big datachunks of maybe 20 MB each to SQL server to store in BLOBs using blobColumn.Write(...) using .NET 2.0 dbcommand object calling a Stored procedure
CREATE PROCEDURE [dbo].[putBlobByPK]
(
@id dKey
, @value VARBINARY(MAX)
, @offset bigint
, @length bigint
, @ModDttm dModDttm OUT
, @ModUser dModUser OUT
, @ModClient dModClient OUT
, @ModAppl dModAppl OUT
)
....
When doing this I can do this exactly 3 times than the application hangs (for ever).
When looking in the SQL Server log, I find the following to errors:
Error: 4014, Severity: 20, Status: 2.
A fatal error occurred while reading the input stream from the network. The session will be terminated.
I don't get this error on the client! OK, the session died.
What may be the problem?
I write big chunks like this to avoid many writes as the data shall be replicated later using peer to peer replication. And the more writes used for writing the total BLOB the more huge becomes the transaction log of the subscriber database.
select TOP 10 rec_id,trans_id,number_id,card_no,message_id,trans_datetimefrom [dbo].[trans_log] order by trans_datetime desc101, 1,34343, 99999, 200, 2015-11-23 12:27:25.710101,2,34343,99999,210,2015-11-23 12:27:26.710102,3,43434,88888,200,2015-11-23 12:28:26.714102,4,43434,88888,233,2015-11-23 12:28:27.710expected result:34343,99999,datediff(ss,'2015-11-23 12:27:26.710','2015-11-23 12:27:25.710') --difference between row 2 and row 143434,88888,datediff(ss,'2015-11-23 12:28:26.714','2015-11-23 12:28:27.710') --
difference between row 4 and row 3difference between row 6 and row 5...In the above query, I always want to find the difference in transaction date time between second and first row in a moving window.I have the unique id as rec_id to compare the next row with the previous row.
I need to build a reference table that will contain the time zone, offset value and DST changes for all time zones. Is there a way I can pull this from the windows server hosting sql server?
I am using aloha POS and they have the date for every check in separate fields and now I want to calculate the total time for the checks but unable to get the how of it..
The date is DOB and it's datetime but I just need to extra the getdate() from it.The open time is OPENHOUR and OPENMINThe close time is CLOSEHOUR and CLOSEMIN
so basically the open time for the check will be the DATE FROM DOB + OPENHOUR + OPENMIN
And the close time will be DATE FROM DOB + CLOSEHOUR + CLOSEMIN
We are maintaining 3 Shifts in our database. Problem in maintaining date and time for 3rd Shift. For example, today date is 13th July and third shift timing is 11 PM - 7 AM. Then I have to display the beginning date as 13/07/2015 11 PM and end date as 14/07/2015 7 AM. Please find the data(in seconds) available in database which I need to use for my calculation.
Date(Fcreacion) Start time in Seconds(Hcreacion) End time in seconds(Hcerrar) turno(Shift)