Some advice appreciated on the following, thanks!
I have this query:
select field1, field2, astartdate, anenddate
from atable
where astartdate BETWEEN '9/11/2007 12:00:00 AM' AND '9/18/2007 11:59:59 PM'
I need the results to be like the following:
field1,field2,anenddate-astartdate <<that's minus
I need a formula to calculate the time (let's say in minutes) between two dates/times. The problem is that I have to exclude the time between 06 PM and 06 AM and also exclude the time in the weekend (Saturday and Sunday). I will use this in a couple of reports made in Reporting Services. If anyone have an algoritm that could be modified for this and is willing to share this I would be very grateful. Many thanks! /Per Lissel
I have a database where user events are recorded quite frequently. I'd like to be able to get a count of the 'good' events that happen in each 5 second period. Unfortunately I don't know how to display and group by a time range.
Here is the query I would like to change: SELECT count(*), clientTime FROM dbo.V_COMBINED WHERE (sessionId = '122b') AND (type = N'sys_goodaction') AND (paraName = 'value') GROUP BY clientTime
It returns records like: 1 |2006-02-16 23:21:05.250 1 |2006-02-16 23:21:05.267 1 |2006-02-16 23:21:06.470
I'd like it to return records like: 5 |2006-02-16 23:21:06 - 23:21:10 3 |2006-02-16 23:21:11 - 23:21:15 4 |2006-02-16 23:21:16 - 23:21:20
Anyone know how I could do this? Is it even possible?
I have to compare two time duration for a resource booking system. for example if a resource is booked for 9.00am to 2.00am ,it must not be booked during that duration and other coinciding times e.g 1.00pm to 2.00 pm should not be booked or even 9.00 am to 5.00pm should not be booked.
I am not sure how to go about doing this. I have a record that has a start time of 1 am and a stop time of 9pm (same day for simplicity) and I want to know how many hours during a peek time and how many were not.
For example, the application starts and inserts into the data base the start time of 1am and then the user stops the app at 9pm. Lets say the peek hours are 1pm to 7pm. I know i can do a date diff function to get how long the app ran for but how can I get the amount of hours it ran during the peek time? I know there has to be some mathematical solution to this but it is escaping me at the moment. I want to do this over many records so a a cte or pivot table is the end solution for performance.
I need to find the total StatusDateTime for each TicketId I need to find the average StatusDateTime for all TicketIDs Ex. TicketId, "T10001", has 4 records based on the Seq column.
By using this, I should be able to find the amount of time between the first Seq and the last Seq to get a total time span for Ticket.
Expanding on this, I should be able to add up all of the Ticket's calculated time spans and divide by the number of tickets to get the average time span.
Looking for returning multiple entries from a time span. I have a date, start-time, end-time and duration. I need the start-times separated in a list. It's fine if temp tables are needed - I have that clearance.
ID - INT Machine - TINYINT StartTime - DATETIME EndTime - DATETIME
What I am trying to do is figure out how much time is used for production per day. The problem is, there are production runs that run over midnight and possible multiple days without ending. For example, if I have the following data:
Hi,I am going to be difficult here... How do I retrieve one row at atime from a table without using a cursor?For example, I have a table with 100 rows. I want to retrieve thedata from row 1, do some stuff to it and send it on to anther table,then I want to grabrow 2, do some stuff to it and send it to another table.Here is how I am envisioning it:WHILE arg1 < arg2 {arg1 is my initial row, arg2 would be by totalrowcount)BEGINSELECT * FROM [TABLE] BUT ONLY ONE ROW.... MANIPULATE THE DATAINSERT into another tableENDOther notes, I am using SQL Sever 2000....Thanks and in advance and as always the help is greatly appreciated.Regards,CLR
I'm having a problem retrieving information from two different tables. Everytime i run a query which i type a source id into a textbox my page keeps timing out. What could be the problem? The tables i'm pulling from is profiles and phone. Here is the code. If someone could tell me what's going on id apperciate it. Thanks!
Dim queryString As String = "SELECT [profiles].[date_added], [profiles].[source_id], [RP_profiles].[fnam"& _"e], [profiles].[lname], [profiles].[title], [phone].[number], [RP_phone"& _"].[source_id] FROM [profiles], [phone] WHERE ([profiles].[source_id] = "& _"@source_id)"
I am about updating some fields with local date and time using blw API Quote Private Declare Sub GetLocalTime Lib "kernel32" (lpSystemTime As SYSTEMTIME)
Public Function LocalDateTime()
Dim MyTime As SYSTEMTIME
GetLocalTime MyTime
Debug.Print "The Local Date is:" & MyTime.wMonth & "-" & MyTime.wDay & "-" & MyTime.wYear Debug.Print "The Local Time is:" & MyTime.wHour & ":" & MyTime.wMinute & ":" & MyTime.wSecond
End Function Unquote This works fine however due to it's not really easy for an administrator to keep checking every times wheter local date and time are correctly updated for each front end users computer i would that to reconsider my function and making it to read local date and time from remote SQL Server. my concern is to help administrator to save time as he should only need to ensure that Server is running with correct time...
I have no idea where to post this kind of question, so here it is!
I have a requirement to retrieve oracle 10 data into SS2000 in as near real-time as possible (stupid users!) and join with resident SS data for on-demand reporting. (We use SS replication to populate some reporting tables from other SS2000 instances and this has spoiled the users as well as the developers! )
I would like to know if there are any clever ways of doing this, or if a plain-old DTS package running in some kind of loop is the practical answer. 1 minute delay is probably too long . . . I don't know if data can be pushed from the oracle side. Or if we need to write a Service and use it to suck and push.
Hi guys, When I thought everything is okay with this script, I got a new problem... I have a VBA's script from Excel 2003 that builds sql script and retrieves data from SQL SERVER 2000. in order to make the sql running, I need to use a multi - batch processing, to pass and execute every command line once a time.
Up to here, I am using a test case with Account number = '123456' and getting the desire results. The code below is running okay with the test case, but when changing the account number (mark as yellow in the code) to include all the accounts (or just one other account), I am getting the following ERROR: run - time error '-2147217871 (80040e31)' - [Microsoft] [ODBC SQL Server Driver] time out expired.
Now, if I take the same code, with the condition that generates the ERROR, and try it into SQL Server, I get the results without errors. Thanks in advance, Aldo.
Below the code:
Code Snippet Function QuerySalesAging() '-------------------------------------------------------------- 'MUST !!! References: Microsoft ActiveX Data Object 2.1 Library '--------------------------------------------------------------
Dim ConnString As New ADODB.Connection Dim RecordSet As New ADODB.RecordSet
'Report Criterias Criteria05 = " AND " & "Accounts.ACCOUNTKEY Between " & AccountKeyAsRange ' -- ==> With AccountKeyAsRange = '123456' AND '123456' it works okay. ' -- ==> With any other value, in example AccountKeyAsRange = '123456' AND '9999999999' it get's ERROR.
CmdLine01 = " USE " & CompanyName
' Check and drop temporary table TemporaryTableName = "CTE" ' The table is a regular one CmdLine02 = " if object_id('" & TemporaryTableName & "') is not null exec('DROP TABLE " & TemporaryTableName & "') "
CmdLine03 = " SELECT ..." CmdLine03 = CmdLine03 & " INTO " & TemporaryTableName CmdLine03 = CmdLine03 & " FROM ..." CmdLine03 = CmdLine03 & " WHERE " & "(" & Replace(Criteria05, "AND", "") & ")" CmdLine03 = CmdLine03 & " ORDER BY ..."
SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO
[code]....
I am trying to display all records older then 5 Mondays span..so far I got this and it is working fine but it is excluding the records with Id 2 and 4 (Sundays) I would like to include these records as well.
I'm trying to add a heading to my report using Table and TableGroups. But I don't want my group headings to line up with the detail rows. This seems like a common requirement in my mind but I can't figure out how to do it.
I need to calculate the amount of time between each visit. I am pulling the Row Number for my visits and now I need the date span that goes between each day. I also need a new column that returns a Yes or a No if the date span exceeds 3 years.
SELECT ROW_NUMBER ( ) OVER ( PARTITION BY pv.PatientProfileId ORDER BY pv.Visit ASC ) AS RN , CONVERT ( VARCHAR ( 20 ) , pv.Visit , 101 ) AS Visit , pv.TicketNumber , vstatus.Description AS VisitStatus , doc.ListName AS Doctor
USE [Testing] GO /****** Object: Table [dbo].[Testing] Script Date: 4/25/2014 11:08:18 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON
[Code] ....
It seems to work fine with one million records.
Each primary key is unique, but the begindate is non-unique, and i guess even if i use datetime2 and add nanoseconds, from what i have read, there is a chance that i could have a duplicate datetime since the date is imported via XML from multiple sources.
Is there a way to keep track in real time on how long a stored procedure is running for? So what I want to do is fire off a trace in a stored procedure if that stored procedure is running for over like 5 minutes.
I am trying to load previous days data at 3 am via a SSIS job.
The Date variable is initiated as DATEADD("dd",-1, GETDATE()) in the for loop.
Now, as this job runs at 3 am, and I set the variable as GETDATE() - 1, it excluded the data from 12 am to 3 am in the resultset as Date is set as YYYY-MM-DD 03:00:00:000 I need this to be set as YYYY-MM-DD 00:00:00:000
I hope to update a DateTime column value with a Time input parameter.  Poor attempt below but it looks like the @ApptTime param is coming in as 10:45:00.0000000 and I might have an existing @SendOnDate as: 2015-10-05 07:00:00.000...I hope to end up with 2015-10-05 10:45:00.000
ALTER PROCEDURE [dbo].[SendEditUPDATE] @QuePoolID int=null ,@ApptTime time(7) ,@SendOnDate datetime
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open() sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table scb = New SqlCeCommandBuilder(sda) sda.Update(dataset) cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open() Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
We need to select rows from the database that have been recently inserted/updated. We have a main primary table (COMMIT_TEST) and a second update table (COMMIT_TEST_UPDATE). The update table contains the primary key and a LAST_UPDATE field which is a datetime (to tell us when an update occurred). Triggers on the primary table are used to populate the update table.
If we insert or update the primary table in a transaction, we would expect that the datetime of the insert/update would be at the commit, however it seems that the insert/update statement is cached and getdate() is executed at the time of the cache instead of the commit. This causes problems as we select rows based on LAST_UPDATE and a commit may occur later but the earlier insert timestamp is saved to the database and we miss that update.
We would like to know if there is anyway to tell the SQL Server to not execute the function getdate() until the commit, or any other way to get the commit to create the correct timestamp.
We are using default isolation level. We have tried using getdate(), current_timestamp and even {fn Now()} with the same results. SQL Queries that reproduce the problem are provided below:
/* Different functions to get current timestamp €“ all have been tested to produce the same results */ /* SELECT GETDATE() GO SELECT CURRENT_TIMESTAMP GO SELECT {fn Now()} GO */ /* Use these statements to delete the tables to allow recreate of the tables */ /* DROP TABLE COMMIT_TEST DROP TABLE COMMIT_TEST_UPDATE */ /* Create a primary table and an UPDATE table to store the date/time when the primary table is modified */ CREATE TABLE dbo.COMMIT_TEST (PKEY int PRIMARY KEY, timestamp) /* ROW_VERSION rowversion */ GO CREATE TABLE dbo.COMMIT_TEST_UPDATE (PKEY int PRIMARY KEY, LAST_UPDATE datetime, timestamp ) /* ROW_VERSION rowversion */ GO /* Use these statements to delete the triggers to allow reinsert */ /* drop trigger LOG_COMMIT_TEST_INSERT drop trigger LOG_COMMIT_TEST_UPDATE drop trigger LOG_COMMIT_TEST_DELETE */ /* Create insert, update and delete triggers */ create trigger LOG_COMMIT_TEST_INSERT on COMMIT_TEST for INSERT as begin declare @time datetime select @time = getdate()
insert into COMMIT_TEST_UPDATE (PKEY,LAST_UPDATE) select PKEY, getdate() from inserted end GO create trigger LOG_COMMIT_TEST_UPDATE on COMMIT_TEST for UPDATE as begin declare @time datetime select @time = getdate()
update COMMIT_TEST_UPDATE set LAST_UPDATE = getdate() from COMMIT_TEST_UPDATE, deleted, inserted where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY end GO /* In our application deletes should never occur so we don€™t log when they get modified we just delete them from the UPDATE table */ create trigger LOG_COMMIT_TEST_DELETE on COMMIT_TEST for DELETE as begin if ( select count(*) from deleted ) > 0 begin delete COMMIT_TEST_UPDATE from COMMIT_TEST_UPDATE, deleted where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY end end GO /* Delete any previous inserted record to avoid errors when inserting */ DELETE COMMIT_TEST WHERE PKEY = 1 GO /* What is the current date/time */ SELECT GETDATE() GO BEGIN TRANSACTION GO /* Insert a record into the primary table */ INSERT COMMIT_TEST (PKEY) VALUES (1) GO /* Simulate additional processing within this transaction */ WAITFOR DELAY '00:00:10' GO /* We expect at this point that the date is written to the database (or at least we need some way for this to happen) */ COMMIT TRANSACTION GO /* get the current date to show us what date/time should have been committed to the database */ SELECT GETDATE() GO /* Select results from the table €“ we see that the timestamp is 10 seconds older than the commit, in other words it was evaluated at */ /* the insert statement, even though the row could not be read with a SELECT as it was uncommitted */ SELECT * FROM COMMIT_TEST GO SELECT * FROM COMMIT_TEST_UPDATE
Any help would be appreciated, we understand we could make changes to the application/database to approximate what we need, but all the solutions have identified suffer from possible performance issues, or could still lead to missing deals (assuming the commit time is larger than some artifical time window).
I need to take a temporary table that has various times stored in a text field (4:30 pm, 11:00 am, 5:30 pm, etc.), convert it to miltary time then cast it as an integer with an update statement kind of like:
Update myTable set MovieTime = REPLACE(CONVERT(CHAR(5),GETDATE(),108), ':', '')
how this can be done while my temp table is in session?
We are using SQL Server 2008 as our database and use Access as a GUI. I am looking to create a form in Access where employees can access their time card and request changes from management. I want to use the format from the attached screen shot for the form. I pretty much know how to do it all, the only point of complication is trying to figure out the easiest way to get the transaction punch record data on employee_punch_record into a format where I can easily populate the form in the horizontal format you see in the screen shot.
I am not super strong in SQL, but figure I can do it using a formatting table of some sort. quick and easy way to move transaction records into a more horizontally oriented record?
I have a very simple time series model which processing works fine without any problem. However when I run the following query
SELECT
[TimeSeries].[PriceChange],
[TimeSeries].[Symbol],
PredictTimeSeries(PriceChange, -3, 2)
From
[TimeSeries]
WHERE
[TimeSeries].[Symbol] = 'x'
I get the following error:
TITLE: Microsoft SQL Server 2005 Analysis Services ------------------------------ Error (Data mining): A time series prediction was requested with a start time further in the past than the internal models of the mining model, TimeSeries, specified in the HISTORIC_MODEL_GAP and HISTORIC_MODEL_COUNT parameters can process.
The following is the excerpt of the minding model script related to the two parameters:
<AlgorithmParameters>
<AlgorithmParameter>
<Name>MISSING_VALUE_SUBSTITUTION</Name>
<Value xsi:type="xsdtring">Previous</Value>
</AlgorithmParameter>
<AlgorithmParameter>
<Name>HISTORIC_MODEL_GAP</Name>
<Value xsi:type="xsd:int">1</Value>
</AlgorithmParameter>
<AlgorithmParameter>
<Name>HISTORIC_MODEL_COUNT</Name>
<Value xsi:type="xsd:int">10</Value>
</AlgorithmParameter>
</AlgorithmParameters>
These HISTORIC_MODEL_GAP (1) and HISTORIC_MODEL_COUNT (10) should accommodate PredictTimeSeries(PriceChange, -3, 2). Could anyone shed some light on this?
we have problems with our SQL Reporting Service 2012 (SSRS) server . We have setup Kerberos delegation between SSRS and the database server (SQL Server Always-on cluster) so users are authenticated down to the database. The issue occurs from time to time that SSRS loses the ability to delegate the user credentials to the database. At this point in time the Report Server logs contain rejected database connections because of ANONYMOUS logon. After restarting SSRS the problem is gone.