I have two columns in Informix data base One has Data Type of date and another column of data type string. Time is stored in string format. I have to Validate wether both are correct, not null, greater than 1753 and concate to get one datetime field to transfer to SQL Server. Right now I am doing it in script component, as I need to log error if any thing is wrong Is there any better way to do it,(derived column or any other component) so that I can log the error also.
During this latest time change, the value of getutcdate() was offset by one hour as called from SQL Server. True UTC time should never be impacted (or changed) by any Daylight Savings Time activity. This is likely not an OS issue, since IIS logs did show the correct UTC time (unchanging) during the DST time change.
Here is a record of how the time change was handled by a running SQL 2000 sp4 Server as these functions were called:
getutcdate() = Oct 29 2006 5:50AM //ran at the same time as// getdate() = Oct 29 2006 1:50AM
getutcdate() = Oct 29 2006 7:05AM //ran at the same time as// getdate() = Oct 29 2006 2:05AM
@@Version=Microsoft SQL Server 2000 - 8.00.2040 (Intel X86) May 13 2005 18:33:17 Copyright (c) 1988-2003 Microsoft Corporation Standard Edition on Windows NT 5.2 (Build 3790: Service Pack 1)
Hi everybody, This is propably very silly question... I've used SQL Server for sometime now, but always through tools supplied by my hosts (my Little Admin, SQL Server Web Admin). I always wanted to try running the real thing, so I've installed SQL Server 2K developer edition on my machine. I also have IIS and .NET framework running. What would be the next step to see those 3 components in action? I have noticed Northwind database in the Enterprise Manager. How do I make the connection? I have the SQL Server Service Manager running fine, but when I open Query Analyzer it comes up with the window "Connect to SQL Server" and this little dropdown, where you can choose the server is actually empty. What do I do? Thank you all for your pointers.
Say you have users who would be accessing a SQL Server database over the web from around the world (mainly the US, Japan, and UK), and you wanted to make the access as fast as possible.
Say also you have a couple of million dollars for hardware and software.
Hi, I have a table ABC. It has 2 columns, A and B. At column A i have 3 rows, Week,Goal,Used. Let say for column B at row Week is WW15 now, after 7 days i want the row to be updated to WW16. How do i do that? Advice please.
I've used SQL Server for sometime now, but always through tools supplied by my hosts (my Little Admin, SQL Server Web Admin).
I always wanted to try running the real thing, so I've installed SQL Server 2K developer edition on my machine. I also have IIS and .NET framework running.
What would be the next step to see those 3 components in action? I have noticed Northwind database in the Enterprise Manager. How do I make the connection?
I have the SQL Server Service Manager running fine, but when I open Query Analyzer it comes up with the window "Connect to SQL Server" and this little dropdown, where you can choose the server is actually empty. What do I do?
Hi! I want to get hole through to Analysis Services 2005 from MS SQL Server 2005 with openquery, so I created linked server (MSOLAP). How to test that I can access an "database" on Analysis Services Server?
Something like select * from openquery(linked_olap, getdate()) or openquery(linked_olap, 'print hello world') or openquery(linked_olap, select results from dataminingmodelCreatedWithVstudio) ? Thanks Bjørn
I was told I can use them with SQLDataSources, but I have no clue how to do it. I believe I have managed to set up the querry in the datasource correctly, but what do I need to do to actually use it in my VB code?
Any help is appreciated, and any tutorials you've found to be usefull on the subject are sure to help. Thanks
Hi all, New to replication I need suggestions on how go about keeping 2 sql servers from different parts of the world to be in sync all times, some latency may be accepted.
Here is the senario, we have website hosted in NA and customers from all parts of the world log in to oursystem and does what ever they have to do which will update the sql server databases. as our company is expanding we want to have a same website hosted in eu, so when ever a person logs from eu he will be hitting the webservers in eu and anything he do will go to the SQL database servers that are in eu. Now the problem i have is when one site goes down say NA site goes down, we need to route all the traffic to site in EU that means sql servers in EU. This brings the challenge of both the servers need to be in sync all times.
I did some research on merge replication, this might not work because in this concept we have to make 1 server as the master and the others as slaves. Also i need to think in a way that if they want to expand more that means to asia, i will be having another server, these all need to be in Sync.
We are using SQL Server 2005 standard edition. I was reading that this can be achived easily with peer-to-peer replication,but is only available in Enterprise edition which is 4times the cost of standard edition.My company is not going to agree for this, so got to do the research and propose a solution
Real World: Backup Strategy and implementation, how? A quote:
€œReal World:
Whether you back up to tape or disk drive, you should use the tape rotation technique. Create multiple sets, and then write to these sets on a rotating basis. With a disk drive, for example, you could create these back files on different network drives and use them as follows:
//servername/data1drive/backups/AWorks_Set1.bak. Used in week 1, 3, 5 and so on for full and differential backups. //servername/data2drive/backups/AWorks_Set2.bak. Used in week 2, 4, 6 and so on for full and differential backups. //servername/data3drive/backups/AWorks_Set3.bak. Used in the first week of the month for full and differential backups. //servername/data4drive/backups/AWorks_Set4.bak. Used in the first week of the quarter for full and differential backups.
Do not forget that each time you start a new rotation on a tape set, you should overwrite the existing media. For example, you would append all backups in week 1. Then, when starting the next rotation in week 3, you would overwrite the existing media for the first backup and then append the remaining backups for the week.€?
I understand these concepts, however in €˜the real world€™ how do you go about implementing these jobs in SQL2K and how on earth do you schedule the tasks to overwrite, for example, week 1, when on week 3€™s rotation.
Could I have real world examples or scripts for the jobs that would carry out this task? It appears that whatever course you do, it does not fully cover the above, and I have only worked on my own and never with a DBA, so I have never seen this implemented in any environment.
I would like full details on this please, as I need to get my head around it.
We are trying to read a table name Order through command line SQLcmd. But it is giving us an error. Order is an SQl2005 reserver word and app programmer created that table who left. No we have problem to read that table through sqlcmd. any ide a how we can do that.
C:>sqlcmd 1> use ACCMedford 2> select * from 'Order'
3> go
Msg 102, Level 15, State 1, Server ACCMedford, Line 2 Incorrect syntax near 'Order'.
1> select * from "order" 2> go
Msg 102, Level 15, State 1, Server ACCMedford,, Line 2 Incorrect syntax near 'order'.
1> select * from order 2> go
Msg 156, Level 15, State 1, Server ACCMedford,, Line 1 Incorrect syntax near the keyword 'order'.
I have gone through the table partitioning in MSDN, like Designing Partitions to Manage Subsets of Data . But how to do this in actual world, since some db need to partition for 7 days, then archive these days records on the 8th day, while other prefer 14 days/monthly, and run this repetitively for many years? If this is running weekly, how can i generate the scheme and function dynamically? What if ID for row count is not viable?
Sure this will not work:
Code Snippet CREATE PARTITION FUNCTION [TransactionRangePF1] (datetime) AS RANGE RIGHT FOR VALUES ('10/01/2003', '10/02/2003', '10/03/2003', '10/04/2004', '10/05/2004', '10/06/2004', '10/07/2004'); GO
I have traditionally done web app and client server programming and am have been playing around with a new tool (Win forms) app that I eventually want to distribute to other developers and a couple of business people in our organization. I have done apps in the past that all connect to a central server for data access. The I'm working on now will have an individual DB per user and should be available locally for a desktop version of the software.
I am looking for more resources on things to consider when deploying an app with either MSDE or SQL 2005 Express. More specifically, items like long term maintenance of the db once it's installed with the user, etc. (DB bloat, transaction files, auto maintenance routines I may want to build in, etc)
I have seen all of the msdn docs on what you need to deploy (and how to do it), but I'm looking fro input from people that have deployed it and any significant pitfalls they have run into with that sort of deployment.
Any links or book references would be appreciated, thank you,
I am executing script like this. How to check for the errors if "master..xp_cmdshell @bcpCommand" fails. Is there any way to verify that BCP is completed successfully
SET @FileName = 'E:TestBCPOut.txt' SET @bcpCommand = 'bcp "SELECT * FROM pubs1..authors ORDER BY au_lname" queryout "' SET @bcpCommand = @bcpCommand + @FileName + '" -c -U -P'
USE [Testing] GO /****** Object: Table [dbo].[Testing] Script Date: 4/25/2014 11:08:18 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON
[Code] ....
It seems to work fine with one million records.
Each primary key is unique, but the begindate is non-unique, and i guess even if i use datetime2 and add nanoseconds, from what i have read, there is a chance that i could have a duplicate datetime since the date is imported via XML from multiple sources.
Is there a way to keep track in real time on how long a stored procedure is running for? So what I want to do is fire off a trace in a stored procedure if that stored procedure is running for over like 5 minutes.
I am trying to load previous days data at 3 am via a SSIS job.
The Date variable is initiated as DATEADD("dd",-1, GETDATE()) in the for loop.
Now, as this job runs at 3 am, and I set the variable as GETDATE() - 1, it excluded the data from 12 am to 3 am in the resultset as Date is set as YYYY-MM-DD 03:00:00:000 I need this to be set as YYYY-MM-DD 00:00:00:000
I hope to update a DateTime column value with a Time input parameter.  Poor attempt below but it looks like the @ApptTime param is coming in as 10:45:00.0000000 and I might have an existing @SendOnDate as: 2015-10-05 07:00:00.000...I hope to end up with 2015-10-05 10:45:00.000
ALTER PROCEDURE [dbo].[SendEditUPDATE] @QuePoolID int=null ,@ApptTime time(7) ,@SendOnDate datetime
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open() sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table scb = New SqlCeCommandBuilder(sda) sda.Update(dataset) cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open() Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
We need to select rows from the database that have been recently inserted/updated. We have a main primary table (COMMIT_TEST) and a second update table (COMMIT_TEST_UPDATE). The update table contains the primary key and a LAST_UPDATE field which is a datetime (to tell us when an update occurred). Triggers on the primary table are used to populate the update table.
If we insert or update the primary table in a transaction, we would expect that the datetime of the insert/update would be at the commit, however it seems that the insert/update statement is cached and getdate() is executed at the time of the cache instead of the commit. This causes problems as we select rows based on LAST_UPDATE and a commit may occur later but the earlier insert timestamp is saved to the database and we miss that update.
We would like to know if there is anyway to tell the SQL Server to not execute the function getdate() until the commit, or any other way to get the commit to create the correct timestamp.
We are using default isolation level. We have tried using getdate(), current_timestamp and even {fn Now()} with the same results. SQL Queries that reproduce the problem are provided below:
/* Different functions to get current timestamp €“ all have been tested to produce the same results */ /* SELECT GETDATE() GO SELECT CURRENT_TIMESTAMP GO SELECT {fn Now()} GO */ /* Use these statements to delete the tables to allow recreate of the tables */ /* DROP TABLE COMMIT_TEST DROP TABLE COMMIT_TEST_UPDATE */ /* Create a primary table and an UPDATE table to store the date/time when the primary table is modified */ CREATE TABLE dbo.COMMIT_TEST (PKEY int PRIMARY KEY, timestamp) /* ROW_VERSION rowversion */ GO CREATE TABLE dbo.COMMIT_TEST_UPDATE (PKEY int PRIMARY KEY, LAST_UPDATE datetime, timestamp ) /* ROW_VERSION rowversion */ GO /* Use these statements to delete the triggers to allow reinsert */ /* drop trigger LOG_COMMIT_TEST_INSERT drop trigger LOG_COMMIT_TEST_UPDATE drop trigger LOG_COMMIT_TEST_DELETE */ /* Create insert, update and delete triggers */ create trigger LOG_COMMIT_TEST_INSERT on COMMIT_TEST for INSERT as begin declare @time datetime select @time = getdate()
insert into COMMIT_TEST_UPDATE (PKEY,LAST_UPDATE) select PKEY, getdate() from inserted end GO create trigger LOG_COMMIT_TEST_UPDATE on COMMIT_TEST for UPDATE as begin declare @time datetime select @time = getdate()
update COMMIT_TEST_UPDATE set LAST_UPDATE = getdate() from COMMIT_TEST_UPDATE, deleted, inserted where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY end GO /* In our application deletes should never occur so we don€™t log when they get modified we just delete them from the UPDATE table */ create trigger LOG_COMMIT_TEST_DELETE on COMMIT_TEST for DELETE as begin if ( select count(*) from deleted ) > 0 begin delete COMMIT_TEST_UPDATE from COMMIT_TEST_UPDATE, deleted where COMMIT_TEST_UPDATE.PKEY = deleted.PKEY end end GO /* Delete any previous inserted record to avoid errors when inserting */ DELETE COMMIT_TEST WHERE PKEY = 1 GO /* What is the current date/time */ SELECT GETDATE() GO BEGIN TRANSACTION GO /* Insert a record into the primary table */ INSERT COMMIT_TEST (PKEY) VALUES (1) GO /* Simulate additional processing within this transaction */ WAITFOR DELAY '00:00:10' GO /* We expect at this point that the date is written to the database (or at least we need some way for this to happen) */ COMMIT TRANSACTION GO /* get the current date to show us what date/time should have been committed to the database */ SELECT GETDATE() GO /* Select results from the table €“ we see that the timestamp is 10 seconds older than the commit, in other words it was evaluated at */ /* the insert statement, even though the row could not be read with a SELECT as it was uncommitted */ SELECT * FROM COMMIT_TEST GO SELECT * FROM COMMIT_TEST_UPDATE
Any help would be appreciated, we understand we could make changes to the application/database to approximate what we need, but all the solutions have identified suffer from possible performance issues, or could still lead to missing deals (assuming the commit time is larger than some artifical time window).
I need to take a temporary table that has various times stored in a text field (4:30 pm, 11:00 am, 5:30 pm, etc.), convert it to miltary time then cast it as an integer with an update statement kind of like:
Update myTable set MovieTime = REPLACE(CONVERT(CHAR(5),GETDATE(),108), ':', '')
how this can be done while my temp table is in session?
We are using SQL Server 2008 as our database and use Access as a GUI. I am looking to create a form in Access where employees can access their time card and request changes from management. I want to use the format from the attached screen shot for the form. I pretty much know how to do it all, the only point of complication is trying to figure out the easiest way to get the transaction punch record data on employee_punch_record into a format where I can easily populate the form in the horizontal format you see in the screen shot.
I am not super strong in SQL, but figure I can do it using a formatting table of some sort. quick and easy way to move transaction records into a more horizontally oriented record?