We are using SQL 7 at 2 different locations, on 2 different Domains that are connected via a T1 line.
We need to have our 2 SQL servers be identical at all times, meaning when a change is made on one server, it is immediately replicated to the other. The problem is that both servers are always being used, so there is no primary/backup server scenario.
Does anyone know if there is a third party software available, or if this is even possible to do, given the fact that the databases are always open ?
I have two database, one is for production and another for reporting database, in this Scenario we need to transfer the real time data to reporting database, which replication we need to use and what at the bottelnecks in this..can any one give some suggesstions or give some links...
I want to create real time replication of my databaser server.I have two database srevers,one is master which having real time data and another slave,where i want to replicate data from master .
Is there a way to keep track in real time on how long a stored procedure is running for? So what I want to do is fire off a trace in a stored procedure if that stored procedure is running for over like 5 minutes.
Hi,when i create a new user in a .MDF file, the hour of creation in the field "createdate" of table "membership" is exactly 2 hours less than the time of my computer. Probably something to do with different hours between Europe and UK or ...How can i set the creation hour to exact the same hour of my computer?ThanksTartuffe
I have posted this question in another forum but no one has so far been able to provide a solution to the problem. Since I know Database Journal always has very informative and enlightening posts, I figured I'd post the question here in hopes that some guru can provide an answer.
If you're running SQL 2000 and have any jobs that have been executed, you could perform a query as such:
select last_run_time from msdb.dbo.sysjobsteps
and receive returned values that contain the last time a job was executed "stored in integer datatype" columns. See -> sp_help sysjobsteps.
In SQL 2005 I believe the concept is the same. I think the intent of Microsoft had it mind for doing this was to store the date separate from the time values which won't work using the datetime datatype and I have read this in documentation in the past.
The challenge is to convert that data into a humanly legible 12 or 24 hour time format like 11:00 AM or 02:45:39.
Does anyone have any suggestions or clues to assist in resolving this problem??? :(
I currently have an application which retrieves stats from a SQL database that are updated throughout the day. My application updates every 3 seconds to retrieve these stats but I found that it's quite expensive to the server's CPU when there are several clients running my application. Are there any other methods for extracting the data that won't require so much CPU. The fact that the database is being hit so much isn't good either. I have already written a webservice hoping that this would improve things but I'm not sure it has.
Can anyone guide me. which is the best method for real time synchronisation of my production server. Is it Transactional Replication or Stand By Server?
db1 gets 150k records every hour. db2 is like the backup of db1. db2 is used to retrieve records to create reports using reporting services. what is the best way to synchronize db2 with db1? is it to create a dts package and set up a job that runs every hour?
off topic question. what kind of language does reporting services use to create the reports?
hi guyz ...i m new to sql server so is there anyone who could guide me abt sql server 2005 ....i need some info abt ssis ,ssas ...i have got a proj n have sme ques regd tat so plz let me know ur no or mail id so that i could contact u ..plz its urgent
hi guyz ...i m new to sql server so is there anyone who could guide me abt sql server 2005 ....i need some info abt ssis ,ssas ...i have got a proj n have sme ques regd tat so plz let me know ur no or mail id so that i could contact u ..plz its urgent
Does anybody know a good way to get a (near) live view of a sqlce database that doesn't involve ResultSets? I tried to use a resultset with the Sensitive option set, but it doesn't like the joins that I have to do in order to have the data make sense to the user.
The ETL processing for my project will be deployed as part of a larger application which is operated by non-technical users. Therefore I want to provide real-time feedback to the user on progress, errors, etc., using windows standard GUI interface objects and style- for example a progress bar. The "designer" of course does some of this but there won't be any designer in our deployed application (and SSIS designer is neither intended to be used by, nor appropriate for, non-technical users anyway).
Now that I have the ETL logic working, it's time to tackle this UI requirement. I am thinking one way to do this is to start a script task to run in parallel with the main ETL processing, open a winform in that task that has various GUI objects whose "state" is based on package scoped variables, update the package level variables via the "on progress" event (or other events as needed) of various SSIS tasks and components, and refresh the winform's display regularly via a timer event.
Does this approach seem like it would be effective? Has anyone tried maintaining an open winform via a script task and updating it's objects via package variables in parrallel with the package is running other tasks?
We currently have a propriety in memory DB that is used to store the latest transactions in the system and we have a service that copies the data to a SQL Server every couple of seconds - For historical reporting purpose.
We would like to move into a more standard DB as our real time DB, since we have scalability and availability issues. We taught about using SQL Server since this is the DB we know, but I'm not sure it's built to handle real time data.
Does someone has any experience in using SQL Server for "Real Time" applications?
Does someone has any experience in storing the data files on RAM?
Does MS has a solution similar to Oracle's TimesTen, which is their real time DB?
We have a table which needs to be updated 2 million times per day. It hosts all real time transaction. There are 200K records in this table. Would you please to share your experience with me about how to protect/save such table in SQL 2000 from any possible damage?
We plan to use point-in-time backup (every 5 minutes). It still takes at half an hour to recover the whole database. Any new technology from Microsoft or SQL 2000 you can recommend?
Hi I´m a SysAdmin and I´m implementing ASSP (anti-spam) to use with an email server. But the ASSP requires a txt file with all domains hosted in the server that are in a SQL Table of the server. My questions is: Is there anything to make a real time TXT file with all the domains ? maybe use trigger I dont know..
I'm sure just about everyone uses the PRINT command to give feedback as to what their lengthy and involved scripts are doing, as sort of a record.
I cannot figure out how to make the stuff I use in PRINT commands come out in real-time like SELECTs seem to. Does anyone have an answer to this? These are long-running scripts, and I'd rather nip a problem in the bud before the entire script completes if there's a problem I can capture.
Hi, I'm would like know if the analysis service data mining enables to detect anomalies from "normal" behavior/patterns of data (1), and alert about such anomalies when detected (2). both above sql server relational DB (3).
Hi, I would like to find out about SSIS compilation. Can you mention anything regarding this issue or can you point me out to a website for this topic please?
Hi There are 2 databases db1 and db2 on SQL server 7 with tables 1 and 2 respectively. Both these tables have a field called Quantity. IF quantity gets updated in table 1 then that change should be immediately reflected in table2. I though an update on table1 would serve the purpose but this trigger doesn't seem to be working however there are no errors during the syntax check. The code for the trigger that i have used on table 1 is as below CREATE TRIGGER [Qty_UPDATE] ON [Table1] FOR UPDATE AS IF UPDATE (Qty) BEGIN UPDATE db2.dbo.table2 SET db2.dbo.table2.qty = qty END Please help Thanks in advance
I was wondering about the stability of SSIS when it comes to importing data on a real-time basis. Let's say you have a scenario where flat files, for instance, will be dropped at random intervals ranging from 1 second to 10 seconds apart and the importer has to import these files immediately.
I would imagine that this is done with a package which runs a loop sniffing the directory forever but I stand corrected on the best ways of doing it.
I'm not too sure whether SSIS is a good idea for this as lots of people have had bad comments on SSIS in real-time in my company but they cannot elaborate on why enough to convince me. I have done some pretty cool stuff and must admit that I am a fan of the technology but dont want to defend it into a corner
I am about to prepare a paper concerning the field of real-time data mining. Real-time here means the process of incremental training of an existing model as soon as the data arrives.
There is a number of papers introducing algorithms for incremental association analysis, incremental clustering etc. Stream mining Ãs a field which is closely related to that. The main reason for the implementation of incremental algorithms is a) the large amount of data to be mined and b) the high rate of new data that is evolving every day.
Using classical batch mining algorithms, models that are outdated for some reason, would have to be re-trained, which could be very time consuming for billions of records. And once the training is completed, the training would have to be restarted once again because a bulk of new data has been arrived.
The question that I would like to discuss now is: For what real world applications would it be a meaningful or even essential to use real-time training of models?
Two main reasons could determine the answer to that question:
You just want to incorporate new data into existing models in order to increase the prediction accuracy of your model or Your underlying data is subject to more or less massive changes (also refered to as concept drift) and you want to adapt your mining model continuously to that reality.
I'm looking for some examples or ideas where one of these cases apply and it would be a good idea to have incremental mining algorithms involved.
I'm looking forward to inspiring some discussion on that issue.
I saw a presentation last week where the speaker created some sorta sql server "watch window" (in Sql Server Management Studio I think) where he could watch all the commands being executed on his sql server database in real-time. For example he could navigate to web pages (that hit the database) and as he pressed buttons you could see the sql commands execute in this "watch window." If other users hit the database at the same time you could see those sql queries execute as well. I didn't think at the time to ask how he did it - does anyone know how to set this up? I have a problem with my sql server right now and it would be useful to see which sql queries (etc) are being executed when. Thanks in advance,J. Shane Kunklejkunkle@vt.edu
I want to build a windows application in Visual Studio 2005 that grabs some data from a SQL DB and displays it in a gridview. That is easy and I already have it done, but I also want to know how to show this data in real-time. For example: right now we have a application that pulls some login information for all the users and displays their phone extension, their name, and their status. They can change their status by clicking on some radios below the datagrid and on change it updates the data and then posts back. However, I dont know how the guy who did it made it so that these local changes are automatically shown on everyone elses computers around the office. In other words, they are shown the live data as it changes, without having to refresh it or anything. How would I go about doing this or does anyone have any places/resources that could help me accomplish this task. Thanks. Oh and the guy who wrote this is no longer working here and he deleted the source code so i cant look at it.
Hi to you all and specially to who will solve my problem!
That is what I want: I need a database that will serve as an archive db in read-only for day-to-day queries and a production db that will be kept small by deleting un-needed records
A replication setup would be fine but I didn't find a way to avoid the deletes to keep the production database small, to replicate in the archive database where transaction are replicated.
I have this scenario which is working fine, but would like to know if others have tried it or can recommend a better approach. Below is a brief description, but code should fully explain:
A process updates a SQL Server table via a stored proc, which in turn writes information to a service broker target queue. In my client/server architecture, I would like clients to see this new information as soon as it gets written to the target queue. To do this, I have a WCF service that calls a stored procedure; the stored proc contains an infinite loop which has a RECEIVE statement with an infinite timeout. Once RECEIVE reads data from the SB queue, data is retrieved via a SqlDataReader and results are sent to clients via a pub/sub. I then wait for more rows until RECEIVE unblocks and so on.
Stroed Proc
Code Snippet
-- ... WHILE 1 = 1 BEGIN -- Param declarations
...
BEGIN TRANSACTION; WAITFOR ( -- Blocks while TargetQueue is empty RECEIVE TOP(1) @conversation_handle = conversation_handle, @message_type_name = message_type_name, @conversation_group_id = conversation_group_id, @message_body = CASE WHEN validation = 'X' THEN CAST(message_body AS XML) ELSE CAST(N'' AS XML) END FROM [dbo].[TargetQueue] -- No time out! )
-- Handle errors ...
-- Return received information. After this statement is executed, -- reader.NextResult unblocks and reader.Read() can read these -- new values SELECT 'Conversation Group Id' = @conversation_group_id, 'Conversation Handle' = @conversation_handle, 'Message Type Name' = @message_type_name, 'Message Body' = @message_body ;
COMMIT TRANSACTION
END -- WHILE
C# Code
Code Snippet
// Create a SqlCommand, and initialize to execute above stored proc // (set command time to zero!) ...
// ExecuteReader blocks until RECEIVE reads data from the target queue using (SqlDataReader reader = cmd.ExecuteReader()) { // Begin an infinite loop while (true) { // Process data retrieved by RECEIVE while (reader.Read()) { Trace.WriteLine("Conversation Group Id :" + reader.GetGuid(0) + "Conversation Handle: " + reader.GetGuid(1) + "Message Type Name : " + reader.GetString(2) + "Message Body : " + reader.GetString(3));
// Send data via pub/sub ... }
// Blocks until stored procedure returns another select statement // i.e., blobks until RECEIVE unblocks and retrieves more data from queue reader.NextResult(); } // while }
I have no idea where to post this kind of question, so here it is!
I have a requirement to retrieve oracle 10 data into SS2000 in as near real-time as possible (stupid users!) and join with resident SS data for on-demand reporting. (We use SS replication to populate some reporting tables from other SS2000 instances and this has spoiled the users as well as the developers! )
I would like to know if there are any clever ways of doing this, or if a plain-old DTS package running in some kind of loop is the practical answer. 1 minute delay is probably too long . . . I don't know if data can be pushed from the oracle side. Or if we need to write a Service and use it to suck and push.
I am trying to understand an environment and provide a solution to Banking system so that they can enter user data (transactions) online and at the same time we can provide users online reporting as well. Using same sql server or server/hardware on other machine.
There are so many branches/customers/ATM machines accessing online data as well as updating their balances. I want to understand how can we provide online reporting. Through replication, transaction log backup, log shipping or what other solution is available. I need to understand this and provide a solution that is already implemented running/successfully. Need some proposals and their pros and cons. cost and maintenance are the constraints with the real time reporting on live system/database.