I have log shipping setup at a client site (it is their network). Log shipping runs fine until midnight, and then stops processing files. What I notice on the primary server (where the files are backed up to), all the files after midnight are set to A attribute, none of the files before midnight have this. The short of it it does not process these files and log shipping dies and has to be rebuilt. Any network admins have any ideas (I did have them stop backing up these files with their backup software) still dies
Hi, I need some advise in the log shipping. The log files in the primary server get cleaned up according to what I have specified in the maintenance plan. But the log files that got shipped to the secondary server stay there for ever wasting my hard disc. Will it make any problem if I remove them or can I set it up to remove all files earlier than past 2 hrs? Please advise. Thanks
Is it a recognised limitation of log shipping over a WAN, that if you re-index the primary db then the result is gigabytes of trn log to send over the WAN, effectively breaking the log shipping sync?
Are there any known work-arounds?
Would anyone recommend using db replication instead to maintain a DR sql server?
I have two databases on a Production Server that I want to Log Ship to a Test Server. According to the sys.master files the physical File Location is on an E drive. Early attempts at Log Shipping these two files error'd out due to space issues on the E drive (one Log Shipped and then one err'd out). I was subsequently informed from the server group that they would prefer that I Log Ship these two database files over to the M Drive where more space is available. In fact, they modified the Server Properties / Database Settings / Database Default Locations (for Data and Log) to the larger M drive (I'm not really sure why they just don't increase the E drive space but there is proabably a good reason).
Okay, so now my problems have been solved. Easy enough. Now I deleted the successful Log Shipped database and started from scratch. However, as before, one db restored and one failed (due to space issues). Apparently, both db are pointing towards the E drive. How is that possible?Â
So here I am with one successful database and the normal sys databases pointing to the E drive. What is the best way of approaching this  move to the larger and preferred M drive?
I'm experiencing a weird problem with log shipping in SQL 2005.
I've setup Log Shipping for a production database between two sites. The standby database is being updated correctly and everything seems to be working as expected but for one detail: the name of the transaction log backups are generated with an UTC timestamp instead of my local timezone.
The the data below extracted from the backup history:
At one of our client side a wired log shipping issues has come up.while monitoring those two server i noticed that although log-shipping report says both server are in sync, report displays information related to both backup and restore , it doesn't shows information related to copy that is when was last file copied and last file copy column is showed blank. Same is when i execute proc "sp_help_log_shipping_monitor" . I get same result ...
When i expand copy job history to analyse its what i found is although job has executed successfully , but in depth reading each steps says that no .trn file was copied .
My copy directory is at secondary server itself where .trn files are placed.And from this location itself files are begin restored.
SQL server and agent on both servers are run by same domain account ....
I'm using SQL with Access as a front end. When the forth user tries to access the same data I get an ODBC connection failed error. I'm new to SQL and have no idea what to do about this. Please help!!!! -Seth
When processing my very large cube, after 6 hours, the service dies and the ASCMD returns an error saying it can't find the server / service. The rest of the server is fine and I just have to restart the Service. But the ASCMD keeps killing the service. Can anyone offer some ideas of when I should be looking?
Hello all.I have quite disturbing situation where I am not happy about the way how SQLhandles the query. Situation is related to using user function in INNER JOINselect. Although the problem occured in more complex situation the query canbe simplified to following example with same results:There is an user function, that can be as simple as:FUNCTION IsItSo (@text1 nvarchar(255), @text2 nvarchar(255))RETURNS integerBEGINif isnull(@text1,'')=isnull(@text2,'')return 1elsereturn 0return 0ENDIt can be any function returning integer, this is just for example.Then there is a query as simple as that:SELECT person, formula FROM(SELECTa.person, v.code AS formulaFROM rows1 AS aINNER JOIN rows2 AS v ON a.type=v.typeAND dbo.IsItSo(a.type,v.type)=1)as formulastables rows1 and rows2 can contain as little as two columns for thisexample - rows1 have [person] and [type], rows2 [code] and [type]. Allcolumns are of nvarchar type.So the AND clause after INNER JOIN is an obvious mistake. That is not theproblem, problem is how SQL reacts. On SQL 2000 SP4 the result for runningthe query is following error:Server: Msg 913, Level 16, State 8, Line 1Could not find database ID 101. Database may not be activated yet or may bein transition.On SQL 2000 SP3 the result is even more dramatic - SQL service will consume100% of all available processors and become unresponsive. Only solution isto restart either SQL service or the server.My question is if this self-destructive behavior of SQL server can beprevented by some configuration parameters or patches. I am a bit annoyed bythe fact that developer can kill the server by little bit of poorprogramming that is syntactically acceptable for server.One more curious bit - the section of the query located between parenthesiscan be run separately without any ill effects and get actual meaningfulresults:SELECTa.person, v.code AS formulaFROM rows1 AS aINNER JOIN rows2 v ON a.type=v.typeAND dbo.IsItSo(a.type,v.type)=1
I'm trying to install "SQL Server 2005 Express Edition SP2".
I downloaded "SQLEXPR32.EXE" (38,220,656 bytes) from the Microsoft site.
At the beginning of the setup the installation program scans my system -- no problems or warnings. Everything looks fine.
During installation I uncheck, "Hide advanced configuration options" simply because I want to change the default instance name. I make no other changes to the default installation.
The installation starts up but when it tries to install the "SQL Native Client" I get the following error dialog message:
"An installation package for the product Microsoft SQL Server Native Client cannot be found. Try the installation again using a valid copy of the installation package 'sqlncli.msi'."
That's an odd message to receive since I am *not* even trying to install the SQL Native Client.
Does anyone have an idea of what might be causing this to happen. I've install SSE many times in the past with no problems.
I have written a package that archives off old orders over night, it appears that this package is failing after about 10000 second every time it is run. I don't think it is memory as I am running it and checking for memory leaks.
Basic run down of package is
EXEcute SQL task to get orders to delete
If a for loop, loop each ordernumber
within the for loop there are 2 dataflow
dataflow 1
find related records in child tables (oldb connection using query)
using a mutli split first
check (with lookup) for records already in archive database
Replication Distribution Agent often dies with the following log entry. At that hour little is ever going on so I am surprised that the error occurs.
Failed Job -> JobName: Instance1-DB1-Instance2-33, StepName: Run agent., Message: Timeout expired. NOTE: The step was retried the requested number of times (10) without succeeding. The step failed..
it's my laptop, i try to install sql server 2005. the installation program dies at the first stage. the program seem cant finish the Native Client installation and stops for ever. theres no error msg. the installation program works well on my desktop. does anyone knows what goes wrong?
I could not able to find Forums in regards to 'Log Shipping' thats why posting this question in here. Appriciate if someone can provide me answers depends on their experience.
Can we switch database recovery model when log shipping is turned on ?
We want to switch from Full Recovery to Bulk Logged Recovery to make sure Bulk Insert operations during the after hours load process will have some performance gain.
I 'm sure I am missing something obvious, hopefully someone could point it out. After a failover log shipping, I want to fail back to my inital Primary server database; however, my database is marked as loading. How can I mark it as normal?
I did the failover as follow:
I did a failover log shipping from the 2 server Sv1 (Primary) and Sv2 (Secondary) by doing the following
1) Stop the primary database by using sp_change_primary_role (Sv1)
2) Change the 2nd server to primary server by running sp_change_secondary_role (Sv2)
3) Change the monitor role by running sp-change_monitor_role (Sv2)
4) Resolve the log ins - (Sv2)
5) Now I want to fail back - I copy the TRN files to Sv1 - use SQL Ent to restore the database at point in time. The task is done; however, the database is still mark as loading. I could not use sp_dboption.
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you
I need to create a RO copy of a production DB owned by an outside company. We are connectd via a WAN link, but cannot use replication. They are proposing using an initial load via tape, and sending us a text file nightly with the days changes to the DB. We will then need to load that data using BCP, DTS or some other method. Does any one have any ideas on using log shipping instead of the text file. It would only be practical to get a fresh load of the entire DB once a quarter or once a month at most. It is a 40+ GB database and we are expecting 100 to 200 MB of logs per night. For business reasons, we are limited to some type of file transfer mechanism for the data transfer, and cannot really change their backup schedule which is nightly fullbackups and tlogs every 30 minutes.
I am using SQL 2k EP Editions with SP2 on Win 2k Advance servers. Since more than week or so I am trying to establish log shipping between two servers. But its not working.
I am using database maintainence plan wizard to set up log shipping. Every thing works fine as far as wizard is concern, it creats plan for log shipping. But my log shipping is not working. The plan to back up log on source database is working fine. I can see the job history and the log files in the backup folder. But I have found that the job on the standby server to copy log file on network folder is failing and so the job to restore log on stand by server. I get the following message
"sqlmaint.exe failed with error state....."
Little reaserch on the standby server found that sql server is using maintainence plan to copy and restore log files, but i do not see any database maintainence plans on standby server as well as I have checked that there is no plan id in sysjobs table on either server.
I have sa rights. The account used by sql service and sql agent have admin rights and they do have rights to access the network folder for both the servers. So there is no rights problem.
I have followed all steps published in white paper for setting up log shipping on microsoft web site.
I have searched microsoft KB but it is of no use for sqlmaint.exe.
This might end up being fairly lengthy...I'm in the midst of implementing log shipping as a "warm stand-by" solution at my company. All the components appear to be in place: I'm using cmd shell to copy the backup device file to a remote server and then execute a RESTORE stored procedure on the remote server. The copy and restore work just fine. The problem I'm having is with the transaction log dumps and restores. We normally dump transaction logs (and then truncate) every hour. With the log shipping being implemented, we're going to want to do separate log dumps every ten or fifteen minutes, copy that dump over to the remote server, and then apply that log to the database. Here's the question: for the log ship portion, I don't truncate the log. But after the "normal" log dump occurs, things get tossed out of whack. When you try to apply a log, I get the message "database has not been rolled forward enough....". Has anyone encountered this type of issue and if so, how did you work around it? I'm assuming it's a simple of issue of certain options you set on your dumps and scheduling.... I'd appreciate any help.... Thanks!!!
We are considering implementing log shipping. Do the sql server logs keep track of the logs that are shipped and applied through log shipping? Or is there some other way to make sure that all logs have been shipped and applied?
I have been successful in getting log shipping working but still have some nagging questions that I cannot find answers to.
1. I had a situation where the copy for one TranLog took much longer than the 15 minute interval I have it setup for. It seemed to get stuck on that copy. Is that how it is supposed to behave.
2. Related to the question above, weekly, I have jobs that reorganize, check integity, recalc statistics. Would these jobs create very large log files? If so, how do others deal with this?
3. Is there any documents available that discuss testing converting your secondary server/database to your primary and back again?
4. Is there a way to setup Email notification to report out-of-sync conditions?
While configuring log shipping, if i choose the "allow database to assume primary role" then the "ceate and initialise new database " option is selected by default..Does this happen all the time or am i missing something.What if i have already initialised the destination database.
I am testing my log - shipping strategy. I have tried with northwind database and it was successfully created and is operating. However in order to test I have created a new Test Table in the primary database to see whether it is working. From now on database shows that it has been loading and I cannot see any tables it is grayed and it says loading. What would be the problem? When I checked the logs it has been copying to the secondary database and it doesn't show any error in the log-shipping monitor. It seem everything is cool accept this loading part. If some one help me I really appreciate it.
I am using SQL 2k EP Editions with SP2 on Win 2k Advance servers. Since more than week or so I am trying to establish log shipping between two servers. But its not working.
I am using database maintainence plan wizard to set up log shipping. Every thing works fine as far as wizard is concern, it creats plan for log shipping. But my log shipping is not working. The plan to back up log on source database is working fine. I can see the job history and the log files in the backup folder. But I have found that the job on the standby server to copy log file on network folder is failing and so the job to restore log on stand by server. I get the following message
"sqlmaint.exe failed with error state....."
Little reaserch on the standby server found that sql server is using maintainence plan to copy and restore log files, but i do not see any database maintainence plans on standby server as well as I have checked that there is no plan id in sysjobs table on either server.
I have sa rights. The account used by sql service and sql agent have admin rights and they do have rights to access the network folder for both the servers. So there is no rights problem.
I have followed all steps published in white paper for setting up log shipping on microsoft web site.
I have searched microsoft KB but it is of no use for sqlmaint.exe.