I am testing a procedure to automate the transaction log dump. I am following the steps located in Chapter 22 of the Microsoft SQL Server Administrator's Companion ("Automatic Transaction Log Dumps Using Performance Monitor"). The alert in Performance Monitor appears to be starting when the log is 75% full, but the alert is not firing off the file that contains the dump transaction sql command. For the 'Run Program on Alert' box this is what I have: isql -Ssvrname -Usa -P -id:appsmssqlinndump.sql
The dump.sql file contains: 'dump transaction pubs with no_log'
I have also tried the following 4 steps: 1) Created a SQL Alert Messsage, 2) Created an NT Performance Monitor Threshold Alert to run sqlalrtr to issue a certain error when the pubs log is 75% full, 3) Created a TSQL Task, and 4) Created a SQL Server Alert to run the Task created in step 3. This appears to do the same thing. The Alert is fired off, but the Task is never executed. Note: I am able to execute the task from within the Schedule Tasks Window.
I am using Standard Security with SQL Server 6.5 (sp5a) running on NT4.
Thanks for you help in advance.
I need to set up a dump of my main database's transaction log to dump every hour but not overwrite until the 8th hour (keeping 7 log dumps). Can anyone tell me how to set up the scheduled tasks for the transaction log dump? Is there anything special I need to do for the main database dump?
does anybody know how to automate the loading of incremental transaction dumps. The manual way is to use the "load tran DB with FILE = x" statement.
Since this has to be done in the right sequence and i need to automate this task to keep a second server up to date i like to know if there is a stored procedure or any other tool which could do the task.
our users believe that we lost some valid data, but no one knows who did it, I thought I can find it from the transaction dumps I take every hour so ,Can I read Transaction Dumps (*.TRN) file in SQL server 7.0 or Can I get this information through other means.
Can anyone tell me where I can find information about why sql server might be executing a stack and/or symptom dump? That is, what are some of the conditions that might cause this. Every time I try to DBCC DBREINDEX on a certain table, stack and symptom dumps occur and SQL Server remains incommunicado (can't connect, even locally) until I restart it. I'm going to try to fix the table by using rename table, create new table (and dependent objects) , transfer data, and toast the original table. No problem, I've done that before with smashing success. But, if this solves my problem, I'll still be no wiser about what it was that caused the problem in the first place. Any suggestions, questions, comments, or even jeers would be appreciated. Thanks, Chris
Hi: Actually we have problems doing database dumps, so we're copying directly the .DAT files to a backup machine, I mean the .DAT files are the data devices that uses SQL no dumps files. We can do or eecute the dump process.
What problems can we expect doing this? Which can be the problems 'cause we cannot dump our DB's?
I have noticed a question on the Admin exam which involves declaring and usting a time variable for scheduling a backup or dump of a database. Is this possible and has anyone else seen this on the exam or used it in reality?
Our SQL Servers is giving us a headache, after a certain period in time, either SQL Service automatically shuts down by itself or hangs. I've opened the logs and found hex dumps. Can you help me out with these?
I've read that the SQL dump can only be done directly into the local server. Is there a way to put directly the dump into another server via the network (not copying it but writing it during the process of backup). Any help much apreciate.
We are experiencng high cpu utilization across all 4 cpu's at the top of the hour when our transaction log dump job runs. Has anyone observed this bahavior before? Is there anything we can do to mitigate this? Thank You.
I am new to microsoft SQL server as I am from Oracle background. I am preparing to the MCP certification for Exam 70-229, Designing and Implementing Databases with Microsoft SQL Server 2000 Enterprise Edition. The exam shows as SQL server 2000 is there any exam for SQL server 2005.
Is it the write exam to give to start as fesher into microsoft platform. Iam into development not towards administration.
If any body has any Exam preparation questions please forward them to krishna.kanigelpula@gmail.com.
How can I check to make sure that my dumps are not corrupted? I have been using a utility from Microsoft called DSCAN5, but have found that this has some limitations.
I have a package which access a DB2 database and pulls data from a single table. I can't put a specific event on it, but the package has been causing a dump to occur on a rather regular basis. The really odd part is sometime when I add a data viewer on the output link of the OLE DB Source it works....then it starts to dump again a couple of executions later. There are not date/time values involved in the result set, just character strings. Default code page is set to 1252 and use default page is set to False....any ideas appreciated - this is really starting to drive me nuts!
I'm going back and forth on an issue and was looking for some outside observations. I have over a dozen SQL NT servers, both 6.5 and 7.0. On some servers I'm dumping the databases and backing up the dumps with Veritas BackupExec, on 4 (two SQL 6.5, two v7) I'm backing up with the Veritas SQL Backup Agent.
Obviously, if you don't have room to dump a database, you must use the backup agent. That is the case on one of my servers and becoming so on a second. But my personal preference is to do the dump/backup.
(as a side note, one server is not backing up correctly with the backup agent but ultimately that box will require it due to DB growth, so it is something that I have to resolve)
I like the dump system for a few reasons. I find it easier to load from a dump, particularly a single table. Likewise, I find it easier to copy a database by loading from a dump rather than going from a backup, but that's mainly because of BackupExec being a little bit strange on redirecting restores.
Here's my clincher. I find restores via the backup agent to be ridiculously slow. Let's say I have a 5gig DB that has 1.5gig in it. The dump size will probably be somewhere below 2gig. The restore via backup agent carefully writes the entire 5gig even though 3.5g of that is empty. This takes a lot of time. Add to that the "post-restore DBCC". I had such a restore take something on the order of 13 HOURS which, needless to say, conflicted with my nightly DBCC's and backups.
OK. End of rant. Any suggestions or thoughts on the subject?
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
I am getting this error :Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.have anybody idea?!
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
I am trying to create a process to automate the FTP process and then IMPORT to a table. Once Import is complete, I would like to move the file from the FTP server and rename it. I tested the "commands" via CMD and it works. I then created a store procedure with the DOS commands to rename/move the source file. The command executes and deletes the file from the source, however, the copied files to a different destination disappeared. I did a search in my local hard drive(testing using my local folders), but can't find where the copied/renamed file went.
HEre is the portion of the commands:
SET @cmd ='Copy "C:Documents and SettingsegllareMy DocumentsBoxtestData.txt" "C:Documents and SettingsegllareMy DocumentsArchiveCopybox.%random%.bak"' EXEC master..xp_cmdshell @cmd GO
Can you please tell me what's wrong with this,
or direct me to a sample store procedure I can model..
Could anyone tell me if and how can i automate processing in sql server. Example, i would like to automate reading from a source file, populating it into the respective tables and then running a view to process desired tables. I already did all the above manually. I just want to know if there is a way of automating all this process.
I was wondering if anyone can share the procedure(s) used in setting up an automated MS SQL database backup through tivoli Storage Manager ( 5.2.7 )..........?
The powers at be have decided that they would like to automatically run a trace on one of our analysis servers when it processes a cube in the early hours of the morning. Now I have no problem creating a SQL Profile to run and store the results in a database table for them, but I have no idea how to automate it so that it runs everyday, any help would be greatly appreciated. Apologise for not putting this in the correct forum.
'INSERT INTO SaltInvWhOpen (StockCode, Warehouse, TrnMonth, TrnYear, OpenBalQty, OpenBalCost)SELECT StockCode, Warehouse, Month(GETDATE()) AS TrnMonth, YEAR(GETDATE()) AS TrnYear, OpenBalQty12, OpenBalQty12 * UnitCost AS Expr1FROM InvWarehouse --------------------------------------------------------------------------------
In older versions I would have used a DTS package (It's been a long time and I'd have had to use a book to guide me through it), but as that is gone now, I am unsure where to even begin looking.
If someone could assist by giving me the current method, and dare I ask, walk me through it, the help would be much appreciated.
After SQL Server Express 2012 is installed, is there a way to automate setting up a few options? I need to enable the option for both windows and sql server authentication. Can this be done with Powershell or other tool? This is a situation where the user has absolutely no computer skills and there is no one available in their office that can set the options manually. I thought some sort of script would be useful for this.
We're trying to put a view of data maintained in desktop Access databases online and into SQL Server.The desktop Access system uses separate databases instead of tables within one database, It's a strange design, but it can't be changed.We have been importing all of the separate databases into a single, new Access database, then upsizing the new databse to SQL Server, then uploading it.This is not going to work long term, because we are stuck with a 250 mB Access database to upsize and upload, when we never need to update more than 2 or 3 of the tables and upload more than 2 mB.We'd like to be able to upload only the tables -- preferably the Access *.mdb's -- that have changed, and then replace the SQL Server tables with the new information. And we'd like to automate it as much as possible, without upsizing Wizardy.I don't know where to even begin looking for information about how this might be done.Any suggestions would be deeply appreciated.- Tinker