No Recovery/no Log File Wanted
Feb 8, 2006
I have a SQL Server (Express) database that I use in a Visual Studio (Visual Basic) 2005 application.
The application is of the Decision Support type, that routine deletes the contents of some of the database tables and then re-populates them with new data. There is no reason for keeping the "old" contents and no reason to restore the old content.
(It does this using "BULK INSERT" statements, as in:
"BULK INSERT SharePrices FROM "C:SharePrice.txt")
Because of all the data replacing, the log file gets huge quickly. I don't need the log file at all.
(I suspect that the logging is also taking up time unncessarily.)
Is there any way I can set the database not to have a log file at all, or to have a small log file?
(I've tried deleting the log file from SQL Server Management SQL Express, but get a message that there has to be a log file.)
When I restrict the log file size, my application returns error messages that the log file is too small, and the application can't do what it needs to do.
I have already made sure that in the database that the recovery mode is set to "Simple" and that "AutoShrink" is set to true.
I've tried manual "Shrinks". These work. However, as soon as the application accesses the database, the log file gets huge again.
I realise I may have to do (unnecessary) backups before/after my bulk insert statements. If so, I'd appreciate some help on how to do this from within a Visual Basic 2005 application. (I'm actually using the Data Access block from the Enterprise Library for my data access.)
I'd really appreciate your help.
Kind regards
Reg Bust
View 3 Replies
ADVERTISEMENT
May 30, 2008
The server is not being backed up and a user deleted an important folder, that a process depends on. The folder was too large to fit on the recycle bin and therefore was deleted.
Anyone happen to know a good third party tool that performs data recovery and is RAID compatible?
Thanks,
---
http://www.ssisdude.blogspot.com/
View 2 Replies
View Related
Feb 20, 2002
I have successfully restored my old MSSQL7.0 database from a computer that crashed - but the database is several months old. I do, however, have the most recent log files (*.ldf) - the actual log file and NOT a backup log file. Is there any way i can update my restored databse with the most recent .ldf files?
View 1 Replies
View Related
Sep 22, 2005
Hi,
Server drives become faulty and after the recovery of the server the SQL server installation was gone from the system drive. Residing on a different drive, the mdf and ldf files of a database were saved. Unfortunately, when trying to attach the files to a new SQL server installation there is a message 'I/O error (torn page) detected during read at offset 0000000000000000 in file E:SQLServer2KMSSQLDatadb.mdf'.
The available backup files are not restorable returning 'An internal consistency error' upon restoring. So, the hope is with the mdf and ldf files. Does anyone have any idea what can be done specifically with the mdf/ldf files? Any recovery software to be recommended?
Thanks!
AV
View 3 Replies
View Related
Sep 22, 2005
Hi,
Server drives become faulty and after the recovery of the server the SQL server installation was gone from the system drive. Residing on a different drive, the mdf and ldf files of a database were saved. Unfortunately, when trying to attach the files to a new SQL server installation there is a message 'I/O error (torn page) detected during read at offset 0000000000000000 in file E:SQLServer2KMSSQLDatadb.mdf'.
The available backup files are not restorable returning 'An internal consistency error' upon restoring. So, the hope is with the mdf and ldf files. Does anyone have any idea what can be done specifically with the mdf/ldf files? Any recovery software to be recommended?
Thanks!
AV
View 1 Replies
View Related
Oct 3, 2007
hi
i was deleted a record from my database, but i want to recover my deleted record via log file, how to do ?
thanks
View 6 Replies
View Related
Apr 10, 2008
I have a warm standby (secondary server) receiving log shipping files.
The database has 5 files all in the primary filegroup. Two of the files need to be moved from one hard drive to another. Whats the best way / process to accomplish the move and re-establish the log shipping recovery status?
Thanks
View 3 Replies
View Related
May 6, 2015
We are doing restoring in a different data center and keep sending log backups and restore from there.Recently we want to bring a new volume to the sql server and move all databases to the new volume, then the problem comes, I did a test that I cannot simply change the drive letter, i.e. old disk letter x, new disk letter y, then I turn off all sql instances and move all databases(restoring status) to disk y, then get rid off disk x, resign x to disk y, so now new disk has old letter x now. then I continue the restore log against latest log backup, it returns the error.
Msg 3446, Level 16, State 2, Line 4
Primary log file is not available for database 'TestDB' (7:0). The log cannot be backed up.
Msg 3013, Level 16, State 1, Line 4
RESTORE LOG is terminating abnormally.
Is there a way to do this? or SQL Server doesn't support this change.
View 9 Replies
View Related
Apr 1, 2008
Hello,
Any body helop me in such respect that my log file was corrupt. I have mdf file only. Is there any tool to recover data from mdf file or export data to any other db.
thanks
Mansoor .net developer
View 4 Replies
View Related
Aug 7, 2015
Log shipping was configured 6 months back. A Transaction log file got corrupted today. How to resolve this?
View 20 Replies
View Related
Jul 19, 2007
I am having a problem restoring to a new or different database than the backup file being restored was created from. I understand the Move commands, and have figured out how to get this to work by mapping the data and log files from the backup file to the files defined for the restore target database.
The problem is with the full text catalog that is part of the backup.
Here's the T-SQL I'm executing, and the error I receive:
T-SQL:
RESTORE DATABASE New_DB
FROM DISK = 'C:ackupOld_DB.bak'
WITH RECOVERY, REPLACE,
MOVE 'Old_DB' TO 'C:SQLDataNew_DB.mdf',
MOVE 'Old_DB_log' TO 'C:SQLDataNew_DB_log.ldf'
Error:
Msg 1834, Level 16, State 1, Line 1
The file 'C:SQLDataFTDataftKeyWords0007' cannot be overwritten. It is being used by database 'Old_DB'.
Msg 3156, Level 16, State 4, Line 1
File 'sysft_ftKeyWords' cannot be restored to 'C:SQLDataFTDataftKeyWords0007'. Use WITH MOVE to identify a valid location for the file.
Msg 3119, Level 16, State 1, Line 1
Problems were identified while planning for the RESTORE statement. Previous messages provide details.
Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.
Not sure what to do about this, any help would be greatly appreciated!
View 2 Replies
View Related
Mar 20, 2008
Hello,
I have a question regarding the backup for the database in Simple Recovery Model.
In this Model, I know we can restore only to the last full backup or can use differential
Backup, if implemented as a part of backup.
But my point of confusion is about the backup of '.ldf' file, should those file should be backed up in the
Maintenance Plan, if yes does it help in reducing the size of Log file?
Do we need the backup of '.ldf' in phase of Restoring?
As I mention my database has Simple Recovery Model, but the size of log file is around 20GB,
Could not understand why as in this Model, normally it automatically truncate the Log file?
Help me to clear my these doubts,
thanks,
View 5 Replies
View Related
Aug 5, 2015
I was looking to change the file growth setting in our AlwaysOn environment databases.We have a single availability group, one primary and one secondary replica. I learned that when changing the file growth setting on the primary databases (data file), the change flows though to the database on the secondary replica.However after doing the same with the log files, the file growth setting changed on the primary but the change did NOT propagate to the secondary.
Is the solution to apply the change directly to the secondary?here's the T-SQL code I used:
ALTER DATABASE myDB
MODIFY FILE ( NAME = N'myDB_log', FILEGROWTH = 512MB );
GO
SQL Server 2012 (11.0.5532)
View 9 Replies
View Related
Nov 3, 2015
Have a database that's in "Simple" recovery mode whose .ldf has grown to 270GB. This database is a data warehouse so "full" is not required. I put it in simple mode a month ago and shrunk the log down and now it's filled up the disk.
What steps can I take to mitigate this in future? I've read that this is caused by long running transactions which fill the log for DR purposes. Should I put the database back into full mode and backup/truncate daily.
The auto-growth is set to 128MB which is very low.
View 3 Replies
View Related
Jul 30, 2015
My understanding is that the log file is not supposed to grow if the database is under simple recovery mode.I am in a situation where the log grows if do any inserts that involve millions of rows.How do i make sure that it does not grow?
View 11 Replies
View Related
Sep 15, 2015
One of our database is in simple recovery model, and usually generating more than 220 GB log file (.ldf) every week. We are shrinking log file many times to release the space.
But as its not advisable I am looking for any other options. I suggested to change the recovery model to Full and start T-log backup, but client dont want to change recovery model.
Is there any way to manage Log file of Simple recovery model to maintain disk space?
Will full backup truncate log file ?
View 9 Replies
View Related
May 31, 2015
Is it alright to move the .bak and .trn backups which are automatically created in a File Share Witness when a DB is added to an Availability Group? I did that and there were error IDs 1069 occurring, and the AG was unable to be brought up whenever I run a load intensive batch job ...
View 5 Replies
View Related
Aug 2, 2006
Ok guys I have read everything in my previous post but unfortunately can not seem to get it to work properly. Would anyone like to do this 10 minutes work (I am sure). Obviously I dont expect it to be free but if I continue I am going to get a sledge hammer to this now.
Thanks in advance
View 20 Replies
View Related
May 10, 2015
I want to truncate my sharepoint config database and WSS_Logging database logs the size of sharepoint_config database is growing at a pace of ~10GB every week. I have scheduled a weekly full backup. Current .ldf file size is 113GB.
I am using SQL server 2012 with Always On High Availability feature. I am not able to set the recovery mode from Full to Simple as it gives me message that mirroring is running on both server.
In my case to reduce the log file what I need to do.
View 3 Replies
View Related
Apr 23, 2015
Came across this scenario in AlwaysOn Availability Group (two node), file share witness times out and RHS terminate and cause the cluster node to reboot. File share witness is for continuous failover and if the resource is unavailable my expectation was that it should go offline and should not impact Server or Sql Server. But its rebooting the cluster node to rectify the issue.
Configuration
Windows Server 2012 R2 (VMs)
Sql Server 2012 SP2
Errors
A component on the server did not respond in a timely fashion. This caused the cluster resource 'File Share Witness' (resource type 'File Share Witness', DLL 'clusres2.dll') to exceed its time-out threshold. As part of cluster health detection, recovery actions will be taken. The cluster will try to automatically recover by terminating and restarting the Resource Hosting Subsystem (RHS) process that is running this resource. Verify that the underlying infrastructure (such as storage, networking, or services) that are associated with the resource are functioning correctly.
The cluster Resource Hosting Subsystem (RHS) process was terminated and will be restarted. This is typically associated with cluster health detection and recovery of a resource. Refer to the System event log to determine which resource and resource DLL is causing the issue.
View 3 Replies
View Related
Nov 30, 2006
Hi, does somebody recognize the problem with my code? -Thanks!
Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: This is an unexpected token. The expected token is 'NAME'. Line 58, position 52.Source Error:
<sessionState mode="SQLServer"stateConnectionString="Jensen"sqlConnectionString=
"datasource=Database;userid=username;password=pass"cookieless="false"timeout="60"/>
View 1 Replies
View Related
Sep 23, 1999
Always looking for people who are strong with microsoft technology for new oppurtunities in the Seattle area. If you are good, we can help you find your ideal next position.
don't be shy. shoot me an e-mail.
thanks
Pat Copeland
View 2 Replies
View Related
Sep 16, 1999
I am looking for strong SQL developers and/or DBA's for some really sweet companies here in Seattle. Anyone interested that is good and wants to hear about new oppurtunities? I would love to help you out.
don't be shy just shoot me an e-mail.
View 2 Replies
View Related
May 8, 2002
Please help to find an example of the code written on VC++ for an extended stored procedure (dll) for SQL Server 2000.
Thanks
View 1 Replies
View Related
Mar 19, 2004
OK...this is driving me nuts....
In the First DELETE and bcp I was getting the thread being launched by xp_cmdshell was being blocked by the parent thread...
put in WAITFOR...sometime it worked...started with an empty table..it worked....left the 28k rows, blocked...
Now, put SELECT COUNT(*)...works each and every g-d damn time...
HUH?
Now I get to the bcp out..
added the same code WAITFOR/SELECT *...
blocks each and ever g-d damn time....
I'm very reticent to COMMIT and start another tranny block...
Anyone have any ideas?
SET NOCOUNT ON
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[wrk_DataHold]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [dbo].[wrk_DataHold]
GO
CREATE TABLE wrk_DataHold(Col1 varchar(8000))
GO
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[wrk_OldNew]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [dbo].[wrk_OldNew]
GO
CREATE TABLE wrk_OldNew(Old varchar(255),New varchar(255))
GO
INSERT INTO wrk_OldNew(Old,New)
SELECT 'SEVERAL EE~S', ''
GO
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[usp_ModifyRows]') and OBJECTPROPERTY(id, N'IsProcedure') = 1)
drop procedure [dbo].[usp_ModifyRows]
GO
CREATE PROC usp_ModifyRows
@Path sysname
, @FName sysname
AS
SET NOCOUNT ON
BEGIN TRAN
DECLARE @cmd varchar(8000), @Servername sysname, @rc int, @error int, @rowcount int
, @Old varchar(255), @New varchar(255), @x int
CREATE TABLE #bcpLog(Col1 varchar(8000))
SET @rc = 0
DELETE FROM wrk_DataHold
SELECT @error = @@error, @rowcount = @@ROWCOUNT
IF @error <> 0
BEGIN
SET @rc = -1
GOTO usp_ModifyRows_Error
END
SELECT @x=COUNT(*) FROM wrk_DataHold
WAITFOR DELAY '000:00:10'
SET @cmd = 'bcp wrk_DataHold in ' + @Path + @FName + ' -S ' + @@SERVERNAME + ' -U -P -c'
INSERT INTO #bcpLog(Col1) EXEC master..xp_cmdShell @cmd
DECLARE OldNew CURSOR FOR SELECT Old, New FROM wrk_OldNew
OPEN OldNew
FETCH NEXT FROM OldNew INTO @Old, @New
WHILE @@FETCH_STATUS = 0
BEGIN
UPDATE wrk_DataHold
SET Col1 = REPLACE(Col1,@Old,@New)
WHERE Col1 LIKE '%'+@Old+'%'
SELECT @error = @@error, @rowcount = @@ROWCOUNT
IF @error <> 0
BEGIN
SET @rc = -1
GOTO usp_ModifyRows_Error
END
INSERT INTO #bcpLog(Col1)
SELECT 'REPLACE "'+ RTRIM(@Old) + '" With "' + RTRIM(@New)+ '"' UNION ALL
SELECT '('+CONVERT(varchar(25),@rowcount)+' row(s) affected)'
FETCH NEXT FROM OldNew INTO @Old, @New
END
CLOSE OldNew
DEALLOCATE OldNew
SELECT @x=COUNT(*) FROM wrk_DataHold
WAITFOR DELAY '000:00:10'
SELECT @FName = SUBSTRING(@FName,1,CHARINDEX('.',@FName)-1)+'.new'
INSERT INTO #bcpLog(Col1)
SELECT 'Preparing to Write out new file '+ @Path + @FName
/*
SET @cmd = 'bcp wrk_DataHold out ' + @Path + @FName + ' -S ' + @@SERVERNAME + ' -U -P -c'
INSERT INTO #bcpLog(Col1) EXEC master..xp_cmdShell @cmd
SET @cmd = 'bcp #bcpLog out D:cpLog.txt -S ' + @@SERVERNAME + ' -U -P -c'
INSERT INTO #bcpLog(Col1) EXEC master..xp_cmdShell @cmd
*/
COMMIT TRAN
usp_ModifyRows_Exit:
SELECT * FROM #bcpLog
DROP TABLE #bcpLog
SET NOCOUNT OFF
RETURN @rc
usp_ModifyRows_Error:
CLOSE OldNew
DEALLOCATE OldNew
ROLLBACK TRAN
GOTO usp_ModifyRows_Exit
GO
SET NOCOUNT OFF
View 5 Replies
View Related
Oct 31, 2007
Dear all,
Can anybody provide sample t-sql procedures coding to do the following tasks?
I want to call a sql stored procedures with parameter passing. I write down the pseudo codes as follows:
In the SQL server side, write a stored procedure which performs the following:
With input parameter "ABCDE";
execute SQL statement "select * from table A where tableA.field B = "ABCDE";
For each record from the result of the above SQL statement,
"select * from tableB where tableB.field C=tableA.fieldX"
if the record can be found in tableB,
"update tableB set tableB.fieldD=something";
if the record cannot be found in tableB,
"insert a new record in tableB"
My actual application requirement is that I have an input file of over hundred thousand records which acts as a primary file to update or to add records to another file. My client side application, which is VB.NET, call a sql procedures once to perform this task. I don't want the VB.NET program to loop through all the input records and call a sql procedure a hundred thousand times!
thanks in advance!
View 8 Replies
View Related
Dec 3, 2014
We have a valid full backup of a database. We know it is valid, we have restored it twice from the network with no problems, but we do not have access to the network location from our sandbox environment.
The .bak file is sizable at about 9GB. The .bak file resides on our internal network, in a SAN drive. No problems there. When we copy the file from there into a sandbox environment to attempt the restore in the sandbox environment it gets corrupted. We've tried three different times and all three different times it gets corrupted. First time we copied the file over to the sandbox the restore went up to 50% and failed. The second time we copied the file again and attempted the restore again it failed at 70%.
The third time it failed at 60%. The error message we get during the restore is "...Read on ... failed: 13(The data is invalid) Msg 3013, Level 16, State 1, Line 1 Restore database is terminating abnormally."
Now some background here. To move the file our network team is doing this: they have this .vmdk file that they mount out in our production environment (which has access to the network location where the .bak file is), copy the file into the drive, then move the .vmdk file into the sandbox environment(which does not have access to the network location), mount the drive in the sandbox environment, and then I have access to the .bak file from within the sandbox environment.
Something in the process of using the .vmdk file to move the .bak file from production into the sandbox is causing the file to get corrupted.
View 13 Replies
View Related
Nov 6, 2015
I need to copy a just-created bak file to another drive after the backup task has completed. I don't see anything in the job toolbox which works with file system operations like this. But still it must be a common need..There are ways to script this or use third part tools but I am looking for something native to the sql server 2012 SSMS toolset, if possible.
An alternate approach would be to run the backup job again, after the main backup, and change the destination to the alternate location. But I was thinking that another backup job would probably invoke more overhead on the server than a simple file copy operation. If I do end up taking this approach I could also use the cleanup task to toss older bak files in the alt dir.
View 7 Replies
View Related
Aug 26, 2004
Hi
I have a table called tblsample, where i have information stored row wise. Ther four quarter information is stored for many years. I want those information column wise for a given year.
say
select col1, col2 from tblsample where rqtr=1 and ryear = 2000
select col1, col2 from tblsample where rqtr=2 and ryear = 2000
select col1, col2 from tblsample where rqtr=3 and ryear = 2000
select col1, col2 from tblsample where rqtr=4 and ryear = 2000
i want information like
for the Year 2000
1 qtr 2 qtr 3 qtr 4 qtr
How to acheive this in MSSQL 2000
View 3 Replies
View Related
Sep 30, 2005
hi! I'm still fairly wet behind the ears when it comes to databases, especially sql server 2000. But I just got a copy from my University and I really want to spend some time messing with it. The first thing I want to try is making a shopping cart that stores the items in the database rather than in the session. I'm using asp.net (VB) and I had previously made a simple shopping cart for a SMALL site that stored all the information in a dataset stored in the session... but I've been told that isn't a good idea and reccomended I store the data in the sql database. so that's what I want to do!
I'm looking for resources that can help me get this started. These are the first questions I have about this approach, and any advice from you experts would be most appreciated!
1) do I store the data in a temporary table or do I use the regular table and make temporary rows? if the former how do I transfer the rows from the temp table to the order table? if the latter how do I change it from temporary to peromanent? in either case, how do I eliminate rows if the user abandons their cart?
2) how do I first initialize an order and keep track of the orderID for each user and ensure that no one is assigned a duplicate orderID?
those are my biggest confusions about this approach. I've looked for books on this but most are either too much about sql syntax and big-time server management, or they are too general and not specific to sql server. I've tried looking online, but I can't find much information on doing this using asp.net 2.0. thanks again for your help!
-SelArom
View 6 Replies
View Related
Jul 20, 2005
I am looking for sql server certifcation questions any one hasany.like dumps or some thing
View 1 Replies
View Related
Jun 26, 2007
Hello,
In the ErrorLog of my Sql Server , i found this line :
2007-06-26 05:35:18.37 Serveur The SQL Network Interface library could not register the Service Principal Name (SPN) for the SQL Server service. Error: 0x54b, state: 3. Failure to register an SPN may cause integrated authentication to fall back to NTLM instead of Kerberos. This is an informational message. Further action is only required if Kerberos authentication is required by authentication policies.
Operating System XP Home SP2
SQL Server 2005 Express Edition with Advanced SP1 )
( idem for another workstation with XP Pro SP2 and same version of SQL Server 2005 Express
My problem is :
i want use the windows authentification but my computers are on Worhkgroup linked by a router ( no window server )
i have read that's possible to connect from a remote computer to a computer having a SQL Server 2005 Express through SPN
How can i do it ? ( activating NTLM ? but how ?)
I'm writing a C# program which must be executing on several computers with a SQL Server 2005 Express installed on a particular computer. These computers will belong to a domain of Windows Server 2003.
As i can't connect to this "normal" network, i am trying to simulate this network at home because i want to test this program and especially the possible locks problems.
I don't know whether i'm querying with the "correct" forum.
Sorry for my poor english.
I shall appreciate any help about this problem
Have a nice day
View 1 Replies
View Related
Apr 27, 2006
Hi!
I'm new to SSIS (and quite new to SQL Server). I have a process which I'd like to automize via SSIS - just don't know how and couldn't figure it out yet by playing around with the program. Shouldn't be too difficult though.
First of all, that's the process as I do it now:
1) Load several flatfile sources (dumps of SQL tables) into an SQL database.
2) Add identifier rows (to some tables), set the primary and foreign keys so the database is "recreated" and I can work on it.
3) Do several simple transformations, aggregations and selects across tables and finally write a new table containing information for reporting stuff.
I succeded in loading flatfiles within the data flow view, doing some transformations and saving the output to a flatfile. What I didn't find out: how can I "recreate" the database enabling me to perform "SELECT/FROM/WHERE" statements across tables? Will I have to write the imported files to tables within a db (how?) or can I avoid this step?
A little guide (newbie friendly) would be great help!
View 11 Replies
View Related