DB Engine :: Is There Any Possibility To Override Database Audit Specification File
Nov 10, 2015Is there any possibility to override database audit specification file. suppose i want to change some logs forcefully . is it possible ?
View 5 RepliesIs there any possibility to override database audit specification file. suppose i want to change some logs forcefully . is it possible ?
View 5 RepliesHow can I take backup of Database Audit Specification ?
View 3 Replies View RelatedHow can i Trace Database Audit specification Enable and Disable. i want to maintain log for enable or disable database audit specification.
View 4 Replies View RelatedI have made a server security audit and specify from database audit specification to audit "select" on a certain user and on a certain table. I logged in by this user and made the select statement..when i run this query
"select * from sys.fn_get_audit_file('d:Auditaudit1*',null,null)"
It return a value at which time the query has done
after 15 minutes i repeated the same action, i run the audit query and the same result is showed off on the panel.is it suppose to return a list of values by how many times this user has made the select statement on that table ? for example at 5:00 pm then 6:00 pm and so on
Currently I am using SQL server 2012 and would like to implement database audit specification on specific users in my database. These are the users in my database name Payroll :-
PayrollAndy.Bred - db_owner
PayrollArpit.Shah - db_owner
Payrollwebapp - db_datareader, db_datawriter, EXECUTE
web_payroll - db_datareader, db_datawriter, EXECUTE
In my database audit specification settings, I would like to capture any SELECT,UPDATE,DELETE and EXECUTE command for users PayrollAndy.Bred & PayrollArpit.Shah only since they owned db_owner access. However, I am unable to capture any single command from both users. I do not want to put 'Principal' as public since I just want to capture both users activity.
Is it I miss out anything? Is it because of windows login account?
I have a windows 2012 server and will like to know how to audit DMLs on a table (delete, truncate, update) on this table, I want to see all T-Sql DML statement carried out on this table in a file.How can this be achieved using if possible something already built into SSMS.
View 4 Replies View RelatedWhat is the recommended size and file growth for a database and log file? We will be storing approx 10000 records a day.Currently we have the following:
CREATE DATABASE Dummy
ONÂ
PRIMARY
( NAME = Dummy_data,
  FILENAME = 'D:....DATADummy.mdf',
  SIZE = 250MB,
  FILEGROWTH = 25MB )
LOG ON
( NAME = Dummy_log,
  FILENAME = 'D:....DATADummy_log.ldf',
  SIZE = 50MB,
  FILEGROWTH = 5MB ) ;
GO
SQL server 2012, I tried to start Audit log from SSMS, I received following error.
Enable failed for audit error 33222, for more information see sql Server error log .
query ss.dm_os_ring_buffers
I am currently re-writing an overnight index defrag procedure and would like to audit indexes in my database - logging the before defrag action" avg fragmentation value and "after defrag action" frag value in an audit table. Â This will be for all databases on the server. Â I have completed the vast majority of it (cycling though all the databases, detecting which indexes need reorganising or rebuilding and inserting the information into a table) but I cannot get the audit values working properly. Â For example, a sample row in my audit table would look like this:
ID Name DB Table frag_before frag_after
1 Index2 DB1 Table6 70.33456 0.03
2 Index7 DB1 Table9 45.98 1.2567etc
One of our database came to Restoring mode. I suddenly stop my SQL service and Copied only MDF files again Started SQL service ,unexpectedly i dropped the Database. Now i cant able to attach the database only with my MDF file.
I tried below Scripts. All scripts shows same error.
EXEC sp_attach_single_file_db @dbname='PhoenixPolice',
@physname=N'C:Program FilesMicrosoft SQL ServerMSSQL10_50.MSSQLSERVERMSSQLPhoenixPolice.mdf'
GO
CREATE DATABASE PhoenixPolice ON
  (NAME = N'PhoenixPolice',
  FILENAME = N'C:Program FilesMicrosoft SQL ServerMSSQL10_50.MSSQLSERVERMSSQLPhoenixPolice.mdf')
FOR ATTACH_REBUILD_LOG
GO
CREATE DATABASE phoenixPolice ON (FILENAME = N'C:Program FilesMicrosoft SQL ServerMSSQL10_50.MSSQLSERVERMSSQLPhoenixPolice.mdf')
FOR ATTACH
All scripts shows below error
File activation failure. The physical file name "F:Microsoft SQL ServerMSSQL10_50.MSSQLSERVERMSSQLDATAPhoenixPolice_1.ldf" may be incorrect.
The log cannot be rebuilt because there were open transactions/users when the database was shutdown, no checkpoint occurred to the database, or the database was read-only. This error could occur if the transaction log file was manually deleted or lost due to a hardware or environment failure.
Msg 1813, Level 16, State 2, Line 1
Could not open new database 'PhoenixPolice'. CREATE DATABASE is aborted.
I have to perform disk maintenance on current drive - Drive 'D' where it has sql data (mdf file) and I have added new drive - Drive 'E' By the way Drive 'C' have the program files for SQL Server 2008 R2 What is the correct process to transfer sql data (mdf file) from Drive 'D' to Drive 'E' and later remove Drive 'D' from the server.
View 12 Replies View RelatedI have a file in Fire bird Database (30 GB with .ydb extension). Â which needs to be restored to SQL Server. I Have created a linked server and done it but it is taking very long time to update the records.
View 8 Replies View RelatedCan we add multiple log file in existing database?
View 8 Replies View RelatedIs this possible to connect do the Database Engine (on sql server 2005 on XP platorm - file *.mdf (not mobile *.sdf)) from win CE 5.0?
I tried to do this :
ConnectionStringSQLServerCE = "Data Source=WORK_STATION\SQLEXPRESS;Initial Catalog=dbMachines;Integrated Security=False;Password=Panel;User ID=Panel";
SqlCeConnectionCE = new SqlConnection(ConnectionStringSQLServerCE);
SqlCeConnectionCE.Open();
but I catch error:
catch (PlatformNotSupportedException ex)
€žPlatformNotSupportedException€?
I noticed that I see server because when I use:
ConnectionStringSQLServerCE = "Data Source=WORK_STATION\SQLEXPRESS";
zwraca błąd typu
catch (SqlException ex)
€œLogin failed for user ''. The user is not associated with a trusted SQL Server connection.€?
I used
using System.Data.SqlClient;
from Compact Framework 2.0 under Visual Studio 2005 Device Application Windows CE 5.0
can anyone help me?
Hi all--Given a table called "buyers" with the following column definitions in a SQL Server 2005 database:
[BUYER] [nvarchar](40) NULL,
[DIVISION] [nvarchar](3) NULL,
[MOD_DATE] [datetime] NULL
This table is laden with Unicode data and the MOD_DATE contains no data--not even NULL values, and is giving me a headache as a result. I can export this data fine to a text file, but when I create an SSIS package to attempt import to another table defined exactly the same as above in another place, I get the following messages:
SSIS package "buyers_import.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Data Flow Task, Source - buyers_txt [1]: The processing of file "D: emp3uyers.txt" has started.
Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.
Information: 0x402090DE at Data Flow Task, Source - buyers_txt [1]: The total number of data rows processed for file "D: emp3uyers.txt" is 232.
Error: 0xC0202009 at Data Flow Task, Destination - buyers_tst [22]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Invalid character value for cast specification".
Error: 0xC020901C at Data Flow Task, Destination - buyers_tst [22]: There was an error with input column "MOD_DATE" (45) on input "Destination Input" (35). The column status returned was: "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Data Flow Task, Destination - buyers_tst [22]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "Destination Input" (35)" failed because error code 0xC0209077 occurred, and the error row disposition on "input "Destination Input" (35)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Destination - buyers_tst" (22) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Data Flow Task, Source - buyers_txt [1]: The processing of file "D: emp3uyers.txt" has ended.
Information: 0x402090DF at Data Flow Task, Destination - buyers_tst [22]: The final commit for the data insertion has started.
Information: 0x402090E0 at Data Flow Task, Destination - buyers_tst [22]: The final commit for the data insertion has ended.
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "Destination - buyers_tst" (22)" wrote 0 rows.
Task failed: Data Flow Task
SSIS package "buyers_import.dtsx" finished: Failure.
Among the customizations in this package is the flag "ValidateExternalMetadata" set to False. The data itself is surrounded by " and delimited by semicolons for each field, with the header row set as the name of each column. It looks like this:
"BUYER";"DIVISION";"MOD_DATE"
"108 Joon-Hyn Kim";"TAD";""
"109 Kang-Soo Do";"TAD";""
"FS07 John Smith";"TAD";""
...
Can anyone suggest a course of action on how to handle the error when the MOD_DATE field is completely empty?
Thanks in advance,
Jonathan
I'm trying to connect to a tab-delimited text file using the ODBC data access. I set it up using Administrative Tools - Data Sources (ODBC) from the start menu, and clicked the 'Guess' button under Define Format.
The schema it created looks like this:
[cims_output_data.txt]
ColNameHeader=True
Format=TabDelimited
MaxScanRows=0
CharacterSet=OEM
Col1=SIMNAME
Col2=TIMESTAMP
Col3=SIMTYPE
Col4=REGION
Col5=SECTOR
....etc.
It looks OK, but if I go back to Define in Administrative Tools and try to review it, I get a similar error message:
Ini File (or Registry) C:Program FilesCIMSschema.ini is corrupt.
Section: cims_output_data.txt, Key: Col1.
Has anyone seen this before? There is nothing on MSDN as far as I could find. I read on another forum that if you replace the label 'Col1' with 'Col' the problem is solved but it still reports that the schema is corrupt when I try to open it in Administrative Tools.
Bill.
Hi,
I've created a few audit trace sessions and each of them is producing files.
The procedure I used is listed below:
IF EXISTS (SELECT 1 FROM sysobjects WHERE [name] = 'sp_Turn_Audit_On' AND type = 'P')
DROP PROC sp_Turn_Audit_On
GO
CREATE proc sp_Turn_Audit_On
as
/************************************************** **/
/* Created by: SQL Profiler */
/* Date: 11/15/2006 05:16:40 PM */
/************************************************** **/
-- Create a Queue
declare @rc int
declare @TraceID int
declare @maxfilesize bigint
set @maxfilesize = 1024
exec @rc = sp_trace_create @TraceID output, 2, N'E:Program FilesMicrosoft SQL ServerMSSQLTraceAudit_Info.trc', @maxfilesize, NULL
if (@rc != 0) goto error
-- Client side File and Table cannot be scripted
-- Set the events
declare @on bit
set @on = 1
begin
exec sp_trace_setevent @TraceID, 14, 1, @on
exec sp_trace_setevent @TraceID, 14, 6, @on
exec sp_trace_setevent @TraceID, 14, 9, @on
exec sp_trace_setevent @TraceID, 14, 10, @on
exec sp_trace_setevent @TraceID, 14, 11, @on
exec sp_trace_setevent @TraceID, 14, 12, @on
exec sp_trace_setevent @TraceID, 14, 13, @on
exec sp_trace_setevent @TraceID, 14, 14, @on
exec sp_trace_setevent @TraceID, 14, 16, @on
exec sp_trace_setevent @TraceID, 14, 17, @on
exec sp_trace_setevent @TraceID, 14, 18, @on
exec sp_trace_setevent @TraceID, 15, 1, @on
exec sp_trace_setevent @TraceID, 15, 6, @on
exec sp_trace_setevent @TraceID, 15, 9, @on
exec sp_trace_setevent @TraceID, 15, 10, @on
exec sp_trace_setevent @TraceID, 15, 11, @on
exec sp_trace_setevent @TraceID, 15, 12, @on
exec sp_trace_setevent @TraceID, 15, 13, @on
exec sp_trace_setevent @TraceID, 15, 14, @on
exec sp_trace_setevent @TraceID, 15, 16, @on
exec sp_trace_setevent @TraceID, 15, 17, @on
exec sp_trace_setevent @TraceID, 15, 18, @on
exec sp_trace_setevent @TraceID, 20, 1, @on
exec sp_trace_setevent @TraceID, 20, 6, @on
exec sp_trace_setevent @TraceID, 20, 9, @on
exec sp_trace_setevent @TraceID, 20, 10, @on
exec sp_trace_setevent @TraceID, 20, 11, @on
exec sp_trace_setevent @TraceID, 20, 12, @on
exec sp_trace_setevent @TraceID, 20, 13, @on
exec sp_trace_setevent @TraceID, 20, 14, @on
exec sp_trace_setevent @TraceID, 20, 16, @on
exec sp_trace_setevent @TraceID, 20, 17, @on
exec sp_trace_setevent @TraceID, 20, 18, @on
exec sp_trace_setevent @TraceID, 37, 1, @on
exec sp_trace_setevent @TraceID, 37, 6, @on
exec sp_trace_setevent @TraceID, 37, 9, @on
exec sp_trace_setevent @TraceID, 37, 10, @on
exec sp_trace_setevent @TraceID, 37, 11, @on
exec sp_trace_setevent @TraceID, 37, 12, @on
exec sp_trace_setevent @TraceID, 37, 13, @on
exec sp_trace_setevent @TraceID, 37, 14, @on
exec sp_trace_setevent @TraceID, 37, 16, @on
exec sp_trace_setevent @TraceID, 37, 17, @on
exec sp_trace_setevent @TraceID, 37, 18, @on
exec sp_trace_setevent @TraceID, 46, 1, @on
exec sp_trace_setevent @TraceID, 46, 6, @on
exec sp_trace_setevent @TraceID, 46, 9, @on
exec sp_trace_setevent @TraceID, 46, 10, @on
exec sp_trace_setevent @TraceID, 46, 11, @on
exec sp_trace_setevent @TraceID, 46, 12, @on
exec sp_trace_setevent @TraceID, 46, 13, @on
exec sp_trace_setevent @TraceID, 46, 14, @on
exec sp_trace_setevent @TraceID, 46, 16, @on
exec sp_trace_setevent @TraceID, 46, 17, @on
exec sp_trace_setevent @TraceID, 46, 18, @on
exec sp_trace_setevent @TraceID, 47, 1, @on
exec sp_trace_setevent @TraceID, 47, 6, @on
exec sp_trace_setevent @TraceID, 47, 9, @on
exec sp_trace_setevent @TraceID, 47, 10, @on
exec sp_trace_setevent @TraceID, 47, 11, @on
exec sp_trace_setevent @TraceID, 47, 12, @on
exec sp_trace_setevent @TraceID, 47, 13, @on
exec sp_trace_setevent @TraceID, 47, 14, @on
exec sp_trace_setevent @TraceID, 47, 16, @on
exec sp_trace_setevent @TraceID, 47, 17, @on
exec sp_trace_setevent @TraceID, 47, 18, @on
exec sp_trace_setevent @TraceID, 104, 1, @on
exec sp_trace_setevent @TraceID, 104, 6, @on
exec sp_trace_setevent @TraceID, 104, 9, @on
exec sp_trace_setevent @TraceID, 104, 10, @on
exec sp_trace_setevent @TraceID, 104, 11, @on
exec sp_trace_setevent @TraceID, 104, 12, @on
exec sp_trace_setevent @TraceID, 104, 13, @on
exec sp_trace_setevent @TraceID, 104, 14, @on
exec sp_trace_setevent @TraceID, 104, 16, @on
exec sp_trace_setevent @TraceID, 104, 17, @on
exec sp_trace_setevent @TraceID, 104, 18, @on
exec sp_trace_setevent @TraceID, 107, 1, @on
exec sp_trace_setevent @TraceID, 107, 6, @on
exec sp_trace_setevent @TraceID, 107, 9, @on
exec sp_trace_setevent @TraceID, 107, 10, @on
exec sp_trace_setevent @TraceID, 107, 11, @on
exec sp_trace_setevent @TraceID, 107, 12, @on
exec sp_trace_setevent @TraceID, 107, 13, @on
exec sp_trace_setevent @TraceID, 107, 14, @on
exec sp_trace_setevent @TraceID, 107, 16, @on
exec sp_trace_setevent @TraceID, 107, 17, @on
exec sp_trace_setevent @TraceID, 107, 18, @on
exec sp_trace_setevent @TraceID, 109, 1, @on
exec sp_trace_setevent @TraceID, 109, 6, @on
exec sp_trace_setevent @TraceID, 109, 9, @on
exec sp_trace_setevent @TraceID, 109, 10, @on
exec sp_trace_setevent @TraceID, 109, 11, @on
exec sp_trace_setevent @TraceID, 109, 12, @on
exec sp_trace_setevent @TraceID, 109, 13, @on
exec sp_trace_setevent @TraceID, 109, 14, @on
exec sp_trace_setevent @TraceID, 109, 16, @on
exec sp_trace_setevent @TraceID, 109, 17, @on
exec sp_trace_setevent @TraceID, 109, 18, @on
exec sp_trace_setevent @TraceID, 110, 1, @on
exec sp_trace_setevent @TraceID, 110, 6, @on
exec sp_trace_setevent @TraceID, 110, 9, @on
exec sp_trace_setevent @TraceID, 110, 10, @on
exec sp_trace_setevent @TraceID, 110, 11, @on
exec sp_trace_setevent @TraceID, 110, 12, @on
exec sp_trace_setevent @TraceID, 110, 13, @on
exec sp_trace_setevent @TraceID, 110, 14, @on
exec sp_trace_setevent @TraceID, 110, 16, @on
exec sp_trace_setevent @TraceID, 110, 17, @on
exec sp_trace_setevent @TraceID, 110, 18, @on
exec sp_trace_setevent @TraceID, 115, 1, @on
exec sp_trace_setevent @TraceID, 115, 6, @on
exec sp_trace_setevent @TraceID, 115, 9, @on
exec sp_trace_setevent @TraceID, 115, 10, @on
exec sp_trace_setevent @TraceID, 115, 11, @on
exec sp_trace_setevent @TraceID, 115, 12, @on
exec sp_trace_setevent @TraceID, 115, 13, @on
exec sp_trace_setevent @TraceID, 115, 14, @on
exec sp_trace_setevent @TraceID, 115, 16, @on
exec sp_trace_setevent @TraceID, 115, 17, @on
exec sp_trace_setevent @TraceID, 115, 18, @on
-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint
exec sp_trace_setfilter @TraceID, 10, 0, 7, N'SQL Profiler'
-- Set the trace status to start
exec sp_trace_setstatus @TraceID, 1
-- display trace id for future references
select TraceID=@TraceID
goto noCursor
error:
select ErrorCode=@rc
noCursor:
return
end
Unfortunately, I didn't mention the @stoptime for each of the trace sessions and now I'm unable to delete the trace files. Is there any select statement that will be useful in finding the @traceid for these auditting sessions? Also, how am I able to stop the sessions without stopping the related services?
Thank you in advance.
Best regards,
Alla
I am setting up SQL audit on sql servers in my environment based on requirement. I want to create database specifications ASAP database created. I tried DDL trigger but Audit doesn't support triggers. So I created audit specifications on model database. the only problem with this is every specification created on new database with same name.database specification name includes newly created database name or other methods to create database specifications on newly created databases.
View 6 Replies View Relatedwriting a T_SQL query for the following scenario..I have a SQL DB Audit file that gets populated with data as the activity on DB goes on.I have multiple monthly tables setup that the import should go into these monthly tables based on the event_time value in the SQL DB Audit file.all the data Like event_time '2015-08-25 15:59:39.033' should go to SQL table Audit_tbl_Aug2015 Query for reading SQLDB Audit file
SELECT * FROM sys.fn_get_audit_file ('C:ackupAudit*',default,default)
order by event_time desc
GO
--DML for Audit table CREATE TABLE [dbo].[Audit_tbl_Aug2015]( [id] [bigint] IDENTITY(1,1) NOT NULL, [event_time] [datetime2](7) NOT NULL, [sequence_number] [int] NULL, [action_id] [varchar](4) NULL, [succeeded] [bit] NOT NULL, [permission_bitmask] [bigint] NOT NULL, [is_column_permission] [bit] NOT NULL, [session_id] [smallint] NOT NULL, [server_principal_id] [int] NULL, [database_principal_id] [int]
[code]...
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET.
I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page.
Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer.
Cheers ~ J
Hi,
I'm using enterprise manager to remotely connect to a MS SQL Server for one of my client's websites. The problem I'm having is that the connection times out when I try to expand the "Databases" folder. The reason that it's timing out is because the company that hosts the DB has hundreds of other clients who all share the same IP, but each with their own seperate login/password/database, but even though I don't have access to all the rest of the DBs on the server, it still wants to list them all for me. What I'm wondering is is there a way to set EM to ignore all of the databases on the server that cannot be accessed with my credentials? So basically when I expand the Databases folder all that appears is the one that I can access?
Thanks,
Dante
I am planning to create an application in SQL Express Advanced SP2, and later upgrading to Standard Edition. The Express SP2 Readme states somewhat unclearly that this might not be possible, that upgrading must be done prior to installing SP2, and then SP2 can be applied. I'm confused about my options.
Right now, I need to use Express Advanced for the reporting capabilities, etc., but the only available download I can find is Express Advanced SP2. If I install and use this, will I later be able to upgrade "seamlessly" to Standard Edition? Or should I install an earlier version of Express Advanced? Also, I haven't seen a lot of advice about the "seamless" upgrade from Express to other editions. If any help can be provided to answer the first question and/or point me toward upgrade documentation, it would be appreciated.
I need help...here is the problem.Last weekend, the servers in our datacenter where moved around. After thismove, and maybe coincidental, 1 server is performing very poor. Afterrunning a trace with SQL Profiler, I saw the problem which was laterconfirmed with another tool for SQL server performance monitoring. It seemsthat all connections to the SQL server (between 200 - 400) are doing a login/ logout for each command that they process. For example, the user'sconnection will login, perform a SELECT, and then logout. This is not a..NET application. The client software was not changed, it is still thesame. The vendor has said that it is not supposed to do that, it issupposed to use 1 connection that log's on in the morning and logs off atthe end of the day or whenever the user exits. 1 user may have severalconnections to the database.At times, the server is processing over 250 login / logouts (avgeraged for30 second period). Has anyone seen this problem? I have the server inAUDIT FAILUREs only. The server has become very unresponsive, things thattook 3 seconds now take over 15 seconds.Any ideas???
View 6 Replies View RelatedCurrently our client has a website where they are using aspx as front end and SQL server as back end. We are developing same functionality using php and my sql.Currently 1 phase of development is finished. They are using both applications to do the job. Now in 2nd integration they are asking the data sync from one database to other.
We have decided once the user enters the aspx form the data will be stored in client sql server but we though of writing a script to enter this info into a csv file so that from our side we access that file and update in mysql.
Other than ftp any other better approaches to access that file on client server?Is there any possibility to sync databases directly?
My customer got a total hard drive failure.After sending it to drive recovery specialist we were able to recover the LDF log file (MyDB_0.LDF).But the MDF file was completely destroyed (MyDB.MDF).They have a good full backup from a month ago.
1) Installed SQL Server 2012 on a new PC
2) Created a new database of same name (MyDB) - with same MDF and LDF file names as original
3) Took the new database offline
4) deleted the MDF and LDF files of the new database
5) put "MyDB_0.LDF" in the place of the LDF file I just deletedÂ
6) put the database back on-line
7) after hitting F5 to refresh databases - it shows "MyDB (Recovery Pending)"
8) tried to do Tail Log Backup with this command
  BACKUP LOG [MyDB] TO DISK = N'C:BACKUPMyDB_TailLog.bak'
WITH NO_TRUNCATE
And I get this error...
Msg 3447, Level 16, State 1, Line 3
Could not activate or scan all of the log files for database 'MyDB'.
The sad thing is I know we can get this data back using ApexSQL-Log. I can see all the transactions since the last full backup in this program - so the log file is not damaged. But my client doesn't want to pay the $2000 fee for this software.There has to be a way to restore this data, without having to purchase a third party tool.
Does MSSQL 2000 and above have an auditing feature? We have a requirement for tracking activity and history on a particular table. I realize this can be accomplished by a simple trigger but I was wondering if MSSQL had a feature that would accomplish this.
joe
Hi,
We are currently porting our POINT OF SALE system to SQL Server 2005. In our existing software we log database changes (inserts, amendments and deletions) to a flat acsii files so that these files can be FTP'd to remote sites (largest number of sites 100+) and processed on to replicate the data.
I have looked at REPLICATION in MSSQL 2005 but we need two way replication and additional processing. Therefore I dont think we will be able to use this feature.
Therefore what I wanted to do was to try and setup an automated way of capturing database changes to all tables within the database and log these changes to XML to be shipped out to remote sites. Unfortunatly I can only find TABLE TRIGGERS which would require creating 100's of triggers as we have 100's of tables.
Is there anyway of setting up MSSQL server to automatically do something like this... I was looking to see if there was a DATABASE TRIGGER which could perform this action but I cant see anything...
Can anyone advise or is there a simpler way of doing this ??
Ray
Hi,DDL:-- create table #task (taskID int identity(1,1) primary key, taskNamevarchar(25) unique, taskCompleteDate dateTime, taskComplete bitdefault(0));/*Business Rules:a) if taskCompleteDate is NULL (like default) then, taskComplete maynot be set to 1 (attempt to update it to 1 would fail);b) else automatically set taskComplete = 1*/I was thinking using CHECK constraint (mutual constraint if possible),along the following line:CHECK (if taskCompleteDate is null set taskComplete=0 else settaskComplete=1)Hmm, your thought?Thanks.
View 5 Replies View RelatedHi,
I have Table (RatesTable) every user can insert records to this table, and all users can see this records, this table contain the following columns:
RateID, Service, Rate, DateTime, User
Want I want is a code (trigger) in the database can do the following:
If user perform an Update request the code will check:
- if this recored inserted by the same user update command will be execute.
- if this recored inserted by other user: update command will not execute and return message.
- if more than 5 minutes passed the update command will not be execute and return message.
I checked quickly one network  and I ran into the folders; I found out the folder MSSQL backup, nothing strange so far. Within that folder, anyway, I found out several backup and one very huge file (datawarehouse) with extension FILE. I am wondering what can be? I got datawarehouse mdf of course and datawarehouse log but what is that huge file (1 TB)?
View 6 Replies View RelatedExecuted as user: S233683-AD01S233683NJ3SQL05. ...at file already exists.' [SQLSTATE 42000] (Error 22048) Â ERROR -- Could not backup database: master - BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 50000) Â xp_create_subdir() returned error 183, 'Cannot create a file when that file already exists.'
View 2 Replies View RelatedHi All....
First a bit about me. I'm a developer in Atlanta. My background is mostly in Unix, but am working in NT at this job. I am gearing my skills towards enabling products on the web for people. I have experience in PHP, HTML, Javscript, C/C++, MySQL, a little Oracle and now getting fimilar with SQL Server.
What i'm trying to do....
Track all changes to data in the DB.
Why i'm doing it....
Banking application, any account changes need to be logged with which employee made the change.
How I want to do it....
Currently the app is a web app that accesses VB for the backend. There is a single SQL Server user accessing the DB. I want to use triggers on INSERT, UPDATE, and DELETE to copy the new row to any audit table. This will be identical but with user, action and time.
Problem....
Single user connection will hide acutal app userid. I want to have the VB app get the "SQL Server Session ID" (if one exists) and store the app_userid in a table with the session ID for cross reference by the trigger.
Question:
Does SQL Server have a "session id" for multiple connections for a single user? Where is it located? Can VB access this information?
Thanks,
Brian
Is there an easy way to monitor (audit) who logs onto a database ??Thanks for any and all help that is provided.Art
View 2 Replies View Related