I have Daily User DB Integrity Checks job running daily
From past 2 days i am getting below error.
[Microsoft][ODBC SQL Server Driver][SQL Server]CHECKDB found 0 allocation errors and 35 consistency errors in table 'Prod_Hist' (object ID 2098106515).
[Microsoft][ODBC SQL Server Driver][SQL Server]CHECKDB found 0 allocation errors and 99 consistency errors in database 'Ucatalog'.
[Microsoft][ODBC SQL Server Driver][SQL Server]repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB (Ucatalog repair_fast).
[Microsoft][ODBC SQL Server Driver][SQL Server]DBCC execution completed. If DBCC printed error messages, contact your system administrator.
I have a few databases on this Windows 2000 Server running SQL 2000 which were detached from SQL 7.0 and attached to SQL 2000. The problem is the Maintenance Plans (Integrity Checks keep failing on SQL 2000. I 'DTS'ed a SQL 7.0 database to this SQL 2000 server and ran the Maintenance Plans on that database. Works fine only for the DTS'ed database. What am I missing ??? :confused:
I am using sql server 2012 with HADR (Always on with sql cluster).
We have database maintenance plans through wizard for full backup & DBCC CHECK DB. It was running successfully but it failed with the below error
Execute SQL Task Description: Failed to acquire connection "Local server connection". Connection may not be configured correctly or you may not have the right permissions on this connection.
(A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)).
(A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)).
I can able to take the backup from query window. It is succesful. The Sql Agent has full permissions. I don't think there are any recent changes happen.
On weekends I have Integrity Checks scheduled to run. Many of these fail for individual databases because users do not log off and the databases cannot be switched to single user mode.
I have checked Books-on-line and have not yet stumbled onto a TSQL command that breaks the connections.
Is there a TSQL command to do this? If not, how can these connections be broken?
Hello, I had a DB Maintenance plan, the schedule is every day, but today I found teh 'Integrity checks job is failed". What is that mean? How to check this. Thanks.
My SQLMaint integrity checks consistently fail when the "Repair any minor problems" option is checked. The reported reason is that the database must be in single user mode. This doesn't seem practical. Am I missing an option somewhere?
The integrity checks job on the user databases failed over the weekend and here is the error I got from the report: Database DB_Stores: Check Data and Index Linkage... [Microsoft SQL-DMO (ODBC SQLState: 42000)] Error 7919: [Microsoft][ODBC SQL Server Driver][SQL Server]Repair statement not processed. Database needs to be in single user mode. The following errors were found: [Microsoft][ODBC SQL Server Driver][SQL Server]Repair statement not processed. Database needs to be in single user mode. ** Execution Time: 0 hrs, 0 mins, 1 secs ** [Microsoft SQL-DMO (ODBC SQLState: 42000)] Error 5070: [Microsoft][ODBC SQL Server Driver][SQL Server] Database state cannot be changed while other users are using the database 'DB1' [Microsoft][ODBC SQL Server Driver][SQL Server]ALTER DATABASE statement failed. [Microsoft][ODBC SQL Server Driver][SQL Server]sp_dboption command failed. Googled the issue and found some articles on this issue, but wanted to also run it by you. I looked at the maint plan properties and under the integrity tab of the maint plan attempt to repair any problems is checked, this is what I think: Since attempt to repair is checked, the db was trying to repair the issue and since users were logged into the system it could'nt repair the issue as the system needs to be in single user mode. I think if we uncheck the attempt to repair option, then the job would run fine. But is this the best way to do? how about the errors it was trying to repair? Do we have to fix the error's it was trying to fix later by changing the db to single user mode. PLease let me know your ideas, thanks!!
Suppose you have two (or more) tables with foreign key constraints. Myquestion is thus:Is it better to check if the fk exists before you try to perform theinsert or let SQL do it for you?On one hand, if you check yourself and the key does not exist you cangracefully handle it (maybe exit out of method with error). If you letSQL do it, the server will throw an error which cannot be suppressed.On the performance side, you doing the check will incur a slight (VERYslight) hit since SQL will ALSO check anyways.
I'm starting to collect and develop some scripts that will tell me the health and welfare of my MSSQL 2k server. I have a few for blocks, db size, who is on and what they are currently running.
I was wondering if you guys could share some of the scripts you guys use to watch the health of your servers.
I'm starting to collect and develop some scripts that will tell me the health and welfare of my MSSQL 2k server. I have a few for blocks, db size, who is on and what they are currently running. I was wondering if you guys could share some of the scripts you guys use to watch the health of your servers.
I have a transaction table which has Date as datetime field, amount and account number. i want to find out count of checks that were written in a period of 4 days which exceeded i.e. > $400, between 401 and 500, > 501 for a single month. the table has data for more than a year and i want the results then grouped in monthly format like in OCT between 300 & 400 #30 (30 customers gave checks total worth $300-$400 within any 4 consecutive days period in the month of OCT ) between 400 & 500 # 20 > 501 # 10
NOV between 300 & 400 #30 between 400 & 500 # 20 > 501 # 10
What daily/weekly checks do you guys currently perform on your servers and databases?
I recently ran across with an article from SQLServerCentral that listed a couple of daily checks that I'm thinking about implementing on my environment, and some of them are: DB Missing Recent Backup - Report DB Missing Recent Log Backup - Report Drives Low on Disk Space - Report Error Log Messages Report - Report Instance Recently Restarted - Report Job Failures - Report Large Databases Log File - Report
I already have in place: Verify is SQL Agent Service is running Check Disk Space Available
Since I'm going to spend some time on this, I was wondering if there's anything else that you guys have in place or any other 'nice to have' that you guys also might have, so I don't leave anything behind...
I have two connections in a package pointing to two different databases on the same server. I have to insert records from 'DB1' table 'Gender1' to 'DB2' table 'Gender2'. Before I do that though, I have to make sure the minimum value (of all the Gender Keys that are going to be inserted) of 'DB1' 'GenderKey' (which is an identity field) is greater than the maximum value of DB2-GenderKey (which is a primary key but not an identity field). How can I do this simple check? I have to do this process for many different tables ....... Gender table is just an example. If someone can give an detailed explanation on which tasks to use and how to use them (as I am relatively new to SSIS) that'd be great.
I'm trying to use a custom check constraint to insert into a table. I have created a function that checks an item against a authorization date as well and I've coupled this into the constraint. So for example, my function starts:
Code Snippet ALTER FUNCTION {function1} (@ItemID INT, @AuthDate DATETIME) RETURNS INT AS BEGIN
DECLARE @Ret INT SET @Ret = 0 IF NOT EXISTS(SELECT top 1 ItemID FROM {table1} Where b1.ItemID = @ItemID AND @AuthDate <= ISNULL(b1.expirydate, @AuthDate)) BEGIN SET @Ret = 1 END ELSE BEGIN SET @Ret = 0 END
RETURN(@Ret) END GO
Now i couple that into a check constraint for the same table:
Code Snippet ALTER TABLE {table1} ADD CONSTRAINT {contraint_name} CHECK (([function1]([ItemID], [AuthorizedDate])=(1))) GO
Now, when i insert a record into this table with an Authorized Date greater than the ItemDate it should set the @ret value to 1. This is being passed back to the constraint should set 1=1 which is true and therefore should allow an insert, however, i cannot do this. I think this is because it inserts the record and THEN does the check which makes it fail. Is there anyway to do this check before the INSERT without having to use a trigger? I dont want to use a trigger because when we're doing an insert for bulk rows, performance decreases correct?
Using the new referential integrity constraints that will be made available, will it allow us to manually define the relationships between entities even if there is no true foreign key constraints setup in the database?
Lets say we deleted the FK_Orders_Customers in Northwind between orders and customers.
I'm trying to simplify a SQL Stored Procedure.The query accepts an int, @ItemTypeID intI have the Query:SELECT ... FROM ItemList WHERE ItemTypeID = @ItemTypeIDor, if @ItemTypeID is 0,SELECT ... FROM ItemList Is there a way to do this query without doing:IF @ItemTypeID = 0BEGIN ...SELECT QUERY...ENDELSEBEGIN ...SELECT QUERY...END?
I have got a script which checks for the databases that are offline. and mails the dba when any of the database is offline.
When I schedule the job , for every hour , it gives me an blank mail only with the subject ' status of database on testsql' even though no databases are offline.
So how can I change the script , so that it mails the dba only when a database is offline even though its scheduled every hour or half an hour.
The script is:
Set NoCount on DECLARE @dbname VARCHAR(100) deCLARE @Status varchar(100) Declare @Message VARCHAR(8000) DECLARE @date varchar(100) set @date = convert(varchar(100), getdate(),109) set @Message = ''DECLARE dbname_cursor CURSOR FOR SELECT [name], CONVERT(varchar(30),DATABASEPROPERTYEX(name,'Statu s')) as [Status] FROM master..sysdatabases WHERE CONVERT(varchar(30),DATABASEPROPERTYEX(name,'Statu s')) = 'OFFLINE' order by name OPEN dbname_cursor FETCH NEXT FROM dbname_cursor INTO @dbname, @Status WHILE @@FETCH_STATUS = 0 BEGIN select @message = @message + @@Servername + '-' + @dbname + ' - ' + @Status + Char(13)+ ‘- ‘ + @date FETCH NEXT FROM dbname_cursor INTO @dbname, @Status END CLOSE dbname_cursor DEALLOCATE dbname_cursor
print @message EXEC master.dbo.xp_smtp_sendmail @FROM = N'testsql2000@is.depaul.edu', @TO = N'dvaddi@depaul.edu', @server = N'smtp.depaul.edu', @subject = N'Status of the Database on Testsqlserver!', @type = N'text/html', @message = @message
I want to check to see if a database exists in SQL Server 2005 Express... using VB.NET (or C#)... Can not use the SQLConnection Object... because I get a Failed Login Attempt... regardless if the DB does not exist or if it is because the User Login is incorrect
Is there some way to check to see if the Database exists (is attached) to the SQL Server 2005 Express Engine?
I have a VM set up for offloading DBCC checks. Specs are below. I've read through this, but I'm not seeing the performance gains by enabling the trace flags and using the physical only switch.
Is the whole drawback that I'm on SATA storage? Is there a VM configuration with the CPU I can/should change? I've been playing with MAXDOP trying to see if I can get any benefits but I'm not seeing a much.
Dynamic Log File Growths Remaining alarm becomes active when a non fixed size log file in any database is in danger of running out of space to grow. It is raised when a log file is almost full and the file cannot automatically grow enough to relieve the problem.
We have our databases with Enable Autogrowth (in Megabytes), and then a Maximum File Size (Limited to a MB value).
Example: If one of the database logs (or possibly filegroup primary) picks up another extent and is about 5 extents (arbitrary value) away from running out of growth room, an alert would be sent to an email address/profile.
P.S. Yes, there are multiple databases on this one instance and the script should loop to run through all of them.
I Have a nifty little stored procedure that takes data from an ASP form post and inserts the data into two tables. However what I really need this to do before the insert is check whether there is already a record that matches some of the criteria, and if so returns an error "This Username already exists" and only if there isn't a record that matches the criteria is the record inserted.
my current stored procedure looks like this -
Code Snippet@siteid int,@companyname nvarchar(50),@address nvarchar(500),@phone nvarchar(50),@fax nvarchar(50),@email nvarchar(225),@url nvarchar(225),@companytype nvarchar(50),@billingcontact nvarchar(50),@name nvarchar(50),@AccountType nvarchar(50),@PASSWORD nvarchar(50),@AccountLive nvarchar(50),@EmployeeLevel nvarchar(50)ASDeclare @NewID INTINSERT INTO dbo.JBClient(JBCLSiteID, JBCLName, JBCLAddress, JBCLPhone, JBCLFax, JBCLEmail, JBCLCompanyType, JBCLURL, JBCLAccountType, JBCLAccountlive, JBCLBillingContact)VALUES (@siteid, @companyname, @address, @phone, @fax, @email, @companytype, @url, @AccountType, @AccountLive, @billingcontact)SELECT @NewID = SCOPE_IDENTITY()INSERT INTO dbo.JBEmployee(JBEClientID, JBESiteID, JBEName, JBELevel, JBEUsername, JBEPassword, JBEAddress, JBEPhone)VALUES (@NewID, @siteid, @name, @EmployeeLevel, @email, @PASSWORD, @address, @phone) The values that i need to check against are -
@siteid @email
in the table dbo.JBEmployee against columns JBESiteID & JBEUsername
What i would really like to do if a record exists is return the user to an ASP page, which contains all of the variables previously enterred -,
I have to perform several data checks before loading data into target table. For example I am having 1 flat file with below column
Id Name Age Int Varchar(100)Â Int
My requirement is to create  package, checks will be performed on each record, column of the files. Any records which failed the checks considered as error records and will be written to the exception table.
create table person( personId int identity(1,1) primary key, fName varchar(25) not null, mI char(1) null, lName varchar(25) not null );
create table student( studentId char(4) not null primary key, personId int not null );
alter table student add constraint fk_person_student foreign key (personId) references person (personId) ;
create table instructor( instructorId char(4) not null primary key, instructorQual varchar(100) not null, personId int not null );
alter table instructor add constraint fk_person_instructor foreign key (personId) references person (personId) ;
create table contract( contractNum int identity(1,1) primary key, contractDate smalldatetime not null, tuition money not null, studentId char(4) not null foreign key references student (studentId), contactId int not null foreign key references contact (contactId) );
create table contact( contactId int not null primary key, fName varchar(25) not null, mI char(1) null, lName varchar(25) not null, street varchar(50) not null, city varchar(25) not null, state char(2) not null, zip char(5) not null, relationship varchar(25) not null, phNum char(12) not null, emailAdd varchar(50) null, );
create table class( classNum char(4) not null primary key, className varchar(25) not null, classDay char(3) not null, classTime char(8) not null, testNum char(5) not null );
alter table class add constraint fk_class_testnum foreign key (testNum) references test (testNum) ;
create table discount( discountNum char(3) primary key, discountDesc varchar(100) not null, discountPer decimal(3,2) not null );
create table test( testNum char(5) primary key, testName varchar(50) not null, testDate smalldatetime not null, testFee money not null, );
create table studentClass( studentId char(4) not null, classNum char(4) not null, pass char(1) not null );
Where should tests for data integrity be done, a few examples:
Inserting data but the underlying data has changed?
Adding data, say an order item row, but since going to that screen the order (and as a result all its order items as well) have been deleted?
It's just I'm starting to see even the simplest of stored procedures as being quite complicated. What puzzles me is a I see lots of INSERT stored procedures with none of this checking in, or returning error codes?