SQL 2012 :: Optimal Settings For DBCC Checks?
Aug 5, 2014
I have a VM set up for offloading DBCC checks. Specs are below. I've read through this, but I'm not seeing the performance gains by enabling the trace flags and using the physical only switch.
Is the whole drawback that I'm on SATA storage? Is there a VM configuration with the CPU I can/should change? I've been playing with MAXDOP trying to see if I can get any benefits but I'm not seeing a much.
wait_type wait_time_spctrunning_pct
CXPACKET 561191.4228.7128.71
OLEDB 387136.7619.8148.52
PAGEIOLATCH_SH 340674.5817.4365.95
TRACEWRITE 321598.8416.4682.41
[code].....
View 9 Replies
ADVERTISEMENT
May 28, 2014
I would like to build a report with nice functionalities like filter, sorting, drill-down, something like a PowerPivot Table, but with some layout/design/format capabilities. I would also want to publish the report, refresh it let´s say once a week, notify users when a new version is available, etc.
If I use PowerPivot, then I am not able to customized the report or to mix data from different sources in one table.
If I convert the cells of the PowerPivot table to workbook formulas I lose the filter, sorting, etc functionalities.
I still have to try using Reporting Services, but I think that always something is missing.
View 1 Replies
View Related
Sep 27, 2015
I am using sql server 2012 with HADR (Always on with sql cluster).
We have database maintenance plans through wizard for full backup & DBCC CHECK DB. It was running successfully but it failed with the below error
Execute SQL Task Description: Failed to acquire connection "Local server connection". Connection may not be configured correctly or you may not have the right permissions on this connection.
(A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)).
(A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)).
I can able to take the backup from query window. It is succesful. The Sql Agent has full permissions. I don't think there are any recent changes happen.
View 9 Replies
View Related
Jul 9, 2014
SMSS installed on my client forgets all the settings I entered under tools / options.maybe it has something to do with installation of system updates.
View 0 Replies
View Related
Sep 24, 2014
I am trying to reproduce a problem that is probably related to how a particular SQL server is configured. Is there a tool or best practice that is useful to compare all of the configuration settings between two SQL Servers (same version)?
View 2 Replies
View Related
Apr 7, 2015
We changed the IP addresses for several SQL Servers and we're a little confused by what we see in Configuration Manager. The TCP/IP properties "IP Addresses" tab still shows the old IP address, even after a restart. On "Protocol" tab, Listen All is set to "Yes".
We've found one source that says this setting causes the listening on an individual IP to be ignored.
View 0 Replies
View Related
Sep 8, 2015
We have a new set up on VM to run an application running 24*7 (migrated from SQL server 2008R2) with below configs:
1. OS- Windows server 2012 Standard 64 bit hosted on Virtual Machine
2. Memory 16 GB and Cores =4 with 2.4GhZ processor
3. SQL server 2012 SP2 , 64 bit Standard edition.
4. Total size of databases as of now 15 GB with biggest being 5 GB.
How should i go around in setting the MAX and MIN server memory settings. I have this set up for many of SQL 2005 and 2008R2 servers, but for 2012 i heard that things has slightly changed.
How should i start analyzing and setting the right value of this MAX and MIN?
View 5 Replies
View Related
Jun 23, 2014
I am currently designing a SSIS package that will migrate data from Syabse to Sql server. If these packages are to be deployed to file system (as this is one time run for migrating data) or to SSIS catalog or to Sql server.
Also, I have not changed the default protection level value in packages during design time and would like to know if I have to change it while handling the deployment using DTutil (yes, i need to deploy & execute using command line utilities).
Please note that I do not have access to PROD, UAT environments and deployment team will use my BAT file that is expected to deploy and execute the packages.
View 3 Replies
View Related
Dec 2, 2014
Every night all our DBCC CHECKDB runs on all our databases. The trouble is one of them is very large and the database is inaccessible whilst this runs. DBCC CHECKDB (dbname) WITH physical_only executed by user found 0 errors and repaired 0 errors. Elapsed time: 0 hours 24 minutes 46 seconds
This normally happens fairly shortly after the backup, and normally (but not always) after a series of these entries in the log.SQL Server has encountered 7976 occurrence(s) of I/O requests taking longer than 15 seconds to complete on file [D:Program FilesMicrosoft SQL ServerMSSQL11. MSSQLSERVERMSSQ LDATA empdb.mdf] in database [tempdb] (2).Would this cause SQL to automatically run a CheckDB.
View 9 Replies
View Related
Oct 6, 2015
DBCC CHECKDB was affecting performance. Screens were hanging, timeouts, etc.
So I made the unfortunate decision of Killing the 2 processes.
I keep checking the percent complete and they are stuck at 33% and 17%.
Will I have to restart the SQL Server Service?
Are they any other options?
View 3 Replies
View Related
Oct 22, 2015
I created sql job using maintenance plan to run DBCC Checkdb for all databases. But I didn't choose the option Ignore databases where the state is not online. If any db is restoring the same time as DBCC Check db job running, is check db job fails?
View 1 Replies
View Related
Apr 23, 2014
I have encountered an anomaly. The dbcc checkident(reseed) command behaves differently on two SQL servers.
In both cases, I am deleting from (not truncating) data in two tables, due to foreign key constraints. (I am truncating other tables, the issue is not with those tables, only with the deleted-from tables.) On one server, I need to use dbcc checkident(reseed,0) so that when I insert fresh data, it begins with identity key #1. According to MS documentation, that appears to be the correct behavior, when data has been deleted from a table, rather than truncated.
However, on the other server, I need to use dbcc checkident(reseed,1); if I use ...(reseed,0) on that server, it begins inserting data with identity key #0.
This is consistent, repeatable behavior on both servers.
View 4 Replies
View Related
Feb 12, 2015
What is the use of running DBCC FREESESSIONCACHE.
How frequently can we do this on all the databases.
What happens if we schedule to run it every 1min.
Are there any advantages/disadvantages of running this on all databases.
View 5 Replies
View Related
May 26, 2015
I can find many examples of loading DBCC results into tables. They all begin with a create table statement defining the results. My question is , other than trial and error, is there a way to determine what data types will be returned. Sure you can say that first element looks like an integer, but is it really a bigint, and that text string can be varchar(max) but will char(2) work.
I'm not looking for an answer for a specific DBCC function, but rather a generic way I can determine the characteristics of any DBCC result set.
I tried
SELECT *
INTO #tmp
FROM OPENROWSET('SQLOLEDB',
'Server=ray;Trusted_Connection=Yes;Database=Ed_sandbox',
'Set FmtOnly OFF; DBCC loginfo WITH tableresults ')
but I got back
Msg 11527, Level 16, State 1, Procedure sp_describe_first_result_set, Line 1
The metadata could not be determined because statement 'DBCC loginfo WITH tableresults' does not support metadata discovery.
View 1 Replies
View Related
Mar 18, 2014
I am a SysAdmin on a test/dev SQL instance. Other non-sysadmin users (developers) need the ability to execute DBCC commands like the following:
DBCC FREESYSTEMCACHE ('ALL') WITH MARK_IN_USE_FOR_REMOVAL
OR
DBCC FREEPROCCACHE
I tried creating a store proc in a user database and granting those non-sysadmin users EXECUTE permission on it as so:
CREATE PROC spFreeSystemCache
WITH EXECUTE AS 'sa'
AS
DBCC FREESYSTEMCACHE ('ALL') WITH MARK_IN_USE_FOR_REMOVAL
GO
When I try to create this proc, I get the following error:
Msg 102, Level 15, State 1, Procedure spFreeSystemCache, Line 2
Incorrect syntax near 'sa'.
Ok, so I can't EXECUTE AS sa...
View 5 Replies
View Related
Jul 8, 2015
I would like to know if it is possible to date stamp the output file for the DBCC integrity check?
The following cmd works fine however I tried different ways to date stamp the output file with no luck.
sqlcmd -U backup -P Password -S Server -Q"DBCC CHECKDB('DatabaseName') WITH ALL_ERRORMSGS" -o"G:LOGSDBCCResults.txt"
View 5 Replies
View Related
Jul 23, 2005
Please excuse what is probably a no-brainer, but here goes.Is there any difference, in terms of performance or any other pertinentfactor, between:SELECT * FROM tblCustomers INNER JOIN tblCustomerOrders ONtblCustomers.fldCustomerID = tblCustomerOrders.fldCustomerIDandSELECT * FROM tblCustomers, tblCustomerOrders WHEREtblCustomers.fldCustomerID = tblCustomerOrders.fldCustomerIDI note that if I type the latter into the SQL pane in a Data window,SQL Server replaces it with the former.TIAEdward--The reading group's reading group:http://www.bookgroup.org.uk
View 3 Replies
View Related
Mar 2, 2005
What is more efficient for a database design - a lot of tables with only a few records or a few tables with lots of records.
I'm starting a new site and each user will have numerous records but I'm not sure whether to have a few very large tables (over 100,000 rows) or start a new table for each user which would result in approx 1500 tables most of which would be the same table design with different rows.
I'm using SQL2000.
I guess this is quite a basic question, but I'm a bit unsure.
Any references anyone could point me too as well.
Thx
View 3 Replies
View Related
Feb 3, 2004
Hi
I'm fairly new to this, so bare with me...
I have to make a new installation of an MS SQL 2000 EE on a Windows 2003 Std. Edt.
HW:
---------------------
Dual Xeon 2,4 + 1 GB Ecc
1 x 32 MB Adaptec 2100S RAID Controller
2 x 18 GB 10K HD
4 x 18 GB 15K HD
---------------------
So far I have made following configuration....
---------------------
2 x 18 GB 10K HD / RAID 1
- C:OS
- D:MSSQL program files + System DB's (Master, pubs ect.)
4 x 18 GB 15K HD / RAID 5
- E:TempDB
- F:Data + Logs
---------------------
But I'm not sure that this is the optimal configuration, and I'm willing to start all over :)
So my q's are.......
--------------------
Which RAID configuration would you suggest?
Which partitions on the raids would you suggest?
Which usage would you assign the various partitions?
How do I move the system and temp db's?
--------------------
Thanx!
Regards,
Taras Bredel dk
View 3 Replies
View Related
Jul 25, 2007
Hi
I'm trying to find the optimal way of getting the timestamp of the last updated entry in an mssql database. A database is updated only about 5 times a minute, how ever a request for the time of the last entry could be around 1 per second. For this reason i was thinking of having a separate table which has a single row which is updated everytime a new entry is updated in the main table. I would then only need a simple SELECT statement and need very little processing power.
Is this the best method, or can you think of any others i could use?
many thanks
View 14 Replies
View Related
Jun 19, 2008
I am wondering if 100% buffer cache hit ratio is considered not good in general?
Are there instances that it is actually bad and can contribute to server performance degradation?
Any thoughts on the topic most welcome :)
--------------------
keeping it simple...
View 11 Replies
View Related
Jul 23, 2005
I am working with a report generator that is based on SQL Server 2000 anduses ASP as the UI. Basically we have a set of reports that end users canexecute through a web browser. In general the model works fine, but we arerunning into some scaling issues.What I'm trying to determine is, what is the optimal configuration for thissystem. It is currently a 2.4G Pentium with a large RAID and 1G of RAM. Wehave been using the "fixed" memory configuration, allocating 864M to SQL.This is on a Windows 2003 server box.This works fine when a "small" query or two is executed, but the performancesuffers terribly when several users try to run reports in parallel. A singlequery might take 10 minutes to run if nothing else is happening on the box,but if additional users log on an run reports, it's almost impossible topredict when the queries will finish.I am also looking at the effect of database size on performance, runningtests against a database with 1 month, 3 months, and say 12 months of data,running the same query against 2 databases in parallel. With the originalconfiguration, the results were all over the place, with the 12 monthdatabase outperforming the smaller dbs, while other times there was littledifference. It seems that once the system starts paging, and paging heavily,it's over; the system never "recovers" and queries that previously ran in afew minutes now take hours.I added 3 G more memory to the system, and modified boot.ini to include the/3GB switch. Now when I run the same tests, the results are much moreconsistent, as the system rarely ever has to swap. Then again I've neverseen it go past 1.7G in Task manager, making me think that any more than say2.5G of memory is a waste?Things we are trying to determine are:- in the SQL Server memory configuration, is Fixed better than Dynamic? Wehave read that Dynamic is not good at returning memory to the OS once it'sbeen allocated- What else can we do to optimize the performance for this application? Itseems to me if the indexes are properly designed, the database sizeshouldn't have that much impact on performance, but this appears to be trueonly to a point. In comparing the execution plans between say a 12 month anda 3 month database, the plans are sometimes dramatically different. I assumethis is due to the optimizer deciding that going directly to the base tablesand not using an index will result in better performance, when in reality,this doesn't always appear to be true.- Are there other SQL Server switches I should be tweaking? Is there somenumber of simultaneous queries that this configuration should be limited to?- What about other versions of SQL Server (e.g. Enterprise, Data Center,etc) would these buy us anything?Thanks for any advice,-Gary
View 2 Replies
View Related
Nov 23, 2005
I have the following tableCREATE TABLE Readings(ReadingTime DATETIME NOT NULL DEFAULT(GETDATE()) PRIMARY KEY,Reading int NOT NULL)INSERT INTO Readings (ReadingTime, Reading) VALUES ('20050101', 1)INSERT INTO Readings (ReadingTime, Reading) VALUES ('20050201', 12)INSERT INTO Readings (ReadingTime, Reading) VALUES ('20050301', 15)INSERT INTO Readings (ReadingTime, Reading) VALUES ('20050401', 31)INSERT INTO Readings (ReadingTime, Reading) VALUES ('20050801', 51)INSERT INTO Readings (ReadingTime, Reading) VALUES ('20051101', 106)GO-- list the tableSELECT ReadingTime, Reading FROM ReadingsGOIt is a table of readings of a free-running counter that istime-stamped. I need to determine the value of the reading thatcorresponds to the closest date to the supplied dateAre there more optimal/efficient ways of accomplishing this than thefollowing?DECLARE @when DATETIMESET @when = '20050505'SELECT TOP 1 ReadingTime, Reading FROM ReadingsORDER BY abs(DATEDIFF(minute, ReadingTime, @when))The above gives me the desired result of ('20050401', 31).Any suggestions would be appreciated
View 1 Replies
View Related
May 14, 2007
I would like to know what options are available from BIOS / OS / SQL and server perspective when configuring or tuning a system with SQL Server 2000 or SQL Server 2005.
For example, I have a system with 4 dual-core Opteron CPUs on Windows 2003 Enterprise Edition. However, the OS sees 8 CPUs -- is this the optimal configuration or is it better (if even possible) to configure the system to see only 4 CPUs? The reason for this concern is due to performance problems faced deploying systems with Hyperthreading Technology.
Any documentation or examples in this regard would be very useful. Basically, what are the scenarios that would require a certain type of CPU configuration over another.
Thanks in advance for your help,
Ziggy
View 1 Replies
View Related
Aug 24, 2007
What is the best performance for this configuration:
Files:
Data
Log
Indexes
tempdb
Disk:
A - RAID 10
B - RAID 10 (or should this be RAID 1?)
Whats best?:
A - Data and Indexes
B - tempdb and Log
??? Thanks.
View 1 Replies
View Related
Jan 3, 2008
C# .Net Application as front end
Sql Server2000 as back end
I need to merge an external dataset from .Net app(in XML format) with the information in database with one column in database table as the merging criteria. A situation similar to Left Outer Join, wherein i need all records from external dataset and if matched in database the corresponding values from there too, the only difference here is that the join is not between two Tables its between a table and external dataset.
There is no need to store the external dataset in the database in persistent form, its just a query - merge - response operation.
So, can anyone suggest the best possible solution for this? A table variable / temporary table / some other schema, what and how?
Thanks in advance..
View 8 Replies
View Related
Mar 26, 2007
Hi All,
I am playing with DBCC command to check the contsrainst on a perticular table (DBCC CHECKCONSTRAINTS ('myTable') WITH ALL_CONSTRAINTS), it always gives the following result:
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
nothing more than that, anyone can help please?
Cheers,
Riaz
View 3 Replies
View Related
Mar 5, 2005
For Backup Database i used script
BACKUP DATABSE.
Then after that, i need to check Database inigrity, for that
i used DBCC CHECKDB and DBCC CHECKCATALOG
but, i want to store the result in a text file, after executing these two commends, how to write commend line please help me.
View 1 Replies
View Related
Dec 10, 2006
Hi,I want to know the optimal solution, to find if all the data was entered. Lets say, Table A (date field) and for a given month, i need that all the days in the given month are present in the Table A. Right now i have different solutions, 1) a stored procedure which loops through all the days in the given month against a select statement on Table A2) a stored procedure, create a temp table which contains all the dates in the given month, and a single select statement using where condition (select * from.... where datefield not in (select * from...))I want to know what is the best solution of these two or any other solution.Thanks
View 9 Replies
View Related
Feb 20, 2004
I was told in one of my systems classes that the real performance bottleneck in accessing information from the database was the opening of a connection from the application to the database.
To combat that problem I was advised to use a Singleton Factory pattern and to have that Factory instaniate a connection and open it, then pass references to that connection for all of the objects that it created. All of those objects passed the connection reference to the objects they created and so on. Basically that meant that I only ever had one connection open at any one time for my entire application. And I was able to implement this solution at my previous job where I was developing in Oracle. I primarially used OracleCommands and OracleDataReaders to get the informaton into and out of the database. I thought this was a very nice solution. Having this many DataReaders accessing a single connection was not a problem because OracleConnections don't get locked from having more than one DataReader open at once.
At my current job, however, I use SQL Server. I am concerned that the single connection will not work in my new enviroment as the SQLDataReaders lock up the connection while they are using it. If the information that I recieved about opening connections being the real bottleneck, then I am hesitant to have a connection instanciated and opened for each method, but I am concerned that a whole lot of errors will be generated if I use the single connection method. Also, how do DataAdapters effect my decision of which approach to use.
Any advice would be most helpful. If you have any questions that would help answer just ask. Thanks.
View 4 Replies
View Related
Jul 20, 2005
Suppose you have two (or more) tables with foreign key constraints. Myquestion is thus:Is it better to check if the fk exists before you try to perform theinsert or let SQL do it for you?On one hand, if you check yourself and the key does not exist you cangracefully handle it (maybe exit out of method with error). If you letSQL do it, the server will throw an error which cannot be suppressed.On the performance side, you doing the check will incur a slight (VERYslight) hit since SQL will ALSO check anyways.
View 3 Replies
View Related
Jul 15, 2004
I'm starting to collect and develop some scripts that will tell me the health and welfare of my MSSQL 2k server. I have a few for blocks, db size, who is on and what they are currently running.
I was wondering if you guys could share some of the scripts you guys use to watch the health of your servers.
Thanks,
DMW
View 1 Replies
View Related
Aug 21, 2002
On weekends I have Integrity Checks scheduled to run. Many of these fail for individual databases because users do not log off and the databases cannot be switched to single user mode.
I have checked Books-on-line and have not yet stumbled onto a TSQL command that breaks the connections.
Is there a TSQL command to do this? If not, how can these connections be broken?
View 2 Replies
View Related