I have installed 2 node windows Fail-over clusters successfully. But QUARUM Configuration is not appearing in Failover cluster manager instead "Witness: Disk (Disk Cluster 4)". I have also configured quarum configuration from Quarum "Configure Cluster QUARUM Settings". I have attached the snapshots of windows cluster configuration. Is it the issue or not. I have not got any warning and error during cluster validation while installing Windows failover cluster. I am assuming it is okay and i can move ahead to installation of SQL Failover cluster setup.
Products used for installation in Virtual Machine: Windows Server 2012 R2 SQL Server 2012 R2 Note: Service Pack is not installed.
I have a windows 8 pc that I just got and installed sqlexpress 2014. My buddy haw windows 7 and installed sqlexpress on his pc. We create a db on his pc, did a backup, copied the backup to my pc. In ssms I right click on "database" > restore database. click device and the button to find my file. I navigate to the folder where the file shows in file explorer but the .bak file does not show in ssms to restore from. This is probably a windows thing but I have don't know what to look at.
I configured a database mirroring on SQL server 2008 R2 on test environment. However It was not working, after some time. I deleted the mirroring configuration and also deleted database on both primary and mirror instance.
few days later I notices, On Primary, Continuous error events logged with message "database mirroring attempt by user 'domainservice account' failed with error 'connection handshake failed, 'domainservice account' does not have permission on the endpoint state 84.
we have roughly 22 people connected to one database. But after a while, their applications begin to drag due to in and out communication with the server. When i check the active connections on the sql server, some times i see 157 active connections, please how to i set a timeout or connection interval close, so as reduce the heavy load being put on the server. Or how can i automatically close connections when they get higher than 50 connections.
I need to do testing on capturing connection coming from outside using Profiler. Basically I need to do something from other SQL servers connecting to this SQL server and test if it is captured by profiler.
In SQL 2008 R2, if we clone an environment including SQL server, the maintenance plans retain a connection string to the source/original server they got cloned from and are not editable. But, I was able to use a work around by editing them in BIDS and saving them back on the server. But now with 2014, I am facing two issues:
1.I still can edit the package to correct the server connection, with SSDT; but the option to save back to the server is not available any longer!
2.I used to be able to see all my plans under SSIS in 2008 R2 but not in 2014 now. Although, they are listed in SSMS!
I am trying to create a logon trigger. As I am testing this, I discovered that each time I do a connection, I get 19 rows, inserted into my audit table. I ran profiler, and I see it is going through the logon trigger multiple times, for a single connection. So, what am I doing wrong? The code is fairly simplistic, and the profiler doesn't give a clue, as to what is going on. When I look at the output, I see the spid for the first couple of connections are different, then a spid, that is different from those 2 is in the next 17 rows. But, when I do an sp_who2, that spid does not exist.
This issue was noticed on a 2012 version, that I was first testing on, then had the same issue on a 2008 R2. I am currently testing on a 2014 version, that is doing the same thing. Is the logon trigger itself, firing, and causing this?
I also tried using the After Logon option, and got the same issue.
Here is the code:
CREATE TRIGGER LogonAuditTrigger ON ALL SERVER WITH EXECUTE AS 'sa' FOR LOGON AS BEGIN DECLARE @Body NVARCHAR(2000),
I want to set up a database role so that users can use sp_readerrorlog through SSMS. It does a check on membership in the securityadmin role.
I have tested it and can see you can grant execute on xp_readerrorlog but the SSMS GUI uses sp_readerrorlog.
I thought I could create a user/certificate and add the signature to sp_readerrorlog but it's not permitted (likely because it's not a normal database object).
So the other solution is to add the users to the securityadmin role but then explicitly deny alter any login (best done with a custom server role in 2012+ but otherwise just manually in 2008). I tested this out and it works, I'm not able to alter any logins or increase my own permissions, I also did a check of what's reported from fn_my_permissions(null, null) and it shows minimal permissions like I'd expect.
i'm having trouble getting my project to work on other machines using windows authentication, so as this is urgent I want to change it to sql authentication. I've enabled sql authentication and enabled the sa login, could someone please tell me how to connect to my db my connection string is currently as follows using windows authentication: <connectionStrings> <add name="GuitarShackConnection" connectionString="Server=(local)SqlExpress;Integrated Security=True;Database=GuitarShack;" providerName="System.Data.SqlClient"/> <add name="GuitarShackConnectionString" connectionString="Data Source=(local)SQLEXPRESS;Initial Catalog=GuitarShack;Integrated Security=True" providerName="System.Data.SqlClient"/> <add name="CustomerNameDS" connectionString="Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|ASPNETDB.MDF;Integrated Security=True;User Instance=True" providerName="System.Data.SqlClient"/> </connectionStrings>
made my first web application using vb script. I am trying to use window authentication for my web page and try to give all permission based on window domain user account. Here is the connecting string I was using to connect to the sql server and it was working fine.
I made couple of changes so it run on based on window domain user. 1) my server is set on window and sql server 2) Web.config file – I set Authentication mode = “ Window� 3) I also added under set <authorization> <deny users="?" /> </authorization>
4) change the connecting string to : ( Added "Integrated Security=True;" and removed line for user and password)
5) run the application and now I am getting error said “Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection.� I was expecting window pop up log in to enter domain user and password and then verity the user and password against sql server and the take it from there. Where I made mistake. Need help.
I have a problem when accessing the reports in SSRS 2005. I have set up several security roles for Domain Users,BUILTINAdministrators for the main folder and the reports in this folder inherited the roles. And I created a ASP.Net application there is a button to link to these reports using URL Access method. But the SSRS 2005 alaways pop up a window to authentication. After I input user name and password and the reports can run. Is any one know why to get rid of the authentication page. Thanks in advance.
I have following query which return me SP/Views and Functions script using:
select DEFINITION FROM .SYS.SQL_MODULESNow, the result looks like Create proc create procedure create proc create view create function
I need its result as:
Alter Procedure Alter Procedure Alter Procedure Alter View Alter Function
I used following
select replace(replace(replace(DEFINITION,'CREATE PROCEDURE','Alter Procedure'), 'create proc','Alter Procedure'),'create view','Alter View') FROM .SYS.SQL_MODULESto but it is checking fixed space like create<space>proc, how can i check if there are two or more spaces in between create view or create proc or create function, it should replace as i want?
If I install an instance with Windows Only authentication, and then change it to Mixed Mode, if I enable the sa login, the password has already been set. What is the default? If it's generated, how secure is it? Is the password generated? What algorithm is used for that?
My sql databases in SQL Server 2014 has the status "suspend" as I saw in SQL Management Studio. I can't restore to serviceable condition sql databases through standard procedures. I need to restore .mdf file.
I am using a monitoring system where I can monitor a numeric SQL result assuming the result is one field and one row.I would like to do this to say monitor the free available space or percentage on say the Master database. DBCC SQLPERF gives me a few columns and results for all databases on the server.
In our environment applications are using a DNS name which points to the physical server ip address. Now we are planning to move to 2014. We are planning to have servers in different subnets so we will be having two ip adresses for listener. How we can point the DNS to the listener ips? If failover happens can the DNS point to the exact ip address of the listener where it's primary node?
"Process 0:0:0 (0x1e10) Worker 0x00000006B6D341A0 appears to be non-yielding on Scheduler 13. Thread creation time: 12906028806348. Approx Thread CPU Used: kernel 0 ms, user 0 ms. Process Utilization 13%. System Idle 84%. Interval: 70189 ms."
Is it better to run the profiler or performan counter?
What are the filters we have to select in the profiler to monitor the Sql server
I have a SQL server box running 2014 reporting services. I have another server running IIS v8.
I would like to be able to connect to the IIS site and be given the SSRS report browser.
So externally if I browse to [URL], I am presented with the report server interface, the same as if I browse to http://xxx.xxx.xxx.xxx/reports internally.
What is the best approach for a read only copy of a database that is ~ 1TB. The primary database is fed nightly with an ETL process. We are currently trying to duplicate the ETL to read only server but that process is not going well. So we are looking at other options to let SQL make the copy.
The primary database is on a Win12R2 with SQL 12 or 14, a 2 node A/P failover cluster.
The read only copy will be on a Win12R2 with SQL 12 or 14. It is not a requirement to fail over to the read only copy if the primary should go down.
What would best the approach to accomplish the end result?
I have 10 databases which are configured as principal in mirroring I need to failover all the databases as part of failover , instead of writing query each database as parner failover, is an script which will generate the databases as principal to failover ?
After installing SSMS on some computers - the only way we can get SSMS to run correctly is to run it as the administrator. Is there a way where you don't have to do that? These end users are logging as themselves and have accounts in SQL Server all set up - but SSMS will only launch for them if we right click and select "run as administrator".After doing some digging - it seems that this is a common problem out there.
Have a SQL 2014 install and cannot for the life of me get the maintenance plan to remove old backups. I've tried everything. Rights to the folder where the backups are stored are adequate, extension set in the clean up task is as it should be, etc. Log shows the job ran successfully. Running the command manually shows successful completion, but backups are still not removed.
when execute the restore log command, in the messages window it shows how many seconds the restore takes, at the meantime, on the status bar, it also shows the seconds the command takes.
Two values are different and could be very different, please see below examples , restoring takes 1.8 seconds, but in total the command takes 4 seconds to complete, the other one is 8.1 seconds and 12 seconds.
What does SQL Server or Windows do after the restoring?
pic a:
pic b:
I did a xperf, I can see after the restoring is completed, sql server did garbage collect and log write, which just run very quickly, but storage is busy on reading the log file for nearly 2.2 seconds( 4-1.8), and 4 seconds ( 12-8.1) .
pic 1:
pic 2:
see pic 1 above, from 13 to 17, the restore operation is finished, but the storage jump to 100% active to do some reads, only reads no writes. zoom that period shows pic 2, it read 4096 (I don't know the unit size) for about 4 seconds, what does this do?
Data file, log file, backup file are no different drives, but all local drive, the interesting point is the read jumped after restoring, I tested it on different server, same result...