SQL Server Admin 2014 :: Delete Orphans Users From Multiple Databases
Oct 21, 2015
I have a requirement to delete all the orphans users for the databases. The issue I am having is with when database principal owns a schema in the DB, User cannt be dropped.
How do I transfer it to DBO in case I am looping multiple databases. This is what I got so far .
declare @is_read_only nvarchar (200)
Select @is_read_only = is_read_only from master.sys.databases where name='test' /* This should be a parameter value */
IF @IS_READ_ONLY= 0
BEGIN
Declare @SQL as varchar (200)
[Code] .....
View 4 Replies
ADVERTISEMENT
May 15, 2015
We have multiple databases on a single instance in an OLTP environment. I have my data files on a separate SAN LUN from my transaction log files (and a few NDFs split out onto additional LUNs). I was wondering if there is a performance benefit to putting each LDF file on its own LUN? Or at least my few busiest LDFs?
We are currently on 2012, but I'm having to put together specs for a 2014 installation and need to answer this question without having an environment in which I can benchmark different setups. I just want to hear whether or not others have done this (why or why not?).
View 3 Replies
View Related
Sep 24, 2014
Is there a way to backing up SQL DB without having to stop users from connecting to it.
View 5 Replies
View Related
Jul 22, 2015
I'm trying to find out what tables are being used in a Database.
I don't want the last User but the User and the Dates.
I have a script that return the last user but that is not going to work.
The following script returns the last user but not all users and the Login Name:
ITH LastActivity (ObjectID, LastAction) AS
(
SELECT object_id AS TableName,
last_user_seek as LastAction
FROM sys.dm_db_index_usage_stats u
WHERE database_id = db_id(db_name())
[Code] .....
View 2 Replies
View Related
Mar 12, 2015
I am quite new to SSIS but managed to build a package which imports text files in to SQL. The text files are generated after users complete a manufacturing process on a machine.
The SSIS package is stored in the SSIS catalog and currently a SQL Agent tasks runs every evening to import new files that have been created during the day. Users have now requested the ability to run the import process as soon as they have finished their manufacturing runs as they may want to query the data to looks up stats etc.
What is the best way to do this considering all of the users are not SQL guys and wont have direct logins into the SQL Server or access to SQL Server Management studio. They will have access to the PC where the files are generated, so I ideally I need a batch file which they can just execute to import their new files.
I have seen lots of things on the web about running dtsexec but as the package is stored in the SSIS Catalog, how can I execute this remotely?
View 6 Replies
View Related
Sep 9, 2015
Our development team wanted to create a database user for each application user in the application and use these for granular data access control, which at first, sounded like a good idea but our initial testing ran into some interesting results.
Our target user base was about 15 million users with an estimated 1% concurrency rate, and finding no MS documentation on an upper limit to the number of users a database can have we began some load testing to see how the database performed. In the hundreds of thousands of users range our test database had a hard time performing well under light loads (even without any concurrent connections).
When we purged the users and reverted back to just a handful of service accounts, performance went back to "normal" under the same loads. I began to wonder if this is a situation where throwing more hardware at the problem would overcome the issue or if there is a practical upper limit to the number of users a single database can handle well.
(There were of course other cons to this arrangement and I certainly was never going to expand the users tree in the object explorer for a database like this, but we thought it a solution worth investigating.)
What is the largest number of users any of you have had in a single database?
View 3 Replies
View Related
Aug 9, 2015
I've got an old version of SQL Server 2008 R2 Developer Edition on an old PC which is failing. I've got a new PC and have put SQL Server 2014 Developer Edition onto it. Now before the old machine completely dies, I've gotten into SSMS on the old machine and did a backup of the databases I want to save. I've moved the .BAK files to where I could get to them from SSMS on the new machine. I've gotten into SSMS and tried to do a restore the database to my new machine. However I'm getting an error that does not make any sense to me.
The database I'm I've backed up is named JobSearch. When I backed it up, that was the only database I had selected. Like I said I copied the .BAK to the new machine. Got into SSMS, told it that I wanted to restore the JobSearch database, telling it where I wanted to put it, and it then immediately fails with a:
"Restore of database 'JobSearch' failed. System.Data.SqlClient.SqlError: Logical file 'VideoLibrary_Data' is not part of the database 'JobSearch'. Use RESTORE FILEISTONLY to list the logical file names."
Well of course VideoLibrary isn't "the logical file". But neither did I select VideoLibrary (which is a database I also want to move, but I'm doing one at a time). So what in heck is going on here? Why is it complaining about a database I haven't even selected to back up? Why, when I check everything on the old machine, it's backing up JobSearch, but on the new machine it sees VideoLibrary?
View 6 Replies
View Related
Jun 19, 2015
Have a SQL 2014 install and cannot for the life of me get the maintenance plan to remove old backups. I've tried everything. Rights to the folder where the backups are stored are adequate, extension set in the clean up task is as it should be, etc. Log shows the job ran successfully. Running the command manually shows successful completion, but backups are still not removed.
View 9 Replies
View Related
Sep 25, 2015
I need backup script to take all the database backups and we have the maintenance plan but our database character size is 98 and when we are taking the backups through maintenance plan while storing the backup history information it is adding the date and timezone information and exceeding the length to 128 so it is not writing the information on MSDB.
So we want to take the backup using the script and it has to create sub folder for each database. Also if any of the database fails it should continue with others.
View 6 Replies
View Related
Sep 1, 2015
I am planning to delete a login from SQL logins because he moved out from project .when i try to delete the login , it throws an error saying " The server principal owns an endpoint and cannot be dropped , error 15141 "
Same problem facing on different servers.
Note : Environment is SQL 2012,SQL 2008 including cluster servers .
View 2 Replies
View Related
Jun 27, 2014
I have just upgraded a test server from sql server 2008 sp3 to sql server 2014 inplace upgrade. The compatability level of master database has not upgraded. It was showing 90 and the rest of system databases got updated to 120. Is it fine to update the compatibility level of master database ? Any precautions need to taken??
View 1 Replies
View Related
Jan 10, 2015
Looking for query that lists all databases, in an instance, that are not accessed before a given date (e.g., not accessed before December 31, 2014)?
View 6 Replies
View Related
Mar 25, 2015
How can you install the System Databases to a drive other than the default?
I want the Data Files to be installed on D:MSSQLData and the log files to be stored on D:MSSQLLog.
For better, quicker answers on T-SQL questions, click on the following...
[URL]
For better answers on performance questions, click on the following...
[URL]
View 9 Replies
View Related
Oct 27, 2015
I have a 2 node cluster having 4 cores each wherein having 3 instances of SQL 2008 R2 enterprise comprising of 60 databases, 20 on each instance. I need to setup mirroring for each of the databases to a secondary server having 4 cores and 3 instances. What i understand is that in this case the mirror server will be providing max of 512 worker threads and the 60 mirror databases would consume 240 threads.what all needs to be checked for looking into the feasabilty of going ahead with a async mirror setup as mentioned above.
View 0 Replies
View Related
Oct 18, 2013
I have system database and user database file are present in G,H and W drive.The process is going to be - copy data from G to S, H to T, W to U. Rename G to X, H to Y and W to Z. Rename S to G, T to H and U to W. Reboot the servers. The original G, H and W will then be X, Y and Z. The old S will be the new G, old T will be H and old U will be W. My question is that after doing this whether my SQL server will start or not
View 8 Replies
View Related
Mar 13, 2014
I can easily query multiple servers using the multi-server query function in Central Management Server and write some of the results to logging tables. I would like to be able to do this via a scheduled job. So far I am finding that even setting up Master/Target Servers this may not work and the only workaround is either using SSIS, SQLCMD (by basically hard coding the servername) and possibly Powershell.
tell me if they have been successful just using standard jobs and querying against multiple servers?
If I can't save the results to a 'central' database/table (I can do this when in SSMS), but can still query against multiple servers I was thinking I could write the results to a CSV file that a SSIS job picks up.
I have attempted using SSIS to iterate through servers and have been plagued with intermittent connection issues when using a For...Loop container.
View 1 Replies
View Related
Apr 12, 2015
I'm looking at installing 2008R2 and 2014 side by side, then using Mirroring to provide HA for the 2008R2 instance and AoHA for the 2014 instance. I'd be using the same two physical servers for both the Mirroring pair and the AoHA pair.
View 2 Replies
View Related
Mar 30, 2015
We are consolidating some old SQL server-environments from 'OLD' to 'NEW' and one of our vendors is protesting on behalve of the collation we use on our 'NEW' SQL server.
Our old server (SQL 2005) contains databases with collation SQL_Latin1_General_CP1_CI_AS
Our new server (2014) has the standard collation Latin1_General_CI_AS
Both collations have CI and AS
From experience I know different databases can reside next to eachother on the same Instance.
The only problem could be ('could be !!') the use of TempDB with a high volume of transaction to be executured in TempDB and choosing for Snapshot Isolation Level ....
The application the databases belong to is very static, hardly updated, and questioned only several time per hour (so no TempDB issue I guess).
using different databases using a different collation running on the same instance?
View 5 Replies
View Related
Jun 23, 2015
I am playing around in a test environment with SQL Server 2014. I have a question about the default location of the report server databases when you have multiple report server instances installed on one server.
I did a very simple install of SQL Server 2014 with the database and Reporting Services in Native Mode (install only) features selected. Accepting the default locations, I ended up with the following locations as you would expect:
C:Program FilesMicrosoft SQL ServerMSSQL12.MSSQLSERVER
C:Program FilesMicrosoft SQL ServerMSRS12.MSSQLSERVER
Running the Reporting Services Configuration Manager, I created the Report Server database. After creating the Report Server database, the related files will be located below in the SQL folder as I would expect.
C:Program FilesMicrosoft SQL ServerMSSQL12.MSSQLSERVERMSSQLDATA
Next I installed another instance, which I called Test, of SQL Server 2014 like I did above. I now have the following folder structure the Test instance as I expect.
C:Program FilesMicrosoft SQL ServerMSSQL12.TEST
C:Program FilesMicrosoft SQL ServerMSRS12.TEST
Running the Reporting Services Configuration Manager, I select the Test instance and create the Report Server Test instance database.
My first thought would be that the Test instance Report Server database would be in the following location:
C:Program FilesMicrosoft SQL ServerMSSQL12.TESTMSSQLDATA
Instead this database is located with the default instance Report Server database:
C:Program FilesMicrosoft SQL ServerMSSQL12.MSSQLSERVERMSSQLDATA
View 2 Replies
View Related
Nov 12, 2014
I read , When sql server Database having multiple data files within single filegroup then sql server writes data in multiple proportional file algorithm where the amount of data written to a file is proportionate to the amount of free space in that file, compared to other files in the filegroup.
so if there is no filegroups created and multiple secondary files are attached in databse , is there same way data stored and writes data in multiple files by the same algorithm or any different way.
View 2 Replies
View Related
Jan 30, 2015
I am trying to create a logon trigger. As I am testing this, I discovered that each time I do a connection, I get 19 rows, inserted into my audit table. I ran profiler, and I see it is going through the logon trigger multiple times, for a single connection. So, what am I doing wrong? The code is fairly simplistic, and the profiler doesn't give a clue, as to what is going on. When I look at the output, I see the spid for the first couple of connections are different, then a spid, that is different from those 2 is in the next 17 rows. But, when I do an sp_who2, that spid does not exist.
This issue was noticed on a 2012 version, that I was first testing on, then had the same issue on a 2008 R2. I am currently testing on a 2014 version, that is doing the same thing. Is the logon trigger itself, firing, and causing this?
I also tried using the After Logon option, and got the same issue.
Here is the code:
CREATE TRIGGER LogonAuditTrigger
ON ALL SERVER WITH EXECUTE AS 'sa'
FOR LOGON
AS
BEGIN
DECLARE @Body NVARCHAR(2000),
[code]....
View 0 Replies
View Related
Mar 6, 2008
Based on our database infrastructure, we need to secure our SQL databases. The security issue concerns on allowing a limited number of Domain Admin users to access the SQL databases.
We tried certain ways, based on the documents in the Microsoft web site, but we couldn€™t reach to the point of preventing the Domain Admin users accessing the SQL databases.
Thanks in advance.
View 5 Replies
View Related
May 7, 2014
A little background on what I am trying to achieve first. We are moving to Azure virtual machines and we will have 8 disks on the SQL Server box. I am adding more files to the primary file group and each file will go on its own drive. I am then rebalancing data across these files by rebuilding all of the indexes on the tables which is working fine. No problems so far all is good.
I now have an additional problem. If there is a lob or blob column on the table, rebuilding the clustered index and all the non clustered indexes doesn't rebalance the blob or lob data across the disks such as it does with in row data.
I cannot find any articles on rebalancing lob or blob data because all the articles say to move to a new file group. I do not want a new file group, I just want to use the primary file group where the data already resides, and just redistribute it evenly in the same way that I can in row data which is working fine.
One solution I thought about was to BCP data out of the table, truncate the table and then BCP back into the table which I imagine would have the desired effect of distributing the data evenly over the files.
View 2 Replies
View Related
Apr 27, 2015
In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?
View 3 Replies
View Related
Oct 30, 2015
When viewing an estimated query plan for a stored procedure with multiple query statements, two things stand out to me and I wanted to get confirmation if I'm correct.
1. Under <ParameterList><ColumnReference... does the xml attribute "ParameterCompiledValue" represent the value used when the query plan was generated?
<ParameterList>
<ColumnReference Column="@Measure" ParameterCompiledValue="'all'" />
</ParameterList>
</QueryPlan>
</StmtSimple>
2. Does each query statement that makes up the execution plan for the stored procedure have it's own execution plan? And meaning the stored procedure is made up of multiple query plans that could have been generated at a different time to another part of that stored procedure?
View 0 Replies
View Related
Aug 27, 2015
I want to set up a database role so that users can use sp_readerrorlog through SSMS. It does a check on membership in the securityadmin role.
I have tested it and can see you can grant execute on xp_readerrorlog but the SSMS GUI uses sp_readerrorlog.
I thought I could create a user/certificate and add the signature to sp_readerrorlog but it's not permitted (likely because it's not a normal database object).
So the other solution is to add the users to the securityadmin role but then explicitly deny alter any login (best done with a custom server role in 2012+ but otherwise just manually in 2008). I tested this out and it works, I'm not able to alter any logins or increase my own permissions, I also did a check of what's reported from fn_my_permissions(null, null) and it shows minimal permissions like I'd expect.
View 0 Replies
View Related
Apr 14, 2015
I inherited a lot of Servers to upgrade to 2014 to include an SSRS Server.
The encryption Key was never backed up and it seems that no one knows what the password is?
Do I have to manually load the reports? There are a lot of Reports.
[URL]
View 4 Replies
View Related
Sep 26, 2013
I want to use BCP to load data from a text file.
By default, constraints are turned off in bcp, so I use the CHECK_CONSTRAINTS hint.
bcp aborts if ANY of the rows contains a FK violation. No data get loaded.
So if I add the -b 1 batch size option, it loads all data UNTIL the first FK violation, but nothing after that.
I want to load EVERYTHING ... except for the violations. But bcp won't let me. Is there a way?
View 2 Replies
View Related
Jan 13, 2014
If I install an instance with Windows Only authentication, and then change it to Mixed Mode, if I enable the sa login, the password has already been set. What is the default? If it's generated, how secure is it? Is the password generated? What algorithm is used for that?
View 9 Replies
View Related
Mar 21, 2014
My sql databases in SQL Server 2014 has the status "suspend" as I saw in SQL Management Studio. I can't restore to serviceable condition sql databases through standard procedures. I need to restore .mdf file.
View 9 Replies
View Related
Jun 18, 2014
I am using a monitoring system where I can monitor a numeric SQL result assuming the result is one field and one row.I would like to do this to say monitor the free available space or percentage on say the Master database. DBCC SQLPERF gives me a few columns and results for all databases on the server.
View 2 Replies
View Related
Jun 25, 2014
In our environment applications are using a DNS name which points to the physical server ip address. Now we are planning to move to 2014. We are planning to have servers in different subnets so we will be having two ip adresses for listener. How we can point the DNS to the listener ips? If failover happens can the DNS point to the exact ip address of the listener where it's primary node?
View 1 Replies
View Related
Jul 31, 2014
Is there a way to schedule a sql job to run at different intervals
For eg:
The job should run at
7:00 Am
8:00 AM
and then at 10:00 Am
View 3 Replies
View Related