SQL Server Admin 2014 :: Check If UNC Path Exists (It Is Folder - Not File)
Oct 29, 2015
usual way to check if file exists
DECLARE @File_Exists INT
EXEC Master.dbo.xp_fileexist 'serverBSQL_Backupfile.bak', @File_Exists OUT
print @File_Exists
1
And if check folder, can use "nul", but it doesn't work for UNC path
DECLARE @File_Exists INT
EXEC Master.dbo.xp_fileexist 'serverBSQL_Backupul', @File_Exists OUT
print @File_Exists
0
If use xp_subdirs like:
EXEC master.dbo.xp_subdirs 'serverBSQL_Backups'
If the folder doesn't exists, Msg 22006, Level 16, State 1, Line 3 xp_subdirs could not access 'ServerBSQL_Backups*.*': FindFirstFile() returned error 67, 'The network name cannot be found.'
how to check if UNC folder exists in Backup? in my code I want to check if the unc folder exists before doing backup, the unc path is retrieved from other table or backup history.
DECLARE @File_Exists INT EXEC Master.dbo.xp_fileexist 'serverBSQL_Backupfile.bak', @File_Exists OUT print @File_Exists
1
And if check folder, can use "nul", but it doesn't work for UNC path
DECLARE @File_Exists INT EXEC Master.dbo.xp_fileexist 'serverBSQL_Backup ul', @File_Exists OUT
print @File_Exists 0
If use xp_subdirs like:
EXEC master.dbo.xp_subdirs 'serverBSQL_Backups'
If the folder doesn't exists,
Msg 22006, Level 16, State 1, Line 3 xp_subdirs could not access 'ServerBSQL_Backups*.*': FindFirstFile() returned error 67, 'The network name cannot be found.'
How to check if UNC folder exists in Backup? in my code I want to check if the unc folder exists before doing backup, the unc path is retrieved from other table or backup history.
Due to a SQL 2014 cluster installation failure related to security while setting up the primary node, I had to remove the node and any related installation programs from the node and redo the installation again. However I am unable to use the machine name as the SQL network name during the instance configuration, I get the following:
Microsoft.SqlServer.Chainer.Infrastructure.InputSettingValidationException: The SQL Server failover cluster instance name "MachineName" already exists as a server on the network. Specify a different failover cluster instance name.I am still getting the same name since the node name "MachineName" is listed under the cluster name. I have used machine name as SQL network name without an issue. I do not have any existing SQL machine name in network using same machinename which I want to use for this installation.
I have a Windows Server 2012 R2 2 node cluster with SQL Server 2014 FCI installed. Data files are on a separate Windows Server 2012 R2 file server. Data files share has been permissioned to the SQL Server service and SQL Server Agent service accounts as Full Control. NTFS Permissions are Full Control.
When I try to attach a database CREATE DATABASE AdventureWorksDW2012 ON (FILENAME = 'apricotmssql_VIOLETMSSQL12.MSSQLSERVERMSSQLDATAAdventureWorksDW2012_Data.mdf') FOR ATTACHI get this error: Msg 5120, Level 16, State 101, Line 4 Unable to open the physical file "apricotmssql_VIOLETMSSQL12.MSSQLSERVERMSSQLDATAAdventureWorksDW2012_Data.mdf". Operating system error 5: "5(Access is denied.)".
If I log into the file server (called APRICOT) and look at the NTFS permissions they all look good. I have also reapplied the NTFS permissions from the root folder down.
EDIT If I log on to one of the nodes in the cluster as the SQL Server service account and navigate to apricotmssql_VIOLETMSSQL12.MSSQLSERVERMSSQLDATA and copy and paste the data file, it works fine.
EDIT2: If I log on to the file server and Enable Inheritance at the root level, then Replace all child objects with inheritable permission entries from this object, I get this error:
User Account Control settings on all nodes and the file server are set to Never notify
I've been running the Ola Hallengren maintenance script for the last five months without missing a beat. Today I find an error stating the UserDatabase Integrity check job failed last night. This is running on SQL Server 2014 BI edition w/64 Gigs.
I ran a DBCC CHECKDB on each database manually and all worked until I tried it on the biggest one that is about 18 gbs. It just keeps running and I eventually stopped it so I'm guessing it is memory, but doesn't make sense considering it has 64 gbs. I have it set to 64/4 max / min. Again, this was never an issue until last night.I've been looking up all morning, but not seeing much on this error "The operating system returned error 1453"?
I wrote the below script to print all folders and files located in the share path. How to extend my script to mention by adding another column whether the file is a folder/file , sort of 0 or 1.
- An MSSQL 2014 Standard server that houses multiple small databases (in excess of a hundred). - These databases are frequently dropped and restored by an application that uses this SQL Server. - There is a business need for this setup at this time, so I can't get away from it. Therefore answers like "don't have so many small databases that are frequently dropped and restored" would be somewhat unuseful
This is the problem I have:
- When I connect SSMS 2014 to the server and expand the "Databases" node, it takes forever to display. In comparison, SSMS 2008 connected to SQL 2008R2 server with the same number of databases displays the Databases tree very quickly.
I ran a trace to see what exactly SSMS 2014 is doing. When the "Databases" node is expanded, it runs a query that checks each database for Memory-Optimized Tables (new and wonderful feature of SQL 2014 for sure, but I'm not using it, at least yet). Naturally, when you have to loop through over a hundred DBs, it takes time. Worse yet, if one of these DBs is in process of being restored, the query sits and waits to time out before proceeding to the next DB. Sometimes this causes outright timeouts. Here is the query:
use [MyDatabase] SELECT ISNULL((select top 1 1 from sys.filegroups FG where FG.[type] = 'FX'), 0) AS [HasMemoryOptimizedObjects]
To be sure, this is NOT a SQL Server performance issue. This server processes a rather heavy workload and has been doing so for over a month, and the workload completes within expected time limits or better. Even so I've done some basic performance measuring, and the server itself is quite all right.
Moreover, if I connect SSMS 2008 to it, I get an error message (Index out of bounds or somesuch), but SSMS 2008 does connect, and displays the Databases tree much faster than SSMS 2014.
I'd like to turn off the option to check for Memory Optimized Objects altogether, as I'm not using the feature.
Previously same records exists in table having primary key and table having foreign key . we have faced 7 records were lost from primary key table but same record exists in foreign key table.
For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.
Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?
In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?
I need to create a script task in sql server 2008 R2 to check if a file exists in a directory. For example, to see if output.dat exist under c:results. If the file exists, then send out an email stating the file exists, if not then send out another email stating the file does not exists.I noticed there is a huge difference between the script task in sql 2005 and sql 2008 r2.
My sql databases in SQL Server 2014 has the status "suspend" as I saw in SQL Management Studio. I can't restore to serviceable condition sql databases through standard procedures. I need to restore .mdf file.
A log file size of a production database has been increase from 4gb to 150 gb initial size.Now i want to find when it will grow & how much it grow & which transaction is responsible for this.
1. I have sql 2008 R2 running on my LocalHost. 2. Created Data Base [Customer]. 3. Created Linked Server [CUSTOMERLINK] USING Microsoft Jet 4.0 to link to Drive F:Data which has DBF files in it. 4. Create dbo.Customer_Upload Table. 5. INSERT INTO [Customer].[dbo].[Customer_UpLoad] ([Name],[Email]) SELECT NAME,EMAIL FROM [CUSTOMERLINK]...[CUS]
All this works fine. I can even put it in to an After Insert Trigger on another table and it works.
My problem is that I need this to work in a scheduled job.
F:Data is just a folder with files in it.
This info is from a Restaurant POS system and I need to update it every night.
I have tried every which way to to setup the security issue as there isn't any login security on the folder and SqlServerAgent wants security.
I have a database it is 50 gb with hundreds of columns. I would like to choose a certain column and convert the data in it to .csv or excel file. How can I do that I am very new to MSSQL...
I had to reinstall my local copy of SQL a few weeks ago, which naturally overwrote the
msdb.dbo.sysmanagement_shared_server_groups_internal and msdb.dbo.sysmanagement_shared_registered_servers_internal tables.
However I still have the local XML file that SSMS reads so I can still access the groups, I just get weird errors when trying to re-register my install as the new CMS. How to rebuilt those tables from the XML file or know of a way to repopulate?
We have one table where we store all documents in one of the column called "Doc" with varbinary(max) data type.
We want to download those documents from sql table to windows explorer and i wrote BCP in sql 2005. And things were fine.
The format file I used there looks like this,
9.0 1 1 SQLBINARY 0 0 "" 1 Doc ""
Now we are in 2014 and when I try the same code with same format file, it hangs in the middle. So I changed the file to 12.0 instead of 9.0 but still not working.
is there a way to restore all file groups except one? example: Database A has 10 filegroups, but 1 of them is defunct, so i cant delete it and there's no backup for restore it.Can I create a new DB restoring the 9 good FGs from a database A's backup?
I have a windows 8 pc that I just got and installed sqlexpress 2014. My buddy haw windows 7 and installed sqlexpress on his pc. We create a db on his pc, did a backup, copied the backup to my pc. In ssms I right click on "database" > restore database. click device and the button to find my file. I navigate to the folder where the file shows in file explorer but the .bak file does not show in ssms to restore from. This is probably a windows thing but I have don't know what to look at.
I was running an operation to shrink file/emptyfile a data file, and then remove it.
It blocked and caused a huge mess, I suspect on the removal part. But I want to confirm that the emptyfile completed (and that the engine isn't going to try to put more data in there for when I schedule the removal part again a week or more from now).
How does the engine know not to put any more data in there, and how long does that situation last?
While i execute dbcc sqlperf(logspace); I get following values.
Database NameLog Size (MB)Log Space Used (%) master 16.17969 13.30275 tempdb 7.429688 61.7245 model 0.7421875 45.78947 msdb 5.554688 25.87904 distribution 2808.93 0.8172179 BANKDB 23438.87 48.20037 WSMIRSDB 109.7422 4.839111
For database BANKDB , Log Space used(%) is 48.83% and Log size is about 23438.87 where as my database size of BANKDB is 60 GB. FULL database and Log back is done every day night one time. My database is performing slow now.
Do we need to take log backup frequently like once a 1 hour so that Log space used will be less. Same query is taking more time to execute than before in same database is it because of log file has increased.
I do index organize and rebuild once a week and stats apply nightly.
Is it correct once log space size is increasing more than 10%. Do we need to take log backup?
Database File Placement Layout? We are planning to implement a new SQL Server 2014 OLTP Database with a 1 TB Data file and 1 TB Log File. I am looking at the possible layout of the database files and trying to determine the best possible configuration. My knowledge/research tells me that items which need separate storage due to constant simultaneous access are:
Data files – should go on the fastest reading storage. Log files – should go on the fastest writing storage. TempDb – involves a lot of writing at the same time the data files are being read. Indexes - (including full text indexes) - involves a lot of writing at the same time the data files are being read.
Also, are there any benefit to having multiple OLTP Database Log files? Because SQL Server writes to the log file sequentially, I do not see any advantages to having multiple database log files. In a SQL Server 2012 Class I took last summer, under “Determining File Placement and Number of Files”, it states “Use a single log file in most situations as log files are written sequentially.”
I recently installed standalone version of SQL 2014 Standard on my work computer. I used Access before but I want to use a SQL server instead.
We have a shared drive that a file gets deposited every day at midnight. I want to be able to get this file and import it to the server (its basically a list of names).
Here what I have done so far:
I created the database
Created the file and successfully imported data into it using the Import Data feature.
I saved the SSIS package
Scheduled an Agent Job for this package to run at certain time,daily
At first the jobs would fail with a Access is Denied. I added a user under Credentials with my network account ( have admin rights on the work computer).Also added a Proxy for the Credential user I made.
Jobs fail with a “Cannot open data file” error. I tried changing things here and there, but I can’t get it to work.
I have installed SQL 2014 (Evaluation Version) on testing machine. We want to import some excel files on database. I manually created one Test Database and now trying to import excel file. Import completed successfully but I am not able to see any table created as result of Import. I tried it 3-4 times and even restarted sql services but no luck.
In SSIS, I need an easy way to see if a file exists, and if not wait for it until a timeout period expires. Here are the options I've discovered, along with the issues I've had:
a) The File Watcher task from www.sqlis.com
This was my first attempt. The task works great, BUT only detects when there is a change on the file. If the file already exists, it keeps waiting which is not the behavior I need. b) The WMI Event Task
There is very sparce documentation on this event and how to write a WQL query. There are numerous examples of monitoring a folder and if any files appear, cause an event to happen. I need to detect for a specific file. I found maybe one example of this using "PartComponent" but wasn't able to get the sytax right to make it work for me. I also need to access a remote file share using a UNC path (e.g. \servernamepathfile.txt) which I could not get to work. c) Script Task using the File.Exists() method
I imported the System.IO namespace, and used a File.Exists(\servernamepathfile.txt) with actual success, but am not sure of the best way to continue to wait if the file is not found immediately. I also want to modularize this approach so I can wait for several files simultaneously so was thinking of implementing this script task as a package by itself to accept variables (filepath & timeout period) but need to know if anyone has had success with this approach. I'm open to suggestions or ways to get options a) and b) to work for my needs. Thanks! Kory