I was in the process of migrating a server from one physical box to another. They are identical drive setups, same OS (2003), same SQL install (2005). Our server team did a 'PlateSpin', which copies the drives from one server to another, as long as the files are not in use. I did not reinstall SQL on the new box, i let the 'PlateSpin' tool copy everything over for me. I then stopped the SQL services on the old server and new server and copied over all of the system database (.mdf & .ldf) files. As soon as i started up the services on the new server, it looked great with one exception. The TempDB was only showing one datafile. When i queried sys.master_files, it was showing me 8 TempDB files. I tried restarting the services, but i still saw the same, only 1 file. I then tried to re-add TempDB files with the same name, but it would error saying they already existed. In turn, i could add new files with different names and they showed up fine. However, on a restart, they would not show up in the properties of the TempDB.
When i queried, sys.master_files again, i now had 16 Temp db files listed in the results. I deleted all but the original single file that was recognized out of the sys.master_files table and re-added the additional 7 files with he original names, restarted the service and then they all appeared.
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
I have a server which is not running optimally and I checked the default trace. I have around 600 entries in the default trace which are all Missing Column Statistics and the database is tempdb.is_auto_create_stats_on and is_auto_update_stats_on are both 1 for tempdb.
Hi all, I have a tempdb that consists of 8 datafiles, tempdb_data_1 totempdb_data_8, each is 8GB. Now how can I drop 7 of them and leaveonly tempdb_data_1? Can this be done? Thanks a lot.
I'm having an argument with our infrastructure architect who has just gone and bought lots of SSD drives to use for our tempdb data and log files, sounds great doesn't it? There is a catch though, his plan is to add the disks to the two available slots in each blade in a RAID0+1 configuration, effectively giving you one usable drive, and adding both data and log files on to one disk.
I then pointed out that SQL Server best practice is to host tempdb data and log files on two separate drive to reduce contention. The architect then basically said that because this isn't spinning disk the issue of drive, r/w contention isn't an issue I don't agree with this and wanted to get some opinions from the community, I'm still advising that two separate disks should be used but someone just went and spent £80k ($150k) on SSDs and doesn't want to back down...
Documentation that supports the placement of Tempdb files on the root of a drive, i.e T: instead of T: empdb. I am positive this is not a best practice, but when challenged could not find any documentation that would support that view.
It's been a long time since I've tried this, but I have a SQL Server that needs to be restored (including master) to a server whose drives and corresponding folders match the source server, with the exception of tempdb. When SQL Server initially starts I believe it will fail since it cannot find tempdb. I just don't recall if it fails to startup or if it starts up reporting errors and recreates tempdb in the same location as master. Does anyone recall the steps needed to point SQL Server to the new location of tempdb?
So we have new servers that are going to be installed with SQL 2012 and I'm debating the wisdom of splitting tempdb with multiple files.
I know it's a myth that performance automatically improves if you split it into a number of files based on processors, but I'm debating the wisdom of putting a file on each of my data / log file drives.
For instance, I have a server with a C: drive (OS), D: drive (Data for system DBs and install of programs - 458 GB), an F: drive for user DB data files (767 GB), and a J: drive for log files (255 GB).
Obviously no files are going on C:. I'm debating on whether or not we should even leave system DBs on the D: drive given in our current 2k8 servers, we end up with Memory.dmp files over flowing the D: drives as well as .cabs and other install / update files that tend to collect on that drive over the years.
But if we leave the system DBs on D:, I'm wondering if adding a second tempdb file to F: and a third to J: will improve query performance or not.
We had someone create an extra data file and log file for tempdb. Sowe currently have two data files and two log files. Is it possible todelete the newly created data and log files? If I just delete thephysical files, I assume they'll get created as soon as SQL Servergets started back up. Any help would be great, since a single dataand log file for tempdb is my goal.Thanks much.sean
Have a SQL2008R2 instance on a VM where the single .mdf for the tempDb database is located on a high contention disk. Â I've managed to get another 60GB disk and thought it would be a good time to move the .mdf and also increase it's size and number of files.Â
The server has 12 cores and after a bit of reading I've decided that it would be best just to have four files for this database as the 1 file per core (-1) seems to be disputed. Â
-- Move the existing file to the new disk and rename it. ALTER DATABASE tempdb MODIFY FILE (NAME='tempdev', FILENAME='E:SQLData empdb0.mdf');
-- Change the size to 1GB ALTER DATABASE tempdb MODIFY FILE (NAME='tempdev', SIZE= 1048576KB, FILEGROWTH=5%);
-- Add three new files, all with the same size & growth ALTER DATABASE [tempdb] ADD FILE ( NAME = N'tempdev1', FILENAME = N'E:SQLData empdb1.mdf' , SIZE = 1048576KB , FILEGROWTH = 5%) ALTER DATABASE [tempdb] ADD FILE ( NAME = N'tempdev2', FILENAME = N'E:SQLData empdb2.mdf' , SIZE = 1048576KB , FILEGROWTH = 5%) ALTER DATABASE [tempdb] ADD FILE ( NAME = N'tempdev3', FILENAME = N'E:SQLData empdb3.mdf' , SIZE = 1048576KB , FILEGROWTH = 5%)
-- Now restart the instance. Â Also, what are peoples thoughts on percentage growth for tempDb? Â I've read that it's not recommend and yet it seems to be the norm.
I have a tempdb split into 4 files (5 if you include the log).
Autogrowth is disabled on the mdf/ndf files so that they can be used round robin (1 file per logical CPU).
Is there a way to be alerted when there is x% of free space left?
I know hwo to check the free space via t-sql but want to be able to be alerted. I could run a sql job that reports the free space and send a database mail message if under x% but wondered if there was a built in (or better) method?
I was in the process of creating additional TempDB.ndf files, and received an error saying they already exist. I checked the location and it was empty, nothing to see here. So I looked in sys.master_files and there are several tempdb files listed in various locations, all of which come up empty.
So the files are listed as online in sys.master_files, but they do not exist on the server. I restarted SQL services but it did not change anything.
Just found that my tempdb is always full whenever I run a query against a large database. Could please any experts here give me any advices on what is tempdb database used for and how to determine what files can be deleted from it?
I am looking forward to hearing from you and thanks a lot in advance.
I'm running this procedure which insert into table_name(id, name.....) select id, name.... from table_name. For some reason the tempdb data file grow up to 200GB. The tempdb is set to expand unrestricted by 10%. How can I prevent that from hapening? Thanks.
My Snapshot Agent for SQL Server 2005 sometimes does not fully complete. I am expecting around 2073 files in the snapshot folder but only get 492 (this number does change). I have immediate_sync set to true, so I should get a full snapshot every time. It seems to quit after all data has been BCP'ed out and before the schema generation starts. I do get the "The replication agent has not logged a progress message in 10 minutes. This might indicate an unresponsive agent or high system activity. Verify that records are being replicated to the destination and that connections to the Subscriber, Publisher, and Distributor are still active. " message in the MSSnapshot_history table.
I am guessing that there is some sort of blocking that is preventing the snapshot agent from continuing and then it times out and dies. But that is a total guess as I have not been able to observe the behavior, only the outcome.
Any suggestions to what the cause is or how I can fix it?
I am currently investigating aa high avg write time ms issue (145ms) which seems to be only occuring on the tempdb data files.I have followed the recommended setup of TEMPDB in that
1. Data files = number of physical cores 2. Data files and logfiles are on separate partitions away from the other databases. 3. Tempdb is presized and no incremental file increases look like they are happening with frequency.
We have sharepoint 2012 setup on other sql servers and with TEMPDB setup following the same guidelines, with far more Sharepoint activity on a similary specified hardware which is why its confusing.FileIO auditing on the partitions themselves shows that the FileIO is very fast on the partitions that the tempdb data file which leads me to beleive that Sharepoint may be the culprit perhaps due to excess use of tempdb with operations taking a long time to resolve.
HiI can't work with my sql server 2000,When I try to open a table I get the message:"The query cannot be executed because some files are missing or not registered.run setup again to make sure the required files are registered.I uninstall and install again (few time)But all the time i get this message.Does any one know who to fix it??ThanksEfrat
I want to skip running the SSIS data flow task when the source file is missing. We have a scheduler that copies the source file to the staging area. This SSIS package runs as SQL server job. So when a SSIS package fails due to missing file the remaining steps in the SQL scheduler won't execute. I want to handle the missing source file condition grace fully. Please advise.
I don't know if anyone faced this issue. We are having a strange problem. Our process was working well when it was implemented on 32 bit processor.IT ran perfectly for 6 months with out a problem. But when we moved the packages to a 64 bit machine, this issue along with some other issues started to show up.
The issue is we are missing files in the source folder.
Our process is designed such that a source process, brings in a file and updates a status for the file in a audit table. The ETL process picks up the file, then assigns the status as €˜running€™ when SRC process is complete and loads into Target DB, and updates ETL status to complete. But current problem is the ETL is losing files after it assigns the status as running. When we looked into the DB weather the data is loaded, we could not find any data related to these files.
we are have mapping level parameters for source path and target path.
We are using a For Each Loop task, and processing files(which are simple flat files) in the source path. The file name is stored in the mapping level parameter. Once the file is process we are moving them into a target path.
Our src and target file paths are on the same drive, just have src folder, inside src folder we have processed folder and failed folder. So files are picked from the source folder and moved into processed folder after processing. The files are not even moved to a failed folder.
There are lot other processing going on this box, and the trend observed is that when more processors are running at peak hour, the missing files€™ count is more.
Right now we are refetching those files, as a work around, but does any one has any suggestion why this is happening or any better implementation suggestions?
I have a package in which I have enabled "Package Configuration".
When I run the package i am sure that it reads the configuration file and executes the package correctly.
However if I remove the configuration file, the package still executes correctly with the settings which were used at the time of development.
I have event handlers for OnError and OnWarning and both these are NOT invoked.
IMO, this is incorrect behavior because if a package has been configured for "package configuration, then we should atleast have a warning generated that SSIS did not find the configuration and it would execute the package with hard coded values (from the time of development).
Is there any work around for this? how can I make SSIS warn me if the config file is missing for a package which was configured for package configuration?
Been practicing DR strategies with a test SQL instance by following the scenarios listed here: [URL] ....
> Took a backup of the Model database > Stopped SQL Server > Deleted model database data & log file > Started SQL Server and it obviously wouldn't start because TempDB needs a model database present. > Started SQL instance with trace flags 3608 & 3609 > Connected to SQL instance using command prompt. > Issued restore command but was met with this error:
Shared Memory Provider: The pipe has been ended. Communication link failure
And found this in the SQL log..
2015-08-12 16:21:32.83 spid51 Starting up database 'tempdb'. 2015-08-12 16:21:36.88 spid51 Error: 3456, Severity: 21, State: 1. 2015-08-12 16:21:36.88 spid51 Could not redo log record (59:136:21), for transaction ID (0:0), on page (1:20), allocation unit 458752, database 'tempdb' (database ID 2). Page: LSN = (30:165:3), allocation unit = 458752, type = 1.
When I downloaded the files and after burning the DVD to create the DVD to setup the 7031A - Upgrading to Microsoft SQL 2005 course, all of the executable files (BCBase.exe, Upgrade.exe, InPlace.exe, SideBySide.exe and SQL7Upgrade.exe) are not in the Setup/Drives folder...the only item in that folder is a file called "placeholder.txt".
Please, if anyone knows how to get these exectuables let me know.
I have to load on SS2012 hundeds of excel files produced by an application over the last five years, during time few columns have been added to the initial set.I created on SS2012 a table to match with the full set of columns and want to load all the files inside the table leaving the missing cells to NULL. I think SSIS can do the job but every trial failed do far.
in order to use an SSIS package as a data source in a report, I need to enable the SSIS extension in the RSReprotDesigner.config and RSReportServer.config files. That extension is in neither of these files. I have SSIS running on my machine with Reporting Services.
The path to RSReportServer.config: C:Program FilesMicrosoft SQL ServerMSRS10_50.MSSQLSERVERReporting ServicesReportServer
The path to RSReportDesigner.config: C:Program Files (x86)Microsoft Visual Studio 9.0Common7IDEPrivateAssemblies
Why is this extension not in either of these files?
The following is an abbreviated list of what's present in the RSReportServer.config file:
<Data> Â <Extension Name="SQL" Â <Extension Name="SQLAZURE" Â <Extension Name="SQLPDW" Â <Extension Name="OLEDB"
[Code] ....
The following is an abbreviated list of what present in the RSReporDesigner.config file:
<Data> Â <Extension Name="SQL" Â <Extension Name="SQLAZURE" Â <Extension Name="SQLPDW" Â <Extension Name="OLEDB"
[Code] ....
I'm running SQL Server 2008 R2
ProductVersion  ProductLevel  Edition 10.50.1600.1 RTM   Enterprise Edition (64-bit)
The operating system is Windows 7 Professional 64 bit.
In SQL Studio, I can go to the restore window and the click "verify backup media". This would check the restore plan listed in this window and check if some of the file are missed.Is there any way to reach this with TQSL? I know there is a "restore verify" command, but this will only verify one backup/file and not the complete restore path. I'm looking for a TSQL solution which is able to control that all necessary backup files are still present on the file System and not moved or deleted (e.g through cleanup task).
I have configured windows failover clustering 2012 on 4 of my test nodes.
I am trying to add another node into this cluster but its not happening. I am not even able to start the cluster service in services.msc
After installing windows failover clustering, when I go to the C:WindowsCluster folder, I am unable to find CLUSDB, CLUSDB.1.container, CLUSDB.2.container and CLUSDB.blf files in the folder.
These files are very much present on the other nodes where cluster service is running.
I tried copying these files manually to server where its missing but still no luck.
I scheduled automatic backup process but its only showing backup of the only one .sql file in the backup folder. Other created .sql files are not backed up. Why is it so?
"tempdb is skipped. You cannot run a query that requires tempdb"?
We're running a .Net web application with a SQL Server 2000 backend, and we get the error intermittently. Restarting the SQL Server service seems to fix it, as it causes tempdb to be rebuilt, but this isn't a long term solution. Any direction or hints would be greatly appreciated. Thanks! - Mike
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?