DB Engine :: Auto Adding New Files?
Jul 2, 2015
I use management studio on the sql server. Each time i want to run scripts over new data, i have to delete the old files in a database and import new ones (from csv files to .dbo). These are the same files everytime except that the data change. Is it possible to make an automated proces for this import?
View 2 Replies
ADVERTISEMENT
Apr 21, 2015
Is it possible/advisable to change this setting with users connected? There are a number of web based users and an agent job running every 30 seconds.
USE [master]
GO
ALTER DATABASE [Bla] SET AUTO_SHRINK OFF WITH NO_WAIT
View 5 Replies
View Related
Feb 20, 2008
i have to import a csv file into a database via ssis and so far everything works fine. but now i have to add a column where a incremental int should be inserted, so every row should have a unique number ... i tried to realize this by using derived col transofrmation, but without success ... has someone an idea how to do this?
View 10 Replies
View Related
Jul 23, 2015
Is it possible to manually force/call/start the system AUTOSHRINK process? I have an issue that appears only when the engine shrinking process is running and I need this to reproduce my bug.
I know how to start a "regular" database shrink process with:DBCC SHRINKDATABASE(xxxx);, but this is not the same as one started from the database engine.
View 13 Replies
View Related
Jul 15, 2015
We have SQL Server 2012 EE SP2 server that is getting hit by reports that time-to-time use up to 300GB of tempdb.
It's happening because our clients can use 'bad' parameters for some reports.
Is there a way to autokill sessions that overuse tempdb (for example, if a session uses more than 15Gb in tempdb)?
Killing can happen immediately or after regular check (5 min for example).
View 3 Replies
View Related
Jul 26, 2007
Hi guys,I followed the ASP.net official tutorial to create a DAL & Business Logic Layer (http://www.asp.net/learn/dataaccess/tutorial02cs.aspx). I have a table with a int ID field. I wish to write a function to add a new entry into this table but have the ID field auto-increment.The ID field is set as the Identity Column and has a Identity Increment & Seed of "1". If I manually go to the table and insert a new record leaving the ID value null it automatically increments. But if I create a C# function to add a new entry I get an error saying that the ID field can't be Null. Is there any way to use the Update method as shown on line 14 below to add a new entry but with it automatically incrementing? I did create a function called InsertDevice that simply inserts the other fields using a SQL INSERT and it auto-increments fine, just wondering if there is a way to do it using the DataTable and the Update method? Thanks for any help!!! 1 public bool AddDevice(string make, string model)
2 {
3 //cannot have the same device entered twice!
4 if (Adapter.FillDeviceCountByMakeModel(make, model) == 1)
5 return false;
6
7 RepositoryDataSet.DevicesDataTable devices = new RepositoryDataSet.DevicesDataTable();
8 RepositoryDataSet.DevicesRow device = devices.NewDevicesRow();
9
10 device.make = make;
11 device.model = model;
12
13 devices.AddDevicesRow(device); << Error thrown Here!
14 int rows_affected = Adapter.Update(devices);
15
16 return rows_affected == 1;
17 }
View 3 Replies
View Related
Nov 21, 2014
We have implemented a very small reporting database which has a main table that started off small and has now grown to around half a million rows. Initially, there were no indexes on the table apart from a clustered index, but as the data has grown, performance has dropped and so we have added a number of indexes. This has resolved the performance issues.
Before creating the indexes SQL Server had auto created a number of statistic objects (_WA_Sys_000... etc). After creating the indexes, new statistic objects where created for the new indexes. In some cases, there are duplicate statistics (auto and index) for the same columns.Should I go through and drop the duplicate auto statistics? Will having duplicates cause issues at all?
View 2 Replies
View Related
Oct 19, 2005
Hello all,I'm using SS2K on W2k.I'v got a table say, humm, "Orders" with two fields in the PK:OrderDate and CustomerID. I would like to add an "ID" column whichwould be auto-increment (and would be the new PK). But, I would reallylike to have orders with the oldest OrderDate having the smallest IDnumber and, for a same OrderDate, I'd to have the smallest CustomerIDfirst. So my question is:How could I add an auto-increment column to a table and make it createits values in a particular order (sort by OrderDate then CustomerIDhere)?In the real situation, the table I want to modify has around 500krecords and the PK has 5 fields and I want to sort on three of them.Thanks for you helpYannick
View 7 Replies
View Related
Apr 22, 1999
Hi,
We will shortly be receiving msg files via MS Exchange and wish to import the data into a temporary table for processing into other tables in the database.
Is there a way that I can get SQL Mail to read the mail and export the message data into a temporary table?
Here is an example of the mail message content:
NAME = xxxxx xxxxxxxxxxxx
ADDRESS = x xxxxxx xxxxxxxxxxx xxxxxxxxxxxx
CITY = xxxxxxxxxxxx
STATE = xx
ZIP = xxxxxxxx
COUNTRY = xxxxxxxxxx
EMAIL = xxxxxxx@xxxxxxxxxx.xx
HOME_PHONE = xxxxxxxxxxxx
BUS_PHONE = xxxxxxxxxxxx
GENDER = xxxxxx
AGE_RANGE = xx-xx
CURRENT_BIKE_MAKE = xxxx
CURRENT_BIKE_MODEL = xxxx
REPLACE_YEAR = xxxx
REPLACE_MONTH = xxxxxxxxxxxxxxxxx
REQUEST_INFORMATION_ON = xxxxxxx
Davy
View 1 Replies
View Related
Jun 30, 2015
Can we add multiple log file in existing database?
View 8 Replies
View Related
Aug 15, 2014
I have set the environment set for AutoRecover (for every 3 minutes and Keep information for 7 days under the SSMS 2014 Menu: Tools -> Option ->Environment -> AutoRecover).
I've rebooted the box and restarted the SQL Server service and nothing seems to create the files.
View 4 Replies
View Related
Jul 6, 2015
I have dataware house database and it's size is 2 TB with simple recovery model.I want to reduce it's size because everyday before loading table gets truncate.Is it best practice to shrink the datafiles?database having 5 data files and one log file.what is the best way to reduce?
View 7 Replies
View Related
Oct 14, 2007
i am trying to add a transaction log file which is the same size and characteristics as the existing log file in sql sserver
would this be correct?
im not sure what would be the maxsize or the filgrowth of an existing log file.
Alter database db1
Add log file
( Name = db1
Filename = db1.log
Size = 1MB,
Maxsize = ,
Filegrowth = %)
View 4 Replies
View Related
May 16, 2015
I have C,D,E drives on server. Data files will be on D and Log on E. My question is what is best practice for data and log files for system databases during sql server installation selection? Should they be on C drive along with SQL Server installation or D & E? If they should not be on C then what is the reason and what is benefit to move them on other drives.
View 9 Replies
View Related
Jul 25, 2001
Problem: I need to build several databases on a quarterly basis. The databases range in size from 30 GB to 250 GB. I want to keep each dtabase file <= 20 GB so my databases contain from 2 to 13 database files (I put these all in one filegroup. the filegroup is separate from the PRIMARY filegroup which contains only the system tables in my databases).
I would like to create the files for the databases asynchronously (I have four physical drive letters on which to create the files and would like to be building one file on each drive simultaneosly). I can acheive the asynchronous operation by creating a separate job for each of the drive letters and then callin sp_start_job for each of the jobs.
The problem is that the ALTER DATABASE command apparently locks the sysfiles table and three of the four processes are always blocked and I therfore end up build the files serially instaed of in parallel.
Is there a way to make these processes work in parallel?
View 1 Replies
View Related
Sep 15, 2015
How to move the database backup files (.bak) from one server to another server using ‘XP_CMDSHELL’
View 2 Replies
View Related
Jul 15, 2015
We have the following requirement :
1. In daily basis auto loop through each item in the invoice table.
2. passing invoice number into a Summary SSRS report as parameter.
3. Auto download ALL generated PDF reports into a window folder with a special file name format i.e. <INVOICE_NO>_<DATE>.pdf
how to achieve this via SSRS, Store Procedure or Power Shell?
View 4 Replies
View Related
Jan 24, 2008
We have 1 TB database and we recently got space so
1) can i add data files and put in different disk in production hours
2) what are the effects of doing this.
JUst want to get expert advise
View 1 Replies
View Related
Jun 15, 2007
hi all,
i have a package here which updates a DB from a flat file source.now the problem is i may get multiple files.i have used a for each loop to handle this. it takes files based on the files name9(file names has a timestamp in it).but i want to give files in order of its Creation time.
Please help me on this.i have written a script task before the for each loop and i have got the minimum creation date from all the files,i am not able to going forward from here.
does any body has an idea!!
View 4 Replies
View Related
Dec 12, 2005
I assume this is the best forum for this quesiton; if not, please direct me.
View 3 Replies
View Related
Oct 23, 2015
Have a SQL2008R2 instance on a VM where the single .mdf for the tempDb database is located on a high contention disk. I've managed to get another 60GB disk and thought it would be a good time to move the .mdf and also increase it's size and number of files.
The server has 12 cores and after a bit of reading I've decided that it would be best just to have four files for this database as the 1 file per core (-1) seems to be disputed.
-- Move the existing file to the new disk and rename it.
ALTER DATABASE tempdb MODIFY FILE (NAME='tempdev', FILENAME='E:SQLData empdb0.mdf');
-- Change the size to 1GB
ALTER DATABASE tempdb MODIFY FILE (NAME='tempdev', SIZE= 1048576KB, FILEGROWTH=5%);
-- Add three new files, all with the same size & growth
ALTER DATABASE [tempdb] ADD FILE ( NAME = N'tempdev1', FILENAME = N'E:SQLData empdb1.mdf' , SIZE = 1048576KB , FILEGROWTH = 5%)
ALTER DATABASE [tempdb] ADD FILE ( NAME = N'tempdev2', FILENAME = N'E:SQLData empdb2.mdf' , SIZE = 1048576KB , FILEGROWTH = 5%)
ALTER DATABASE [tempdb] ADD FILE ( NAME = N'tempdev3', FILENAME = N'E:SQLData empdb3.mdf' , SIZE = 1048576KB , FILEGROWTH = 5%)
-- Now restart the instance.
Also, what are peoples thoughts on percentage growth for tempDb? I've read that it's not recommend and yet it seems to be the norm.
View 4 Replies
View Related
Feb 24, 2014
When opening .sql files, I get a connect to database engine prompt every single time. how to stop this from prompting vs. just using my current active connection?
View 4 Replies
View Related
Sep 17, 2014
When the database is configured for mirroring and you want to do partitioning on that database, How can we do? Is this similar process or any variation there while adding file groups and files? The partition will reflect in the mirroring database also?
View 1 Replies
View Related
Sep 25, 2015
I want to schedule a job which pulls files from a non SQL server (Sybase) which later needs to have a step 2 kicking the ssis package. Problem is that, on the source a batch file will run every 4 hours and outputs total of 10 text files. (takes 5 minutes complete). Now, on destination i want to pull these files via SQL job but while scheduling;
1. I don't see any option saying like 4 hours 10 minutes or so
2. If its out there, then i believe this might be a problem as this time would be an increment one e.g next run would be 4 hours 20 minutes in that case.
How should i achieve pulling these files up because we have an SSIS package on destination that needs those text files to be used as soon as they arrive on SQL server(destination)
View 2 Replies
View Related
Feb 16, 2014
Since upgrading from SQL Server Management Studio 2008 R2, I've noticed that it no longer autosaves queries that have not been manually saved first. If a file has been manually saved the autorecover files end up in the following directory:
%appdata%MicrosoftSQL Server Management Studio11.0AutoRecoverDatSolution1
However, I have ended up in the situation where I have unsaved queries when my computer has crashed and have not been able to recover them.
I have also found references to .sql files stored in temp files in the following directory, but the files here seem to be very haphazardly caught:
%userprofile%AppDataLocalTemp
View 2 Replies
View Related
May 14, 2007
Hi, all experts here,
Thank you very much for your kind attention.
I am trying to modify the files path (primary file, log file) of databases, but it looks like I am not able to mofidy their files path directly from the database property dialogue? Would please any experts here give me some ideas on what else can I try to figure it out? Thanks a lot in advance and I am looking forward to hearing from you shortly.
With best regards,
Yours sincerely,
View 5 Replies
View Related
Jan 23, 2004
I have an MS SQL Server table with a Job Number field I need this field to start at a certain number then auto increment from there. Is there a way to do this programatically or within MSDE?
Thanks, Justin.
View 3 Replies
View Related
Sep 3, 2015
Is there any way or option to get the all columns of dataset added to table when we add a table in data region. It will take lot of time to add one by one and also there are chances to add one column ore than once.
View 7 Replies
View Related
Aug 6, 2013
What is the syntax for adding a column where you are adding a year to a date in a date format? For example adding a column displaying a year after the participation date in date format?
View 1 Replies
View Related
Jan 9, 2015
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
View 2 Replies
View Related
Mar 13, 2008
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
Any Suggestions?
View 3 Replies
View Related
Apr 24, 2008
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1:
----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class
example for step 2:
-------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
'Set the properties like username & password
cm.Properties("ServerName").SetValue(cm, "ftp.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
'Connects to the ftp server
ftp.Connect()
'ftp.SetWorkingDirectory("..")
ftp.SetWorkingDirectory("directoryname")
Dim folderNames() As String
Dim fileNames() As String
ftp.GetListing(folderNames, fileNames)
Dim maxname As String = ""
For Each filename As String In fileNames
' whatever operation you need to do to find the correct file...
Next
Dim files(0) As String
files(0) = maxname
ftp.ReceiveFiles(files, "C: emp", True, True)
' Close the ftp connection
ftp.Close()
'Set the filename you retreive for use in data flow
Dts.Variables.Item("FILENAME").Value = maxname
Catch ex As Exception
Dts.TaskResult = Dts.Results.Failure
End Try
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
View 16 Replies
View Related
Feb 24, 2015
I have the need to delete old backup files via TSQL job. Found this solution online:
PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
View 6 Replies
View Related