I created a few databases but I forgot to change the location from c:program filesd1.mdf.I don't have much space left on c: drive but have lot of space left on e:
How do I move the data and log files to e: drive ?
Can someone tell me how to move the log file for a database from one drive to anoyher. I want to keep the data on one drive and the logs on another so I need to move the log files to another drive. Thanx, Kelly
Without restoring the database, has anyone written a slick little piece of code that moves a file existing on on filegroup to another newly created filegroup without doing it from a restore.
The purpose of doing this is to get rid of one data file and placing the data into the data file we are retaining. We have some more physical file space and are moving data into one data file and one transaction log file.
We have already ran the DBCC SHRINKFILE with EMPTYFILE to move the data, We have already ran the ALTER DATABASE statement but here, because the filegroup is identified as PRIMARY in the sysfilegroup table, we can not REMOVE the data file because of indexes relating to the PRIMARY filegroup.
So if I could ALTER DATABASE and create a bogus filegroup and move the data file which I want to get rid of into the bogus filegroup, I could ALTER DATABASE with REMOVE filegroup and solve the situation.
Does anyone know of an easier way to do it other than BACKUP database and RESTORE database.....please help!!!
I am looping a set of .txt files in a folder using foreach.
Inside foreach, I have a Dataflow task that does the transformation. After that i have added a File System that moves file to archive.
However, If the user puts any other file instead of a .txt file, i want to move the file to Error folder and this should be before the DataFlow starts within foreach.
How can i do this? I want to use script to read the file extension and then move it to the error.
Can someone lend some assistance in this? It sounds like it should beable to be done.We have a large replicated database in SQL 2000. We need to move themdf and ldf files to a location on another drive.Will doing this affect or break replication in any way? Is there anyway to prevent this?Thanks in advance.Glenn DekhayserVoyant Strategies
I'm trying to find some way to implement a move file task in my Data Flow. I parse through a flat file, retrieve a value, then I'm using a lookup table to check if that value exists in a table... if the value doesnt exist, I need to move this file to a new directory and insert the value into a different table. Does anyone have any ideas about how to move the file while using the Data Flow pane?
Or is there some way to pass information from the look up table to the Control flow pane and use the Move File Task there? I'm trying to stay away from a Script task or a Script component unless it's unavoidable. I appreciate any ideas!
I could really use some help on this one. I know virtually nothing about SQL Server. We installed this because our database for the application we're running (Lexis Nexis Time Matters) had gotten too large for the Express 2005 version. We've also installed a new version of the software, and on a new server.
The new server has the drive partitioned into two partitions. When installing the application, I selected drive D (545 gb) as the data drive, but when I installed SQL server, it was installed on drive C (12 gb). I did not realize that the the application data was simply configuration information, and the data for the program is actually being installed on the C drive (Program FilesMicrosoft SQL ServerMSSQL.1MSSQLData).
Is there a simple procedure to move this data, including the SQL instance (TIMEMATTERS) to the D drive? Or should I uninstall and start over?
Hello guys and girls. I have installed SQL Server 2005 Standard Edition and I have specified that the databases should be created on the K: drive. This is okay but now I need to move all the transaction log files (.ldf) to the L: drive. I have already changed the default location for the log files to point to the L: drive and the new databases that were created after the installation have their transaction log file correctly in the L: drive but now I need to move transaction log files for the master, model, temp ... databases. How can this be done? And are there any gotchas?
I set up MA plan to full back up by schedule and I would like to move them each completing process to another computer for preventing failing if harddisk of server was damaged.
How to move them by schedule to any computers ?? Please give me some suggest.
I need to move all log files for my SQL 2005 databases to another drive. I don't wish to shrink the files, I need to move the logs to another drive spindle. I did find an article (Article ID: 224071) that describes moving both the database and logs using sp_detach and then sp_attach. What is the best way just to move the logs to another drive on the same server, and that keeps the databases in their original location? Thanks.
Hello there, I've been told that it is good practice to keep mdf and ldf files in another location... We have it in place for all our user databases, however mdf and ldf files for our system dbs are still at the same location. I was wondering what is the right way of splitting those should be?
I haven't found the definitive answer on how or if this can be done without removing replication. I'm thinking ALTER DATABASE modify_file is the way to go. Anybody know if this will work or a better way to go about it?
I have a job which copies .txt files 24 hours 7 days a week to c:TempSource
What I am planning to do is copy the files from one location to another say c:TempTarget
So I have written the Powell shell script and when i put that in the sql agent job i get below error ;
A job step received an error at line 5 in a PowerShell script. The corresponding line is '$filesToMove = $files | Where -Property "Name" -NotLike -Value $newestFile.Name'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'A parameter cannot be found that matches parameter name 'Property'. '. Process Exit Code -1. The step failed.
Using the below script task I am checking for the excel file existence and upon file existence using the data flow task will load the excel data into sql table. After the data is loaded from one file or however number of excel files present, I want to move those into a archieve folder with datetimestamp to the filenames,please let me know how I can move those files with datetimestamp to the filenames, any help is greatly appreciated. Thanks!!
Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Imports System.IO Public Class ScriptMain Public Sub Main() If File.Exists(ReadVariable("FileNameVariable").ToString()) Then Dts.TaskResult = Dts.Results.Success Else Dts.TaskResult = Dts.Results.Failure End If End Sub
'From Daniel Read's Blog - http://www.developerdotstar.com/community/node/512/ Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
I need to move files for a lot of databases that are all part of an AG. I've used the method at the bottom of this link with success on a small test DB.
Using the below script task I am checking for the excel file existence and upon file existence using the data flow task will load the excel data into sql table. After the data is loaded from one file or however number of excel files present, I want to move those excel files into a archieve folder with date×tamp to the filenames, please let me know how I can move those files with datetimestamp to the filename, any help is greatly appreciated. Thanks!! Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Imports System.IO Public Class ScriptMain Public Sub Main() If File.Exists(ReadVariable("FileNameVariable").ToString()) Then Dts.TaskResult = Dts.Results.Success Else Dts.TaskResult = Dts.Results.Failure End If End Sub 'From Daniel Read's Blog - http://www.developerdotstar.com/community/node/512/ Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
I want to poll an ftp site to see when files arrive, then I would like to download them, and move them into a different directory on the ftp site. It seems like I would have to do a lot to work around the limitations of the FTP Task. Is it capable of this sort of work? If not is there a 3rd party task that is better suited for ftp operations within SSIS?
Currently looping through the set of flat files like CHK0604, CHK0611, CHK0618, and CHK0625 from the source folder C:SOURCE
OBJECTIVE within the flat file if any records/rows cause error i have to move the bad data into separate folder C:ERROR
STEPS TAKEN
1) In FOREACH LOOP component i specified the variable User:: sourceFilePath for my source file CHK0604 etc. location C:SOURCE. The loop walkthrough each file in C:SOURCE and if no error then moves the flat file into another folder C:ARCHIVED. This task is perfectly working.
2) Within the dataflow I am diverting the the bad rows from "conditional component" into "Flat File Destination" Component.
3) "Flat File Destination" Connection manager i set the expressions as @[User:: sourceFilePath] +"_Error.TXT".
ISSUE
Because of point (3) the error file is created in the SOURCE flat file location C:SOURCE.
QUESTION
1) My error file name should be CHK0604_Error, CHK0611_Error, CHK0618_Error, CHK0625_Error created in another folder C:ERROR.
2) How to move the bad data into another directory while looping through a set of FLAT FILES ?
3) If i have to create another variable like @[User:: ErrorFilePath] where to create ? How to use the source file title as the title of error file.?
I have a requirement to move files from HOLD folder to input folder. In HOLD folder I receive multiple files starting with af, ai, ar i.e. af*.txt , ai*.txt, ar*.txt . I need to move one file at a time to input folder as each file is to be loaded into database before next file is processed. In all the files the SSIS has to look at ai*.txt files first followed by af*.txt and lastly ar*.txt. If there are multiple files of same group the file with oldest date has to be moved first. How do I achieve this?
I am trying to create a job using power-shell script to move the backup files from one folder to another. I am using Ola Hallengren script for backups. Ola hallengren created a common backup folder with sub-folders for databases and even more sub folders for Full and Log backups. My goal is to move full backups, which are older than a month and save them in a different drive along with the same folder structure. I was able to move the first set of backups without any problem, but I can't move anymore files and keep getting this error even when I try to overwrite the previous file with the force statement:
Move-Item : Cannot create a file when that file already exists.
I have a scenario where I need to move a series of files from within a directory of many files. The files follow no nameing convention and are more of less random. However the file names never change from week to week. I tried various different options in a 'file system task', no go. Any ideas on how to move only a list of files?
or
can I load only specific files into a 'Foreach Loop container' from a certain directory. I tried delimiting file names in the file source, that did not work.
I need to move specific files from a server to another server on a monthly basis. There are hundreds of files that are in the source directory and I need to move approximately 40 of those to the destination server. I would like to easily add or delete the file list as needed. I have seen where several variables were created for for each file name (and one for the path) and the ForEach Loop would go through them. With 40 or more I was thinking that I could make a connection to an Excel spreadsheet or text file with a record for each file name and read in and and move to the next record and make that value become the content of a "FileName" variable. Then if I wanted to add another file name I could just add another record to spreadsheet/text file or remove and the package would handle automatically....
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you