Renaming A Local File
Jan 7, 2004Suppose I have an Excel file in D:Test with the name a.xls.
Can I rename that excel file from QA of SQL SERVER in the same location or any other location?
Subhasish
Suppose I have an Excel file in D:Test with the name a.xls.
Can I rename that excel file from QA of SQL SERVER in the same location or any other location?
Subhasish
I'm copying files to a folder with the naming convention as follows in the source folder:
CM_ABC_MY_TEST.txt
In the destination folder, this filename needs to appear as:
CM_XYZ_MY_TEST.txt
In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.
Hi All,
We have a SSIS package which is accessing a remote Windows file share location.
The package first moves the file from folder-1 to folder-2 and also renames the file during this process.
Then the package reads the file (using a flat file connection FF_SRC) from folder-2 and renames it again after processing it successfully.
The permissions given to the user executing the package on folder-2 are: Read+Write+Modify+List folder contents.
We are facing an error:
Code SnippetFile or directory "Z:folder-2XYZ.txt" represented by connection "FF_SRC" does not exist.
We are getting the above error when the SSIS package is trying to rename the file the second time in folder-2.
However, the file exists in folder-2.
The OS is Windows 2000 Server SP4.
Any ideas why this could be happening and how it could be resolved?
Best Regards,
Avnip
I'm having trouble working this out in SSIS. I am trying to use a File System task to rename a file using an expression so that file.zip will be renamed to filemmyy.zip at the end of every month (for instance this month would be file0506.zip).
I am using the destination expression variable. But I'm not sure what to put for DestinationConnection. It seems to want a file name, but the file name is going to be variable, so I'm not sure what to put.
Any ideas?
I have had a go at using a package with a script to rename and move a file and it works well by using a script task in a package with source and destination variables. See script at bottom
But in my scenario the file thst comes in every day will have a slightly different name. It will be called "System_UT_INCR_BOOKINGHEADER_20080228000000.TXT"
On the 28th Feb. The date part of the title changes everyday.
SO i need to adjust my "Source" variable which is currently just a string with a value of "C:DatafilesimportsSystem_UT_INCR_BOOKINGHEADER_20080228000000.TXT" So it will only ever look for that exact file name
Imports System.IO
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
File.Move(Dts.Variables("Source").Value.ToString, Dts.Variables("Destination").Value.ToString)
Dts.Events.FireInformation(0, "", "File Moved Succesfully", "", 0, True)
Catch ex As Exception
Dts.Events.FireError(1, "", "Source file or destinations does not exist", "", 0)
End Try
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
I'm moving a database (XYZtest) from the test server to the production server via sp_detach/sp_attach. I want the logical file names to be XYZ_data, rather than XYZtest_data, etc. I can easily rename the disk files, but how do I rename the logical file names?
Thanks,
Al
Is there a way to rename the logical file for a database. For example, if I am moving a development database into production, I can use backup - but the backup takes the logical file names of the database and puts it into my production server. Now I have a production database with "dev_data1" for a logical file.....Can I change that name....?
Thanks!
Dean
Hi,
After executing a dts package to load data from a file to a SQL table. I need to rename the file after the load.
Could any one give me a code sample so that I can add that as a task after the execution of the load in the same package
Ramam
Is there a way to rename the logical file names? I'm not talkingspecifically about the physical files, because those can be changedduring a restore, but the values immediately to the left of those inEnterprise Manager such as DBName_Data and DBName_log. EnterpriseManager lets me change them during a restore, but when I do it gives anerror. Any ideas?
View 1 Replies View RelatedHi,
I am using a DTS package to extract data from a table and export it to an excel file. This task needs to run on a weekly basis and the filename should contain the date the file was created. I have sucessfully used the activex scropt below to rename .txt files but when I try to use it for Excel files it always defaults to the default filename I specified in the destination file properties. Can anyone show me how to do this for Excel files?
'**********************************************************************
' Visual Basic ActiveX Script
'************************************************************************
Option Explicit
Function Main()
Dim sFilename, oPkg, oConn
Dim sYear, sMonth, sDay
If Month(Now) > 10 Then sMonth = Month(Now) Else sMonth = "0" & Month(Now)
If Day(Now) > 10 Then sDay = Day(Now) Else sDay = "0" & Day(Now)
sFilename = "\servernameD$Daily ReportsFlagcodes " & sDay & sMonth & Year(Now) & ".xls"
Set oPkg = DTSGlobalVariables.Parent
Set oConn = oPkg.Connections(2)
oConn.DataSource = sFileName
Set oConn = Nothing
Set oPkg = Nothing
Main = DTSTaskExecResult_Success
End Function
Thanks for your help
Hi
I am trying to move a flat file to an archive folder and then rename the file to include the date and time of runtime.
I have set up a File System Task to move the file which works fine and a second File System Task to rename. The IsDestinationPathVariable is set to TRUE and the DestinationVariable is called User::Error_Log_File_Rename. There is an expression as follows:
@[User::Error_Log_File_Rename] = "Y:\SSIS_PUD_Upload\Error_Log_Archive\Update_Error_Messages - " +REPLACE((DT_WSTR, 30)(DT_DBDATE)GETDATE() + "-" + (DT_WSTR, 30)(DT_DBTIME)GETDATE(),":","-") +".txt"
which is the required file name.
If I go into the expression and evaluate it it will give me the current date and time. However when I run it as part of the package say 10 minutes later the file is renamed using the same date and time from 10 minutes before when I evaluated the expression. If I wait another 10 minutes and run the package again it still retains the original date and time.
Is there a way to get the expression evaluated with the current date and time during execution of the package?
Thanks in advance
Scott
Is there any danger with renaming the LOGICAL file names behind the database?
There are a bunch of databases that were restored copies and all of them have the same logical database file name. I'm trying to get some growth data so I want the logical files to be different (prefer them to match the actual database name) so I can more easily identify them.
For instance:
database_id name type_desc name physical_name
1 DLMdb1 ROWS DLMDB1 D:dlmdb1.mdf
1 DLMdb1 LOG DLMDB1_log E:dlmdb1.ldf
2 DLMdb2 ROWS DLMDB1 D:dlmdb2.mdf
2 DLMdb2 LOG DLMDB1_log E:dlmdb2.ldf
3 DLMdb3 ROWS DLMDB1 D:dlmdb3.mdf
3 DLMdb3 LOG DLMDB1_log E:dlmdb3.ldf
Am I safe to rename the logical names? I can't think of anything that references those logical file names that I would be breaking [backups, applications].
In my script task I have the following code. The task I'm trying to accomplish is:
If the filename on FTP can be found in the local archive folder of e: drive then show message "FileAlreadyThere" (I will ultimatley change it to do nothing); if the filename on FTP cannot be found in the local archive folder of e: drive then transfer the file to the local package folder on d: drive.
While the script task is executing I was watching it closely, but the problem i saw is that:
If some files on FTP are already in local archive folder and some are not, then it the files which are already in the archive folder are dumped to the package folder; then after that the files which are not in the archive folder are then dumped to the package folder. But I only want the new files on FTP to be transferred to the package folder for further processing.
Then after this is finished, I saw all the files in the package folder are refreshed one after another, after the first round of refresh the second round starts, after the second round finishes it then stopped. I saw it refreshes itself because the 'Date Modified' of the file changes. And I saw the script task turned green.
I don't see how the code below produced this result. Something is wrong in the logic of the loop? Anyone has any idea why it's behaving the way it is now? And how to change the code to accomplish what I want? Thanks a lot!!
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Imports System
Imports System.IO
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
cm.Properties("ServerName").SetValue(cm, "ftp2.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0")
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
ftp.Connect()
ftp.SetWorkingDirectory("/directory")
Dim fileNames() As String
Dim folderNames() As String
ftp.GetListing(folderNames, fileNames)
If fileNames Is Nothing Then
MsgBox("NoFileOnFTP")
Else
Dim fileName As String
For Each fileName In fileNames
If File.Exists("c: emp" + fileName) Then
MsgBox("FileAlreadyThere")
Else
ftp.ReceiveFiles(fileNames, "c: emp", True, True)
End If
Next
End If
ftp.Close()
End Sub
End Class
I have a fairly large DTS package which processes a large amount of data and outputs it to a series of tables. Currently I am saving the package as a local package, but I want to save it as a file to make migrating to different servers easier for the people who will be moving it. My question is, Can I run the package as a file without saving it as a local package on the server? And if so, are there any good examples of how to do it out there?
View 1 Replies View RelatedHi, i wondered if anyone could help me with my stored procedure.
It updates a table with users messageboard and saves the Title, Line 1, Line 2 and so on into a XML thats stored in the column named XML_Data.
How can i then output this xml to a file on my local machine?
T-SQL :-
ALTER PROCEDURE [dbo].[prUpdateMessageBoardVetted]
(PARAMS GO HERE)
UPDATE CustomerData
SET Title = @Title,
Line1 = @Line1,
Line2 = @Line2,
Line3 = @Line3,
Line4 = @Line4,
Line5 = @Line5,
Line6 = @Line6,
XmlFileName = @XmlFileName,
FlashFileName = @FlashFileName,
HtmlFileName = @HtmlFileName,
Vetted = @Vetted,
ClientName = @ClientName
WHERE
(UserRef = @original_UserRef)
END
BEGIN
IF @Vetted = 1
DECLARE @xmlData AS XML
SET @xmlData = (SELECT
UserRef,
Title,
Line1,
Line2,
Line3,
Line4,
Line5,
Line6
FROM dbo.CustomerData
WHERE (UserRef = @original_UserRef) FOR XML RAW('Message'), ELEMENTS)
SELECT @xmlData
UPDATE [dbo].[CustomerData]
SET
XML_Data = @xmlData
WHERE
(UserRef = @original_UserRef)
(THEN HERE DO SOME SORT PROCEDURE TO OUTPUT @xmlData to a XML FILE ON MY LOCAL MACHINE,
BUT ALSO NAME THE XML FILE VIA @XmlFileName BY DOING SOME SORT OF CONCANTENATION)
any ideas?
Thanks in advance !
Neil
I have just upgraded to Microsoft SQL Server 2005 Express Edition with Advanced Services SP2.
Now I can't connect to my local database.mdf file.
When trying to open or doubleclick or expand my .mdf file the following message pops up:
Generating user instances in SQL Server is disabled. Use sp_configure 'user instance enabled' to generate user instances.
What is the solution for this problem?
Hey All,
I am attempting to create a report on SQL Reporting Services that uses a local XML file as its datasource. Although i did not find a way to directly do this, I found an MSDN article that walks a person through on how to use a report parameter to paste in your XML data from the page. When I set up the report in this way, I get an "Invalid URI" error when I go to the preview tab.
http://msdn2.microsoft.com/en-us/library/aa964129.aspx
What would be the best way for me to use an XML file as a datasource using reporting services? Is there a way that I can use a local file and use a parameter to point at that file?
Any info, links, tuts, etc. would help. Thanks!
Nathan
I have the following command in a .bat file
osql -E -S localhost sp_grantlogin 'GREGASPNET'"
What I would like to do is replace GREG with the name of the local computer this .bat file is being run on.
Is there any string for this... i.e [LOCALMACHINE]ASPNET or something like that?
Greg
:eek:
I want to use the Office Web Components (v10) PivotTable embedded in a web page. The examples and tutorials I've been able to find on MSDN all use the same architecture: The initial pivottable configuration is created by instantiating the object server-side and programmatically configuring it. The XMLDATA property is then read and sent to the client. At the client, script sets the XMLDATA property. So far so good, all makes sense. Then my problem: The PivotTable is then connected to Analysis Server using an http connection string (through IIS). As my users do not have database accounts, only application accounts, I cannot allow database connection information on the client. Trouble is, the PivotTable generates MDX queries when a user reconfigures the view, and there are no hooks (events or other means) to programmatically obtain the generated MDX query - a fact that is explicitly confirmed in MSDN documentation.
This clearly means that the PT must be connected directly to it's data source. So I thought I could just create a local MOLAP cube file and download it to the client (permissible as it runs as a trusted application). But nowhere can I find any example or documention on how to do this.
This is driving me slightly insane and I have little hair left now; if anyone knows how to do it or where to find proper PivotTable documentation (rather than a collection of examples that do something else from what I need to do!) then help is greatly :p appreciated!
Dag
I am trying to create a sproc creates several tables, views, and sprocs across several similar DBs (please...no lectures on structure...I didn't create this multi-DB setup) .
My sproc loops through the DBs and runs a script on each DB. I am running the Sproc from a client computer and cannot easily access the c drive on the SQL server.
The only way I have figured out how do do this is to run the following
xp_cmdshell 'isql d- %@mydb% i- c:script.sql'
inside a loop.
My problem is that xp_cmdshell refers to the server c drive instead of my local c drive, where my script file is.
Is there a way to refer to my local c drive? Also any thoughts on a better way to approach this (other than running the entire script inline in the sproc since I don't want to have to run quotes around each batch of the script).
Thanks
How to create a db in clients local machine with sql server ce and .net?Is any sample applications available?so,that later i can sync the data to the sql server db.
Please help.I am new to sql server ce.
Hi,
Here's my situation. Every day I will be downloading extracts to a folder. The extracts are named:
20070529.Extract1.csv
20070528.Extract1.csv
20070527.Extract1.csv
20070529.Extract2.csv
20070528.Extract2.csv
20070527.Extract2.csv
So, on any given day, I will want to find the most recent versions of Extract1 and Extract2, for example:
20070529.Extract1.csv & 20070529.Extract2.csv
How would I go about doing this?
Thanks much
We are using a SSAS Server in order to analyse financial data. To create forecast scenarios, we would like to use the "what-if-analysis"-feature of Excel. With this feature, data that was changed in Excel can be written back into the cube.
There are several data analyists who should be able to create scenarios on their own, that's why, each of them gets his own offline cube (a .cub file that is stored in the file system) whose data is extracted from the SSAS Server.
Unfortunately, we found no way to write back data in these offline cubes. Due to the error message, Excel failed to establish a connection to the external data source. Is there a way to writeback data to a local cube?
Hi
i need to create a Report from XML file which available in my current folder.
can anybody suggest How to create a Data source from XML file and How to make a query for that.
Thanks
Mahesh
I have a data file, call it Data.mdf, that I am accessing through Visual Studio. It does not show up in SQL Server 2000 Enterprise Manager.
How can I get it assigned to, or imported into, the local server so I can manage it?
Thanks
Glen
I want to add a data source to a C# project that comes from a local SQLServer instance. But when I attempt to do a new connection, it only allows me to specify a file. That is, the only choices I have to select a data source type are "Access DB file" and "SQL Server DB file". I know that with the non-express VS edition there are more choices available. But reviewing limitations of express editions on the web, I find a frequent mention that one cannot access remote data, only a locally installed data source. But is not a SQL Server instance on my local machine a local data source?
This is an issue because if I proceed with connecting as a file, then VC#Express complains the file is in use. If I first go into SqlServer Expresss and detach the database there, then I can connect to it in VC#Express, but that is rather cumbersome!
So to put it another way, can I use this connection string...
Data Source=.SQLEXPRESS;Initial Catalog=testDB;Integrated Security=True
instead of this one?
Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory| estDB.mdf;Integrated Security=True;Connect Timeout=30;User Instance=True
Hi,
I'm trying to work on a database when I'm not connected. I can't figure out how to get a local copy of a database.mdf that I created on the server, onto my hard drive, using SQL Management Studio Express. Does anyone have any suggestions? I'd be forever in your debt. Thanks
I have made a small asp.net project which uses a local database file as a part of the project. The project is running fine om my local machine, but when I upload it to the remote server, the login fails for the server.
I suspect this is can be solved by using sqlserver authentication. But I have now spent a lot of time trying to configure the database file to use this authentication mode. As I see it there are three possible solutions to the problem.
use management studio express to configure the local mdf file (Ecxept that I cant find out how to connect to the mdf-file) and from here change the authentication method to sqlserver authentication.
use Visual Web developer to change the authentication method (but how???)
make the windows authentication work on the server (this would probably require that mannamgement studio express connects to the remote database. (Same problem as no 1)
Help will be higly appreciated.
Bjarke
Hi all,
Currently, my (small) intranet site is storing it's data on a remote SQL server. The danger with this, as has happened several times now, is that the application is twice as vulnerable; if either the webserver or the dataserver malfunctions or is unreachable, the application won't work.
I only recently discovered the possibility to use local database files (MDF files), and this seems like a much better solution for my site. But now I want to transfer the tables that are residing on the dataserver, to the MDF file. The database only contains tables. How do I handle this? I do not have access to the dataserver, only to a few databases that are residing on it. Is this possible using Visual Studio 2008? I have read about a "Bulk Copy Program" (bcp) which is included with SQL Server, but I cannot find a download for just that application.
Or is this totally not the way to go? I've discovered MDF files are a bit more problematic with concurrent connections; having tables open in Visual Studio results in "Site offline" or "Cannot open database" error messages on the website. Problems I've never had to deal with using SQL Server, but they are only minor problems.
Thanks,
Peter
Have an SSIS package with a Web Service Task, downloaded the WSDL to a local file, all is well until we try to deploy the package to another machine at which point the package expects to find the WSDL in the same place.
Am I not understanding something? Yes I can make it work, BUT:
1. Couldn't the task just get the WSDL from the web service each time, instead of from a static file? Isn't that the idea? Latest and greatest...
2. Having packages dependent on files is extremely inconvenient in a production environment. Security issues abound, things are in different places on different servers. How are folks handling this? The Web Service Task is certainly not the only place in SSIS where a package can be dependent on a file system location.
I am using the following C# code to establish a SQL connect to a SQL database file:
// connection string
// attach a SQL database file to a local SQL server express instance
string _connectionString = @"Server=.SQLExpress; AttachDbFilename=C:BalanceDatabase_1.mdf; Trusted_Connection=Yes; User Instance=True";
// using System.Data.SqlClient;
SqlConnection _sqlConnection = new SqlConnection(_connectionString);
// open the connection
_sqlConnection.Open();
// do something
// close the connection
_sqlConnection.Close();
So far, the connection works fine.
However, next, I want to copy the database file to another folder. So the following codes:
// source database file name
string sourceDatabaseFileName = @"C:BalanceDatabase_1.mdf";
// target database file name
string targetDatabaseFileName = @"D:BalanceDatabase_1.mdf";
// copy database file
System.IO.File.Copy(sourceDatabaseFileName, targetDatabaseFileName, true);
Then the program came with runtime exception: "IOException was unhandled: The process cannot access the file 'C:BalanceDatabase_1.mdf' because it is being used by another process."
Is it because the database file was sill attached to the local SQL Server express instance? What can I do to bypass this problem? Detach the database file? or dispose the local SQL Server express instance?
Many thanks indeed!
I have 15 Dealers files .With the Files name as follows
1. ''ACTEST00001_20141112_0408_INV.TXT''
2. ''ACTEST00002_20141112_0408_INV.TXT''
I will get these files through FTP on Daily basis with changes on date alone.
I will have 4 files for each dealers.like INV,SERVICE,SALES,APPOINTMENT.
SO i need to fetch a particular dealer with particular date.
I need to create this scenario in SSIS package. How to create this and what are all the Tasks i need to use to implement this process.
Two questions actually ...
1) Need a simple routine or system function for testing for or verifying the existence of a file or folder on the local server's file system. Returning a simple boolean value or 1 or 0 would be fine.
2) Need a syntax and use description of the new "master.dbo.xp_create_subdir" function ... anyone have some documentation or links? MS technet and MSDN have nothing.
Thanx
EHammer