I'm writing a script for a pending migration (2k enterprise to 2k5x64 enterprise). I've got about 325 or so databases, spanned across 3 instances that I need to migrate. My plan is to do a mass scripted detach of all DBs of the old boxes, copy the files (on a new domain) and then reattach all of the mdf and ldf files.
I've been writing a sql script that will scan a "staging" directory for each of the MDF and LDF files, then programmatically run sp_attachdb. The problem is that the mdf and ldf files weren't always a 1 to 1 naming convention. My predecessors stuck all sorts of things in the file names that make doing a simple scripted search pretty difficult. The disk paths are going to be different from server to server (simpler), so I need to move the files
Is there a way (using sp_attachdb or otherwise) that I could somehow scan each of the MDF files to get the path of the LDF? I can already do that with a .bak file using the RESTORE FILELISTONLY command, but I'm not sure how to do it like this. I'm figuring there has to be a way though, as SSMS seems to know where the LDF is if you try to attach an MDF.
Usually, Google does well, but I haven't been able to find anything as of yet. Any ideas?
I am finding that in order to have the Web Services Task work successfully the location of the WSDL file has to be on a local drive that SSIS is executing upon. Is the current intended behavior?
In my SSIS task I use a URL path to store information extracted from the Web Service. The information is stored on a different server than the one that SSIS is running upon. This works properly without error.
I have confirmed that SSIS has appropriate permissions to read/write to that directory on that server. When I attempt to reference the WSDL file (located in the same URL directory that I am saving the information) I get a web services error, 'The Web Services Name is empty, Verify that a valid web service name is available."
When I update the Web Service Task attribute to point to the WSDL file located on a local drive it works correctly. I have confirmed that both WSDL documents are exactly the same.
The behavior seems a little strange...so I must be missing something subtle.
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET. I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page. Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer. Cheers ~ J
Easy question, I hope We are setting up an active active sql cluster. The management have done everything the wrong way around and purchased the hardware already.
We are planning on having 2 databases 1 in each instance. We will place the transaction log files for each database on their own cluster "physical disk" resource. We only have 4 disks availalbe for the transaction log files. To make things a little more complicated I have been given no information on the expected transaction use other than they should be no larger than 50GB
Here are my options
1. 2 Mirrored raid arrays one for each sql instance 2. 1 Raid 10 array with 2 logical drives (so the transaction logs are sharing the same disk spindles)
What would you do? seperate the disk spindles or share them and go for fast disk performace.
Is the name of the most recent backup file for each database stored anywhere in SQL2K? I want execute a SQA job periodically that takes the BAK from database A and restores it over database B (using the T-SQL RESTORE DATABASE procedure), but I need to know the exact name of the .BAK file; i.e. I need to know the yyyymmddhhmm value at the end of that file.
I've got some users that created a database with the log file on a drive that doesn't have a lot of space. I'd like to truncate the log and move it to a different drive. I can truncate it, but is moving it as easy as changing the files properties through SSMS?
version:SQL Server 2000. db size: 25.6GB trans log file: 32GB so far: I have read the Forum FAQ on "clearing the transaction file" and some DBCC SHRINK in Books Online.
I am trying to set up maintenance plan or backup plan. File sizes are getting very large and performance is horrible since we have had no regular maintainence in the past (No DBA in house).
1. What is the best way to identify the location of the transaction log file associated with a particular database?
When I right-click on the database name in Enterprise Manager and select properties, I can see one location for the transaction log.
However, If I right-click on the database > All Tasks > Shrink Database > click Files and select Temp_Table_Log, I get a different location.
2. We delete then repopulate about 105000 records in one particular table each day. In addition, we do the same with about a hundred rows in several other tables daily. -- Should I be doing Full Backups nightly? -- I have the option set to "AutoShrink" on the db. Will this truncate and shrink the transaction log as well as shrink the db when I do a full backup?
I am requested to list out SQL Server setup file location but a bit confused with it, I can see two setup files in two different locations:
All Programms>Microsoft SQL Server 2008 R2>Configuration Tools>SQL Server Installation Center (64bit)>Right Click>Open File Location: Directing to LandingPage C:Program FilesMicrosoft SQL Server100Setup BootstrapSQLServer2008R2x64
But here below it is found the setup file location:
Is it possible to set the log file location during installation? I don't see an option under Advanced. Any suggestions are greatly appreciated. Thanks.
I have an FTP server that will be receiving files. The directory and file structure will be a folder with a client name (can be called anything) and it will have files in it (these files will have the same filenames as all the other directories. So I will have folder JimmyDoe with files a.txt, b.txt, c.txt and I will have JonnyDue with files a.txt, b.txt, and c.txt.
Now I'm trying to figure out a way to get that dynamic file location to a DTS package so I can import all the data from the text file into a SQL server. The way the SQL server will be set up is that each Folder from the FTP site will be a separate Database and each file will 1:1 with a table with the same name..
My biggest issue is figuring out a way to tell the DTS package the file location to pull all those files and then importing them to the proper database.
I'm not limiting the solution to DTS packages so if .NET can be incorporated to make it easier then so be it. But keep in mind I can have up to 200 folders with 12 - 20 text files ranging from hundreds of rows of data to many thousands of rows. And the package needs to be ran twice a day so time/performance is an issue.
To recap: Need DTS package that uses Dynamic file source and transfers data to Dynamic database destination.
(And I'll write slow VB.NET code to handle this before I create/manage 200+ DTS packages as a solution)
I want to change the default backup location of MSSQL. I know we can configure it while installing. But I want to change the default backup location of a particular database, running on a database server.
The default location is like. Program FilesMicrosoft SQL ServerMssqlBackup
I want to move index file(69 gb) to different location.Right now data and index files are on the same drive(e), I am trying to move index files from e drive to f drive, so I will get 69 gb free on e drive . Can anyone please advise me what precautions I should take and advise best practice.
Hi, I am working on a new installation which I did not set up and realized was using the wrong partition of the server to store the data and log files, I have already created several databases, I want to use another partition for these databases without having to drop them and create them all over again. In BOLine i saw this command but want to make sure its safe, hope somebody can comment on this or if I am missing something.thank you
I asked this before and was told that it was an option on setup, but for the life of my I can't find it.
I have a requirement for all the transaction logs to go to a separate drive (L:), I would like to be able to specify this location as part of the install but I don't see the option. I have been successful doing so after the install but it is a bit annoying to have to do it that way when it seems like the install should have this option. I am reaching out once more to see if anyone can tell me if I am missing something in the install process that will allow me to do this?
I have a requirement to have all the transaction logs for our SQL Server databases go to a drive letter L: - I would like to do this during the install process but I don't see any way to specify in the setup. I am able to go in and do it after the install but it is a bit of a pain to do.
Anyone know how I can do this in the install itself?
Can you guys please give me the steps of how to create a destination file dynamically. What i mean is for example i want to get everything from a table and ssis should create a file distination according to today's date and save it in a specific folder.
I have developed an SSIS package which uses an XML configuration file for connection information. It runs fine both from Visual Studio and from an Agent job after I specify the location of the XML configuration file in the Agent job definition.
My question is regarding the target location of the XML configuration file after deployment. I do my SSIS development on a local workstation with SQL Server 2005 DE and Visual Studio 2005 installed on it. I deploy my SSIS packages to a remote SQL Server 2005 SE server. Both machines are running SP1. I double-click the package manifest from my local workstation and specifiy the remote server in the deployment utility. The package gets deployed to the remote server just fine but the XML configuration file ends up on my local workstations C:Program FilesMicrosoft SQL Server90DTSPackages path. I would have expected that the XML configuration file would have gone to the same path on the remote server, where the package itself was deployed. Obviously, when I run the deployment utility from the remote server, it stores the XML configuration file in the same path on the remote server. But this is not a workable production solution.
I've been working on package configurations here, mainly trying to figure out what works, and how. If I use an XML configuration, and and don't specify the path, ie just typing in test.dtsconfig, it creates the config file on my desktop. When I execute the packge in BIDS, it seems to be looking in that location. If however, I execute the package manually on the filesystem, test.dtsconfig must be in the same directory as the package (which is what I would expect). Why does this work differently in BIDS. Also, if the package is deployed to a SQL server, where would the configuration file need to be in this case? Thanks for the help.
We have 1 tb of dbsize of 3 different db which include ,mdf and log and mdf file and also how some sql jobs ile daily and weeklybackup jobs .what is the best option to copy these db and sql jobs . Is there any tool tools available which can copy these file over the network .How much time will it take
1) If we go in for fullback and restore option ? 2) make the sql server db offfine and copy and paste these mdf ,ldf,log file to different location ? 3) what would happen to the will database login accounts , cross database ownership chaining ? etc
I have an SSIS package that I created that looks like the following:
Execute SQL Task [Determines the file name]
Data Flow Task Source - Query [Grabs the data] Destination [Saves data to file on network drive]
Send Mail Task [Tells user to get file from network] When I run this package inside Microsoft Visual Studio I get the results I want. It sends the file to the network drive and because there are SSN's in the file, it doesn't actually e-mail the file but puts it in a secure location for the person to get the file.
When I schedule this in SQL Server 2005 I get the file saved to: c:windowssystem32dynamicfilename.csv
So somehow it is still able to determine what the file name should be, but instead of sending the file to the right destination when you run it through VS, it flops out and sends it to this wacky directory.
I've spent too much time on this and was wondering if someone can point me in the right direction without going to the extent of writing a script like this suggests. thread1555-1265108
During installation of SQL Server 2000 I set the default location for data files as D:MySQLServer which resulted in the location D:MySQLServerMSSQL.
I then installed SQL Server 2005. I do not remember being given the option of specifying location for the Data Files. Then I read that the location for named instances is deteremined by the first installation of SQL Server. The location for the data files for SQL Server 2005 turned out to be MSSQL.1 but under C:Program Files.....
I want the default location for SQL Server 2005 to be under D:MySQLServer, something like D:MySQLServerMSSQL.1. How do I do I change the default location for the Data Files.
I am using SQL Server configuration indirectly with the connection to it configured in an XML configuration file with the location of the XML file and the connections all being different between development and production. I deploy to the file system using the SSIS Deployment Manifest/Package Installation Wizard.
It seems that sometimes the path to the XML config file is updated in the deployed packages and sometimes it is not.
When the path is updated, all connection strings (as displayed in the Connection Managers section of the Execute Package Utility) are displaying the new values as well.
When it is not, it looks for the XML file in the development location and since it cannot find it all configurations still point to the development connections.
Does anyone know of any conditions when the location of the XML config does not get updated and when it doesn't?
I have a flat source file(.csv) which I am importing to SQL server. Now, if the source file is not available at the specified location, then the SSIS package should retry to execute n times( say 3 times) after certain time interval. The number of retries and the time interval should be configurable.
I have used Flat File connection manager for the source and OLEDB connection manager for the destination.
I am quite new to SSIS. Any help would be highly appreciated.
My parent package calls packages stored in the file system. While developing, I would like to call packages in the project bin directory. In production, I would like to call packages in a different development. Is this possible?
I can change the package connection string with an expression that refers to user variables PackageLocation1 or PackageLocation2. I would like to do this automatically. Is this something that should be done at deployment time? Or is there a run time value that I can check and conditionally use PackageLocation1 or PackageLocation2?
Development and deployment is done on the same server, so the same enivronment variable value would be used in an indirect configuration. Same thing applies to a file configuration.
Another question: Is it possible to set up a different Installation Folder for use during deployment? Every time I deploy, I have to navigate the folders, you can't even paste in the folder name.
I added a secondary data file to TEMPdb yesterday and gave it a wrong location by mistake. If I try to change the location, then I am getting an error now. I think that is because TEMPdb is in use and that is why I cant change it's secondary file's location. Do I need to take TempDB offline and then change the secondary file's location??
We have an existing OLD System in SQL Server 2000 DTS Packages. The Whole application runs on DTS. There are several Packages which are called from a Master Package. Each Child packages have their own Global Variables. Most of them are the File Location variables to have the Source Location of the Input Data, mainly from the Excel Files. Now, even if the Global Variables are there to change whenever they want to change the Locations of the Files, they have to goto each child package and change the variable themselves. To resolve this issue, they want a configuration File (INI) / Table which would store those Variables. My thought is to read from that File/Table and Update all the packages' global variables through an ActiveX Script as the First Step of the Master package. That would eliminate the need of changing anything in the existing System. But the Problem is the management (PM / DBAs / Team members) have different views to store the Configuration data. 1. Some wants it into a Different Database, having one table for this application so in future they can also add another table for some other application. 2. Some wants to store it in a Table in the Same Database of this Application. 3. Some wants to save it as a INI file.
As i'm the one who's going to really implement it, they have asked me to research for a best solution out there.
so I request to help me to decide which is a good solution and why.
I am working on SQL Server 2014 and I want to know, Is it possible to change the location of SQL log file and will it affect to the database if it is possible to change its location?
Hey everyone I've got this question that has me stuck for the last few days but its an important part of my website.....What I am trying to do is basicly have a user be able to upload a file, have that uploaded file plus some other info automaticly display on other parts of my site, and have a different user eventually be able to download that file....I have thought about allowing the file upload as a BLOB but still cannot find a proper way to execute this using VB, plus I have heard that this way of doing it is not reccommeneded cause databases were not designed to store large files like this, lots of articles recommened having the file upload to a Folder on your server then get the binary data for the file that can be placed in a database to refrence that particular file.....Well this also proves to be a lot harder then said here is what I got so far (written in C#) protected void UploadBtn_Click(object sender, EventArgs e) { if (FileUpLoad1.HasFile) {
FileUpLoad1.SaveAs(@"C:Documents and SettingsAdamMy DocumentsVisual Studio 2005ProjectsWebsiteFiles" + FileUpLoad1.FileName); Label1.Text = "File Uploaded: " + FileUpLoad1.FileName; } else { Label1.Text = "No File Uploaded."; } } and here is the asp part of the code that goes with it<asp:FileUpLoad id="FileUpLoad1" runat="server" />
<asp:Label ID="Label1" runat="server" Text=""></asp:Label> Now from what I know is I need to get the binary of the file which I have read you can do with the Page.Request.Files statement but again not sure how I would impliment this. Does anyone have any suggestions on which way I should take when dealing with this should I try and just use the BLOB method or use the binary refrence method? and if so how would I impliment this, heck even some good tutorials on the subject would be great... Thanks.....Adam
I need to know how to create a AScii 7 bit flat file using Integration services. I do have basic charecters in the flat files - only other charecters required are a pipe (|), which is used as delimeter and additionally it will have line feed (LF) which is used as row delimeter.
I created a SSIS package to extract data from a flat file source and load them into a table in a data base. After I created the package i checked it in to source control(perforce).
But the problem is once a month new flat source file comes and data should be updated. Once the new flat file comes, is there anyway that SSIS package can identify the path of the flat file and execute the package automatically? In Flat file source only the data will be changed. Not location or data type or anything. Can i use parameters to do that?