Hi All,
I've been trying to find the answer but been unable to. My question is it possible to create a SQL Server (2005) database in a usb2 drive? I have a large usb drive that i would like to store my database into instead of my local drive which is not that big.
I am able to run SSIS packages as SQL Server Agent jobs with a Control Flow items "File system task", if I move a file (test.txt) from a drive (c on the server (where SQL Agent jobs run) to a subdirectory on the same drive. But, if I try to move a file on a network drive, the package fail.
I am trying to move a log file from one drive to another.
What I have done is add another file to my file group. So now my log has a file on the 'e' drive and one on the 'f' drive. I now want to remove the file on the 'e' drive. I have emptied the file on the 'e' drive. When doing the command:
ALTER DATABASE Uniprodruntime REMOVE FILE m_rk_runtime_log
I get the following error message..
Server: Msg 5020, Level 16, State 1, Line 1 The primary data or log file cannot be removed from a database.
I have also gone into enterprise manager and tried to delete the file and it does nothing.
I want to keep Win98 and my files intact on my C drive then install another hard drive and install MS Back Office (NT 4.0, SQL 7.0) on the new drive. When I want to switch operating systems, I want to restart in dos, go to the new drive and launch NT. I can't dual boot because I have FAT32 file system on my C drive. Has anybody tried this? Do it work?
I created a DTS local package on the SQL Server. It's basically importing a text file into a table in my database. This file resides in a mapped drive (X:) from another server. When I schedule the DTS as a job, it fails. It doesn't execute any of the workflow in the design.
However, when I copy the text file into a drive local to the SQL server (D:), it runs flawlessly.
What I do right now is I have a windows scheduled task that runs a batch file that copies the text file from X: to D: at certain time intervals. Then the job scheduler runs to import it.
What am I missing? How come the job scheduler can't read the file directly from the mapped drive?
I am having an Access database on a shared network drive which has read/write access rights on the that shared network drive. When I try to Access data through the linked server it gives me gives me a message box saying you do not have permissions to view the data. Also if i try to use xp_cmdshell to copy over the mdb file to my local drive it say 'Access denied'
But when I copy (through command prompt) the same file to another network drive or my local drive where I have full control the linked server can connect sucessfully.
The problem is the i cannot have 'full control' permissions on shared drive where my database resides.
We had some SAN issues and we dont have Transaction Log files for some databases.. The drive which was holding this Tlog files were missing.. How to bring back databases.
In my environment, there is maintenance plan configured on one of the server and while running DBCC checkdb on a database of size around 200GB, log file usage of tempdb is increasing and causing the maintenance job to fail.
What can I do to make the maintenance job run successfully, size of the tempdb database is only 50GB and recovery model is set to simple. It cannot be increased as the mount point on which it is residing is 50GB.
Hi guys, I am a new person to sql. I wanted to know how you execute a script file in Query analyzer. and also I wanted to know how you get the output to a file. Thanks for the help . Please I need this info asap.
Being a very novice SQL Server administrator, I need to ask the experts a question.
How do I go about moving a database from 1 drive to another? The source drive (C is local to the server, but the target drive (E is on a Storage Area Network (SAN), although it is still a local drive for the server. I want to move the database from C: to E:. Can someone provide me with instructions?
I have a several indexes on a filegroup that I would like to move to a different physical drive. I am aware of the sp_detach...sp_attach routine which allows moving the .mdf and .log files to a different location. How would I go about moving a .ndf file though?
Hello, I have created a package that runs without problem. I run the package with the command dtexec /F "package_name.dtsx" > package_name.txt.
Then I run the same package from SQL Server Agent, everything is OK
Then, I tried to edit the command line to have the output file, but I got an error.
The command line is: dtexec /F "package_name.dtsx" MAXCONCURRENT "-1" / CHECKPOINTING OFF /REPORTING E > package_name.txt. (MAXCONCURRENT "-1" / CHECKPOINTING OFF /REPORTING E are created by default)
I'm trying to move the transaction logs of my databases to a different drive (for fault tolerance). I can create a second transaction log file for each database via Enterprise Manager but I have 2 questions:
1) If two transaction log files exist for a database which one does it use ?
2) How do I force SQL to use the new transaction log file ? (so I can delete old)
I know , it is not going to work , just wondering if anyone could give any reasons for that. Whether it was intentional constraint or just internally compact edition was designed in particular way which makes such a usage not possible. It is a pity that it doesn't work that way as it would be much easier transitional path for many Visual Foxpro , MS Access applications. In my case I just want READ-ONLY database either for multi-user access via shared drive or stand-alone on local drive.
as indicated by my stupid question, I am very new to sql. our vrsion is 2000 and I'm talking about in enterprise manager, the database that was created is not showing up in the list of db. Although I can see the file in explorer.
The problem I€™m having is when I try to attach the database €śmailarchive3Q2007_data.mdf€? it is also looking for the log file €śmailarchive3Q2007_log.ldf€? . The log file was removed by someone else off our system. I have a backup of the file but it is too large to restore now (160 gig) when the system was first set up the recovery model was not set to simple so the log just grew till it filled up our drive. I no longer have the drive space necessary to restore the log file and shrink it. So what do I do now? I need some kind of €śmailarchive3Q2007_log.ldf€? file to attach the database in enterprise manager.
How to backup half of dbs from a server on C drive and the other half on D drive and vice versa, first half on D drive and other half On C drive using only one job and one stored procedure??
Using scheduling from job add 2 schedules to the job so first schedule backup first half to C and second half to D , the second schedule backup first half to D and second half to D.
I'm trying to call WinSCP in a SSIS Execute Process Task using a .bat file to automate. SSIS is returning a generic failure error message ("Error: 0xC0029151 at Execute Process Task, Execute Process Task: In Executing "MyServerC$Program Files (x86)WinSCPWinSCP.exe" "-script=MyNASShareWinSCPFTP.bat " at "", The process exit code was "1" while the expected was "0"."). So now I'm trying to run the .bat file by itself. Unfortunately, the command prompt rushes by so fast that I can't see what the server is doing.
I can't find anything on the WinSCP site that indicates how to slow down the .bat file processing or log the remote server responses. I do see how to use the /log switch when using the interactive command line console, but that's not what I want to do.
We are receiving following alerts frequently about 1:40 AM in the morning. We have backups running on 11:00 PM everyday and rebuild job running at 2:00 AM. Not sure the exact cause of this error.
Error:
The file group "PRIMARY" for the database "tempdb" in SQL instance "MSSQLSERVER" on computer "XYZ" is running out of space.
tempdev Initial size : 133,100 MB Growth: By 10 percent, Limited to 140000 MB templog Initial Size : 5,475 MB Growth: By 10 percent, Unlimited
We are running SQL 7 on a Windows NT Server. If you copy a 25Mb filefrom this machine to a W2K server, the file copy takes over 5 minuteson a 100Mpbs switched network.Copying the same file to another NT server takes only seconds, andcopying the same file to the W2K server from the 2nd NT server, (whichis not running SQL) takes only seconds also.Has anyone any ideas as to why file copying between this machine and aW2K one will take so long. It is repliacted on 5 further w2K machines.
I'm hoping that someone can help me. I have a SQL 2005 SSIS package that will run Friday mornings to empty/load a table with data from another database. On Friday evenings I'll need to run another package, but want to make sure the table load completed prior to launch. For this I planned to use a file watcher task, however I cannot for the life of me figure out how to output a 'done' semaphore, from the morning job, to a networked drive.
A file system task will not work because there is not a 'create file' option. I do not have an existing file that I can rename either.
I tried an execute process task running cmd.exe with the following argument:
This fails because UNC paths are not recognized. (The package executes from another server so I cannot use a local path, nor am I allowed to set-up a local share.)
Can someone offer an alternative suggestion? I'm really hoping this is easier than I'm making it.
I have to perform disk maintenance on current drive - Drive 'D' where it has sql data (mdf file) and I have added new drive - Drive 'E' By the way Drive 'C' have the program files for SQL Server 2008 R2 What is the correct process to transfer sql data (mdf file) from Drive 'D' to Drive 'E' and later remove Drive 'D' from the server.
We have an encrypted drive (that can be mounted and dismounted, a third party tool to encrypt drive path). I wanted to store the secondary file to that encrypted drive path. The secondary file stores confidential information. I separated the table from the primary to secondary file. Encryption per column is not advisable to do on that table so we decided to separate that table and put it on secondary filegroup. The physical file is stored in the mounted drive path.
I can read and write in that mounted drive path. I can also read and write if the drive is unmounted (which I believe read and write is really being done). When the drive is unmounted, the physical secondary file (.ndf) is not visible to any user logging in the server itself (this is actually the goal why we do this encrypted drive setup thing). It is kept virtually somewhere in the machine. To mount it back, a password is needed.
I'm a bit confuse, somebody can advise or give their insight on this setup. I believe that when the drive is dismounted, SQL Server stored the transactions in cache until it finds that the drive is mounted back. This means that all transactions are not comitted yet. When the drive is mounted back, I think SQL Server is smart enough to check/know that the drive is physically present and will flash all the pending transaction from the cache to the hard drive.
Is my assumption correct? Is there any thing that I need to know about transaction, committed and those data flashing thing on the hard drive?
I recently installed standalone version of SQL 2014 Standard on my work computer. I used Access before but I want to use a SQL server instead.
We have a shared drive that a file gets deposited every day at midnight. I want to be able to get this file and import it to the server (its basically a list of names).
Here what I have done so far:
I created the database
Created the file and successfully imported data into it using the Import Data feature.
I saved the SSIS package
Scheduled an Agent Job for this package to run at certain time,daily
At first the jobs would fail with a Access is Denied. I added a user under Credentials with my network account ( have admin rights on the work computer).Also added a Proxy for the Credential user I made.
Jobs fail with a “Cannot open data file” error. I tried changing things here and there, but I can’t get it to work.
I have create a batch file to execute a stored proc to import data.
When I run it from the server (Remote Desktop) it works fine, but if I share the folder and try to run it from my pc, it doesn't do anything. I don't get an error, it just doesn't do anything. My windows user has admin rights in SQL. Why is it not executing from my PC?
I created a SSIS package which will generate an output file and place it on a remote fileshare location which will look something like this
\RemoteServerNameRemoteFilePath
The package is executing fine when I am executing it through BIDS or through execute package utility and writing the output file to remote file share location.
I created a SQL job for the package and ran the Job. Then, its throwing an error saying
Executed as user: DomainUser. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:33:06 AM Error: 2008-03-10 10:33:22.22 Code: 0xC020200E Source: DFT_Generate Output File Description: Cannot open the datafile " \RemoteServerNameRemoteFilePathOutputFileName.txt". End Error Error: 2008-03-10 10:33:22.34 Code: 0xC004701A Source: DFT_Generate Output File DTS.Pipeline Description: component "FF_DST_Output" (160) failed the pre-execute phase and returned error code 0xC020200E. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:33:06 AM Finished: 10:33:22 AM Elapsed: 15.891 seconds. The package execution failed. The step failed.
DomainUser have all the permissions on the remote file share location. SQL server agent is running with the log-on account DomainUser(same as the above).
I have a database data file almost at 2tb maxing out a windows drive. Only 16gb left. Should I just add another data file on another Windows drive for growth? Or just move current huge data file to a new GPT drive? Or do both adding another data file and moving existing to its own new GPT drive?
The MDF and LDF files are placed in SSD drive and tempdb files are placed in HDD drive. Snapshot isolation is enabled on the database. When a script is executed to insert data with NULL value to a table which has NOT NULL column, the transaction fails and then a log undo happens which fails and takes the database to suspect mode.
But when the MDF and LDF files are placed in HDD drive all this do not happen. The transaction just fails.
I recently created a program that connects to a Microsoft SQL database that was stored on my computer and it worked fine. As soon as I tried to connect to the same database via a network drive I got an error stating that "The file Y:Filename.mdf is on a network path that is not supported for database files.". I can't seem to get it to work, if anybody has any ideas what I'm doing wrong I would appreciate your help.
Here's a challenge that is killing me:I've got 2 web servers and a SQL Server and about 5,000 'users' whostay connected to the site all day. I have a page that is supposed tobe 'real-time', so to do this, I have a 1px frame that refreshes every15 seconds (so the other frame doesn't have to reload all the time--thetop only reloads when a new record or a changed record hits the db).The real time data can be filtered in about 8 different ways.Currently, I have each user querying a table that contains 1 record,including the max ID and the most current insert/update posting date.The browser contains a cookie with that date. When the browser receivesnotification that there is some new info on the server, it refreshesthe top page and reloads the data. This is happening for all users. So,I thought to eliminate the 5,000 users running the same (or closevariations) of the same query each time a records is inserted/updated,that I would generate an XML file with the current day's data.In a dev environment this works 'ok'. I'm doing this by running anActiveX job on the SQL Server that calls a stored proc (FOR XML) andwrites the content to a file. Then from the web servers, I'm queryingthis file for the new timestamp and then if newer than the cookie,grabbing the XML (using the httprequest in the ASP XMLDOM) and usingXSLT to transform the data instead of parameterizing the queries.Theoretically I love this solution. Problems happen in a LIVEenvironment where either the file is being written to or the job isn'table to run. When 2 records are trying to be written within the samesecond, the file isn't being written (or maybe that the http requestingthe XML is keeping the file locked?)....anyway...this is a HUGE problethat I can't seem to solve. Once we roll to .NET I think storing thedataset in cache and updating cache (still don't know how I'll triggerthat without each user checking the db)....Long winded, sorry...help?
I need to copy a just-created bak file to another drive after the backup task has completed. I don't see anything in the job toolbox which works with file system operations like this. But still it must be a common need..There are ways to script this or use third part tools but I am looking for something native to the sql server 2012 SSMS toolset, if possible.
An alternate approach would be to run the backup job again, after the main backup, and change the destination to the alternate location. But I was thinking that another backup job would probably invoke more overhead on the server than a simple file copy operation. If I do end up taking this approach I could also use the cleanup task to toss older bak files in the alt dir.