From my workstation (with SQL Server 7 Desktop Edition SP3), I seem unable to restore a database on my Server (SQL Server Standard Edition SP3). I am logged into both machines, and I am an Administrator on both machines. Using either a UNC or Mapped Drive (see below)
RESTORE DATABASE ogAEC FROM ogAECDump WITH REPLACE , RECOVERY , STATS
, MOVE 'AEC_Data' TO 'Og-sqlsrvrC-DriveMSSQL7DataogAEC_Data.MDF'
, MOVE 'AEC_Log' TO 'Og-sqlsrvrC-DriveMSSQL7DataogAEC_Log.LDF'
RESTORE DATABASE ogAEC FROM ogAECDump WITH REPLACE , RECOVERY , STATS
, MOVE 'AEC_Data' TO 'Q:MSSQL7DataogAEC_Data.MDF'
, MOVE 'AEC_Log' TO 'Q:MSSQL7DataogAEC_Log.LDF'
I get
Server: Msg 3156, Level 16, State 2, Line 1
The file 'Og-sqlsrvrC-DriveMSSQL7DataogAEC_Data.MDF' cannot be used by RESTORE. Consider using the WITH MOVE option to identify a valid location for the file.
But I am successful if I run the essentially command locally from the server:
RESTORE DATABASE ogAEC FROM ogAECDump WITH REPLACE , RECOVERY , STATS
, MOVE 'AEC_Data' TO 'C:MSSQL7DataogAEC_Data.MDF'
, MOVE 'AEC_Log' TO 'C:MSSQL7DataogAEC_Log.LDF'
What can I do to be able to restore DBs from my workstation?
I have been trying to use openrowset with a shared drive, and even though the share has "full control" permissions granted to "everyone" and the accout that SQL runs under has been granted explicit full control permissions I am unable to open the file which itself has no security on it.
Can I not use a \ path and only use mapped drives?
Thanks
below works...
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=C:5People.xls', [Sheet1$])
below doesn't work...
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=\cluster02FileManager5People.xls', [Sheet1$])
I created a DTS local package on the SQL Server. It's basically importing a text file into a table in my database. This file resides in a mapped drive (X:) from another server. When I schedule the DTS as a job, it fails. It doesn't execute any of the workflow in the design.
However, when I copy the text file into a drive local to the SQL server (D:), it runs flawlessly.
What I do right now is I have a windows scheduled task that runs a batch file that copies the text file from X: to D: at certain time intervals. Then the job scheduler runs to import it.
What am I missing? How come the job scheduler can't read the file directly from the mapped drive?
I am having a problem with a DTS package that pulls from a flat file off a mapped drive. When the package is ran alone, it runs perfectly but the stored proc that I took from an example from the net will not execute the DTS properly and I am unsure as to why it will not do so.
CREATE PROC spExecuteDTS @Server varchar(255), @PkgName varchar(255), -- Package Name (Defaults to most recent version) @ServerPWD varchar(255) = Null,-- Server Password if using SQL Security to load Package (UID is SUSER_NAME()) @IntSecurity bit = 0,-- 0 = SQL Server Security, 1 = Integrated Security @PkgPWD varchar(255) = ''-- Package Password AS
SET NOCOUNT ON /* Return Values - 0 Successfull execution of Package - 1 OLE Error - 9 Failure of Package */ DECLARE @hr int, @ret int, @oPKG int, @Cmd varchar(1000)
-- Create a Pkg Object EXEC @hr = sp_OACreate 'DTS.Package', @oPKG OUTPUT IF @hr <> 0 BEGIN PRINT '*** Create Package object failed' EXEC sp_displayoaerrorinfo @oPKG, @hr RETURN 1 END
-- Unitialize the Pkg EXEC @hr = sp_OAMethod @oPKG, 'UnInitialize' IF @hr <> 0 BEGIN PRINT '*** UnInitialize failed' EXEC sp_displayoaerrorinfo @oPKG , @hr RETURN 1 END
-- Clean Up EXEC @hr = sp_OADestroy @oPKG IF @hr <> 0 BEGIN EXEC sp_displayoaerrorinfo @oPKG , @hr RETURN 1 END
RETURN @ret GO
that is the stored proc that i am using along with a couple error trapping ones but this being the one that does the actual execution. Is there anything i can change about this in order for it to run the DTS properly from the mapped drive?
I have a rather odd problem that hopefully you'll be able to shed some light on.
We want to back up the databases to a hard drive held on another server so I mapped the drive in explorer to the drive then went into Enterprise manager and tried to create a backup device and it won't see the mapped drive.
I've tried mapping to my PC and I can see that via enterprise managers backup stuff (infact any PC in the office works) but it won't see any of the servers even though we can map to them and access them via windows explorer.
I've tried when logged on via sa and the windows NT administrator and still no luck. In fact no matter what I type or do it fails and keeps telling me device error or device off line which it isn't.
On our test instance of SQL Sever we can backup to other servers but not the new live version!
Any thoughts on what might cause this to happen and how to fix it?
I have a rather odd problem that hopefully you'll be able to shed some light on.
We want to back up the databases to a hard drive held on another server so I mapped the drive in explorer to the drive then went into Enterprise manager and tried to create a backup device and it won't see the mapped drive.
I've tried mapping to my PC and I can see that via enterprise managers backup stuff (infact any PC in the office works) but it won't see any of the servers even though we can map to them and access them via windows explorer.
I've tried when logged on via sa and the windows NT administrator and still no luck. In fact no matter what I type or do it fails and keeps telling me device error or device off line which it isn't.
On our test instance of SQL Sever we can backup to other servers but not the new live version!
Any thoughts on what might cause this to happen and how to fix it?
How to Restore database from a Mapped Network Drived where i have taken backup from a production server and want to restore in Development server, without copying it to local harddrive. I do not have enough space on my local Drive. I am trying Following command, but i get error, saying cannot recognize 'g:mssql7ackupProduction.bak'.
RESTORE DATABASE TestRestore FROM disk ='g:mssql7ackupProduction.bak' WITH REPLACE, MOVE 'TestRestore' TO 'c:mssql7dataNewNwind.mdf', MOVE 'TestRestore_log' TO 'c:mssql7dataNewNwind.ldf'
I have VC++ express and MSSQL SMS express and have an application working nicely locally. The Data explorer and data connections part works really easily.
Now, I want to make the application available to my home network.
I mapped the drive where the database is and called it Z: so I could put my "release" on my other network PC and assumed it would find Z: if I mapped the shared network drive on that machine and called it Z:
But: I can't even add the mapped connection on the local machine, I get:
The file "Z:databasescalorie.mdf" is on a network path not supported for database files. An attempt to attach.....etc"
It works fine on the original F drive.......
Am I approaching this the wrong way. How should I distribute to network PCs?
Our production database is located on one server, and our test database resides on another. Often, we need to restore the production dump to the test database. But the Restore Device from Server window doesn`t seem to allow references to dump files on a different server. I`ve tried using the Add Backup Disk File window to point to the remote location; but it doesn`t seem to save the reference when I close the window.
I`m sure there`s a way to handle this without copying the dump file from one server to the other.
This would seem to be a common scenario. Thanks for any suggestions that others have found useful.
I have configured a cluster SQL 2005 (active/passive) and added 4 LUNS to that cluster the cluster works without problems. However I am having issues backing and restoring on 2 drives. I can complete the operation ( a resotre of 100MB ) DB in less than a minute if i use the 2 drives but it takes about an hour if i use the other 2 drives. The wait type is ansynchronous_IOcompletion and backupthread in dm_exec_requests.
I have detached the drives and reformatted it but stil lsame issues any help
as indicated by my stupid question, I am very new to sql. our vrsion is 2000 and I'm talking about in enterprise manager, the database that was created is not showing up in the list of db. Although I can see the file in explorer.
The problem I€™m having is when I try to attach the database €œmailarchive3Q2007_data.mdf€? it is also looking for the log file €œmailarchive3Q2007_log.ldf€? . The log file was removed by someone else off our system. I have a backup of the file but it is too large to restore now (160 gig) when the system was first set up the recovery model was not set to simple so the log just grew till it filled up our drive. I no longer have the drive space necessary to restore the log file and shrink it. So what do I do now? I need some kind of €œmailarchive3Q2007_log.ldf€? file to attach the database in enterprise manager.
We are attempting to restore one of our databases from a backup that went to a local drive on the server. We see the backupset in the list but receive an error that it is not available when we try to use it. When we try to restore from device and select the files the drive letter is not available. When we attempt to enter the path to the file it can't locate it. We don't have space on our SAN to copy the backup there and we can't add the local drive to the cluster resources.
I am trying to move a log file from one drive to another.
What I have done is add another file to my file group. So now my log has a file on the 'e' drive and one on the 'f' drive. I now want to remove the file on the 'e' drive. I have emptied the file on the 'e' drive. When doing the command:
ALTER DATABASE Uniprodruntime REMOVE FILE m_rk_runtime_log
I get the following error message..
Server: Msg 5020, Level 16, State 1, Line 1 The primary data or log file cannot be removed from a database.
I have also gone into enterprise manager and tried to delete the file and it does nothing.
 1: TempDB keeps getting filled.  Restart of the server has not fixed it. I shrink it, but the space gets filled again. Now I can't even shrink it anymore 2: TempDB is at the wrong location. Its current location is this :C:Program FilesMicrosoft SQL ServerMSSQL10_50.SQLPROD6MSSQLDATA empdb
How do I change its location?Â
C:Program FilesMicrosoft SQL ServerMSSQL10_50.SQLPROD6MSSQLDATA empdb Correct location of TempDB should be: TempDB(T:) But its not there
Being a very novice SQL Server administrator, I need to ask the experts a question.
How do I go about moving a database from 1 drive to another? The source drive (C is local to the server, but the target drive (E is on a Storage Area Network (SAN), although it is still a local drive for the server. I want to move the database from C: to E:. Can someone provide me with instructions?
I mapped a drive on to my SQL Server box. It points to another server from the same domain. When I try to backup or restore a database, I can't see this mapped drive through my SQL Server. Even if I type the entire path, SQL Server wouldn't take it. I don't have a clue about why it is not working. Can anyone throw some light on this. Your help is grately appreciated.
Is there a way to setup a mapped database between ms sql servers? I have 2 mssql servers, however, if someone connects to sql1 and tires to use a database that son sql2, i want it to map to that without giving an error that the database doesnt existst.
I tried linking them but it still says "xxx database doesnt exists" since it recides on the other server.
I am trying to grant users permissions to our database, but when I add the users through a local group, I receive a mapped name of DOMAIN_USERNAME.
We are using standard security, with the SSQL server in a resource domain. I have created a local group with the global group inside it from the accounts domain. This seems usual MS practice to me.
We do not want these long cumbersome login names, but I am slowly worrying that there may perhaps be no way around it. Can't they just login with their normal username?
Due to a previous (mis)configuration, i need to grant readwrite permission on a share from a MSSQL DB User.The SQL user will launch t-sql queries on demand and they cannot be scheduled.
I've created a credential object in SSMS, configured it with the correct AD user and mapped it to the MSSQL DB user.Now, if i execute a simple t-sql backup:
BACKUP DATABASE [DB] TO DISK = N'IP.ADD.RE.SSshareDB.bak' WITH NOFORMAT, NOINIT, NAME = N'DB-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10 GO
I get an access denied error, monitoring the sqlserv.exe process via procmon, i see that the Sql Server process is not impersonating the AD user configured in the credential, it still try to access using the local machine account .
If a user is mapped to "master", (in login properties, user mapping) are they able to access all dbs, even though "master" is the only one with the check mark?
How to backup half of dbs from a server on C drive and the other half on D drive and vice versa, first half on D drive and other half On C drive using only one job and one stored procedure??
Using scheduling from job add 2 schedules to the job so first schedule backup first half to C and second half to D , the second schedule backup first half to D and second half to D.
Our company is migrating a Microsoft Access 2010 backend database to a SQL Server 2008 database. One of the memo fields in the Access backend can store up to 150 Kb of Unicode data. To store this data in SQL server, we found that we can use the following data types:
Because ntext will be deprecated in future releases of SQL Server, the only good alternative to store an Access memo field in SQL server is to use nvarchar(max), which is what Microsoft recommends for large Unicode texts.Storing a large amount of text like 150 Kb in an nvarchar(max) field using only SQL server works as expected. However, if Access is used to store the data in a table linked to SQL server, the maximum number of characters allowed is only 4000. We found that this limitation is imposed by the ODBC driver that limits nvarchar(max) to 4000 characters.
The connection string we are currently using to link a table to SQL server is this:
ODBC;DRIVER={SQL Server Native Client 10.0};SERVER= SQLEXPRESS;DATABASE=TestDB;Trusted_Connection=No;UID=uid;PWD=pwd;
Any solution for this limitation storing large amounts of data in a Microsoft Access memo field mapped to an nvarchar(max) data field in a SQL Server database?
In my SSIS package I have a text file source that I am mapping to a destination table. I have an error component that logs any row level errors and have noticed that it is not logging the correct field. I know this because I have a few different sources that submit the same files and have looked at the source of both. THE ONLY DIFFERENCE in the one that works versus the one that does not is that 2 of the 25+ columns are switched. I would not think this would matter because field A in the text file is mapped to field A in the database.
Does the order in which the fields come into the SSIS package matter?