SQL CE Database Doesn't Seem To Update Fast Enough.
Jan 4, 2008
I must be doing something wrong.
Code Block
TREE Form
ssql.Append("INSERT INTO FINDINGS (Facility) ")
ssql.Append("VALUES ('" & Facility & "')")
Try
Dim NewRow As Integer = dba.ExecuteSQL_Affected(ssql.ToString)
Catch ex As Exception
MsgBox("There was an error saving records.", MsgBoxStyle.Information, "No Key")
Exit Sub
End Try
Assessment.dtblFindings_Initialize()
Code Block
Public Function ExecuteSQL_Affected(ByVal sSql As String) As Integer
'//Execute the query like Insert, Update and delete
Dim RowsAffected As Integer
Try
If Conn.State = ConnectionState.Closed Then
Conn.ConnectionString = "Data Source=" & oDBConfig.LocalDBLocation & "" & oDBConfig.LocalDBName & ";"
Conn.Open()
End If
Dim cmd As New SqlCeCommand(sSql, Conn)
cmd.CommandType = CommandType.Text
RowsAffected = cmd.ExecuteNonQuery()
cmd.Dispose()
Conn.Close()
Return RowsAffected
Catch err As SqlCeException
MsgBox(Utility.ComposeSqlErrorMessage(err))
Catch ComErr As Exception
MsgBox(ComErr.ToString, MsgBoxStyle.Information)
Finally
End Try
End Function
Code Block
Assessment Form
Public Sub dtblFindings_Initialize()
Dim rdr As SqlCeDataReader
Dim dba As New DBAccess
Dim ssql As StringBuilder = New StringBuilder
ssql.Append("SELECT Facility FROM FINDINGS")
rdr = dba.OpenResultSet(ssql.ToString)
Try
rdr.Read()
While rdr.Read
...
So here is the problem. The normal function is to initiate the insert by pression a button. That should go through all the steps then hit the dtblFindings_Initialize command and rebuild the datatable. However when it happens for the first time (i.e. the first facility going into the database), the SELECT statement always returns nothing.
If I stop the application and Pull the database to the desktop, the row has been inserted. So I feel that I am somehow doing something wrong, not closing something, not initializing something....argh! Please help!!
In an ASP, I have a dynamically created SQL statement that amounts to "SELECT * FROM Server1.myDB.dbo.myTable WHERE Col1 = 1" (Col1 is the table's primary key). It returns the data immediately when executed.
However, when the same record is updated with "UPDATE Server1.myDB.dbo.myTable SET Comments = 'blah blah blah' WHERE Col1 = 1", the page times out before the query can complete.
I watched the program in Profiler, and I saw on the update that sp_cursorfetch was being executed as an RPC once per each row in the table. In a table of 78000 records, the timeout occurs well before the last record is fetched, and the update bombs.
I can run the same statements in Query Analyzer from a linked server and have the same results. The execution plan shows that a Remote Query is occurring on the select that returns 1 row, and a Remote Scan is taking place on the update scanning 78000 rows (I guess this is where all the sp_cursorfetch calls are happening...?).
How can I prevent the Remote Scan? How can I prevent the execution of the RPC sp_cursorfetch for each row in the remote table?
OK, This is an Update that I have working, But what do I do, if the customer does not exist already it doesn't add the customer? How should I remedy this? if the customer does exist works great.
UPDATE AC SET CustId = Left (CustomerId,10), CustName = Left (CustomerName,25), Addr1 = Left (Address1,25), Addr2 = Left (Address2,25), City = Left (ca.City,15), Region = Left (State,2), PostalCode = Left (Zip,5) FROM RIO.dbo.tblArCust AC INNER JOIN (SELECT CustomerCode, MAX(LastUpdatedDate) MaxDate FROM COFFEE.dbo.vueCustomerAddress GROUP BY CustomerCode) V ON V.CustomerCode = AC.CustId INNER JOIN COFFEE.dbo.vueCustomerAddress CA ON CA.CustomerCode=V.CustomerCode AND MaxDate=LastUpdatedDate WHERE CA.addresstypeid = 1
What should I have done? Is there anything that can be done other than restoring from backup? How does one know if the database is really recovering or is EM just joken? I can wait 2 hours before starting the restore
I was BCPing 12 million rows into a staging table. II used the '-b' option every 20K which I thought would do a commit and clear the log in batches. After the process EM appeared to show the transaction log as empty. Upon inspecting the Bcp output file I discovered the message that the BCP did not complete because syslogs was full. I could not do a truncate transaction log or a dump database. I tried to do a truncate transaction with no_log and it appeared to just hang. I stopped the SQL Server thinking I could dump the transaction log, but could not start the Sql Server again. I then stopped the NT Server because 'if all else fails'. The SQL Server started but the user database if marked as recovering.
My DB size was from 500MB to 10GB since 8/1998 to 12/2004. But now is 16GB (from 1/2005 - 5/2005), I don't why the data size growth too fast (as double) ?
The primary database i'm responsible for has started to grow super fast. Every couple of days is growing by 10% (which matches with the db settings). But, the recent growth doesn't match with the historical growth. It took a couple of months to grow from 7 to 8 GB, but it has grown to about 24 Gb in the last 2 months. Bottom line - trust my assertion that it's growing alarming fast.
I need help determine what objects are fueling the growth. If I know the objects, I can probably determine the cause. From a flip-side, it might be legit data stored very poorly. I'm open to any ideas...but I need to get ahead of this problem in the next week or so...or I'm going to run out of room on the hard drive and could start to affect my users.
I'm trying to work out a database design to make it quicker for my clientprogram to read and display updates to the data set. Currently it reads inthe entire data set again after each change, which was acceptable when thedata set was small but now it's large enough to start causing noticabledelays. I've come up with a possible solution but am looking for others'input on its suitability to the problem.Here is the DDL for one of the tables:create table epl_packages(customer varchar(8) not null, -- package_type char not null, -- primary keypackage_no int not null, -- /dimensions varchar(50) not null default(0),weight_kg int not null,despatch_id int, -- filled in on despatchloaded bit not null default(0),item_count int not null default(0))alter table epl_packagesadd constraint pk_epl_packagesprimary key (customer, package_type, package_no)My first thought was to add a datetime column to each table to record thetime of the last change, but that would only work for inserts and updates.So I figured that a separate table for deletions would make this complete.DDL would be something like:create table epl_packages(customer varchar(8) not null,package_type char not null,package_no int not null,dimensions varchar(50) not null default(0),weight_kg int not null,despatch_id int,loaded bit not null default(0),item_count int not null default(0),last_update_time datetime default(getdate()) -- new column)alter table epl_packagesadd constraint pk_epl_packagesprimary key (customer, package_type, package_no)create table epl_packages_deletions(delete_time datetime,customer varchar(8) not null,package_type char not null,package_no int not null)And then these triggers on update and delete (insert is handled automaticallyby the default constraint on last_update_time):create trigger tr_upd_epl_packageson epl_packagesfor updateas-- check for primary key changeif (columns_updated() & 1792) > 0 -- first three columns: 256+512+1024insert epl_packages_deletionsselectgetdate(),customer,package_type,package_nofrom deletedupdate Aset last_update_time = getdate()from epl_packages Ajoin inserted Bon A.customer = B.customer andA.package_type = B.package_type andA.package_no = B.package_nogocreate trigger tr_del_epl_packageson epl_packagesfor deleteasinsert epl_packages_deletionsselectgetdate(),customer,package_type,package_nofrom deletedgoThe client program would then do the initial read as follows:select getdate()selectcustomer,package_type,package_no,dimensions,weight_kg,despatch_id,loaded,item_countfrom epl_packageswherecustomer = {current customer}order bycustomer,package_type,package_noIt would store the output of getdate() to be used in subsequent updates,which would be read from the server as follows:select getdate()selectcustomer,package_type,package_no,dimensions,weight_kg,despatch_id,loaded,item_countfrom epl_packageswherecustomer = {current customer} andlast_update_time > {output of getdate() from previous read}order bycustomer,package_type,package_noselectcustomer,package_type,package_nofrom epl_packages_deletionswherecustomer = {current customer} anddelete_time > {output of getdate() from previous read}The client program will then apply the deletions and the updated/insertedrows, in that order. This would be done for each table displayed in theclient.Any critical comments on this approach and any improvements that couldbe made would be much appreciated!
Does anyone know how to upload (bulk) data from a client (written in Excel VBA) to a remote SQL2000 database? Of coarse I tried "INSERT INTO" and rst.addnew but I noticed this is much, much slower as downloading from the same remote database.
I have formview and I have a SqlDatasource for it.I have few textboxes in the edit mode and bind it to the data columns or fields in the database.If the data for all those fields have content in it, then it will update just fine. However, if one of the text field is null or empty, the formview can't be updated When i try to update with empty data in one textboxData field allows null value, and type are varchar.I am suspecting it's throwing an internal exception somewhere. However, since all the operations are handled by the asp.net. I have no idea what's going on internally. Does anyone have an idea what's causing this error and how to fix it?
In the device emulator when it starts I see two records that were added to the SDF in the VS2008 IDE. I added a third record and after tapping on the Save menu my navigation control shows all three records. I then closed the application. I went to Memory and stopped the form. I went to the File Editor and restarted the application. It only showed the original 2 records. In the File Editor I tapped on the SDF to open it in the Query Analyzer. It also only shows the original two records.
The Dataset appears to be updated, but not the bound table in the SDF. Can any one help? Code is below. bsTEST is the binding SOurce navTEST is my custom Navigation control
Private Sub HandleMenus(ByVal Sender As Object, ByVal EA As EventArgs) Handles mnuAdd.Click, mnuCancel.Click, mnuSave.Click
Hi, I have the following query:UPDATE lista SET audio='3847f5e9-4ef7-42d7-9e57-e5cbad9131b1.jpg' WHERE id='13';If the id already exists it'll modify the row correctly as expected. But if the id doesn't exist at the table I want the row to be inserted anyway, but this is not happening. The same query works well in mysql + php. The table has an identity increment of 1 for the primary key also. Any ideas?Thanks in advance.
insert into [dbo].[tbl_FG_Alert_Count_All_Report] ([Date] ,[Count] ,[Rule Type]) Â Â SELECT TOP 10 [Date] Â Â Â ,[Count] Â Â Â ,[Rule Type] Â FROM [dbo].[tbl_FG_Alert_Count_All] where Count <>'0' and DATEDIFF(dy,date,GETDATE()) Â = 1 order by Date desc
When I ran this T-SQL statement in SSMS; I don't get any error and as expected, I can see new data in [dbo].
[tbl_FG_Alert_Count_All_Report] table.Â
Now I created one job with same T-SQL Statement. Job completes successfully with out giving any error message; But unfortunately I don't see any new data in [dbo].[tbl_FG_Alert_Count_All_Report] table.  What would be the reason that I don't see new data when job completes successfully but I can see new data after executing same T-SQL statement in SSMS?
C#, Webforms, VS 2005, SQL Hi all, quick hit question. I'm trying to update a table with an employee name and hire date. Session variable of empID, passed from a previous page (successfully) determines which row to plop the update into. It's not working even though i compiles and makes it all the way through the code to the txtReturned.Text = "I made it" debug line...Any thoughts? 1 string szInsSql; 2 3 string sConnectionString = "Data Source=dfssql;Database=MyDB;uid=myID;pwd=myPWD"; 4 SqlConnection objConn = new SqlConnection(sConnectionString); 5 6 objConn.Open(); 7 8 szInsSql = "UPDATE empEmployee SET " + 9 "Name = '" + this.txtName.Text + "', " + 10 "HireDate = '" + this.txtHireDate.Text + "', " + 11 "WHERE empID = '" + Session[empID] + "'"; 12 13 SqlCommand objCmd1 = new SqlCommand(szInsSql, objConn); 14 objCmd1.ExecuteNonQuery(); 15 16 txtReturned.Text = "I made it"; It's got to be a ' or a , out of place but I've looked at this code for a half hour straight, trying a variety of changes...and it still doesn't update the DB...Any help would be great. Thank you! -Corby-
Hi all: I have a list of items (actually a relation in which a user has selected an item, along with a rating for the item) in an Access database table, connected to my app with a SqlDataSource and bound to a repeater. The repeater displays the items to the user along with a dropdown box to show the rating, and allow the user to update it. The page connects and displays correctly. My problem is that when the user submits the page and I iterate through the repeater items to update each rating, the updates are not being completed in the database. The update works if I hard-code a value for the rating into the query itself, but not when using an updateparameter (pTaskRating below). In other words if I replace pTaskRating with '5', all the correct records will be found and have their ratings updated to 5. That means that the mySurveyId and pTaskId(DefaultValue) parameters have to be working, because the right records are found, but I can't seem to update records based on the DefaultValue of the pTaskRating parameter, even though I can verify that the DefaultValue is correct by placing a watch on it. It seems that my problem must be in my use of that particular parameter in the query, either in properties of the parameter or in the value assigned to it. I am extremely frustrated - any ideas would be greatly, greatly appreciated. Thanks! Bruck The table I'm pulling from and updating looks like this: SURVEY_ID (Text 50), TASK_ID (Long Int), RATING_ID (Long Int) Here's my ASPX for the main data source: <asp:SqlDataSource ID="sqlTaskSelections" runat="server" ConnectionString='Provider=Microsoft.Jet.OLEDB.4.0;Data Source="abc.mdb";Persist Security Info=True;Jet OLEDB:Database Password=xyz' ProviderName="System.Data.OleDb" SelectCommand="SELECT [SURVEY_ID], [TASK_ID], [RATING_ID] FROM [TBL_TASK_SELECTION] WHERE [SURVEY_ID] = mySurveyId" UpdateCommand="UPDATE [TBL_TASK_SELECTION] SET [RATING_ID] = pTaskRating WHERE ([SURVEY_ID] = mySurveyId) AND ([TASK_ID] = pTaskId)"> <UpdateParameters>
<asp:SessionParameter Name="mySurveyId" SessionField="SurveyId" DefaultValue="" /><asp:Parameter Name="pTaskId" DefaultValue="" /><asp:Parameter Name="pTaskRating" DefaultValue="" /> </UpdateParameters> And here's the repeater (the Task ID and Rating are stored in hidden fields for easy access later): <asp:Repeater ID="rptTaskSelections" runat="server">
<FooterTemplate></td></tr></table></FooterTemplate> </asp:Repeater> And here's the page load and submit VB: Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
If Not Page.IsPostBack Then
'BIND / LOAD RATINGS TO DROPDOWN BOXES HEREDim i As IntegerDim cbCurrentRating As DropDownListDim hCurrentRating As HiddenFieldrptTaskSelections.DataSource = sqlTaskSelectionsrptTaskSelections.DataBind()
I've got a listbox that displays a list of employee's names. The employee number is the value stored in the listbox. I then have a vaccinations gridview that displays all the vaccinations received by the selected employee in the listbox. For some reason, when I click edit and modify a record and then click update it doesn't want to actually update the record. It just appears to do a postback and redisplay the record without any changes. My sqldatasource control is configured as follows: SelectCommand="SELECT * FROM [tblVaccinations] WHERE ([EmpNum] = @EmpNum)" @empnum = mylistbox.selectedvalue the update command is as follows: UpdateCommand="UPDATE [tblVaccinations] SET [EmpNum] = @EmpNum, [VacType] = @VacType, [VacIssueDate] = @VacIssueDate, [VacExpDate] = @VacExpDate, [VacInstitution] = @VacInstitution WHERE [VaccinationNum] = @VaccinationNum"
I'm using a performance monitor counter with an alert to catch merge replication conflicts. As you might have seen in my earlier psting this alert keeps running even if all conflicts have been resolved. After almost 2 days searching I found that in the table master..sysperfinfo there all the PM counters are stored still has the value 4 for conflicts/sec. When I manually create a new conflict this value becomes 5 which is nonsense since it's not the number of conflicts per second but the total number of conflicts since I started the server. If I check in PM it shows the correct value, whih of course is zero 99% of the time. Has anybody ever experienced the same kind of problem??
Or does anybody know a way of resetting the values in sysperfinfo. Even after allowing direct updates to systemtables I still can't modify the table.
Hello everybody, I can't perform an operation apparently very easy: set a field to a NULL value.
This is the db: Microsoft SQL Server 2000 - 8.00.760 (Intel X86) Dec 17 2002 14:22:05 Copyright (c) 1988-2003 Microsoft Corporation Standard Edition on Windows NT 4.0 (Build 1381: Service Pack 6)
This is the table: CREATE TABLE [ProgettoTracce] ( [ID_Progetto] [int] NOT NULL , [MisDifDef] [real] NULL , [MisDifMeas] [real] NULL , [MisDifAna] [real] NULL , [MisDifID] [real] NULL , [MisDifCV] [real] NULL ) ON [PRIMARY] GO
This is qry: UPDATE ProgettoTracce SET MisDifDef = NULL WHERE ID_Progetto = 3444
The qry has been performed with no error. Then I execute SELECT * FROM ProgettoTracce WHERE ID_Progetto = 3444 and I find the value I tried to overwrite with NULL. If I update with 0 (for example) it works. Obviously this happens on the production db, because on the development db the update with NULL works fine. No transaction is called, db options are the same on dbs...
I have this script in my database, but it always gives 2054 rows back and if I actually DO change something it doesn't even notice...
UPDATE a SET a.[omschrijving]=SP.[omschrijving] ,a.[verkoopprijs]=SP.[verkoopprijs] ,a.[gewijzigd]=getDate() FROM [artikelen] a LEFT OUTER JOIN [Hofstede].[dbo].[sparepartsupdate] SP ON a.PartNrFabrikant = sp.PartNrFabrikant WHERE ((A.omschrijving != SP.[omschrijving]) OR (A.[verkoopprijs] != SP.[verkoopprijs]))
We recently migrated from SQL Server 7 to SQL Server 2005. Now there's a curious thing with some legacy applications. I have pasted some code below. Don't judge me, because like I said, it's legacy.
You can see, that I have two connection strings. One is commented and accessing via SQL Native Client. The other one is doing this through the old SQL Server driver. Funnily enough, when I use the new Native Client driver, the exception "Run-time error '-2147467259 (80004005)' [Microsoft][SQL Native Client]Invalid attribute value" is thrown in the rsPorder2.Update line. With the old driver, this works just alright.
Is this a bug? Is there a way, to make the code run, because we don't want to search the whole application for other occurances, if not necessary.
Any insights would be greatly appreciated.
Best regards, DD
Dim ilinx As New ADODB.Connection Dim rsPorder As New ADODB.Recordset Dim rsPorder2 As New ADODB.Recordset Dim cmdLinx As New ADODB.Command Dim strConn As String
I have this sql stored procedure in SQL Server 2012:
ALTER PROCEDURE [dbo].[CreateBatchAndSaveExternalCodes] @newBatches as dbo.CreateBatchList READONLY , @productId int , @cLevelRatio int , @nLevelRatio int AS set nocount on;
I am writing a stored procedure which updates a table, but when I run the stored procedure using a login that I have granted execute privileges on, then I get a message that I cannot run an update on the table. This would happen in dynamic sql... while my SQL has parameter references, I don't think it is considered dynamic SQL?
sproc: CREATE PROCEDURE [schemaname].[SetUserCulture] @UserID int , @Culture nvarchar(10) AS UPDATE dbo.SecUser SET Culture = @Culture WHERE UserID = @UserID
using System; using System.Collections.Generic; using System.Text; using System.Data; using System.Data.OleDb; using System.Collections;
namespace TimeTracking.DB { public class sql { OleDbConnection conn;
// //the constructor for this class, set the connectionstring // public sql() { DBConnectionstring ConnectToDB = new DBConnectionstring(); conn = ConnectToDB.MyConnection(); }
// // // public void UpdateEntry(int ID, string Week, string Year, string Date, string Project, string Action, string Time, string Comment) { int m_ID = ID; int m_Week = (Convert.ToInt32(Week)); int m_Year = (Convert.ToInt32(Year)); string m_Date = Date; string m_Project = Project; int m_ProjectID = new int(); string m_Action = Action; int m_ActionID = new int(); Single m_Time = (Convert.ToSingle(Time)); string m_Comment = Comment;
// //get the project ID from the database and store it in m_ProjectID // OleDbCommand SelectProjectID = new OleDbCommand("SELECT tblProject.ProjectID FROM tblProject" + " WHERE (((tblProject.Project) LIKE @Project))", conn);
// //get the action ID from the database and store it in m_ActionID // OleDbCommand SelectActionID = new OleDbCommand("SELECT tblAction.ActionID FROM tblAction" + " WHERE (((tblAction.Action) LIKE @Action))", conn);
finally { //close the connection if (conn != null) { conn.Close(); } } } } }
Code Snippet
The update statement is not working in my application, no error in C# and no error in ms-access. When I paste the update query into the ms-access query tool and replace the parameter values (@....) with real values, is will update the record.
I'm having a problem with my database, can any of you sql gurus out there help me? after I use sp_resetstatus to unmark the database as suspect I get this: error: 602, Severity: 21, State: 15 Could not find the row in sysindexes for dbid '6', object '9', index '2'. Run DBCC CHECKTABLE on Sysindexes
then this one: error: 3414, Severity: 21, State: 1 Database 'maillist' (dbid 6): Recovery failed. Please contact Technical Support for further instructions.
I can't do checktable because it's suspect, and I can't reset it's suspect status, any ideas?
In enterprise manager I am copying a table from one database toanother. I am using the dts wizard to import the data. After Isuccessfully import the data, I open both tables to compare therecords to make sure they are the same. I right click on a field andclick "last" for both tables. However, the record is different forboth. If I do a query the record is still there but they do not showup in the same order. Why does'nt the import wizard import therecords in the same order? Any help would be greatly appreciated.
I am having trouble getting database mail to work. I have tried many things, but can't make it work.
Initially, I setup IIS opn the same box as a virtual SMTP server. I set it up to relay to 127.0.0.1. In sql server, I setup database mail to point to localhost, with no authentication. After the setup, when I test the email it works fine.
I created a job that will fail every time. I setup myself as the notify party on the job. When I run the job, it fails. I get no email. The job log has the following error (notice failed to notify):
Message The job failed. The Job was invoked by User xxxxAdministrator. The last step to run was step 1 (test 1). NOTE: Failed to notify 'Dan Jones' via email.
In the SQL Server log, I see:
[264] An attempt was made to send an email when no email session has been established
Also:
Message [260] Unable to start mail session (reason: Microsoft.SqlServer.Management.SqlIMail.Server.Common.BaseException: Mail configuration information could not be read from the database. ---> System.Data.SqlClient.SqlException: profile name is not valid at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) a)
This is my preferred configuration, but I couldn;t make it work, so I tried another...
I tried to setup mail through my mail service provider. I entered the IP in sql server. When I test it, I get the following error:
Date 11/19/2007 8:16:06 AM Log Database Mail (Database Mail Log)
Log ID 24 Process ID 3176 Last Modified 11/19/2007 8:16:06 AM Last Modified By NT AUTHORITYSYSTEM
Message 1) Exception Information =================== Exception Type: Microsoft.SqlServer.Management.SqlIMail.Server.Common.BaseException Message: Could not retrieve item from the queue. Data: System.Collections.ListDictionaryInternal TargetSite: Microsoft.SqlServer.Management.SqlIMail.Server.Controller.ICommand CreateSendMailCommand(Microsoft.SqlServer.Management.SqlIMail.Server.DataAccess.DBSession) HelpLink: NULL Source: DatabaseMailEngine
StackTrace Information =================== at Microsoft.SqlServer.Management.SqlIMail.Server.Controller.CommandFactory.CreateSendMailCommand(DBSession dbSession) at Microsoft.SqlServer.Management.SqlIMail.Server.Controller.CommandFactory.CreateCommand(DBSession dbSession) at Microsoft.SqlServer.Management.SqlIMail.Server.Controller.CommandRunner.Run(DBSession db) at Microsoft.SqlServer.Management.SqlIMail.IMailProcess.ThreadCallBack.MailOperation(Object o)
On the same box, I used telnet to step through thge SMPT mail send process. It worked fine, indicating no firwewall, router or other issues.
I am of the opinion that Database mail is defective. This post is similar to another I have done, but no usefull responses have been made. I am hoping that sonmeone will provide some useful information here
This is running on Windows 2003 Standard, and on SQL Server workgroup edition. I have a second backup server similary configured that behaves exactly the same.
Since VWD Express doesn't support Publish Website, I used 'Copy Website' to deploy my local site to the hosting server. The static page works fine, but when it gets to accessing database, I kept getting the following error: An attempt to attach an auto-named database for file I:DataWebdnh.sklingling_15726c71-b2bf-479f-bcc3-b7ae43318f3cwwwApp_DataPersonal.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. Anyone has the same experience? How to resolve this? Thanks,
I'm using 2014 SE.I know a backup of a database doesn't take the source DB offline, but then I need to move this DB to another server (for intensive reporting work). At present we restore the DB, but that means putting the DB in single user mode, kicking everyone off, and completing the restore.
I see from 2014 EE notes that "online restore" is possible. EE is of course, mightily expensive.Or perhaps it's possible to configure things to speed up the restore process somehow, so there is less downtime?, the resource impact in creating the backup is quite high, perhaps there's a way to (apart from playing with backup compression) reduce the impact on the source server here?
Hi, all, I found that the SQL2000 EM does not show database space allcoated information, as well as tables and indexes size while SQL 7.0 does. Someitmes these information are fairly handy. is there any other easy ways to find out the same info from SQL2000 through Em, or elsewhere ?? Thanks Anthony
I want to create a SP that creates a new database, so I script-ed out the db and paste the script into new SP gui (SQL 2000). I want to pass it a variable for the data/log file location that is not the default location. The original script looks like this: CREATE DATABASE [PWRR_DDS] ON (NAME = N'PWRR_DDS_Data', FILENAME = N'F:SQL SERVER FILESDatabasesdbName.mdf' , SIZE = 3118, FILEGROWTH = 10%) LOG ON (NAME = N'PWRR_DDS_Log', FILENAME = N'F:SQL SERVER FILESDatabasesdbName_Log.LDF' , SIZE = 5000, FILEGROWTH = 10%) COLLATE SQL_Latin1_General_CP1_CI_AS. I replaced the path of the FILENAME variable like this: CREATE DATABASE [PWRR_DDS] ON (NAME = N'PWRR_DDS_Data', FILENAME = @DBPath, SIZE = 5000, FILEGROWTH = 10%) LOG ON (NAME = N'PWRR_DDS_Log', FILENAME = @LogPath , SIZE = 5000, FILEGROWTH = 10%) COLLATE SQL_Latin1_General_CP1_CI_AS, declaring the variables as char(500). The error I get is "Incorrect sybtax near '@DBPath'. Any ideas for workaround? Thanks, EJM