Updating A Production Database With Back Up Of Development Database
May 11, 2007Can someone provide a step by step tutorial for this? I'd like to safely update a database that is used for a website without much or any downtime.
View 1 RepliesCan someone provide a step by step tutorial for this? I'd like to safely update a database that is used for a website without much or any downtime.
View 1 RepliesI have a production database with a backup job that creates files with thenaming convention dbname_db_200503291800.bak. I want to schedule a restorejob that will retire yesterdays backup. How can I write my restore statementso that it will specify the backup file with yesterdays date.Thanks
View 1 Replies View RelatedWhat's the best way to move database from development to production environment?
View 1 Replies View RelatedHi,
We're using SQL Server 2000 as back end in our web project. The problem is we've 3 different copies of same database - one each for Development, Test and Production sitting in 2 different machines.
My question is - is there any tool for comparing the objects (tables, stored procedures, etc) ?
Thanks,
Harish
I have been looking for some documentation that would support or rejectmy opinion on Production -vs- Development naming conventions. Ibelieve that each environment should be housed on separate servers withidentical names, access, users, stored procs....... If you eitheragree or disagree with this methodology, I would appreciate your input.TIA,Bill
View 5 Replies View RelatedHello,
I am trying to refresh a test database with data from a production database. Both database structures are identical, e.g. constraints, stored procs, PK, etc. I am trying to create a package in SSIS that accomplishes this task and I am having extensive problems. The import export wizard is out of the question because the constaints are not carried over, plus when I try to refresh the data using the import export wizard, it fails on 1 specific table because of a column in that table named "Error code". I think "Error code" is a micrsoft keyword, so it fails on this column. Does anyone know a workaround that I can do to accomplish this simple task, that could be completed in minutes using DTS. I understand that SSIS is not as straight forward as DTS, but this task is something that DBA's do on a regular basis and therefore should not be this difficult.
Any help would be appreciated,
David
I developed an asp.net application in visual web developer 2005 express edition and SQL sever 2005 express with Advanced services. The application has been deployed and iam wondering what tools are availabel to for backing up my data. Are there any tools i can use to back-up my database. Iam not talking of third party tools but tools a vailable in sql sever 2005 express with advanced services or visual web developer express.
OR can write a vb.net Sub procedure that i run and have my database backed up. If so where can i start or what other options may i explorer.
I have question regarding SQL Transactional Replication methodology
1. Let's say successfully created SQL Transactional Replication and running / transferring data from publisher to subscriber
2. Now one day the source production / publisher SQL Server is down and the remaining DR SQL Server is up (subscriber)
3. Next day, we fixed and bring up the production / publisher SQL Transactional Replication server, then how can we sync back all existing data records from subscriber into publisher side ?
Hi.
I have a question that I have been thinking about for some time. Have in mind that I'm somewhat a SQL newbe.
I work on a few webapplications that are database driven applications. When we desides to change something in the database, we normaly do the changes on our replicat development database, and after some testing the changes are suppose to be implamented in the live, public database.
Now, what is the best way, or recomended way to do this? Today we bo it manualy. But at one time I experimented with the export/import wizard to se if it was possible. What I want to do is change de design of a tabel, without loosing the original data, and without geting the data from the development database into the live database.
To export an stored proccedure, I usualy just create a SQL-script for the procedurs in question and the runthe script against the public DB using the Query Analyser. But if I do the same procedure on a table, I loss all the data in the database.
Do I make any sense at all?
Any tips/idea/best practise on this subject?
/Johan Christensson
Hi
Can we use BCP AND/OR BULK INSERT to do development server databse refresh from production server . I have to do this by following the rules below:
- no truncation of source table
- update of changed or new records only.
I know we can use replication, dts and other methods for this but I need information about this one.
Thanks
Hi,
I have an application where I'm filling a dataset with values from a table. This table has no primary key. Then I iterate through each row of the dataset and I compute the value of one of the columns and then update that value in the dataset row. The problem I'm having is that when the database gets updated by the SqlDataAdapter.Update() method, the same value shows up under that column for all rows. I think my Update Command is not correct since I'm not specifying a where clause and hence it is using just the value lastly computed in the dataset to update the entire database. But I do not know how to specify a where clause for an update statement when I'm actually updating every row in the dataset. Basically I do not have an update parameter since all rows are meant to be updated. Any suggestions?
SqlCommand snUpdate = conn.CreateCommand();
snUpdate.CommandType = CommandType.Text;
snUpdate.CommandText = "Update TestTable set shipdate = @shipdate";
snUpdate.Parameters.Add("@shipdate", SqlDbType.Char, 10, "shipdate");
string jdate ="";
for (int i = 0; i < ds.Tables[0].Rows.Count - 1; i++)
{
jdate = ds.Tables[0].Rows[i]["shipdate"].ToString();
ds.Tables[0].Rows[i]["shipdate"] = convertToNormalDate(jdate);
}
da.Update(ds, "Table1");
conn.Close();
-Thanks
hi!
I am able to run the package successfuly in test database. but not in production database. It throughs up error saying
Description: Unable to load the package as XML because of package does not have a valid XML format. A specific XML parser error will be posted.
Description: Failed to open package file "D:\TAHOE\APPS\SSISPackages\Integration Services Packages\ArchiveMain.dtsx" due to error 0x80070015 "The device is not ready.". This happens when loading a package and the file cannot be opened or loaded c
orrectly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
End Error
Could not load package "D:\TAHOE\APPS\SSISPackages\Integration Services Packages\ArchiveMain.dtsx" because of error 0xC0011002.
Description: Failed to open package file "D:\TAHOE\APPS\SSISPackages\Integration Services Packages\ArchiveMain.dtsx" due to error 0x80070015 "The device is not ready.". This happens when loading a package and the file cannot be opened or loaded corr
ectly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
what is the problem here....
JAs
New to Database Mirroring and I have a question about the Principal database server. I have a Database Mirroring setup configured for High-safety with automatic fail over mode using a witness.
When a fail over occurs because of a lost of communication between the principal and mirror, the mirror server takes on the roll of Principal. When communication is returned to the Principal server, at some point does the database that was the previous Principal database automatically go back to being the Principal server?
we have a situation where modifications to the stored procs and views keep happening in the development databse environ while there is a live production database also up and running.
periodically we need to ensure that the development and the production are in synch where all changes made to dev are transferred to prod. however the entire prod database obviously cannot be overwritten as it contains production data. therefore what we need to do is to transfer all SPs and Views from the development to prodn. ( we do not prefer incremental transfers as there are many SPs and views and we can never really be sure that all changes have been transferred )
the obvious way is to script out all SPs and views in dev and run the scripts in prodn. but what we encountered was the enterprise manager does not script out the objects in heirarchical sequence - ie parent objects first and then the objects that refer the parent objects. this leads to errors when running the scripts as the objects required for the creation of a SP, say another SP run from within that SP , may not have been generated by then as the "contained" SP is lower in the alphabetical order in which SQL server scripted out the objects.
finally we had to look at each and every error and manually set the process right.
moreover in the way we did things the "sysdepends" table got all screwed up and therefore we can no longer depend on enterprise manager to show dependencies within the database.
is there any elegant way of going about this ? we also tried transfer manager in DTS but it also gave the same error. moreover can the sysdepends table be rebuilt from scratch ?
I have a student who has a 100GB database that we need to have a R/W copy of in a Development environment.
It would be okay for the development system to have yesterday's data.
What would be the easiest solution for replicating the data.
Replication?
Log Shipping?
Backup/Restore?
Other Ideas?
Thanks in adavnce for the help.
Cheers,
Casey
Hello everyone,
I have a question about how to efficiently develop on a development version of a database and get those changes over to the production database without loosing live data, etc.?
Here is how we are trying it:
1 Export production database to temp database(exact copy)
2 Delete production database
3 Export development database (definition only, no data) to new database with production database's name
4 Export production (temp) database (data only) over to new production shell
We are having truncate errors due to FK constraints.
Is there a better way?
Thanks for all help
Kevin
I have two servers: one production and one development. There is a third party query that runs on both servers. Yes, the query is poorly tuned - I cannot change it. In production, it runs in 54 minutes. In development, I have tried to let it run for days and it never completes.
Here's what I have tried so far:
Compared the settings of production vs. development. The settings are very similar - the development box is larger, 4 times more memory.
Max degree of parallelism is the same on both boxes.
No compression on both boxes.
The production server is fairly busy, the development server is empty - this is the only process running on it.
Plenty of free disk space.
Updated all statistics on all databases touched by the query on dev.
Indexing is the same on both boxes.
The development box is running MSSQL2012.
The production box is running MSSQL2008R2.
What I've noticed:
The query consumes a massive amount of CPU time.
HUGE number of reads (16 million reads for 10 writes according to sp_WhoIsActive)
Largest wait types are CXPACKET, SOS_SCHEDULER_YIELD and TRACEWRITE respectively.
wait_typesum_wait_time_mspct_wait_timesum_waiting_tasksavg_wait_time_ms
CXPACKET4142765580.7176307423.5
SOS_SCHEDULER_YIELD24146944.7532725530.0
TRACEWRITE19856343.914831338.9
Hi everyone,
I really need your help!! How you manage development server and production server?
My current situation:
Our programmers develop new Stored Procedures on development server. When all testing is completed, we then deploy them to production server.
Challenges:
Programmer (Eric) modified a Stored Procedure (SP_1) on development server. He did not deploy it on production server because SP_1 is not completed yet.
Programmer (John) got request and needed to modify SP_1 on development server
So now there is a problem in SP_1. if John deploy SP_1 on production server, it will overwrite the current version and cause errors.
Anyone can solve this problem? Will it be better if we have another testing server?
many thanks
Hi all, I've been assigned a task of refreshing data from theproduction env to development env.what i got is a backup file of a db in the prod env, i now need tomake that into the development env.I can restore it to the dev env no problem, but the warnings i got arethat the tables owner in prod and dev env need to be different, thatis, owner is A in prod env and owner is B in dev env.So I need to:1) restore the db in dev env2) change table owner from A to B3) change related triggers4) change related viewsCan anyone suggest me an approach that is most efficient?Thanks a lot.
View 2 Replies View RelatedHello All,I have been searching for a published document for Best Practicesconcerning access levels based on roles. Should developers have morethan (if at all) select level access to production data? If Iunderstand (from multiple postings) that it is best to have:1. Development (developers have extensive access levels)2. Test (developers have restriced access levels)and3. Production (developers have none or select level access)Our environment and budget only allows for items 1 and 3.If any body could point me to a document from a 'reputable' source, Iwould greatly appreciate it.TIABill
View 3 Replies View Related
We have a system here where we develop SSIS packages on a development server. I am trying to figure out the cleanest way to promote these changes to a production server where stored procedures/tables that are used in the package are not deployed yet.
When I switch the connection in the package to the production server there are alot of objects in the package that are "invalidated" because they are trying to verify existence of tables/columns. One example is outputing the results of a query to a text file. The text file destination gets a red X on it because it cant grab the columns from the source query (because that stored procedure doesnt exist yet)
Is there a best practices or something on how to deploy packages to a production system? I have tried turning off "ValidateExternalMetaData" with no success.
Thanks!
Hi all
Iam working in Prodcution ENV,Please help how make space
The log file for database 'Home_alone' is full. Back up the transaction log for the database to free up some log space.
Hello,i am in great trouble. I want to revert back to original state ofdatabase before i performed restore database on my sql server 2KDatabase. Accidently i didn't take backup of my Database and i didrestore, so is there any way to get the original state back of myDatabase?Any suggestion will be highly appriciated.Regards,S. Domadia.
View 2 Replies View Related1.) Can an aspnetdb.mdf database be configured and setup on one server and then be moved to a production server or is there something machine specific that keeps this from being possible? 2.) Is putting this file in the app_data folder something that is used only for SQL 2005 or SQL express? I had to set up a connection string in my SQL 2000 installation to get the connection to work.
Thanks for any input!
Colelaus
This is driving me nuts: On my development machine the code runs fine
but generates an error on the production server. Both are running SQL
Server 2000 and ASP.NET 1.1
The datatype of the field in question is datetime.
The webform has a calendar for a user to select and automatically
insert the date into the textbox. The update command in the webform is:
cmdInsert.Parameters.Add("@citation_date", CDate(txtDate.Text))
This works without a hitch on my development system, but on the production server it generates the following error:
Cast from string "19-12-1997" to type 'Date' is not valid.
WHY?
I have a simple code that uses a the file upload control to read an excel sheet and upload the data into a SQL 2005db.
I'm using Visual Web Developer and Sql's express edition to test it. It works fine when I test. However, when I push it up to the production server and try it via the any other pc it does not. The page loads fine. However, when it starts to upload it errors out.
Any reason why? I've never seen this happen.
Here's the code. Thanks in advance. Protected Sub BtnUpload2_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles BtnUpload2.Click
UploadTextDocument()
End Sub
Private Sub UploadTextDocument()
Dim location As String = FileUpload1.PostedFile.FileName.ToString
' Connection String to Excel Workbook
Try
Dim excelConnectionString As String = String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties=Excel 8.0", location)
' Create Connection to Excel Workbook
Using connection As Data.OleDb.OleDbConnection = New Data.OleDb.OleDbConnection(excelConnectionString)
Dim command As Data.OleDb.OleDbCommand = New Data.OleDb.OleDbCommand("Select BuilderID,SeriesID,OptionLevel,CommunityID,PhaseID,PlanID,ElevationID,OptionID,CurrentSalesPrice,LocalComments,Active,DateAdded,DateAvailable,DateInactive,SalesPriceEffective,SalesPriceExpires,PreviousSalesPrice,CutOffNotBefore,CutOffNotAfter FROM [Data$]", connection)
connection.Open()
' Create DbDataReader to Data Worksheet
Using dr As Data.Common.DbDataReader = command.ExecuteReader()
' SQL Server Connection String
Dim connectionString As String = ConfigurationManager.ConnectionStrings("HbAdminMaintenance").ConnectionString
' Bulk Copy to SQL Server
Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(connectionString)
bulkCopy.DestinationTableName = "ExcelData"
bulkCopy.WriteToServer(dr)
End Using
End Using
connection.Close()
End Using
LBError.Text = "The spreadsheet was successfully uploaded."
Catch
LBError.Text = "There was an error. Check the spreadsheet for correct format."
End Try
End Sub
I'm having a problem with a Maintenance Plan created for SQL Server 2005 using Microsoft SQL Server Management Studio. See version information at the bottom of this post.
I have a Maintenance Plan using the "Back Up Database Task" that is set to perform a "Full" back up of "All Databases" on the local server. However, it appears that the list of "All databases" are hard-coded to be those databases that were available at the time the task was created. It seems any newly created databases don't appear to be in the list. If I attempt to edit the task and select the "These databases:" option, the newly created databases aren't even in the list.
Is there some way to have the task refresh the list of the databases available for backup? Ideally, I'd like any newly created databases to get backed up automatically without having to modify the task.
Thanks for your help,
Jonathan.
Microsoft SQL Server Management Studio 9.00.3042.00
Microsoft Analysis Services Client Tools 2005.090.3042.00
Microsoft Data Access Components (MDAC) 2000.086.3959.00 (srv03_sp2_rtm.070216-1710)
Microsoft MSXML 2.6 3.0 4.0 6.0
Microsoft Internet Explorer 7.0.5730.11
Microsoft .NET Framework 2.0.50727.42
Operating System 5.2.3790
I have my first small SQl Server 2005 database developed on my localserver and I have also its equivalent as an online database.I wish to update the local database (using and asp.net interface) andthen to upload the data (at least the amended data, but given thesmall size all data should be no trouble) to the online database.I think replication is the straight answer but I have no experience ofthis and I am wondering what else I might use which might be lesscomplicated. One solution is DTS (using SQL 2000 terms) but i am notsure if I can set this up (1) to overwrite existing tables and (2) notto seemingly remove identity attributes from fields set as identities.I know there are other possibilities but I would be glad of advice asto the likely best method for a small database updated perhaps onceweekly or at less frequent intervals,Best wishes, John Morgan
View 3 Replies View RelatedHi,
How do I insert data that I have collected in a local database onto a table on my online ie hosted database which is on a different server?
At the moment I am just uploading all the data to the hosted DB but this is wasting bandwith as only a small percentage of data is actually selected and used.
I thought that if i used a local DB and then update the table on my hosted DB this would be much more efficient, but I am not sure how to write the SQL code to do this!
Do I do some kind of
INSERT INTO sample_table
SELECT xxx
FROM origanal_table
Or is it more complicated than this?
Thanks
We are setting up a new Reporting Services 2005 enterprise reporting tier that will support multiple developers, applications, and end users. We will have mirrored environments including development, test, and production each with their own database cluster, and reporting server.
We have multiple report developers who share a single Visual Studio solution which is saved in SourceSafe and is setup to have separate report projects for each business unit in the orgainzation. Each report project is mapped to a specific deployment folder matching the business unit. Using the Visual Studio Configuration Manager, we can simply flip to the envirnoment we want to deploy to and the reports are published to the correct environment and folder structure.
My problem lies with the common data sources. We are using a single master Common Data Sources folder to hold all of the data sources. The trick is that each and every reporting folder seems to have to have it's own copy of the data source in visual studio. There does not seem to be an easy way to change the data sources for the reports when you publish to various environment, i.e. development, test, production etc.
Ideally, we would have a single project for the common data sources that all reporting projects and associated folders would map to, and we would have a way to associate the appropriate data source for each environment when we deploy.
I'm looling for best practices on how to setup data sources for development and deployment in an enterprise environment that uses Visual Studio to develop and publish reports. We have 3 environments, and 6 data sources per environment and about 20 reporting folder / project in Visual Studio. That's 360 changes that have to be manged when deploying reports. Is there a best practices way to do this?
There has got to be a better way? Can anyone give me some insite into how to set this up?
Thanks!
The web hotel I am using have MS SQL server installed. I do not have it on my local development PC.
How do I develop and test a website, locally, using MS SQL DB, without having MS SQL sever. Is there any light MS SQL sever I can use with a good user interface, and then just upload the DB to MS SQL server?
Please advise, NewBe2
Hi,
Can anyone let me know how to deal with the following things:
Is there any links for database coding, test data generation, editing,data cleansing, migration and loading with respect to Database Developing.
Thanks in Advance.
Hi,
Is there any tool available to migrate the data from the SQL Server test database to SQL Server production database. Data Migration should be based on a condition which can be given as an input for a table by the user. The dependant tables also should be migrated based on the given condition. i.e data subsetting based on the matching conditions.
Ex : Salary > 2000
The rows of the table which matches the condition alone need to be migrated for the corresponding table. Also its dependant table's rows should be migrated based on the given condition. Please help me with a tool which can automate this.
Thanks,
MiraJ