Data Source Deployment Best Practises Supporting Development, Test, And Production Environments
Feb 4, 2008
We are setting up a new Reporting Services 2005 enterprise reporting tier that will support multiple developers, applications, and end users. We will have mirrored environments including development, test, and production each with their own database cluster, and reporting server.
We have multiple report developers who share a single Visual Studio solution which is saved in SourceSafe and is setup to have separate report projects for each business unit in the orgainzation. Each report project is mapped to a specific deployment folder matching the business unit. Using the Visual Studio Configuration Manager, we can simply flip to the envirnoment we want to deploy to and the reports are published to the correct environment and folder structure.
My problem lies with the common data sources. We are using a single master Common Data Sources folder to hold all of the data sources. The trick is that each and every reporting folder seems to have to have it's own copy of the data source in visual studio. There does not seem to be an easy way to change the data sources for the reports when you publish to various environment, i.e. development, test, production etc.
Ideally, we would have a single project for the common data sources that all reporting projects and associated folders would map to, and we would have a way to associate the appropriate data source for each environment when we deploy.
I'm looling for best practices on how to setup data sources for development and deployment in an enterprise environment that uses Visual Studio to develop and publish reports. We have 3 environments, and 6 data sources per environment and about 20 reporting folder / project in Visual Studio. That's 360 changes that have to be manged when deploying reports. Is there a best practices way to do this?
There has got to be a better way? Can anyone give me some insite into how to set this up?
We're using SQL Server 2000 as back end in our web project. The problem is we've 3 different copies of same database - one each for Development, Test and Production sitting in 2 different machines.
My question is - is there any tool for comparing the objects (tables, stored procedures, etc) ?
We are setting up a test lab environment with 100 machines. We want one master testing db that gets replicated to each to run scripted application tests nightly.
My goal is to minimize the amount of work to move this thing to each of the 100 test machines. I am wondering if we need to even have the sql local and invest in a monster db server with 100 copies of the db we restore and each test machine point to their own db on that server, or if I should use db mirroring or something to get the master test db to each of those machines instead.
Hi all, I've been assigned a task of refreshing data from theproduction env to development env.what i got is a backup file of a db in the prod env, i now need tomake that into the development env.I can restore it to the dev env no problem, but the warnings i got arethat the tables owner in prod and dev env need to be different, thatis, owner is A in prod env and owner is B in dev env.So I need to:1) restore the db in dev env2) change table owner from A to B3) change related triggers4) change related viewsCan anyone suggest me an approach that is most efficient?Thanks a lot.
Hello All,I have been searching for a published document for Best Practicesconcerning access levels based on roles. Should developers have morethan (if at all) select level access to production data? If Iunderstand (from multiple postings) that it is best to have:1. Development (developers have extensive access levels)2. Test (developers have restriced access levels)and3. Production (developers have none or select level access)Our environment and budget only allows for items 1 and 3.If any body could point me to a document from a 'reputable' source, Iwould greatly appreciate it.TIABill
I am trying to refresh a test database with data from a production database. Both database structures are identical, e.g. constraints, stored procs, PK, etc. I am trying to create a package in SSIS that accomplishes this task and I am having extensive problems. The import export wizard is out of the question because the constaints are not carried over, plus when I try to refresh the data using the import export wizard, it fails on 1 specific table because of a column in that table named "Error code". I think "Error code" is a micrsoft keyword, so it fails on this column. Does anyone know a workaround that I can do to accomplish this simple task, that could be completed in minutes using DTS. I understand that SSIS is not as straight forward as DTS, but this task is something that DBA's do on a regular basis and therefore should not be this difficult.
There is a production database which has ever increasing data. For testing purposes though, I would like to build a test database with exactly the same schema but only a subset of data copied from the production database . I'll specify the criteria (something like a where clause in select query) for copying the data from the production database.
Is there a tool that anyone has come across to do this job ?
We have multiple development environments on one machine for our VisualStudio .net projects.These are web applications so we have organized them by IP Address andHost Headers.When I installed Reporting Services, it installed into my default website (localhost) and that is fine.However, we want to be able to create our reports in our different .netprojects. The only projects that will allow me to create reports arethe projects using the default website.When I right click to create a new project, the Business Intelligenceprojects (reports) type is not there.Can I do this? If so, how do I configure it?
I am debugging one of our programs and ran the fix in Test. I would liketo compare table 1 between Production and Test. I want the query to outputcolumn 1 if Production <> Test output.What is the best way to achieve this?jeff--Message posted via http://www.sqlmonster.com
I was wondering if anyone has a neat (preferably automated) method of creating small testing databases from large production instances. My requirement would be to copy the schema and a subset of configuration data from a production database into a test database. The subset of data would be a full copy of a subset of tables, rather than a subset of data within one or more tables. There is a mixture of SQL2000 and SQL2005 servers involved in this requirement. I'm familar with the scripting mechanisms of Enterprise Manager and Management studio and DTS packages, sufficent to perform a process like this manually, but want to productionise and schedule this process to be performed automatically. I'm sure this must be a commonly performed task, so I'm interested to know if anyone has a "best practice" for this requirement.
Hello. What I'm trying to do should be fairly simple, but after an hour of searching and trials, I can't seem to get it working.When I run my website on my test machine, I would like it to use a certain value for my connectionString. But when it's run on the live server, it should use a different connection string. This way, when it's on the test server it'll use the test database.In my Web.Config file, I have this:<connectionStrings> <add name="myConnectionString" connectionString="Data Source=TEST_DB;Initial Catalog=tester;Integrated Security=True" providerName="System.Data.SqlClient" /></connectionStrings>I would like to be able to change the connectionString in something like the Global.asax file, when the application is loaded. I'd check for a certain environment variable, or some other condition, and change the value of the connectionString.I tried this in my Global.asax, but it told me the value was read_only: ConfigurationManager.ConnectionStrings["myConnectionString"].ConnectionString = "Data Source=LIVE_DB;Initial Catalog=live;Integrated Security=True" Does anyone have a good way of doing this, so I don't need to juggle Web.Config files when I publish my site? Thanks.
Production and development servers are on different domains and they do not trust each other. How do I import data from the table t1 from a database db1 in production and load it into table t1 inside database db1 in development?
We have both a production SQL 7 server, QA, and Development. From time to time, I want to move just the data from the production server to the other 2 servers without modifing the objects that may have been changed such as stored procedures and rights. Is there a way using the SQL tools provided that we can just move the data. Becuase also what happens is that the rights to the objects change which means my developers no longer have access to the tables for selects in QA since the changes where overwritten by production where they do not have the rights.
we have a situation where modifications to the stored procs and views keep happening in the development databse environ while there is a live production database also up and running.
periodically we need to ensure that the development and the production are in synch where all changes made to dev are transferred to prod. however the entire prod database obviously cannot be overwritten as it contains production data. therefore what we need to do is to transfer all SPs and Views from the development to prodn. ( we do not prefer incremental transfers as there are many SPs and views and we can never really be sure that all changes have been transferred )
the obvious way is to script out all SPs and views in dev and run the scripts in prodn. but what we encountered was the enterprise manager does not script out the objects in heirarchical sequence - ie parent objects first and then the objects that refer the parent objects. this leads to errors when running the scripts as the objects required for the creation of a SP, say another SP run from within that SP , may not have been generated by then as the "contained" SP is lower in the alphabetical order in which SQL server scripted out the objects.
finally we had to look at each and every error and manually set the process right.
moreover in the way we did things the "sysdepends" table got all screwed up and therefore we can no longer depend on enterprise manager to show dependencies within the database.
is there any elegant way of going about this ? we also tried transfer manager in DTS but it also gave the same error. moreover can the sysdepends table be rebuilt from scratch ?
I have a question about how to efficiently develop on a development version of a database and get those changes over to the production database without loosing live data, etc.?
Here is how we are trying it:
1 Export production database to temp database(exact copy) 2 Delete production database 3 Export development database (definition only, no data) to new database with production database's name 4 Export production (temp) database (data only) over to new production shell
We are having truncate errors due to FK constraints.
I have two servers: one production and one development. There is a third party query that runs on both servers. Yes, the query is poorly tuned - I cannot change it. In production, it runs in 54 minutes. In development, I have tried to let it run for days and it never completes.
Here's what I have tried so far:
Compared the settings of production vs. development. The settings are very similar - the development box is larger, 4 times more memory.
Max degree of parallelism is the same on both boxes.
No compression on both boxes.
The production server is fairly busy, the development server is empty - this is the only process running on it.
Plenty of free disk space.
Updated all statistics on all databases touched by the query on dev.
Indexing is the same on both boxes.
The development box is running MSSQL2012.
The production box is running MSSQL2008R2.
What I've noticed:
The query consumes a massive amount of CPU time.
HUGE number of reads (16 million reads for 10 writes according to sp_WhoIsActive)
Largest wait types are CXPACKET, SOS_SCHEDULER_YIELD and TRACEWRITE respectively. wait_typesum_wait_time_mspct_wait_timesum_waiting_tasksavg_wait_time_ms CXPACKET4142765580.7176307423.5 SOS_SCHEDULER_YIELD24146944.7532725530.0 TRACEWRITE19856343.914831338.9
I really need your help!! How you manage development server and production server?
My current situation: Our programmers develop new Stored Procedures on development server. When all testing is completed, we then deploy them to production server.
Challenges: Programmer (Eric) modified a Stored Procedure (SP_1) on development server. He did not deploy it on production server because SP_1 is not completed yet.
Programmer (John) got request and needed to modify SP_1 on development server
So now there is a problem in SP_1. if John deploy SP_1 on production server, it will overwrite the current version and cause errors.
Anyone can solve this problem? Will it be better if we have another testing server?
I have a production database with a backup job that creates files with thenaming convention dbname_db_200503291800.bak. I want to schedule a restorejob that will retire yesterdays backup. How can I write my restore statementso that it will specify the backup file with yesterdays date.Thanks
We have a system here where we develop SSIS packages on a development server. I am trying to figure out the cleanest way to promote these changes to a production server where stored procedures/tables that are used in the package are not deployed yet.
When I switch the connection in the package to the production server there are alot of objects in the package that are "invalidated" because they are trying to verify existence of tables/columns. One example is outputing the results of a query to a text file. The text file destination gets a red X on it because it cant grab the columns from the source query (because that stored procedure doesnt exist yet)
Is there a best practices or something on how to deploy packages to a production system? I have tried turning off "ValidateExternalMetaData" with no success.
1.) Can an aspnetdb.mdf database be configured and setup on one server and then be moved to a production server or is there something machine specific that keeps this from being possible? 2.) Is putting this file in the app_data folder something that is used only for SQL 2005 or SQL express? I had to set up a connection string in my SQL 2000 installation to get the connection to work. Thanks for any input! Colelaus
This is driving me nuts: On my development machine the code runs fine but generates an error on the production server. Both are running SQL Server 2000 and ASP.NET 1.1
The datatype of the field in question is datetime. The webform has a calendar for a user to select and automatically insert the date into the textbox. The update command in the webform is:
This works without a hitch on my development system, but on the production server it generates the following error: Cast from string "19-12-1997" to type 'Date' is not valid.
I have been looking for some documentation that would support or rejectmy opinion on Production -vs- Development naming conventions. Ibelieve that each environment should be housed on separate servers withidentical names, access, users, stored procs....... If you eitheragree or disagree with this methodology, I would appreciate your input.TIA,Bill
Hi all, I have a asp .net 1.1 application running on the intranet which uses SQL Server 2000. The application is in production and everytime I want to do some changes, i do the changes on my development machine then I copy the application dll on the server. The problem is that I'm using Stored Procedures for all my Select, Insert and Delete statements. These stored procedures are live on the server so I can't do the modifications locally and test them then copy to the server.
How can I do modifications without affecting the production server and the users ??? thanks.
Sould one has a seperated environment for production and test system? How do you do it on a same server? Install two instance? How do you seperate test DBs from the production DBs? Please advise...Thank you
I have a simple code that uses a the file upload control to read an excel sheet and upload the data into a SQL 2005db. I'm using Visual Web Developer and Sql's express edition to test it. It works fine when I test. However, when I push it up to the production server and try it via the any other pc it does not. The page loads fine. However, when it starts to upload it errors out. Any reason why? I've never seen this happen. Here's the code. Thanks in advance. Protected Sub BtnUpload2_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles BtnUpload2.Click UploadTextDocument() End Sub
Private Sub UploadTextDocument() Dim location As String = FileUpload1.PostedFile.FileName.ToString ' Connection String to Excel Workbook Try Dim excelConnectionString As String = String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties=Excel 8.0", location) ' Create Connection to Excel Workbook Using connection As Data.OleDb.OleDbConnection = New Data.OleDb.OleDbConnection(excelConnectionString) Dim command As Data.OleDb.OleDbCommand = New Data.OleDb.OleDbCommand("Select BuilderID,SeriesID,OptionLevel,CommunityID,PhaseID,PlanID,ElevationID,OptionID,CurrentSalesPrice,LocalComments,Active,DateAdded,DateAvailable,DateInactive,SalesPriceEffective,SalesPriceExpires,PreviousSalesPrice,CutOffNotBefore,CutOffNotAfter FROM [Data$]", connection) connection.Open() ' Create DbDataReader to Data Worksheet Using dr As Data.Common.DbDataReader = command.ExecuteReader() ' SQL Server Connection String Dim connectionString As String = ConfigurationManager.ConnectionStrings("HbAdminMaintenance").ConnectionString ' Bulk Copy to SQL Server Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(connectionString) bulkCopy.DestinationTableName = "ExcelData" bulkCopy.WriteToServer(dr) End Using End Using connection.Close() End Using LBError.Text = "The spreadsheet was successfully uploaded." Catch LBError.Text = "There was an error. Check the spreadsheet for correct format." End Try End Sub
I have just finished upsizing an Access database to SQL Server 2k. Now the SQL Server need to be run on a test basis to determine if i need to make more changes to the front-end (Access). The problem I am facing is how to keep the two databases in sync while I am testing. Any suggestions?
Also any suggestion or comments on how to run a test setup like this (in parrallel) are also welcome since this is my first time attempting a project like this.
Is there any tool available to migrate the data from the SQL Server test database to SQL Server production database. Data Migration should be based on a condition which can be given as an input for a table by the user. The dependant tables also should be migrated based on the condition. i.e data subsetting based on the matching conditions.
Ex : Salary > 2000
The rows of the table which matches the condition alone need to be migrated for the corresponding table. Also its dependant table's rows should be migrated based on the given condition. Please help me with a tool which can automate this.
Hi, I work with a large team developing ASP.NET application that has a large database with over 50 complex stored procedures. It is proving more and more difficult and time consuming to centralise the development and update of the database changes and I was wondering if there were any best practises/tools that could be recommended. I have looked on the web for good articles and haven't found anything difinitive (except that Team Foundation Server is the way forward).. A brief background to the current process is that everyone develops on the same database, and then updates the stored procedure scripts in source safe (manually). Then when we do a new release someone builds a script of all the database updates and runs it. There are issues related to developers updating there stored procedures over other peoples and other concurrency. I am looking to move all the developers to start using local databases so that there work only effects them, but then this brings up problems of keeping all the local databases up to date whenever they get the latest source code. The only way I currently see is to build a database update program, that will run and update to the latest version. Surely this must be a common issue? Anyone have any good ideas/concepts? Also our setup is Visual Studio 2005, SQL Server 2005 and Source Safe 2005. Cheers, Andrew Thomas
For the past few months I've been developing an DW and ETL with SQL 2005 / SSIS. My packages are being deployed to a SQL Server. Although in the end game we will have a Dev/Staging/Production environments, I would still like to archive production packages when we push staging to production. Essentially I would like to archive the last X packages that were deployed to production where X is a reasonable number (3 - 5). I don't necessarily need to have them accessible to run. One of the purposes is to have another safeguard should we miss anything in user testing and need to roll back a deployment.
I am utilizing VSS and we will have backups running on the production server, but I would prefer to have a archive that is a little more accessible.
I just wondering if anyone has any thoughts on how to extract/archive production packages when the push is made. I could easily develop an app that queries the MSDB and exports the packages to the file system.