We're using SQL Server 2000 as back end in our web project. The problem is we've 3 different copies of same database - one each for Development, Test and Production sitting in 2 different machines.
My question is - is there any tool for comparing the objects (tables, stored procedures, etc) ?
We are setting up a new Reporting Services 2005 enterprise reporting tier that will support multiple developers, applications, and end users. We will have mirrored environments including development, test, and production each with their own database cluster, and reporting server.
We have multiple report developers who share a single Visual Studio solution which is saved in SourceSafe and is setup to have separate report projects for each business unit in the orgainzation. Each report project is mapped to a specific deployment folder matching the business unit. Using the Visual Studio Configuration Manager, we can simply flip to the envirnoment we want to deploy to and the reports are published to the correct environment and folder structure.
My problem lies with the common data sources. We are using a single master Common Data Sources folder to hold all of the data sources. The trick is that each and every reporting folder seems to have to have it's own copy of the data source in visual studio. There does not seem to be an easy way to change the data sources for the reports when you publish to various environment, i.e. development, test, production etc.
Ideally, we would have a single project for the common data sources that all reporting projects and associated folders would map to, and we would have a way to associate the appropriate data source for each environment when we deploy.
I'm looling for best practices on how to setup data sources for development and deployment in an enterprise environment that uses Visual Studio to develop and publish reports. We have 3 environments, and 6 data sources per environment and about 20 reporting folder / project in Visual Studio. That's 360 changes that have to be manged when deploying reports. Is there a best practices way to do this?
There has got to be a better way? Can anyone give me some insite into how to set this up?
I have a production database with a backup job that creates files with thenaming convention dbname_db_200503291800.bak. I want to schedule a restorejob that will retire yesterdays backup. How can I write my restore statementso that it will specify the backup file with yesterdays date.Thanks
I have been looking for some documentation that would support or rejectmy opinion on Production -vs- Development naming conventions. Ibelieve that each environment should be housed on separate servers withidentical names, access, users, stored procs....... If you eitheragree or disagree with this methodology, I would appreciate your input.TIA,Bill
I am trying to refresh a test database with data from a production database. Both database structures are identical, e.g. constraints, stored procs, PK, etc. I am trying to create a package in SSIS that accomplishes this task and I am having extensive problems. The import export wizard is out of the question because the constaints are not carried over, plus when I try to refresh the data using the import export wizard, it fails on 1 specific table because of a column in that table named "Error code". I think "Error code" is a micrsoft keyword, so it fails on this column. Does anyone know a workaround that I can do to accomplish this simple task, that could be completed in minutes using DTS. I understand that SSIS is not as straight forward as DTS, but this task is something that DBA's do on a regular basis and therefore should not be this difficult.
Is there any tool available to migrate the data from the SQL Server test database to SQL Server production database. Data Migration should be based on a condition which can be given as an input for a table by the user. The dependant tables also should be migrated based on the condition. i.e data subsetting based on the matching conditions.
Ex : Salary > 2000
The rows of the table which matches the condition alone need to be migrated for the corresponding table. Also its dependant table's rows should be migrated based on the given condition. Please help me with a tool which can automate this.
I am able to run the package successfuly in test database. but not in production database. It throughs up error saying
Description: Unable to load the package as XML because of package does not have a valid XML format. A specific XML parser error will be posted. Description: Failed to open package file "D:\TAHOE\APPS\SSISPackages\Integration Services Packages\ArchiveMain.dtsx" due to error 0x80070015 "The device is not ready.". This happens when loading a package and the file cannot be opened or loaded c orrectly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format. End Error Could not load package "D:\TAHOE\APPS\SSISPackages\Integration Services Packages\ArchiveMain.dtsx" because of error 0xC0011002. Description: Failed to open package file "D:\TAHOE\APPS\SSISPackages\Integration Services Packages\ArchiveMain.dtsx" due to error 0x80070015 "The device is not ready.". This happens when loading a package and the file cannot be opened or loaded corr ectly into the XML document. This can be the result of either providing an incorrect file name was specified when calling LoadPackage or the XML file was specified and has an incorrect format.
I am attempting to create a Test db from a full backup of the production db. With 2012, I cannot do it the the way i had done it in previous versions (and now i understand why because of Logical names).
The Test db runs in the same instance as Prod db.
I attempted to run this but come up with errors. This is what i executed:
RESTORE DATABASE TEST FROM DISK = 'E:<path>FULL.BAK' WITH REPLACE, RECOVERY, MOVE 'PROD' TO 'E:<path>TEST.MDF';
The errors are all cannot execute due to PROD is in use.
Is there any tool available to migrate the data from the SQL Server test database to SQL Server production database. Data Migration should be based on a condition which can be given as an input for a table by the user. The dependant tables also should be migrated based on the given condition. i.e data subsetting based on the matching conditions.
Ex : Salary > 2000
The rows of the table which matches the condition alone need to be migrated for the corresponding table. Also its dependant table's rows should be migrated based on the given condition. Please help me with a tool which can automate this.
we have a situation where modifications to the stored procs and views keep happening in the development databse environ while there is a live production database also up and running.
periodically we need to ensure that the development and the production are in synch where all changes made to dev are transferred to prod. however the entire prod database obviously cannot be overwritten as it contains production data. therefore what we need to do is to transfer all SPs and Views from the development to prodn. ( we do not prefer incremental transfers as there are many SPs and views and we can never really be sure that all changes have been transferred )
the obvious way is to script out all SPs and views in dev and run the scripts in prodn. but what we encountered was the enterprise manager does not script out the objects in heirarchical sequence - ie parent objects first and then the objects that refer the parent objects. this leads to errors when running the scripts as the objects required for the creation of a SP, say another SP run from within that SP , may not have been generated by then as the "contained" SP is lower in the alphabetical order in which SQL server scripted out the objects.
finally we had to look at each and every error and manually set the process right.
moreover in the way we did things the "sysdepends" table got all screwed up and therefore we can no longer depend on enterprise manager to show dependencies within the database.
is there any elegant way of going about this ? we also tried transfer manager in DTS but it also gave the same error. moreover can the sysdepends table be rebuilt from scratch ?
I have a question about how to efficiently develop on a development version of a database and get those changes over to the production database without loosing live data, etc.?
Here is how we are trying it:
1 Export production database to temp database(exact copy) 2 Delete production database 3 Export development database (definition only, no data) to new database with production database's name 4 Export production (temp) database (data only) over to new production shell
We are having truncate errors due to FK constraints.
I have two servers: one production and one development. There is a third party query that runs on both servers. Yes, the query is poorly tuned - I cannot change it. In production, it runs in 54 minutes. In development, I have tried to let it run for days and it never completes.
Here's what I have tried so far:
Compared the settings of production vs. development. The settings are very similar - the development box is larger, 4 times more memory.
Max degree of parallelism is the same on both boxes.
No compression on both boxes.
The production server is fairly busy, the development server is empty - this is the only process running on it.
Plenty of free disk space.
Updated all statistics on all databases touched by the query on dev.
Indexing is the same on both boxes.
The development box is running MSSQL2012.
The production box is running MSSQL2008R2.
What I've noticed:
The query consumes a massive amount of CPU time.
HUGE number of reads (16 million reads for 10 writes according to sp_WhoIsActive)
Largest wait types are CXPACKET, SOS_SCHEDULER_YIELD and TRACEWRITE respectively. wait_typesum_wait_time_mspct_wait_timesum_waiting_tasksavg_wait_time_ms CXPACKET4142765580.7176307423.5 SOS_SCHEDULER_YIELD24146944.7532725530.0 TRACEWRITE19856343.914831338.9
I really need your help!! How you manage development server and production server?
My current situation: Our programmers develop new Stored Procedures on development server. When all testing is completed, we then deploy them to production server.
Challenges: Programmer (Eric) modified a Stored Procedure (SP_1) on development server. He did not deploy it on production server because SP_1 is not completed yet.
Programmer (John) got request and needed to modify SP_1 on development server
So now there is a problem in SP_1. if John deploy SP_1 on production server, it will overwrite the current version and cause errors.
Anyone can solve this problem? Will it be better if we have another testing server?
Hi all, I've been assigned a task of refreshing data from theproduction env to development env.what i got is a backup file of a db in the prod env, i now need tomake that into the development env.I can restore it to the dev env no problem, but the warnings i got arethat the tables owner in prod and dev env need to be different, thatis, owner is A in prod env and owner is B in dev env.So I need to:1) restore the db in dev env2) change table owner from A to B3) change related triggers4) change related viewsCan anyone suggest me an approach that is most efficient?Thanks a lot.
Hello All,I have been searching for a published document for Best Practicesconcerning access levels based on roles. Should developers have morethan (if at all) select level access to production data? If Iunderstand (from multiple postings) that it is best to have:1. Development (developers have extensive access levels)2. Test (developers have restriced access levels)and3. Production (developers have none or select level access)Our environment and budget only allows for items 1 and 3.If any body could point me to a document from a 'reputable' source, Iwould greatly appreciate it.TIABill
We have a system here where we develop SSIS packages on a development server. I am trying to figure out the cleanest way to promote these changes to a production server where stored procedures/tables that are used in the package are not deployed yet.
When I switch the connection in the package to the production server there are alot of objects in the package that are "invalidated" because they are trying to verify existence of tables/columns. One example is outputing the results of a query to a text file. The text file destination gets a red X on it because it cant grab the columns from the source query (because that stored procedure doesnt exist yet)
Is there a best practices or something on how to deploy packages to a production system? I have tried turning off "ValidateExternalMetaData" with no success.
1.) Can an aspnetdb.mdf database be configured and setup on one server and then be moved to a production server or is there something machine specific that keeps this from being possible? 2.) Is putting this file in the app_data folder something that is used only for SQL 2005 or SQL express? I had to set up a connection string in my SQL 2000 installation to get the connection to work. Thanks for any input! Colelaus
This is driving me nuts: On my development machine the code runs fine but generates an error on the production server. Both are running SQL Server 2000 and ASP.NET 1.1
The datatype of the field in question is datetime. The webform has a calendar for a user to select and automatically insert the date into the textbox. The update command in the webform is:
This works without a hitch on my development system, but on the production server it generates the following error: Cast from string "19-12-1997" to type 'Date' is not valid.
Hi all, I have a asp .net 1.1 application running on the intranet which uses SQL Server 2000. The application is in production and everytime I want to do some changes, i do the changes on my development machine then I copy the application dll on the server. The problem is that I'm using Stored Procedures for all my Select, Insert and Delete statements. These stored procedures are live on the server so I can't do the modifications locally and test them then copy to the server.
How can I do modifications without affecting the production server and the users ??? thanks.
Sould one has a seperated environment for production and test system? How do you do it on a same server? Install two instance? How do you seperate test DBs from the production DBs? Please advise...Thank you
I have a simple code that uses a the file upload control to read an excel sheet and upload the data into a SQL 2005db. I'm using Visual Web Developer and Sql's express edition to test it. It works fine when I test. However, when I push it up to the production server and try it via the any other pc it does not. The page loads fine. However, when it starts to upload it errors out. Any reason why? I've never seen this happen. Here's the code. Thanks in advance. Protected Sub BtnUpload2_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles BtnUpload2.Click UploadTextDocument() End Sub
Private Sub UploadTextDocument() Dim location As String = FileUpload1.PostedFile.FileName.ToString ' Connection String to Excel Workbook Try Dim excelConnectionString As String = String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties=Excel 8.0", location) ' Create Connection to Excel Workbook Using connection As Data.OleDb.OleDbConnection = New Data.OleDb.OleDbConnection(excelConnectionString) Dim command As Data.OleDb.OleDbCommand = New Data.OleDb.OleDbCommand("Select BuilderID,SeriesID,OptionLevel,CommunityID,PhaseID,PlanID,ElevationID,OptionID,CurrentSalesPrice,LocalComments,Active,DateAdded,DateAvailable,DateInactive,SalesPriceEffective,SalesPriceExpires,PreviousSalesPrice,CutOffNotBefore,CutOffNotAfter FROM [Data$]", connection) connection.Open() ' Create DbDataReader to Data Worksheet Using dr As Data.Common.DbDataReader = command.ExecuteReader() ' SQL Server Connection String Dim connectionString As String = ConfigurationManager.ConnectionStrings("HbAdminMaintenance").ConnectionString ' Bulk Copy to SQL Server Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(connectionString) bulkCopy.DestinationTableName = "ExcelData" bulkCopy.WriteToServer(dr) End Using End Using connection.Close() End Using LBError.Text = "The spreadsheet was successfully uploaded." Catch LBError.Text = "There was an error. Check the spreadsheet for correct format." End Try End Sub
I have just finished upsizing an Access database to SQL Server 2k. Now the SQL Server need to be run on a test basis to determine if i need to make more changes to the front-end (Access). The problem I am facing is how to keep the two databases in sync while I am testing. Any suggestions?
Also any suggestion or comments on how to run a test setup like this (in parrallel) are also welcome since this is my first time attempting a project like this.
There is a production database which has ever increasing data. For testing purposes though, I would like to build a test database with exactly the same schema but only a subset of data copied from the production database . I'll specify the criteria (something like a where clause in select query) for copying the data from the production database.
Is there a tool that anyone has come across to do this job ?
I am debugging one of our programs and ran the fix in Test. I would liketo compare table 1 between Production and Test. I want the query to outputcolumn 1 if Production <> Test output.What is the best way to achieve this?jeff--Message posted via http://www.sqlmonster.com
We will be implementing our first SQL cluster in December. Our current plan calls for a shared development/test database server with one physical server, but two SQL Server instances. Our production environment will be a SQL cluster. Is it necessary to create a clustered test environment for testing patches, hot-fixes, etc...?
How do I change application code to easily switch between the application working against a test database versus working with a production database?
My thought is to change the connection string to work with a test DB, and when ready to Publish, change the connection string back to the production DB. After Publish is successful, change the connection string back to the test DB.
At first, it appears it will work. Will it? Whether it will or won't?
Hello I have a production database that i need to refresh to our test environment daily. The database size is 700 MB. I do not need to transfer the stored procedures and triggers , users and logins. Would a DTS package that runs every night be the best and the easiest solution to implement or should i look into log shipping and snapshot replication.
Setting up Transaction Replication in test environment. I am willing to bet that most of you take a production backup (if so, how, and using what?), restoring the database to your test environment, then running a snapshot to your subscriber and away you go.
But perhaps you take a backup of your publisher and subscriber, if so, how do you know there are no inconsistences because there were transactions sitting on the distributor?
What do you do if you have additional indexes on the subscriber for reporting, that are not on the publisher?
Here at work we are having issues with getting consistent databases set up with T Rep, missing rows, duplicate keys at subscriber etc. How to avoid these issues.