SQL 2012 :: Approaches For Creating A Test Environment
Dec 11, 2014
I am try to put together options in regard to creating a test environment for our Dynamics NAV system. The environment will be mainly used to test new releases / changes ahead of applying them to production.The 2 options I am considering areā¦
1.Create a second Test instance on our Production SQL Server to host a test database
2.Purchase a set of SQL developer licences and having a totally separate server for our test environment.
My preference would be option 2. However I need to build a convincing case that this is the best way forward. I wondered if I could tap into the thoughts of the SQL Central community and see how other approach this.
Setting up Transaction Replication in test environment. I am willing to bet that most of you take a production backup (if so, how, and using what?), restoring the database to your test environment, then running a snapshot to your subscriber and away you go.
But perhaps you take a backup of your publisher and subscriber, if so, how do you know there are no inconsistences because there were transactions sitting on the distributor?
What do you do if you have additional indexes on the subscriber for reporting, that are not on the publisher?
Here at work we are having issues with getting consistent databases set up with T Rep, missing rows, duplicate keys at subscriber etc. How to avoid these issues.
I am attempting to create a Test db from a full backup of the production db. With 2012, I cannot do it the the way i had done it in previous versions (and now i understand why because of Logical names).
The Test db runs in the same instance as Prod db.
I attempted to run this but come up with errors. This is what i executed:
RESTORE DATABASE TEST FROM DISK = 'E:<path>FULL.BAK' WITH REPLACE, RECOVERY, MOVE 'PROD' TO 'E:<path>TEST.MDF';
The errors are all cannot execute due to PROD is in use.
Sould one has a seperated environment for production and test system? How do you do it on a same server? Install two instance? How do you seperate test DBs from the production DBs? Please advise...Thank you
I must be missing something, and its starting to fusterate me. Bear with me here. I created a site for a ...client I guess you'd call it, and made this really slick newletter generator thing. The people from the web enter in their info, and if they want, they sign up for a newsletter -- all tied into a db, 1 table, 4 stored procedures, REALLY simple stuff. They insisted I used a certain webhost which, on paper, looks like it will fit the bill. I'm starting to question that. On top of the newsletter thing, I created an aspnetdb for the "administration" side of it for her to log into and send out the newsletter so total, there's 2 dbs in the app_data folder. Locally, it works GREAT and on my test box (iis6) that is running 2k5 express. The webhost runs sql2k in (what I consider) a bastardized way. Can't use the management studio, can't use anything except a really weak web-based interface which adds to my fusteration. Anyway, my questions : 1, is there a way to make the mdf files work with sql2k without having to re-do the whole thing and 2, if I have to redo it, does anyone have an example connection string that might help out?
I want to build a SQL testing environment in an active/active setup. Any recommendations on what I could use if I want to set up the most bare-bones system. I want to do it as cheaply as possible.
I am debating whether to go to all the trouble of setting up on-demand Profiler traces on some test servers for the developers here. Really just tracing RPC:Completed and SQL:BatchCompleted, so the developers can at least try to catch a performance problem before going to production. The question I have, though, is just how useful is this sort of information to mid- to low-level (i.e. experience) developers. One of the bigger concerns is over Java applications, which like to hide their queries behind a lot of "sp_cursorfetch" calls.
My question to the forum is if you are a developer, have you ever dreamed of having this sort of information available? How useful is it?
I am going to try to post a poll along with this, but I am not sure it will work..
We have a production database that was generated by a vendor. The vendor wants us to test a new version of their software. This testing process will take several months. The users want the testing to be as real time as possible. I have developed a series of scripts that will back up our databases and ship them over to our test environment on a nightly basis. We also of course have nightly backups. As a general rule, we do full backups once a week and differentials on a nightly basis.
We are a phone company that has transactions being applied to the database 7 X 24.
My question is this: Is there a way (an option or something) that when my backup of the production database which is destined for the test environment runs, I can tell it to not set the flags that indicate a backup has been done. What I want to avoid is the differential backup process from being 'Confused' about what backup it is doing a differential for.
I would appreciate any help or insight you can give me.
So, we are about 3 weeks away from going into production, and somehow we failed to give much thought to deploying our RS project into production.
We have over 110 report models that need to be deployed into production, and until now, we just deploy into our dev and test environments using Visual Studio. But, in our production environment, our deployers will not have Visual Studio.
Is there any simply backup/restore method that can be used to move our test environment into production? Please don't suggest a copy of each file one at a time /sigh.
We will be implementing our first SQL cluster in December. Our current plan calls for a shared development/test database server with one physical server, but two SQL Server instances. Our production environment will be a SQL cluster. Is it necessary to create a clustered test environment for testing patches, hot-fixes, etc...?
I need to restore test DB from production backup but once it is restored I would need all the permissions of sql logins and windows AD account intact in test Db as it was before.
I need to create a test database on the live server for performance testing.From time to time, as structural modifications are made to the livedatabase, I'd like to be able to delete the test database and replace itwith a new copy of the live database. Is there a simple way to do this witha script or other? Thanks!
I was wondering if anyone has a neat (preferably automated) method of creating small testing databases from large production instances. My requirement would be to copy the schema and a subset of configuration data from a production database into a test database. The subset of data would be a full copy of a subset of tables, rather than a subset of data within one or more tables. There is a mixture of SQL2000 and SQL2005 servers involved in this requirement. I'm familar with the scripting mechanisms of Enterprise Manager and Management studio and DTS packages, sufficent to perform a process like this manually, but want to productionise and schedule this process to be performed automatically. I'm sure this must be a commonly performed task, so I'm interested to know if anyone has a "best practice" for this requirement.
I need to import a CSV file with a few million records and 50 fields into a table. Only 1 column in the file needs to be transformed and a second column needs to be checked for data validity (e.g. don't want to let someone pass in 'CA' for an integer field.). Two approaches come to mind:
1. Use SSIS to read the file directly into the table, then apply t-sql to do a mass update to the single field that needs to be transformed. (with this approach it is not clear how to check the data valdity in each row via t-sql, though).
2. Use SSIS to import the file, 1 line at a time, transforming the data and checking its validity.as it goes. I suspect this approach will be much slower than that in 1) but I haven't tried it yet.
I am retrieving data from an AS400. I used the import data utility to pull the data in initially and to create the table for the data in SQL server. There are a couple of text fields that represent dates in the data I imported.
The Import data utility imported these fields as character data. That was fine by me.
Now, when I pull new data from the same AS400 data source I have to use the Data Reader source (due to the OLEDB components not being able to work with an SQL command;i,e, it works okay when pulling the entire table but does not work at all with even the simplest of sql statements). The data reader source insists on interpreting the text fields containing date data from the AS400 as DB_DATETIMESTAMP data type. The net effect of this is that the data it pulls in in the DB_DATETIMESTAMP format is too large to fit into the 10 character varchar field that was created when the entire table was imported.
I know I can get around this in a couple of ways. But, my question is this. Why will the DateReader source not let me specify how I want the incoming data to be interpreted? There seems to be no way to change the way the external column is interpreted. By this I mean that I can change the external columns representation on the datareader source to indicate that I want it treated as a 10 character text field but I cannot change the output column representation. When I try to do so I get an error message telling me the data type of the output columns on the component datareader source cannot be changed. It seems a bit odd to me that character data coming in cannot be treated as character data in output.
It looks like I will need to use a script component to force the data conversion that I want and it just doesn't make a lot of sense to me as to why I can't do what I need to do in a conversion component.
I would feel better if I could understand what is going on behind the scenes that makes this state of affairs desirable.
I've read lots about why you shouldn't normally shrink databases in posts such as this:
[URL]
But we have a situation where we are required to copy the live db to various non production environments for testing. Part of this process involves truncating a number of tables with masses of blob data. So we're freeing up quite a lot of space. The question is how to reclaim this? The database is peculiar in that it's got no clustered indexes so I can't rebuild indexes on another filegroup with drop to move and rebuild.
I've tried dbcc shrinkfile specifying a size. I've tried to shrink the file in increments. The problem is I'm just not getting much space released. I get maybe 2-3%. I suspect this is because we're dealing with heaps with some tables that have sparsely populated blob / image data.
Is there an alternative to shrinking? Should I recreate all the db objects in a new database? It doesn't matter if the process takes a while or if it has to be done manually.
We are working in a Merge replication environment where we have SQL Server 2005, 11 publications and 2 subscribers.We used to get lot of incidents from the Application owner for blockings, recently we faced a situation where the lead blocker is in sleeping state and the session was used by the merge agent.Checked the query that the session was running, it was sys.sp_MSenumgenerations90;1.
We have 4 Servers which have SQL SERVER 2012 and "AlwaysOn" have been enabled on all 4 servers:
Server1,Server2,Server3,Server4
Server1 is the Primary node and thr rest are secondaries. There is a Sync relation between Server1 and Server2 and also there is aSync relation between Server1 and Server3 & Server4.
Is it possible to setup log shipping from Server2 & Server3(secondaries) to two new servers?
After breaking the mirroring database to allow testing on the DR site, is the only way to re-establish the mirroring again to copy the backups from the actual production site ?. Can something be done on the DR site to put back the mirroring after the test ?.
I do have the following Extended Event session on my Dev box;
CREATE EVENT SESSION [sp_showplan] ON SERVER ADD EVENT sqlserver.query_post_execution_showplan(SET collect_database_name=(1) ACTION(sqlserver.plan_handle) WHERE ([package0].[equal_uint64]([object_type],(8272)) AND [sqlserver].[equal_i_sql_unicode_string]([object_name],N'MyStoreProcedure'))) ADD TARGET package0.event_file(SET filename=N'E:DBA_AuditSP_Exec.xel',metadatafile=N'E:DBA_AuditSP_Exec.xem') WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=OFF) GO
The idea is being able to capture the execution plan when the program invokes the store procedure, regardless of the database.
This works on my Dev box. When I manually trigger "MyStoreProcedure" from database A , the event is saved. The same thing happens when I do that from database B. Ok ... so far, so good.
So I went to the live production environment and setup my Extended Events session. But it's saving nothing. I was able to check that the store procedure was executed on several databases but my extended events session never grabbed the plan.
What could be the reason for this? Memory starvation maybe? Is there something I am doing wrong?
What are the tables are having the Default Constraint those table's Script the User can't generate but remaining tables He/She can generate,If i want DENY he/she to do not generate the script those are not having the DF Constraints what should i do.....
My goal is to be able to include test information in the change request that will make the operations manager confident that the change can proceed. Right now all I can think of is to add the index in dev and staging and compare some before and after stats. But I am concerned that an index change might also affect other operations besides SELECTs - such as updates, transactions, or scheduled jobs.
I have deployed a project with multiple packages to SSIS 2012 db. I am able to configure the project parameters fine. But, I am not able to replace the package variable values with the 'Environment' variables.
Any fix for the seemingly random sort order of the variables in the dropdown list when configuring parameters and connection managers in the SSISDB catalog?
I imported all of our connection strings into an environment (about 200 of them). They were inserted in alpha order and the ID values within the internal.environment_variables table shows them in order as well, by ID and by name. When I run profiler and capture the command that retrieves them and run it in ssms they are in order but in the dropdown they seem random.
There are no values within any of the tables that accounts for the order they are in.
If a package has 5 connections you need to go through the unsorted list 5 times to find them.
Sometimes you get lucky and they are in the first 20 or so.
I know I can write a script, just wondering if there is a fix for the sorting.
I am creating my first SSIS package, simple, I read a file, a computation and then write it to an excel.
I am getting the message: "Test connection failed because of an error in initializing provider." The database is NOT remote. My computer is slow though. after about 10 or 15 seconds after clicking the button to test my connection, I get the message and much more. The message asks me if the instance name is correct. I don't know what this would be. This message is on the first task, reading the file (finding the file).
The last sentence of the long message is: "Named pipes provider: Could not open a connection to SQL Server (2).
My question is: Why am I getting this message and what do I need to do to resolve this problem?
I have SSIS 2012 Enterprise, using catalog deployment and have more that 50 environment variables for connection to databases across my enterprise.
The problem when i go to configure the packages after deployment and pick the proper env variables, that are not sorted, so i have to browse all entries in order to find the proper entry in environment variables.
I have finished a change request from our client. I need to update clients' database with the one in developments.Here is the changes i made to database:Added/Changed some tablesAdded/Changed some stored proceduresAdded data to some dictionary tableThe data in clients' current database MUST be kept. So how can I merge the changed information to clients' database?
I am trying to send out notifications when jobs complete (fail or succeed). I have database mail working fine on my DEV server, but I am having issues with it on my PROD server. I am currently having people look into if McAfee may be blocking it.
I am able to send out a test email from SSMS>Management>Database Mail, but when I set a Notification for a job, the job will complete and in the history, it will say "NOTE: Failed to notify 'User' via email."
I have created an Operator and set up Profiles and Accounts, just as I did on my DEV server.