Way To Efficiently Make Test Data ?

Dec 26, 2006

We have a website that accesses our SQL databases. In the past, we used our internal employees to improve our SQL databases. However, we want to outsource the work.

There is a lot of information we would like to keep private from the outsourcing.


Is there a way to efficiently make test data throughout our database without changing our original database?
Is a way to easily update our database to the new changes?

I found this product through Google EMS Data Generator for SQL Server
http://www.sqlmanager.net/en/products/mssql/datagenerator
Would this program help us make test data?

Thanks in advance
-Devin

View 2 Replies


ADVERTISEMENT

Efficiently Supplying Xml Data For Treeview

May 16, 2008

I have an interesting problem:

I have an ASP.NET web application that uses a Treeview control to display what can potentially be a very large data set. In the past, I would just run a recursive stored procedure in my database that would output the XML which I would save to a file. The Treeview used the XML file as its data source. I did this because it can take so long for the stored procedure to run (10 seconds or more) that, it isn't practical to have the treeview point directly to the stored procedure. This worked well enough because the data didn't change very often.

Now, it looks as if the application will be used in a production environment, and I really need to find a way to supply up-to-date data to the treeview in a dynamic way. I have tried creating a view that would provide XML and that would be updated any time the target table is updated but, that has not worked. I have also tried creating a trigger that would output to an XML file any time an edit was made (using the xp_cmdshell functionality) but, that has proven difficult as well.

Is there a simpler solution that I am just missing? I just want an up-to-date XML representation of the data that is a result of a recursive function.

Thanks for any help you can provide.

View 7 Replies View Related

Sampling Data Set Via Integration Services Data Flow For Data Mining Models Without Saving Training And Test Data Set?

Nov 24, 2006

Hi, all here,

Thank you very much for your kind attention.

I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.

Thank you very much in advance for any help.

With best regards,

Yours sincerely,

View 5 Replies View Related

How Can L Rewrite This To Run Efficiently !!!!!!!!!!

Apr 18, 2002

How can l rewrite this and trim the code.

CREATE Procedure Disbursements_Cats
(@startdate datetime,
@enddate datetime)

AS
Begin

SELECT Loan.loan_No AS Loan_No,
Loan.customer_No AS Customer_No,
Customer.first_name AS First_name,
Customer.second_name AS Second_name,
Customer.surname AS Surname,
Customer.initials AS Initials,
Bank.Bank_name AS Bank_name,
Branch.Branch_name AS Branch_name,
Branch.branch_code AS Branch_code ,
Bank_detail.bank_acc_type AS Bank_acc_type,
Transaction_Record.transaction_Amount AS Transaction_Amount,
Transaction_Record.transaction_Date AS Transaction_Date,
Loan.product AS Product,
Product.product_Type AS Product_Type,
Product_Type.loan_Type AS Loan_Type

FROM Transaction_Record INNER JOIN
Loan ON Transaction_Record.loan_No = Loan.loan_No INNER JOIN
Product ON Loan.product = Product.product INNER JOIN
Customer ON Loan.customer_No = Customer.customer_no INNER JOIN
Bank_detail ON Customer.customer_no = Bank_detail.customer_no INNER JOIN
Branch ON Bank_detail.Branch = Branch.Branch INNER JOIN
Bank ON Branch.Bank = Bank.Bank INNER JOIN
Product_Type ON Product.product_Type = Product_Type.product_Type

END;
GO

View 1 Replies View Related

DB Engine :: Replicate A Master Test Database To 100 Test Environments?

Oct 12, 2015

We are setting up a test lab environment with 100 machines.  We want one master testing db that gets replicated to each to run scripted application tests nightly.  

My goal is to minimize the amount of work to move this thing to each of the 100 test machines.  I am wondering if we need to even have the sql local and invest in a monster db server with 100 copies of the db we restore and each test machine point to their own db on that server, or if I should use db mirroring or something to get the master test db to each of those machines instead.

View 6 Replies View Related

Using Wildcards Efficiently With Equals Or LIKE

Jul 6, 2006

Is it possible to use wildcards with an equals statement? Such asSELECT * FROM Table WHERE City = '%' AND State='Ca'Bascially just stating where city equals anything...I know you can do it with a LIKE statement such as...SELECT * FROM Table WHERE City LIKE '%' AND State='Ca'but is that very efficient?The reason I want to do this is because I want to programmitcally set the city, so just ommiting it won't work
Also, using City LIKE '%' seems to not include NULL...is there anywayto include NULL as well as anything else?
Thanks for your help!

View 2 Replies View Related

Rewrite A Query Efficiently

Mar 15, 2007

Is there a efficient way to write this query?

SELECT CASE
WHEN Population BETWEEN 0 AND 100 THEN '0-100' WHEN Population BETWEEN 101 AND 1000 THEN '101-1000' ELSE 'Greater than 1000' END AS Population_Range,
COUNT(CASE WHEN Population BETWEEN 0 AND 100 THEN '0-100' WHEN Population BETWEEN 101 AND 1000 THEN '101-1000' ELSE 'Greater than 1000' END) AS [No. Of Countries]
FROM Country
GROUP BY
CASE WHEN Population BETWEEN 0 AND 100 THEN '0-100' WHEN Population BETWEEN 101 AND 1000 THEN '101-1000' ELSE 'Greater than 1000' END

View 1 Replies View Related

Ordering Records - Efficiently

Feb 8, 2008

I hope I explain this correctly...

I'm required to allow users to order items in a field to be displayed on a page in the order they specified.

For example: A user can drag and drop items in a list to specify the order it will be displayed in.

I have my drag and drop code ready to do this.
I have an idea on how to do this but I think it’s too inefficient.
I was going to create an orderby field and populate it with a number that corresponds to the position of the item. However, as one can deduce, if a user drags and drops a record between two others, I would have to change not only its orderby number but then change all the other items orderby number.

For instances if I dropped an item with an orderby number of 3 between 6 and 7 I would have to change the 3 to a 7 and then recursively change all the other records orderby numbers up to 3 and then change everything after 7.

Well, I hope I make sense. It’s easier to visualize it on paper.

Does anyone know how to tackle this issue of user dynamic ordering?

View 2 Replies View Related

Efficiently Joining Same Table Twice

Jul 23, 2005

My main table has the following structure:t1 (id_primary, id_secundary, name) i.e. [(1,1,"name1"), (2,1,"name2")]I want to join this table with the following second table:t2 (id_primary, id_secundary, value) i.e. [(1, NULL, "value1"),(NULL,1,"value2")]The join should first try to find a match on id_primary and only if thatfails it should find a match on id_secundary. Every row in t1 is matchedagainst a single row in t2.The following query works:selecta.name, isnull(b.value, c.value)fromt1 a left outer join t2 b on a.id_primary = b.id_primaryleft outer join t2 c on a.id_secundary = c.id_secundaryI'm wondering though if it would be possible to write a query that only usest2 once, since it actualy is quite a complex query that is calculated twicenow. Any ideas (besides using a temp table)?

View 3 Replies View Related

Unit Testing For SSIS - To Test Or Not To Test?

Oct 17, 2006

Now that we have a good programming model in SSIS - the question is whether to write automated unit tests for your packages, and would it generally be a good idea for packages?

Also - if yes to write tests - then where to find more informations regarding How to accomplish that?

View 1 Replies View Related

How To Test SSis Package And What Are The Things I Need To Test It ?

Nov 27, 2007



hi every one,
i need to test SSIS pacakge which will import data from different database where record count is around 5 millions.
iam planning to test it through c# code as well as manually also.
SSIS source : consist of 7 tables
SSIS destination :consist of 7 tables
Using c# code iam trying to run ssis package through batch file.
i am putting expected rowcount, column count in an excel file and comparing same with destination tables by writing query implementing ADO.Net concept.
am i going right way ,can any one suggest best and productive way to test the ssis package .
what are the other things i need to test it.
do any one can add test cases to it.






S.No

Test Case


1

Verify all the tables have been imported.



2

Verify all the rows in each table have been imported.



3

Verify all the columns specified in source query for each table have been imported


4

Verify all the data has been received without any truncation for each column.



5

Verify the schema at source and destination



6

Verify the time taken /speed for data transfer


7

Fields truncated due to difference in length of the field at destination.
Regards
Arif shareef

View 9 Replies View Related

Efficiently Searching Multiple Words In A String

Feb 15, 2008

 Hi,I'd be interested in people's thoughts about the following.  A user on my site will be searching for a venue name, and that could officially include a sponsor which the user might not search for.  Now I am using the AutoCompleteDropdown from the AJAX Control Toolkit, so the user will start typing in a few characters and the results will be returned. I can generate the results from sql by doing a simple LIKE '%' + @searchTerm + '%' however, this fills me with great fear of table scans. At the moment, we'd be querying against a table of 5K records, but our application is very new.I'm thinking one option is to split the words into another table - a one to many relationship to hold each word of the venue.  The benefit of this would be that you could do a:LIKE @term + '%'but then I have the cost of the join. (And the added complexity which is not a major issue)Any thoughts/tips?Thanks!   

View 1 Replies View Related

SQL 2012 :: How To Efficiently Downsize Some Unicode Fields

Oct 12, 2015

We have a SQL Server 2012 Enterprise live transactional database that is now growing over 1G per month and is becoming a size problem for us. It is currently at 23G. Character type fields are all Unicode and I have calculated a savings of 5G in space converting only 2 such fields averaging 206 characters each to non-Unicode, and almost 10G in space if we convert a few more of them from nchar and nvarchar to char and varchar types. These fields will never have a requirement to hold Unicode characters that cannot be in the SQL_Latin1_General_CP1_CI_AS collation as they come in as plain ASCII originally and will always do so per the protocol standard.

I’m the software architect and chief C# developer though only a DBA hack or I would not have designed our database to have Unicode fields for high volume tables that did not need Unicode for those fields when the database was created 3 years ago. I want to correct this mistake now before we finalize converting to an AlwaysOn environment to support with various performance and backup issues.

After downsizing these two or more fields, we would like to shrink the database one time to take advantage of the space savings for full backups, and for seeding an AlwaysOn environment.

1.What is the safest and most efficient conversion technique for downsizing columns from nchar/nvarchar to char/varchar types? Esp. when there are multiple fields in the same table to be converted. I tested doing an “add new column, set new=old, drop old, rename old to new” for both of the main two fields I want to convert from nvarchar(max) to varchar(max), and it took 81 minutes on our test server (4 virtual core, 8G memory) before running out of disk space even though there was 8G left on the disk, and the db has unlimited size set (Could not allocate space for object 'dbo.abc'.'PK_xyz' in database 'xxx' because the 'PRIMARY' filegroup is full). I did delete an old database before it finished after getting a disk warning so maybe it did not count that new space. Regardless it was too slow. And this was on just the two largest of these columns (12.6M rows) and only ran 2 to 3% CPU busy so seemed not very efficient, and indicated unacceptable downtime if we were to convert even these two fields much less any additional fields. Average field size for these two fields was only 206 characters or 412 bytes each. Another technique I plan to try is to create the new table def in a new schema, select into it from the old table, then move tables amongst schema and delete the old table. I have a FK and indexes to contend with on the table.

2.If I figure out how to do #1 efficiently within an acceptable maint window, what is the safest practice for doing a one-time shrink and end up with organized/rebuilt indexes and updated Statistics? I understand the logic of not doing regular shrinks and that sometimes it can actually increase the size.

3.Is there any third party tool that could take a backup and restore it into a new database with the modified field definitions or otherwise convert certain field types?

View 9 Replies View Related

Need Opinions On Creating A Reporting Database More Efficiently

May 27, 2006

Situation:
SQL Server 2000.
At my new employer they have a production database on one server and a copy of it that is set to read only on another server which is used for reporting.

#1
They have an SQL Server Agent job on the production server that: (2 times a day)

Backs up the production database
Copies the backup file to a directory on the reporting server. (Its pretty big and can take time if there are problems with the LAN)

#2
They have an SQL Server Agent job on the Reporting server that: (scheduled to run 2 hours or so after the job on server 1 has runthey figured that it would be a safe bet that the backup and copy process of the first job would be done by then)

Breaks the user connections to the reporting database
Performs a restore on the reporting database using the backup file that was copied to the holding directory by the production job.
Sets some permissions for various users.
Sets the reporting database to READ ONLY.
What I would like to do is find a more efficient way to create this reporting database, I have started doing research into DTS methods but would like some opinions from more experienced users.

Thank You,
Wade

View 1 Replies View Related

Test Image Data

Feb 28, 2007

Hi I have the following table and just wondered if theres an easy way of inserting test data with the image field not null?

Table: file
Fields: file_id(int), filename(varchar), data(image).

Any help would be great!
Cheers, Mark

View 6 Replies View Related

Efficiently Creating Random Numbers In Very Large Table

Jan 19, 2007

Hello,

I need to sample data in a very large table in SQL Server 2000 (a gazillion rows of Performance Monitor statitics).

I'd like to take the top 5%, for instance, based upon a column containing random numbers.

Can anyone suggest a highly efficient method of populating a column with random numbers.

Thanks in advance.

Rod

View 10 Replies View Related

Inserting Test Data In To The Database

Jan 3, 2001

How can i insert test Data in to the Database,I want to insert one million records in to the table,This is to test Database Performance.
Can anyone help me in this regard,Do we have any scripts for this purpose???
thanks
Mar

View 4 Replies View Related

Generating Test Data In SQL Server

Jul 3, 2006

Hi,

I use SQL Server 2005 Dev Edition and am not new to making databases (then again, I've had enough experience and my dad does the same thing).

I am (unfortunately) a university student and for my dissertation I am going to produce a SQL Server database with a strong emphasis on data mining.

Obviously, for the data mining to be useful at all I need to produce loads and loads of test data.

Fair enough, and there are applications which do this, such as EMS Data Gen, but can anyone recommend me any other data gen utilities? EMS Data Gen has poor handling of unique attributes, and as I am doing a car manufacturer this will give me problems when I come to the registration number attribute.

Also, why are utilities for SQL Server (and Oracle at that) so expensive? This makes it out of my reach and makes it difficult to build a truly good database that will net me good marks, and demotivates me. :(

Lastly, please feel free to recommend to me any utilities for SQL Server - such as performance monitors, backup utilities. Anything. But if they are priced utilities, they have to be sensibly priced (<£100), because I cannot yet afford to pay >£1k on such utiltiies.

Thanks

View 1 Replies View Related

Train And Test Data Sets

Feb 8, 2007

I've seen that sometimes is better to split the table into a test dataset and a training dataset, and I'll appreciate if anyone can explain why is this...

thanks

Santiago Aceñolaza
Argentina

View 4 Replies View Related

Test Connection Successful, Yet No Data

May 18, 2007

I run XP Professional on a home machine. I recently installed VB 2003 with MSDE. When I go to the server explorer window, right click to add a connection, the Data Link properties dialog box pops up as it should. I select the name of home server, indicate integrated security, and select a sample database such as model or master from the list. The test connection is successful and clicking OK causes the connection to appear in the Server Explorer window. Yet when I go to open the connection, there is no data. The folders for Tables, Stored Procedures and so forth appear but they are empty. I have another 2000 database that a friend sent me. If I try to indicate in the Data Link propetries dialog box that I would like to attach this database, I get the same symptom. Test connection successful, yet when I create and open this connection, the folders are empty. I am a newbie to ADO .NET and I am not sure where to begin. Can someone help?

View 3 Replies View Related

Need To Generate Test (Dummy) Data In SQLServer 7.0

Nov 1, 1999

Hi.

Does anyone know if SQLServer 7.0 will generate dummy data for specific columns in tables?

TIA,

Jeff

View 2 Replies View Related

Move A Subset Of Data From Production To Test

Nov 18, 2005

Here is my requirement.

There is a production database which has ever increasing data. For testing purposes though, I would like to build a test database with exactly the same schema but only a subset of data copied from the production database . I'll specify the criteria (something like a where clause in select query) for copying the data from the production database.

Is there a tool that anyone has come across to do this job ?

View 2 Replies View Related

Restore Data From Live System To Test

Jun 16, 2015

I have two SQL Databases on separate servers, live and test. I have been asked to copy the data from the live system and put it into test. They are SQL Management Studio 2008 running on MS Server 2008R2.

Could a simple backup of the database, then copy that file to the test system and restore the database from that point work or it there more to it?

View 3 Replies View Related

CLR Test Script SELECT Returns No Row Data

Sep 12, 2006

Hi,

The test.sql scripts I write to test CLR stored procedures run successfully, but when I want to display the resulting data in the database with a simple "SELECT * from Employee"

I get the result as:
Name Address
---- -------
No rows affected.
(1 row(s) returned)

But not the actual row is displayed whereas I would expect to see something like:

Name Address


---- -------

John Doe


No rows affected.


(1 row(s) returned)

I have another database project where doing the same thing displays the row information but there doesn't seem to be a lot different between the two.

Why no results in first case?

Thanks,
Bahadir

View 1 Replies View Related

Test - Populating Tables With Dummy Data

Aug 24, 2006

In 2000, BCP seemed the way to go. DTS packages would also work. My question is, in 2005, what is the best choice? I seem to remember that BCP ignored all referential integrity constraints, and applying them afterwords was a royal pain. I'm not a BCP expert by any means. Running this at the command line means using the DOS prompt correct?

What is 2005's answer to this?

View 4 Replies View Related

Transact SQL :: Create Test Data Using Script For Each Row

Apr 22, 2015

I am looking for a sql code snippet which read data from below table

UserId username contact
 1      Anil    111
 2      Sunil   222

and insert data to below table with some test data appending sequence number 1,2,3 for only City and Email. Both are different tables and does not have any referencial integrity.No of records inserted for user is configurable for example count = 3

Username  City  Email
Anil      city1 email1
Anil      city2 email2
Anil      city3 email3
Sunil      city1 email1
Sunil      city2 email2
Sunil      city3 email3

View 5 Replies View Related

SQL Server 2008 :: How To Generate Test Data Using Only VS 2013

Sep 29, 2015

I've been tasked to generate some test data (a few thousand rows) into a new table in a new database. This database is a whole new idea, so I can't write a query to pull pieces of data from other databases. I cannot consider any third party tools, such as what Redgate or Idera has to offer. I can't consider free tools such as what I've found on GitHub. I've been instructed to restrict myself to Visual Studio 2013 and whatever I can get that works within that.

View 5 Replies View Related

Query To Compare Table Data Between Test And Production?

Jul 23, 2005

I am debugging one of our programs and ran the fix in Test. I would liketo compare table 1 between Production and Test. I want the query to outputcolumn 1 if Production <> Test output.What is the best way to achieve this?jeff--Message posted via http://www.sqlmonster.com

View 2 Replies View Related

Is There Anyway To Do Data Mining Model Test Via SSIS Package?

Dec 1, 2006

Hi, all here,

I am wondering if there is any kind of ways for us to test data mining models via SSIS package? That'll be quite helpful if there is such a way.

Looking forward to hearing from your guidance and thanks a lot in advance.

With best regards,

Yours sincerely,

View 3 Replies View Related

SQL Server 2008 :: Create Test Portfolio Data Summing To 100%?

Feb 5, 2015

I'm building a proc to generate fake stock portfolios for testing. I have a list of thousands of symbols, and I want the tester to be able to select how many symbols they want in their fake portfolio, and then give each symbol a random weighting (i.e. percentage held in that security) which, across all the symbols, sums to 100%. The securities here are not the part I care about, it's the weightings summing to 100 that's important.

So test data would look something like this:

/*
--This is the repository of potential symbols I can add to a fake portfolio.
-- So the simple part is basically select top (@symbolCt) from #PossibleSymbols, plus some magic I have yet to determine
if object_id('tempdb.dbo.#PossibleSymbols') is not null drop table #PossibleSymbols
create table #PossibleSymbols
(
SymbolID int
)
insert into #PossibleSymbols (SymbolID)

[code]....

View 0 Replies View Related

Analysis :: YTD / MTD Functions Return Empty Values Probably Due To Old Test Data

Jun 17, 2015

I have managed to use the BI Wizard for time intelligence and added YTD and MTD successfully. I notice the values returned are empty, and I think this is due to the fact that all the test data I use is many years old. What's the simplest way to resolve this issue so that I can see that these MDX functions return correct values? Changing the system date on this company laptop is not an option.

View 4 Replies View Related

Built Database With SQL Server Management Studio Express, How Can I Quikly Add Test Data?

Apr 13, 2006

I've built my SQL Server Express database with SQL Serevr Management Studio Express, and now I want to enter some seed data to assist in building tha app around it. I cannot find an option to manage the data in SQL SMSX, like I used to with Enterprise Manager.
I don't want to have to write an app just to get test data in. Seems like this should be a common need. Am I missing something obvious here? Can't find any reference to this in a search of the forums.
Please help.
 

View 1 Replies View Related

Data Source Deployment Best Practises Supporting Development, Test, And Production Environments

Feb 4, 2008

We are setting up a new Reporting Services 2005 enterprise reporting tier that will support multiple developers, applications, and end users. We will have mirrored environments including development, test, and production each with their own database cluster, and reporting server.

We have multiple report developers who share a single Visual Studio solution which is saved in SourceSafe and is setup to have separate report projects for each business unit in the orgainzation. Each report project is mapped to a specific deployment folder matching the business unit. Using the Visual Studio Configuration Manager, we can simply flip to the envirnoment we want to deploy to and the reports are published to the correct environment and folder structure.

My problem lies with the common data sources. We are using a single master Common Data Sources folder to hold all of the data sources. The trick is that each and every reporting folder seems to have to have it's own copy of the data source in visual studio. There does not seem to be an easy way to change the data sources for the reports when you publish to various environment, i.e. development, test, production etc.

Ideally, we would have a single project for the common data sources that all reporting projects and associated folders would map to, and we would have a way to associate the appropriate data source for each environment when we deploy.

I'm looling for best practices on how to setup data sources for development and deployment in an enterprise environment that uses Visual Studio to develop and publish reports. We have 3 environments, and 6 data sources per environment and about 20 reporting folder / project in Visual Studio. That's 360 changes that have to be manged when deploying reports. Is there a best practices way to do this?

There has got to be a better way? Can anyone give me some insite into how to set this up?

Thanks!

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved