• Validate Data Content On Production And Secondar

Mar 14, 2008

Hi, we are using sql 2000, what would be the good way to validate different databases current data with seconday server's data after logshipping applied? thank you

View 3 Replies


ADVERTISEMENT

Validate Data Type Before Inserting

Feb 26, 2003

I have a temp table which is used to store data before inserting to the real permanent tables. All columns of the temp table have nvarchar type.

How do I validate the data in temp table before doing the actual insert? For example, I need to ensure a field is a valid Datetime (currently in temp of nvarchar) before inserting to a Datetime field. Can this be done using Transact-SQL?

Thanks.

View 1 Replies View Related

Write A Query To Validate Data

Sep 10, 2014

I'm trying to write a query to validate the data.

Here is the scenario:
1. The table has Three columns 1.ID, 2.Sqno, 3. Adj
2. The values for adj are (0,1,2)

Case1: The Sqno should start at '001000' for adj in (0,2) and increment by 2, i.e the next sqno would be '001002' and '001004' so on.
Case2: The sqno should start at '001001' for adj in (1) and increment by 2 i.e the next sqno would be '001003' and '001005' so on.

Finally when you do order by sqno and group by ID it will be a running sqno.

ID Sqno Adj
123A 001000 0
123A 001001 1
123A 001002 2
123A 001003 1
123A 001004 2
123A 001005 1
123A 001006 0
123A 001007 1
123A 001008 2
123A 001009 1

write a query that can validate this scenario.

View 2 Replies View Related

Dumb Question: Best Practices To Validate Data Columns

Aug 15, 2007

[This is one of those cases where I think I know the answer, but I hope I'm wrong!]

I have a data flow which is processing data from the XML Source. There are 16 outputs from the XML Source. I have to perform a variety of validations on these outputs: things like "column 1 is required if column 2 has value 'a' or 'b'", or "column 1 or column 2 may be present, but not both".

For lookups and such, I use the Lookup component and its error output, both to redirect rows that fail the lookup, and to capture the data, column number and error code.

But, how do I do the same for "normal" validations?

If I have to use the Conditional Split transform, then I'll have to have one output per validation, and use a Union All to combine the rows again for output to an error file. This will also cost an extra "Derived Column" transform per output, in order to get a column number and possible error code per failed validation.


Worse, it's a pain to have to maintain all the columns in such a large "Union All"!

If I had the time, I might write a "Conditional Error" transform. It might be fun. But I have to be done by the end of this month, and don't even have time to create the UI for evaluating expressions!

Any tips or tricks or pointers to such would be very welcome.

View 9 Replies View Related

Recovery :: Validate Data In Transaction Logs Shipping

Jul 16, 2015

Out of using stored procedure, reports and all this staff, I want to know the possible way to make sure that the data inside my Secondary Server Read only database are same as data in my primary server database.

So what is the simple way to do this check?

View 4 Replies View Related

Transact SQL :: Finding Gaps And Filled With Last Validate Data?

Aug 26, 2015

currently I am facing a complex escenario related with gaps and sequences, but I was trying with diferent cases but I did not get the correct results, I am sure about the use of windows functions.  I have a table with the information grouped by PublicationId, Provider, MetricId and Amount by Date, one row by each month, but in some cases these data don't have a sequencial values, for example I have the data for the next sequence:

I need to get the sequence by each month, in this case I need to project the month from February to May (with the last previous value, for this case of January) , this is:

The data for testing are:

DECLARE @PublicationsByUser AS TABLE
(
  Id   INT,
  PublicationId  INT,
  MetricId       INT,
  ProviderId     INT,
  DateCreated    DATE,
  Amount         FLOAT

[code]....

View 14 Replies View Related

How To Validate Flat File Data Against Not Null Columns Of A Table?

Nov 27, 2007

Hi all,

I am facing a problem on validating the data from a flat file while inserting the data into the destination table of sql server 2005 database. In my package, i have to validate the input data whether the values are coming as null or not, before inserting into the destination database. The flat file may not contain data for all NOT NULL columns. I have to find out that row(s) and reject the record. If the rows are coming as Null for the Not Null columns, the OLEDB Destination throws OLEDB exception for the constraint.

To resolve this, i have an script component in data flow, to check whether the input data is coming as null. I have added the output column of boolean type to the script component, it will be assigned to TRUE when there is null for the Not Null column in the script.

And in the follwing conditional split, i am checking the flag for TRUE to reject the record.

Is there any other way to handle this validation?

Regards
Madhav

View 6 Replies View Related

Power Pivot :: Deleting Data Content From Data Model

Sep 10, 2013

I don't know if the question has been nailed down.  Aside from deleting tables, can we delete the *content* of data within the tables.  It doesn't seem crazy that, if you can pull in data from a feed then you should be able to remove the content out again (without also destroying the user's meta-data work ).  Reasons for this include:

- Security (a user may not have rights to see *my* data and should go refresh their own)
- Size (workbook doesn't need to have GB's of irrelevant data saved to disk in a workbook if it was just useful during development phase to a pre-production data feed)
- Bad data (pre-production data feed is not good data)
- User-friendliness (data feed was refreshed 2 years ago and workbook was saved to file server.  Users shouldn't be presented with irrelevant data, but should get empty pivot tables until they go do their refresh)

Obviously Excel internally knows how to clear out PowerPivot data, given the prompt shown here: [URL] ....

But how does a user initiate this on their own (corruption aside)?

Previous time this question was asked, without a real resolution: [URL] ....

View 8 Replies View Related

Data Mining :: Content In Data Mining Structure Not Valid - Deployment Fails

Apr 30, 2015

I'm a beginner with SQL 2012 SSDT & SSMS. I get this error message when I try to deploy my project: 

"Error 6
Error (Data mining): KEY SEQUENCE columns are not supported at the case level. The 'Customer Key' column of the 'TK448 Ch09 Cube Clustering' mining structure contains content that is not valid.
0 0
"
I am finding it hard to locate the content that is not valid. I've been trying to find a answer for this problem but can't seem to find anything. How can I locate the content that is not valid and change or delete it so that I can deploy this solution?

View 2 Replies View Related

Scrambling Production Data

Apr 1, 2008

This is probably an easy solution for some of you seasoned DBA Vets but here is my problem.

I have to take production data and scramble certain sensitive columns such as SSN, DOB, Address, First Name so that our Management team can use it as demo material. Is there a quick solution to this issue?

Thanks,

JC

View 11 Replies View Related

How Can I Move My Production Data Only To My Dev Env.

Sep 14, 2007

I'd like to move data from my prod. env. to my dev. env. The data in dev would be replaced by the prod one. I cannot do a detach/attach or backup and restore due to some already existing dev objects located in the dev. env.

Any lead on how I can do this?

Thanks Marcel

View 5 Replies View Related

Error On Custom Data Mining Plugin: Content Type Mismatch

Jan 11, 2007

Good afternoon,

I'm doing a custom clustering plugin for text to pre-process ("clean" the texts), calculate weights, estimate the number of clusters (using the PBM index) and finally, do the actual clustering.

So... I've made each of these modules on C++ and I'm putting them all togheter on the plugin.

My database (MDB file) has only one table, with only two fields within: a key (auto-incremental) and a small text. What I intend to do is to get the text in each test case, store them togheter somewhere and call my classes to cluster these texts.

I'm trying to log the texts in a file (just a test) on the ProcessCase method, in the CaseProcessor class. I've did it with no problems with numerical data.

But when I load the MDB file on the Mining Structures Wizard, it says the content type of the field holding the texts is "Continous" and the data type is "Text". Actually, when I saw it I didn't really mind.

But when I run the mining model it gives me the following error: "Error 1 Error (Data mining): The data type of the Table1.Texto mining structure column must be numeric since it has a continuous content type (Content is set to Continuous or Key Time or Key Sequence). 0 0 "


So... How do I change this content type ? (the content type combobox on the Mining Structures Wizard couldn't the changed)

Can anyone help me on this, please ?

Thanks a lot.

View 6 Replies View Related

Refresh Data From Production To Development

Jul 23, 2005

Hi all, I've been assigned a task of refreshing data from theproduction env to development env.what i got is a backup file of a db in the prod env, i now need tomake that into the development env.I can restore it to the dev env no problem, but the warnings i got arethat the tables owner in prod and dev env need to be different, thatis, owner is A in prod env and owner is B in dev env.So I need to:1) restore the db in dev env2) change table owner from A to B3) change related triggers4) change related viewsCan anyone suggest me an approach that is most efficient?Thanks a lot.

View 2 Replies View Related

Development - Production Data Access

Aug 10, 2005

Hello All,I have been searching for a published document for Best Practicesconcerning access levels based on roles. Should developers have morethan (if at all) select level access to production data? If Iunderstand (from multiple postings) that it is best to have:1. Development (developers have extensive access levels)2. Test (developers have restriced access levels)and3. Production (developers have none or select level access)Our environment and budget only allows for items 1 and 3.If any body could point me to a document from a 'reputable' source, Iwould greatly appreciate it.TIABill

View 3 Replies View Related

Deleting Data From Production Sever

Jan 9, 2008



Hi All,
I have to delete 135 Million records from our production server, keeping only last three months data & the table doesnot have any index.

Can you please suggest the best possible approach to perform this activity.

Approach i am thinking :-


1)I will rename the existing table & create a new table with the same name (By this my application wont be interrupted)
2)then i am planning to create a nonclustered index on the date column sort order desc so that i can get the last three months data.
3)Once the index is created i can transfer the required records to the new table created in step 1.

Please sugeest your valuable feedback or suggestions to perform the task ASAP.

Thanks


View 9 Replies View Related

Move A Subset Of Data From Production To Test

Nov 18, 2005

Here is my requirement.

There is a production database which has ever increasing data. For testing purposes though, I would like to build a test database with exactly the same schema but only a subset of data copied from the production database . I'll specify the criteria (something like a where clause in select query) for copying the data from the production database.

Is there a tool that anyone has come across to do this job ?

View 2 Replies View Related

Copy Data From Staging Table To Production?

Apr 12, 2015

I am trying to insert data from staging table to production table. In the staging table I only have period or date but no primary key.

This is my staging table

Create stagingtable(
[Period] [char](7) NOT NULL,
[CompanyCode] [varchar](100) NOT NULL,
[total] [int] NULL,
[status] [varchar](50) NULL
)

Create Production(
[Period] [char](7) NOT NULL,
[CompanyCode] [varchar](100) NOT NULL,
[total] [int] NULL,
[status] [varchar](50) NULL
)

I get this every month. What can I do to make sure only unique record are loaded into production table with no duplicate from previous month.

View 5 Replies View Related

Refreshing A Development Database With Production Data

Jan 17, 2007

Hello,

I am trying to refresh a test database with data from a production database. Both database structures are identical, e.g. constraints, stored procs, PK, etc. I am trying to create a package in SSIS that accomplishes this task and I am having extensive problems. The import export wizard is out of the question because the constaints are not carried over, plus when I try to refresh the data using the import export wizard, it fails on 1 specific table because of a column in that table named "Error code". I think "Error code" is a micrsoft keyword, so it fails on this column. Does anyone know a workaround that I can do to accomplish this simple task, that could be completed in minutes using DTS. I understand that SSIS is not as straight forward as DTS, but this task is something that DBA's do on a regular basis and therefore should not be this difficult.

Any help would be appreciated,

David

View 1 Replies View Related

How To Transfer Back Data To Production Database

Nov 19, 2015

I have question regarding SQL Transactional Replication methodology

1. Let's say successfully created SQL Transactional Replication and running / transferring data from publisher to subscriber

2. Now one day the source production / publisher SQL Server is down and the remaining DR SQL Server is up (subscriber)

3. Next day, we fixed and bring up the production / publisher SQL Transactional Replication server, then how can we sync back all existing data records from subscriber into publisher side ?

View 3 Replies View Related

SQL 2012 :: Re-vamping Data Files In A Production Database?

Dec 22, 2014

(SQL 2012 Standard)

I have a database with one 320 .mdf file and one 200GB .ndf file.

I would like to reconfigure my database to have four 200GB .mdf files. How do I get from here to there?

View 5 Replies View Related

Query To Compare Table Data Between Test And Production?

Jul 23, 2005

I am debugging one of our programs and ran the fix in Test. I would liketo compare table 1 between Production and Test. I want the query to outputcolumn 1 if Production <> Test output.What is the best way to achieve this?jeff--Message posted via http://www.sqlmonster.com

View 2 Replies View Related

Recovery :: Retrieving Deleted Data In Production Table?

Jul 8, 2015

I am in problem that I have delete data in my production table.

Now how to retrieve it?

I have only Yesterdays Full and Transaction Backup files.

I used the following query for deleting all data.

delete from t1;

View 7 Replies View Related

Data Collection On Production Server To Capture Growth Rates

Mar 18, 2015

I setup data collection on a production server to capture growth rates.

When I run the dis usage report, it shows a daily growth rate of over 500 megs. This seems excessive to me.

As a troubleshooting step I then ran sp_space used and got these results:

database_namedatabase_sizeunallocated space
rgc_prod 273442.63 MB3648.48 MB

reserved data index_size unused
265345488 KB164385384 KB99826072 KB1134032 KB

What should my next steps be to try and determine why there is so much growth? And isn't the index size rather large?

View 1 Replies View Related

Deploy To Production Server / SSIS Writes Out Garbage Data

Apr 2, 2015

I built a SSIS(writing out to a flat file ) in 32 bit machine and it woks fine . But however when I deploy to the produciton server(64 bit) the SSIS writes out garbage data . After some research I found out that the problem with the 32 bit OS and 64 bit OS problem.What is my next step. Am I out of luck that now I will have to redesing the SSIS in 64 bit?

View 5 Replies View Related

OpenRowSet And OpenDataSource Fail On Production To Open Excel Data

Jun 26, 2006

The following statement fails when using SQL Analyzer under sa but works on all of our development and staging server. All are SQL Server 2005 SP1. We upgraded production over the weekend from SQL Server 2000, creating a new instance machinenameSQL2005.

SELECT *

FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',

'Excel 8.0;Database=d:data est.xls',

'SELECT * FROM [Sheet1$]')

The error we are getting only in prodcution is:

OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "Unspecified error".

Msg 7303, Level 16, State 1, Line 1

Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".

Any help seriously appreciated,



Tronn

View 44 Replies View Related

Master Data Services :: Safe Installing MDS On Production Server?

Oct 1, 2015

I have to install MDS on a production server without testing on test server (there is none test/dev server) On the production server each day are rendering SSRS reports which cannot be interrupted.

What risk is by installing MDS on a production server, (the SSRS, SSIS and engine may not go down,well can for some hours)  SQL2012Enterprise.

What do I have to do first, steps taken, to install as save as possible for the current running BI environment?

View 2 Replies View Related

Data Source Deployment Best Practises Supporting Development, Test, And Production Environments

Feb 4, 2008

We are setting up a new Reporting Services 2005 enterprise reporting tier that will support multiple developers, applications, and end users. We will have mirrored environments including development, test, and production each with their own database cluster, and reporting server.

We have multiple report developers who share a single Visual Studio solution which is saved in SourceSafe and is setup to have separate report projects for each business unit in the orgainzation. Each report project is mapped to a specific deployment folder matching the business unit. Using the Visual Studio Configuration Manager, we can simply flip to the envirnoment we want to deploy to and the reports are published to the correct environment and folder structure.

My problem lies with the common data sources. We are using a single master Common Data Sources folder to hold all of the data sources. The trick is that each and every reporting folder seems to have to have it's own copy of the data source in visual studio. There does not seem to be an easy way to change the data sources for the reports when you publish to various environment, i.e. development, test, production etc.

Ideally, we would have a single project for the common data sources that all reporting projects and associated folders would map to, and we would have a way to associate the appropriate data source for each environment when we deploy.

I'm looling for best practices on how to setup data sources for development and deployment in an enterprise environment that uses Visual Studio to develop and publish reports. We have 3 environments, and 6 data sources per environment and about 20 reporting folder / project in Visual Studio. That's 360 changes that have to be manged when deploying reports. Is there a best practices way to do this?

There has got to be a better way? Can anyone give me some insite into how to set this up?

Thanks!

View 8 Replies View Related

How To Validate Tables In DTS

Jan 4, 2000

Hai,

Can any one suggest me how to use the DTS to validate and transfer data from couple of tables of one database to different Server database
Is there any procedure how to pass variables (validations) while using DTS.
Can I use the following code for validation in DTS, if so can any one direct me how to use.


For example"

DECLARE CUR_X CURSOR FOR
SELECTsource column FROM sourcetable
WHERE sourcetable.sourcecolumn <> [some value]
ORDER BY sourcetable.sourcecolumn
OPEN CUR_X
FETCH NEXT
INTO CUR_X WHILE (@@FETCH_STATUS = 0)
BEGIN
INSERT INTO destinationserver.destinationtable(Destination columns)
SELECT source columns
FROM source table
WHERE source column = @sourcecolumn

Will appreciate for your time

View 1 Replies View Related

Validate Telephone No.

Aug 6, 2004

i want this column to accept telephone number of this format
000-000-0000

what's validation expression do i use? or how can get this db to accept this tel no. format
i use varchar for data type
thanks

View 2 Replies View Related

How Do You Validate And Scrub Using DTS?

Feb 11, 2004

When I use the DTS GUI and insert a "Bulk Insert Task" the main tab says:

"Import text files into SQL Server. You cannot validate, scrub, or transform data using this task".

So my question is, what shoud you use to validate and scrub?

In particular I have fixed-format text file with some occasional bad records (e.g. wrong length, empty record). What should I be using? If you suggest vbscript could you show me some examples? I'm new to vbscript.

Thanks!

View 2 Replies View Related

How To Validate Timestamp

Aug 31, 2013

How to validate timestamp,i have tried so many time ,but iam not getting correct result, hwo to validate source data has timestamp or not ? i would like result s if the data is timestamp that should be 1, if the data is not timestamp that should be 0 ,iam using code like this

CASE
WHEN TO_CHAR(OE_TECH_VLD_FROM_DTTM,'ddmonyyyy:hh:mi:ss.sssssss') then 1
ELSE 0
END

View 7 Replies View Related

Validate By Column Name

Jan 17, 2006

Hi,
I would like to validate a large file using an IS package before importing it into a table using IS. The Validation rules are stored in the database against each column name

Question
1. Is there a way I can get the column names of Input (coming from the file)? (So I could check them against the validation table)
2. Can I store these rules in memory, may be using an array if so how do i create an array?
3. What would be the best way to go abt doing this?? (I really appreciate any ideas)

Appreciate your help...

View 1 Replies View Related

DTExec /validate

May 23, 2008

HI, with a dataflow that has delay validation property = true, DTExec will not try to validate it when I call it with /validate option. Is there a way to see if the dataflow validate even though the delay validation property is set to true?

Thanks,

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved