I am working on a coupon redemption project that appears to require dynamic data validation and need some help. For example, a coupon could require a person to buy XXX number of XXX items from XXX manufacture before the transaction is approved. The logical operator between each validation could also be “AND� and “OR�. My first attempt at this has been something along the lines of creating a coupon table with a child table of validation logic. However, being able to apply this logic (and at what layer) has turned out to be a challenge. I am very concerned about speed as all requirements have to be met before a transaction is approved. Any help with trying to figure out the best solution for this would be great@!
Coupons
C_ID
C_DESC
MFG_ID
MFG_OFFERNUM
Coupon_Validation
CV_ID
C_ID
MFG_ID (Manufacture)
MFG_PARTNUM (Manufacture Part Number)
CV_QTYREQUIRED (Number to meet requirements)
CV_ITEMLOGIC (AND, OR)
I am currently in the process of migrating data from Sybase to Sql server and would like to know how to test the data migrated.
As of now, we took one table data from both source and destination and compared it in Excel to check if the data migrated looks good (note, we used SSIS to migrate data). However, I would like to check if there are any other best & easy ways to apprach data validation post migration.
While run time these values are lets suppose @SSN = '999-000-000' & @State='ABC'
Now the Result is displayed with the state data Like 'AB' only.
Output: 1 999-000-000 AB
instead it should give system generated error.
Here I have 2 Questions: 1. Why it is taking 1st 2 Charecters? 2. Why it does not have any system generated for length?
I can do validation with Length function for these 2 variables however if have 100 variables then it should not feasible case. So, what is the reason behind?Â
I have something like this SportInfoTable SportID UserID CurrentlyRegisteredOnSport 1 1 False2 1 True2 2 True3 3 True4 1 False SportTable ID NameOfSport1 Football2 Baseball3 Tennis4 Squash5 Hockey6 Basketball Im trying to write an SP that will get all the sports where "currentlRegisteredOnSport" is false and all the sports where the user has not registered on it. i.e userid for a particular sport is not mentioned in SportInfoTable(I will be passing the userid as a parameter) so for example for userID 1 i want it to return from SportTable: ID NameOfSport1 Football4 Squash5 Hockey6 Basketball How can i do this? hope it makes sense. Thanks
Hi guys I am fairly new to t-sql. I am sure there are SPs or scripts that I can use to create a procedure that will do the data validation in the staging table...
Client send us data and often some of the records have bad values... what i have to create is a process that will check for those values and update a flag in the staging table for each column if the data is not valid....
Please help me out if you have something that can be used for this.
Being relatively new to SSIS, I'm looking for advice, or a best practice, regarding data validation before extracting the data for a transformation.
One of my project's require that certain data be validated in staging tables before it is loaded. The validations include checking for null values, verifying that a field is populated with apropriate values etc... The entire batch of data (good records and bad records) may be rejected depending on the validations.
I have a couple of different thoughts on how this could be handled...
Run a series of validation queries on the data before executing an SSIS package Run some kind of validation transformation (does one exist or should I write a custom transformation?) Place contraints on the target tables so that bad records error out on the load Something else... I could be missing the completely obvious
#3 doesn't seem to viable as the entire load may be rejected if some of the data is bad...
Hello all...I am trying to validate that the new work date that is being imported from the text file does not exist in the table. So in other words I do not want duplicate data. This is what I haveSqlDataReader dr = new SqlDataReader(); SqlParameter sp = new SqlParameter("@WorkDate", Data.SqlDbType.datetime, 8, Data.ParameterDirection.Input); if (dr.HasRows) sp = "@WorkDate"; else *How does that look? Am I in the ball part…
I have one question regarding SQL Server replication.
I have setup SQLServer merge replication at my work, one of our subscriptions has unreliable connection to the publisher, after setting up the replication to this subscriber, it happened once that the connection was really bad so the merge agent for this subscription was failing before synchronizing data changes, and after that , the connection went good & the merge agent completed the synchronization successfully.
After a while I decided to validate all subscriptions, I discovered that this subscription has different row count for many articles although the merge agent was finishing all synchronizations successfully!!
so my question is:
how could this happen? I mean finishing the sync with success and still have different records!!
I thought that SQL Server will keep make the sync fail just because it can't do it!!
Is it right or there is something I misunderstood here?
How can I prevent this from happening again? at least I want SQL Server to inform me by failing the merge agent synchronizations.
I new here so hopefully I'm asking this question in the correct forum. I'm have a flat file that contains numbers that I need to verify that they begin with certain prefixes so they load to the correct client. For example in the flat file if I'm loading data for client A and their account numbers begin with 045XXXXXXX then it loads the data. But if there is a record that begins with 037XXXXXXX it should be loaded to client B instead and that records gets written to a error file.
So to summarize what I need , I'm looking for a check to kick out records if I'm loading client A's data versus if I'm loading client B's data.
Hello everyone, I have spent hours trying to figure out the validation issue with SQL Server Reporting Services with no luck.I am hoping someone can help me with the issue. Basically I am looking for a way to validate the data entered by the user in the parameterized Reports. say, the field is defined as int datatype. If user enter String, it should display the message saying string cannot be entered. Currently, My report does not check the user entered datatype, runs the underlying query and displays the error thrown by the DB ( oracle in my case). I get : An error has occured during report processing Cannot read the next data row.... ORA-01722: invalid number
Instead of this, validation even before query is executed would have been perfect. Is there a way to do that?
I want to make a conditional split based on the data type provided by the input. For example : If the comming (Column x) is of data type (numeric) then pass , else do not pass.
I need some suggestions on validating a string of text based on some business rules using T-SQL. I have a string similar to the following:
This is some text (1.00); this is some more (there's more text here) text (2.00); this is yet more text (1.10)
The above example illustrates a valid string. You'll notice the multiple sets of parens The parens that contain numbers must be followed by a semi-colon except for the last set of parens. Furthermore, any parens followed by a semi-colon must contain only numbers and not text. I can easily identify the positions in the string by using PATINDEX.
Get the closing paren not followed by a semi-colon, ignoring the last closing paren PATINDEX('%)[^;]', @string) Get the closing paren followed by a semi-colon, ignoring the last closing paren PATINDEX('%);%', @string)
My question is, is there a way to quickly validate the data between the parens without visiting each character. This is SQL 2000 so using CLR regular expressions is out.
I am working in sqlserver 2008 R2 and below is my sample research query. I am trying to get previous 6 months data.
WITH CutomMonths AS ( SELECT UPPER(convert(VARCHAR(3), datename(month, DATEADD(MM, DATEDIFF(MM, 0, GETDATE()) - N, 0)))) Month ,DATEADD(MM, DATEDIFF(MM, 0, GETDATE()) - N, 0) startdate ,DATEADD(MM, DATEDIFF(MM, 0, GETDATE()) - N + 1, 0) enddate FROM ( VALUES (1)
[Code] ...
Current output what i am getting:
Expected Output:
I found why the April month i didn't get the $20 because the startdate of my perks CTE '2014-04-03'. If it is '2014-04-01' then i will get the expected output.
But i should not change the the date on perks. How to neglect this date issue and consider the month instead to get the expected output.
We have the following scenario: We receive CSV files every month for which SSIS packages were built to process the data. The following problems occur from time to time:
1. The structure of the CSV file changed (e.g. column added or removed) 2. There were no footers in the data, but now footers started to appear 3. Date format changed (e.g. used to be mm/dd/yyyy, but became mm.dd.yyyy) 4. Number format changed (e.g. from 2000 to 2,000)
Currently we have person who manually opens each file, and using our "validation document" validates to ensure none of these or similar problems occur. We would like to move away from this manual process if possible. I understand that items 3. and 4. could be caught by loading data into a staging table with VARCHAR data types, and performing validation before moving it any further.
Item 2 is a bit questionable (meaning depending on the footer size SSIS load could fail or not).
Item 1, however, is a sure fail of the SSIS package that directly loads the data into a table.
Thus I feel the two possible options are:
1. Create a custom script that will run through the file, row by row, apply all the necessary validations and report an error or continue if all checks out
2. Use some 3rd party tool to validate the files (semi-manually) before kicking off the SSIS processing.
Im newto SSIS. I want to develop package for data validation.
FirstName
1. Mandatory  field checking: if Null, reject the record  2. If field length > 50, then reject the record
SSN
1.  If field length > 12, then reject the record 2. If SSN is not in valid format, issue warning and process rhe record  without SSN value. 3. Valid format: 9 digit numeric values should present after striping off  all non-numeric characters. 4. Only send 9 digits to MDM
Like these i have 30 rules. And I have to shop the error msg if the validation fails like "Mandatory feild is missing".
I wanted to know more about validation of SSRS parameters. I have a simple report which has a parameter called startdate of DateTime datatype. The datetime parameter in SSRS takes manual input as well. So, the user can enter any junk value. I want to ensure that the input parameter is in correct format and I want to display an error msg when the format is incorrect. My report has the following VB code for validation:
Public Function Validate( ByVal startdate As String) As Boolean If IsDate(startdate) = True Then Return True Else Return False End If End Function
And my report has a textbox which has the expression property set to;
=Code.Validate(Parameters!startdate.Value) the textbox on the report has to display if the entered date is valid or not.
But, when i enter an erroneous date, SSRS doesn't render the report and throws a generic error. This happens even before the code written for validating the parameter executes.
Also couldn't find a way to disable the manual input for the datetime parameter. Even that would solve the problem.
Another alternative was to make the startdate parameter as string, but i want the calendar control button to be provided for the user.
In my database some of the store procedures getting the data from xml nodes.so I need to implement the validation to xml data for prevent sql injection.
I am trying to executed a packege so that it loads data from from the excel file to the SQL Server Server database. When I execute it, it prompts the following error message and 1 warning The excel file has three colums, Week, Item and Value
Error 4 Validation error. Data Flow Task: OLE DB Source [94]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80040E37 Description: "ORA-00942: table or view does not exist ". Test - GET NW PERF 1.dtsx 0 0
Warning
Warning 1 Validation warning. Data Flow Task: OLE DB Destination [36]: The external metadata column collection is out of synchronization with the data source columns. The column "DAY" needs to be added to the external metadata column collection. The column "TCH_AVAIL" needs to be added to the external metadata column collection. The column "PDROP" needs to be added to the external metadata column collection. The column "P_HR" needs to be added to the external metadata column collection. The column "SFAIL" needs to be added to the external metadata column collection. The "external metadata column "VALUE" (90)" needs to be removed from the external metadata column collection. The "external metadata column "ITEM" (89)" needs to be removed from the external metadata column collection. Not in use - GET NW STATS.dtsx 0 0
I've created a merge replication a few days ago. it works correctly, but today when click synchronize publications, some of then encounter this error :
Data validation failed for one or more articles. When troubleshooting, check the output log files for any errors that may be preventing data from being synchronized properly. Note that when error compensation or delete tracking functionalities are disabled for an article, non-convergence can occur. (Source: MSSQL_REPL, Error number:
I want to do something with error checking in my company. For this we have a selection of different tables and the data needs to meet various validation rules else it is classed as an error.
To deal with this I'm currently thinking of this approach:
1. Create a view pulling all of the various data together from the multiple tables. 2. Create an empty 'errors' data table. 3. Create an Excel file with a button to call a Check for Errors Script
Then in the the script:
1. Clear the 'errors' data table 2. Call multiple scripts, each of which uses the new view, applies the checks for that specific error and writes any erroring data into the 'errors' data table (along with a text string with the unique error code for filtering / sorting purposes). 3. After calling all the scripts, the table can be refreshed in excel when when used with a pivot table can show the various errors, and let us drill down into all the data so we can fix them.
Also.. Ideally, I'd like some way to write comments in an excel column for each entry and error code and be able to write that back into a comment table.
The issue is in the data flow for loading and setting the Fact table dimension keys (the dimensions are all loaded fine). After 16 rather pedestrian Lookup Transformations, I have an escalating problem adding additional Lookup transforms to the Data Flow. The problem is not in execution; the problem is adding more transforms in design mode.
Lookup # Fields in Data Flow Time to validate that lookup <17 47 Sub-second 17 48 2 sec 18 49 4 sec 19 50 8 sec 20 51 16 sec 21 52 32 sec 22 53 64 sec
While I€™m intrigued by the mathematical progression that is forming here, the issue is that I have at least 6 more Lookups to perform. I hope you can see my dilemma.
I have gone to where it takes a little over 4 minutes each to validate the lookup transform and its associated Derived Column transform and Union transform (Total 12 Minutes). Not only does this add up to many idle minutes to each design step, BUT it breaks the debugger as it pre-validates the ENTIRE data flow before it ever switches into debugging mode.
Some notes: 1. It doesn€™t matter what order the Lookup transforms occur in, the timings are exactly the same. 2. I tried many Data Flow execution optimizations, but they don€™t improve the validation times (or even get a chance to improve the execution times!)
I realize this may be somewhat of a unique problem.
I have a package set up basically with two consecutive data flows. The first flow takes data from an OLE DB Source and stores it into a Flat File Destination. The second flow uses this same flat file as a source, alters the data, and stores the data in the same flat file, overwriting the old file. I set DelayValidation to True on the flat file. Still, here are the error messages I am receiving:
Error: 0xC020200E at DO, Flat File Destination [7676]: Cannot open the datafile "C:Temp.txt".
Error: 0xC004701A at DO, DTS.Pipeline: component "Flat File Destination" (7676) failed the pre-execute phase and returned error code 0xC020200E.
I am new to SSIS, so I'm sure I have a setting wrong or something. Is the problem that SSIS is trying to write to a file from which it is simultaneously reading data?
In C# .NET I have the possible to create some validations of my data, with regulary expressions. Do SQL have the same feature? I will like to do an data validation of all my insert statement inside the sql-server. Is that possible?
I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some business logic? How do i do this?
I'm developing a database-driven program using SQL server 2000 and Visual Basic 2005.
Most of the guys say professional programming is doing the validation stuff (such as the constraints and data integrity stuff like" [0-9][1] " and the use of LIKE IN keywords etc.) in the databse itself.
say i did the data validation contraints in SQL server itself. and now i connect the database with the interface made in 2005. and say a person enters some invalid data through the interface. but the error messages are generated by SQL server. how am i to display the SQL server generated error messages in the VB made interface??
PLS HELP ME .. if the question is not clear pls tell so that i can explain it further.
I'm developing a database-driven program using SQL server 2000 and Visual Basic 2005.
Most of the guys say professional programming is doing the validation stuff (such as the constraints and data integrity stuff like" [0-9][1] " and the use of LIKE IN keywords etc.) in the databse itself.
say i did the data validation contraints in SQL server itself. and now i connect the database with the interface made in 2005. and say a person enters some invalid data through the interface. but the error messages are generated by SQL server. how am i to display the SQL server generated error messages in the VB made interface??
PLS HELP ME .. if the question is not clear pls tell so that i can explain it further.
I have a customer table with a postcode and a suburb fields and cutomer info which is manually entered by data entry people...
I am trying to compare the entries against a postcode table with the correct postcodes which have fields postcode and suburb and based on the postcode entered in the customer table it should be the same as the suburb in the postcode table, if they are not the same output them to a table for manual checking..How would I go about this
Can anyone see where my logic may have gone a stray? A user would be attempting to update a gridview row, I'm storing the amount of an item and the qty they purchased. If the order was for $150 and the grid has two item 1 x $50 and 2 x $50 the order already totals $150, so if the user tries to change the first item to a qty of 2 x $50 the total would be $200 and this would be more than the total payment or $150. I want to then tell them they cannot do this. @AP_ID Int,@AI_ID int,@PurchaseAmount money,@PurchaseQTY int,@LastUpdate datetime,@LastUpdateBy nvarchar(50),@Receipt_ID intASdeclare @current_item_total decimal(18,2)declare @new_item_total decimal(18,2)declare @total_payments decimal(18,2)declare @current_purchase table(receipt_id int,purchase_amt money)INSERT INTO @current_purchase SELECT tblPurchase.Receipt_ID,(tblPurchase.PurchaseAmount * tblPurchase.PurchaseQty) as purchase_amtFROM tblPurchase INNER JOIN tblReceipts ON tblPurchase.Receipt_ID = tblReceipts.Receipt_IDWHERE tblReceipts.Receipt_ID=@receipt_id--Get total already saved to recordselect @current_item_total = sum(purchase_amt) from @current_purchase--Get total amount of payment saved to recordselect @total_payments = sum(tblReceipts.AmountPaid) from tblReceipts where tblReceipts.receipt_id=@receipt_id--Get total amount user is trying to save to recordset @new_item_total = (@PurchaseAmount * @PurchaseQTY)--If current total plus new total is greater than total payment, tell user they cannot do this.IF ((@current_item_total + @new_item_total) > @total_payments)BEGINSELECT 'You are attempting to add more than your total payment.' AS MESSAGEENDELSEBEGIN--UPDATE ROWUPDATE tblPurchase SET [AI_ID]=@AI_ID,[PurchaseAmount]=@PurchaseAmount,PurchaseQTY=@PurchaseQTY,LastUpdate=@LastUpdate,LastUpdateBy=@LastUpdateBy WHERE [AP_ID]=@AP_ID SELECT 'Item was updated.' AS MESSAGEEND
Trying to bind some data to a datalist for a report.
User selects an Industry from a dropdown list and then I dump all records for that industry. However in order to parse some of the record field values into names (I.E. from a 1 to the actual company name) for some records I have to read TABLE_ONE and for other records I might have to read TABLE_TWO depending on the value of FIELD_ONE.
If FIELD_ONE = "A" then I get the NAME from TABLE_ONE. If FIELD_ONE = "B" then I get the NAME from TABLE_TWO. If FIELD_ONE = "C" then I get the NAME from TABLE_THREE.
I'm lost at how to get started on this. I thought about adding IF statements to my query but these won't work because I'm not passing in the value of FIELD_ONE ahead of time - it's part of the query. So I thought maybe I could do a pre-read and store all FIELD_ONE values in an ArrayList and pass these in as parameters, but the stored proc is only being called once - so that won't work.