I added Validate Subscriptions to my publication using the procedure outlined in BOL. Procedure is listed below. I want to stop the validation for this publication. I do not want to validate any longer. How can I stop this process?
To validate transactional data using SQL Server Enterprise Manager
At the Distributor, expand Replication Monitor, expand Publishers, and then expand a specific Publisher.
Right-click a transactional publication, and then click Validate subscriptions.
Choose whether you want to validate all subscriptions or just specific subscriptions, and if you want to validate specific subscriptions, select those in the text box.
To choose the type of validation, click Validation Options.
Choose whether you want to compute a fast rowcount based on cached table information, compute an actual row count by querying the tables directly, or compute a fast row count and if differences are found, compute an actual row count.
You can also choose to enable Compare checksums to validate data, a binary checksum (if the Subscriber is running SQL Server 2000), and you can choose to stop the Distribution Agent after the validation has completed.
I doubt this is possible, but can someone think of a way to change the email address used for sending report subscriptions based on the report or subscription?
It's a need that I've heard from a number of different clients. Scenario: a company has one reporting services server with reports running from numerous departments. Report subscriptions are sent to internal and external email addresses and there's a business need to use different "from" addresses based on the report (or audience).
Can any one suggest me how to use the DTS to validate and transfer data from couple of tables of one database to different Server database Is there any procedure how to pass variables (validations) while using DTS. Can I use the following code for validation in DTS, if so can any one direct me how to use.
For example"
DECLARE CUR_X CURSOR FOR SELECTsource column FROM sourcetable WHERE sourcetable.sourcecolumn <> [some value] ORDER BY sourcetable.sourcecolumn OPEN CUR_X FETCH NEXT INTO CUR_X WHILE (@@FETCH_STATUS = 0) BEGIN INSERT INTO destinationserver.destinationtable(Destination columns) SELECT source columns FROM source table WHERE source column = @sourcecolumn
When I use the DTS GUI and insert a "Bulk Insert Task" the main tab says:
"Import text files into SQL Server. You cannot validate, scrub, or transform data using this task".
So my question is, what shoud you use to validate and scrub?
In particular I have fixed-format text file with some occasional bad records (e.g. wrong length, empty record). What should I be using? If you suggest vbscript could you show me some examples? I'm new to vbscript.
How to validate timestamp,i have tried so many time ,but iam not getting correct result, hwo to validate source data has timestamp or not ? i would like result s if the data is timestamp that should be 1, if the data is not timestamp that should be 0 ,iam using code like this
CASE WHEN TO_CHAR(OE_TECH_VLD_FROM_DTTM,'ddmonyyyy:hh:mi:ss.sssssss') then 1 ELSE 0 END
Hi, I would like to validate a large file using an IS package before importing it into a table using IS. The Validation rules are stored in the database against each column name
Question 1. Is there a way I can get the column names of Input (coming from the file)? (So I could check them against the validation table) 2. Can I store these rules in memory, may be using an array if so how do i create an array? 3. What would be the best way to go abt doing this?? (I really appreciate any ideas)
HI, with a dataflow that has delay validation property = true, DTExec will not try to validate it when I call it with /validate option. Is there a way to see if the dataflow validate even though the delay validation property is set to true?
hi all, i want to ask about validating 2 users (2 userids and 2 pws) with one button. my database table called login and it has loginId and password.have tried "SELECT * FROM login Where loginId = '" + hoName + "', '" + toName + "' AND password = '" + hoPassword + "', '" + toPassword + "'" please help me!!
I need to know how can i validate the sql querries so that they don't contain any scripts.
I am not using any web application or any other application.
I am only using query analyzer to write statements and execute.
So all my co workers are doing the same thing to insert the information in database by using query analyzer. And at the end of the each day the database files are shipped at another warehouse.
So where in sql server 2000/7 i can validate those statements before inserting them.
I don't want the my co-workers to know that i am validating their querries.
How to validate timestamp, I have tried so many time ,but iam not getting correct result, hwo to validate source data has timestamp or not ? i would like result s if the data is timestamp that should be 1, if the data is not timestamp that should be 0 , I am using code like this :
................................................................ CASE WHEN TO_CHAR(OE_TECH_VLD_FROM_DTTM,'ddmonyyyy:hh:mi:ss.sssssss') then 1 ELSE 0 END ..................................................................
Is there any way to validate the input paratemers for the report? For example: I want to restrict the value to be less than 100 in one parameter. How to achieve this?
Trying to install Backup Exec 12 which comes bundled with SQL Server 2005 Express. OS is a clean install of Swedish Windows Server 2003 Std R2, fully patched.
SQL fails to install, and the following is in the SQL summary-log:
Product : Microsoft SQL Server 2005 Express Edition Product Version : 9.2.3042.00 Install : Failed Log File : C:ProgramMicrosoft SQL Server90Setup BootstrapLOGFilesSQLSetup0002_VAXSRV02_SQL.log Last Action : Validate_ServiceAccounts Error String : SQL Server Setup could not validate the service accounts. Either the service accounts have not been provided for all of the services being installed, or the specified username or password is incorrect. For each service, specify a valid username, password, and domain, or specify a built-in system account. The logon account cannot be validated for the service SQL Server. Error Number : 28075
Since the installation of SQL is bundled with the Backup Exec installation, there is no(?) possibility for me to specify usernames for the different services. The Backup Exec installation is initiated under the Domain Admin's login.
I suspect the problem occurs because of the OS not being English, but I am not sure. Have installed earlier versions of Backup Exec with SQL Server 2005 Express, on Swedish Windows Server 2003, before without issues. No help at Veritas/Symantec's homepage.
Hi, I'm pondering building a custom connection manager and within the Validate() method I want to initiate a connection to a SQL Server instance, check something out, and then close he connection at the end of the method. Here's the code I have so far:
Code Snippet public override Microsoft.SqlServer.Dts.Runtime.DTSExecResult Validate(Microsoft.SqlServer.Dts.Runtime.IDTSInfoEvents infoEvents) { DTSExecResult execRes = DTSExecResult.Success; if (String.IsNullOrEmpty(_serverName)) { infoEvents.FireError(0, "SqlConnectionManager", "No server name specified", String.Empty, 0); execRes = DTSExecResult.Failure; } else {
//Establish a connection and check that it is pointing to an MDM DB SqlConnection sqlConnToValidate = new SqlConnection(); sqlConnToValidate.ConnectionString = this._connectionString; try { sqlConnToValidate.Open(); if (!IsMDMDatabase(sqlConnToValidate)) { execRes = DTSExecResult.Failure; } } catch (Exception e) { infoEvents.FireError(0, "MDMConnectionManager", e.Message, String.Empty, 0); execRes = DTSExecResult.Failure; } finally { if (sqlConnToValidate.State != ConnectionState.Closed) sqlConnToValidate.Close(); } } return execRes; }
I'm worried about the overhead of establishing a connection every time Validate() is called. So the questions are
When does Validate() get called?
Does it get called for every component/task that uses it?
Does anyone suspect that what I'm doing here is necassarily a bad thing or not?
hi,all: I'm new to Sql 2000,now I have a login asp.net page and I used the sql 2000 database. my login page included a user id and password need user inputed. if the user input the correct userid and password ,IE will transfer to main page,or there will show eorror message in login page. my SP like this: CREATE PROCEDURE dbo.Usp_Accounts_ValidateLogin @userid char(4) , @EncPassword binary AS if (select count(*) from hhmxUserData where Userid=@userid and UserPWD=@EncPassword) >0 return 1 else return 0GOmy asp.net code like this: dim result As Integer dim rowsAffected as integer myConnection.Open() Dim command As SqlCommand = BuildIntCommand(storedProcName, parameters) rowsAffected = command.ExecuteNonQuery result = CInt(command.Parameters("ReturnValue").Value) myConnection.Close() Return result I test it in sql 2000,it's ok.but when I performed it and retrieve the "returnValue", it still return 0 . so how can I create my correct SP ?
I have a temp table which is used to store data before inserting to the real permanent tables. All columns of the temp table have nvarchar type.
How do I validate the data in temp table before doing the actual insert? For example, I need to ensure a field is a valid Datetime (currently in temp of nvarchar) before inserting to a Datetime field. Can this be done using Transact-SQL?
Here is the scenario: 1. The table has Three columns 1.ID, 2.Sqno, 3. Adj 2. The values for adj are (0,1,2)
Case1: The Sqno should start at '001000' for adj in (0,2) and increment by 2, i.e the next sqno would be '001002' and '001004' so on. Case2: The sqno should start at '001001' for adj in (1) and increment by 2 i.e the next sqno would be '001003' and '001005' so on.
Finally when you do order by sqno and group by ID it will be a running sqno.
It seems the following queries are causing my DB connections to time out, but I cant seem to figure out why.
These queries reside within my xml.config file.
The connection to the DB is fine and live. I get timeout errors every few hours or so.
Can anyone take a look at the queries which I pasted below and tell me why I keep getting timeout errors? PLEASE, I need all the help I can get.
<query name="Products" rowElementName="Product"> <sql> <![CDATA[ SELECT p.*, pv.VariantID, pv.name VariantName, pv.Price, pv.Description VariantDescription, isnull(pv.SalePrice, 0) SalePrice, isnull(SkuSuffix, '') SkuSuffix, pv.Dimensions, pv.Weight, isnull(pv.Points, 0) Points, sp.name SalesPromptName, isnull(e.Price, 0) ExtendedPrice FROM Product p join productvariant pv on p.ProductID = pv.ProductID join SalesPrompt sp on p.SalesPromptID = sp.SalesPromptID left join ExtendedPrice e on pv.VariantID=e.VariantID and e.CustomerLevelID=@CustomerLevelID WHERE p.ProductID = @ProductID and p.Deleted = 0 and pv.Deleted = 0 and p.Published = 1 and pv.Published = 1 ORDER BY p.ProductID, pv.DisplayOrder, pv.Name ]]> </sql> <queryparam paramname="@CustomerLevelID" paramtype="runtime" requestparamname="CustomerLevelID" sqlDataType="int" defvalue="0" validationpattern="" /> <queryparam paramname="@ProductID" paramtype="request" requestparamname="ProductID" sqlDataType="int" defvalue="0" validationpattern="^d{1,10}$" /> </query>
<query name="Ratings" rowElementName="Rating"> <sql> <![CDATA[ SELECT ProductID, CAST ( AVG(CAST(Rating AS decimal)) as decimal (10,2)) as Rating FROM Rating WHERE ProductID = @ProductID Group By ProductID ORDER BY ProductID ]]> </sql> <queryparam paramname="@ProductID" paramtype="request" requestparamname="ProductID" sqlDataType="int" defvalue="0" validationpattern="^d{1,10}$" /> </query>
<query name="TotalRatings" rowElementName="TotalRating"> <sql> <![CDATA[ SELECT ProductID, Count(Rating) as TotalRating FROM Rating WHERE ProductID = @ProductID Group By ProductID ORDER BY ProductID ]]> </sql> <queryparam paramname="@ProductID" paramtype="request" requestparamname="ProductID" sqlDataType="int" defvalue="0" validationpattern="^d{1,10}$" /> </query>
<query name="popup" rowElementName="popit"> <sql> <![CDATA[ SELECT ProductID as rateID FROM Product WHERE ProductID = @ProductID Group By ProductID ]]> </sql> <queryparam paramname="@ProductID" paramtype="request" requestparamname="ProductID" sqlDataType="int" defvalue="0" validationpattern="^d{1,10}$" /> </query>
I modified it to accept NULL values and conform more closely to INT specification. Here is my modified function:
CREATE FUNCTION [dbo].[udfIsValidINT] ( @Number VARCHAR(100) ) RETURNS BIT BEGIN DECLARE @Ret BIT, @ShiftByOne INT; IF LEFT(@Number, 1) = '-' SELECT @Number = SUBSTRING(@Number, 2, LEN(@Number)), @ShiftByOne=1; SELECT @Number = COALESCE(@Number,'0'), @ShiftByOne = COALESCE(@ShiftByOne,0) IF (PATINDEX('%[^0-9-]%', @Number) = 0 AND CHARINDEX('-', @Number) <= 1 AND @Number NOT IN ('.', '-', '+', '^') AND LEN(@Number)>0 AND LEN(@Number)<11 AND @Number NOT LIKE '%-%') SELECT @Ret = CASE WHEN CONVERT(BIGINT,@Number) - @ShiftByOne <= 2147483647 THEN 1 ELSE 0 END ELSE SET @Ret = 0 RETURN @Ret END GO SELECT dbo.udfIsValidINT('2147483648') SELECT dbo.udfIsValidINT('2147483647') SELECT dbo.udfIsValidINT('-200') SELECT dbo.udfIsValidINT('-2147483649') SELECT dbo.udfIsValidINT('32900') SELECT dbo.udfIsValidINT('1.79E+308') GO
I also have a separate function for SMALLINT:
CREATE FUNCTION [dbo].[udfIsValidSMALLINT] ( @Number VARCHAR(100) ) RETURNS BIT BEGIN DECLARE @Ret BIT, @ShiftByOne INT; IF LEFT(@Number, 1) = '-' SELECT @Number = SUBSTRING(@Number, 2, LEN(@Number)), @ShiftByOne=1; SELECT @Number = COALESCE(@Number,'0'), @ShiftByOne = COALESCE(@ShiftByOne,0) IF (PATINDEX('%[^0-9-]%', @Number) = 0 AND CHARINDEX('-', @Number) <= 1 AND @Number NOT IN ('.', '-', '+', '^') AND LEN(@Number)>0 AND LEN(@Number)<6 AND @Number NOT LIKE '%-%') SELECT @Ret = CASE WHEN CONVERT(INT,@Number) - @ShiftByOne <= 32677 THEN 1 ELSE 0 END ELSE SET @Ret = 0 RETURN @Ret END GO SELECT dbo.udfIsValidSMALLINT('589') SELECT dbo.udfIsValidSMALLINT('-200') SELECT dbo.udfIsValidSMALLINT('-32900') SELECT dbo.udfIsValidSMALLINT('32900') SELECT dbo.udfIsValidSMALLINT('1.79E+308')
and one for TINYINT:
CREATE FUNCTION [dbo].[udfIsValidTINYINT] ( @Number VARCHAR(100) ) RETURNS BIT BEGIN DECLARE @Ret BIT, @L TINYINT; SET @L = LEN(@Number); SET @Number = COALESCE(@Number,'0'); IF (PATINDEX('%[^0-9]%', @Number) = 0 AND @L>0 AND @L<4) SELECT @Ret = CASE WHEN CONVERT(SMALLINT,@Number) < 256 THEN 1 ELSE 0 END ELSE SET @Ret = 0 RETURN @Ret END GO SELECT dbo.udfIsValidTINYINT('256') SELECT dbo.udfIsValidTINYINT('-1') SELECT dbo.udfIsValidTINYINT('0') SELECT dbo.udfIsValidTINYINT('255') SELECT dbo.udfIsValidTINYINT('1.79E+308')
And, finally, a separate function for DECIMAL validation:
CREATE FUNCTION [dbo].[udfIsValidDECIMAL] ( @Number VARCHAR(100), @Scale TINYINT, @Precision TINYINT ) RETURNS BIT BEGIN DECLARE @Ret BIT, @L TINYINT, @DSI TINYINT; SET @Number = COALESCE(@Number,'0'); IF LEFT(@Number, 1) = '-' SELECT@Number = SUBSTRING(@Number, 2, LEN(@Number)); SET @L = LEN(@Number); SET @DSI = @L - LEN(REPLACE(@Number,'.','')) IF( PATINDEX('%[^0-9.]%', @Number) = 0 ANDCHARINDEX('-', @Number) = 0 AND@DSI <= 1 AND@L>0 AND@L<=@Scale+@DSI+ CASE @DSI WHEN 1 THEN @L-CHARINDEX('.', @Number) ELSE 0 END AND @Scale - @Precision >= CASE @DSI WHEN 1 THEN CHARINDEX('.', @Number) - 1 ELSE @L END ) SELECT @Ret = 1 ELSE SET @Ret = 0 RETURN @Ret END GO SELECT dbo.udfIsValidDECIMAL('256',2,0) SELECT dbo.udfIsValidDECIMAL('-1',1,0) SELECT dbo.udfIsValidDECIMAL('10.123456789123456789',18,17) SELECT dbo.udfIsValidDECIMAL('10.123456789123456789',18,16) SELECT dbo.udfIsValidDECIMAL('-255.0000000000000001',3,0) SELECT dbo.udfIsValidDECIMAL('1.79E+308',9,2)
Node that the DECIMAL validation function specifically tests whether the input number can legally convert to a given decimal scale and precision. Converting a value of 0.234234 over to DECIMAL(1,0) will work, but SQL will truncate the actual decimals to fit it in that space. However, it will throw an error if you have too many whole digits.
On the whole, I was rather rushed to get these created, so there may be some errors I didn't notice. I'm interested in any improvements you guys can make to improve performance or make them cleaner.
I have an xml task that I have set up to validate my xml using a XSD. It seems to be working OK. However, I have had to wrap my xml in a SOAP envelope before I send it to the validation task so I need to include an additional schema for the soap message header. That "header" schema has an <xsd:import> of the soap envelope schema. When I try to <xsd:include> the message header schema I get this:
"Warning 313 The targetNamespace 'blah blah' of included/redefined schema should be the same as the targetNamespace ' blah blah blah' of the including schema."
Is it not possible to use the Xml Task to validate the entire document including the SOAP Envelope due to the differing target namespaces? Thanks for any suggestions.
I am building a custom component. In the Validate() method I am checking for various things. Here's a section of my code: if (ComponentMetaData.InputCollection[0].InputColumnCollection.Count != 1) { return DTSValidationStatus.VS_ISBROKEN; throw new Exception("You need to select one and only one column"); }
The problem I'm finding is that my Exception never gets thrown so I never see the error in the UI. if I reverse the 2 lines like so: if (ComponentMetaData.InputCollection[0].InputColumnCollection.Count != 1) { throw new Exception("You need to select one and only one column"); return DTSValidationStatus.VS_ISBROKEN; }
Then I get my error message in the UI but I never get anything returned from my Validate() method - which doesn't seem quite right to me.
I don't like using FireError() either because it merely puts the error into the Task list window - and who ever looks at the Task list window???
I'm basically just after some guidance about what people think is the "proper" way of doing this. At the moment I prefer to throw an exception because that way I get some errors in the UI.
I am looking for a script which capture the object details like constraints, clustered & non-clustered indexes, primary & foreign keys, etc before migration & also after migration and give me the differences after comparison if any. Since this is on production environment, we do not have approval to use any comparison tool. Being a DBA, I am very poor in TSQL programming.script which can compare the differences before & after migration.
I have file name like clm_05-04-2014.I need to valid whether file name consist of todays date in SSIS or Sqlserver.
If the file name is clm_04-04-2014 then this is yesterday's file so not valid.When i run ths SSIS package today the file name should consist of todays date in file else i need to pass message as invalid file .CLM will be common..
Hi, we are using sql 2000, what would be the good way to validate different databases current data with seconday server's data after logshipping applied? thank you
I have a set of packages, which I need to run through Sql Agent to automate it. This package runs Ok when I use DtexecUI but when I run it through SQl Agent with the same options it fails.
THe error it fails on is the old 0xC0202009, acquire connection failed error. The component on which it fails is an access database with the location and query fed by variables that pick up this from a table in a sql server database. THe access database is on a shared drive on a different machine
Does anyone have any idea as to why this should be happening
Does the validation time of DTSX package depend on the amount of data?
I have a data transfer task which contains a oledb source data from a SQL Server 2005 view. Then the data go through 3 Lookup transfromations before going into another view, also on SQL Server 2005 but a different database. The purpose of this package is to load fact data, so it has to deal with few million of rows. Before setting DelayValidation to true, it takes few minutes just to open the package in BI Studio. Now with DelayValidation set to false, I can open the package without any problem. But it takes more than 5 minutes during Validation, Prepare for Execution and Pre-execution Phase. During the time the memory usage and cpu time on SQL Server goes up significantly. The CPU doesn't hit 100% though. My client machine doesn't have any significant activity.
I have similar packages ( Oledb view -> 3 or 4 lookup Transformations -> Oledb Table) but dealing with dimension data. With DelayValidaion set to False, those packages can be opened in BI Studio within few seconds. They also take only few seconds during those 3 phases before starting actual execution phase.
So I have an impression that the validation time depends on the amount of data in the database. Shouldn't it just depends on the metadata?
This may sound silly, but I'm calculating how much someone owes over time verses how much they have paid for that period. So if they owe more than they have paid the result is an under payment amount like -$100. Then I need to add the -$100 to the new payment due, let's say $100. So the total Due is $200. But if I add -100 to 100 that equals 0. Any suggestions?
Hello,In SQL Server 2005 Enterprise, I can change the CPU affinity viaManagement Studio. Is there a way that I can change these settings viat-sql?I wish to use all my cpu's at night when my data warehouse builds andthen durning the day reduce the number of cpus for SQL so that theapplication can get more time.TIARob