The current setting of this option can be determined by examining the page_verify_option column in the sys.databases catalog view or the IsTornPageDetectionEnabled property of the DATABASEPROPERTYEX function.
However, there is no column named page_verify_option in the view sys.databases, and DATABASEPROPERTYEX('IsTornPageDetectionEnabled') does not discriminate between the settings CHECKSUM and NONE (it returns 0 for both)!
Can anyone provide me with the syntax for comparing rows of two tables using binary checksum? The tables A and B have 8 & 9 columns respectively. The PK in both cases is Col1 & Col2. I want checksum on Columns 1 to 8.
select t.* from test t join test2 t2 on t2.id=t.id where CHECKSUM(t.col2,t.col3)<>CHECKSUM(t2.col2,t2.col3)
--The purpose of the above script is to check for any updates in the two tables. It returns two rows. But as you can see both these rows were present in the table before. So I modify the script to - --SCRIPT B select t.* from test t join test2 t2 on t2.col2=t.col2 where CHECKSUM(t.col3)<>CHECKSUM(t2.col3)
-- In this case no row is returned.This is exactly what I need. The problem - Now execute the script below.
TRUNCATE TABLE TEST TRUNCATE TABLE TEST2
insert test values(4,4,'d','02/06/2004') insert test values(4,4,'d','02/01/2004')
--Now when I execute script B two rows are returned which is not what I want. Since the rows are identical no row should be returned. So depending on what column changes (col2 or col3), I have to alter the script. I seek advise on the method to calculate checksum. Again the PK is ID and Col1 only.
I'm developing a stored procedure to run an update based on valuesentered into a .Net web form. I want to capture the chceksum of therow when it is displayed on the form then validate that when the updateis exec'd. Simple enough logic, eh? The problem is when I try to usethe checksum(*) function, SQL server yells at me and says that it isn'trecognized. I'm using SQL Server 7, so wtf? I am not the admin of theserver and I'm skirting around SQL Server Enterprise Manager and usingany free utils, MS Access, and Visual Studio to maintain this db.ThanksAlex Jamrozek
Hi,I'd like advices about an idea I add to resolve a problem. thanks toyou in advance for yours answers.I have a database with tables that I load with flat file. The size ofeach table is 600 Mb. The flat file are the image of an applicationand there is no updated date or created date on any table. So mytables are just a copy of the data from the flat file.Now I'd like to create an History Table. So I have to determine whichlines changed and which one did'nt.As I don't have any date on my row the only answer I had unil know wasto check each column on each row to see if any data changed. If thedata changed I add a new line in my history date.My idea is to add a checksum column in both table on all columns. Toknow if any data change I just have to check my PK + my checksumcolumn.Do you think that is a good idea ? Is checksum a quick function ornot ?.Thanks.--K
I heard that page checksum enabled will reports errors occured in the log. That's good.
Currently we have DBCC PHYSICAL_ONLY run alone and CHECKTABLE on group of tables on different days. A suggestion came from a person is to turn off 'DBCC CHECK TABLE' and run only when checksum reports an error and continue running CHECKDB WITH PHYSICAL_ONLY as before.
Is this suggestion a best practice? Please also write few lines to say why it is wrong or wright.
I recently researched on the CHECKSUM & CHECKSUM_AGG functions in T-Sql and found them really useful. However, I was skeptical that there are chances of these functions returning the same values for non-identical inputs. I just got on to the forums and found more than one unhappy folks writing about their experience with these functions.
I am designing a large database (warehouse) and found these functions tempting to implement for the sake of
using CHECKSUM for
- indexing long character fields - multiple colums of the same table that would involve in a join and use the new checksum field instead
using CHECKSUM_AGG for
- I bulkcopy flat file soruce data into a character field of a table and to ensure that I am not loading the same file multiple times, I plan to use CHECKSUM_AGG( CHECKSUM( [FlatFileRecord] ) ) and verify that no two loads have the same output.
Can some body suggest if I can trust these methods for my purpose?
it sounds like a column can be added to each row in a table that is the checksum or binary_checksum of an expression. How many bytes do each of these occupy? Does the answer depend on the number and/or length of items in the expression?
i am using checksum in my etl process for this i have a checksum field to calculate the values in my table the column is a computed column and it has a property for persistence .
what decision should i take should i make it persisted ot not what is the industry standard.
Can you please expalin how this property would affect the behaviour of the column
will this property affect me in any thing like indexes . please let me what step should i take should i make the column persisted or not .
I have two tables src_monthly_terrrier and src_weekly_terrier. Both of these tables consists of 10+ columns. As the table names probably suggest, I import weekly data into one and monthly data into another.
All the source data comes from an Excel spreadsheet via straight Import Data procedure. The only guaranteed change on a weekly and monthly basis is that one of the columns in each table named src_date will obviously have the data value for whichever month or week's data it relates to.
I understand that through 'SQL Server Business Intelligence Development Studio' I can create an 'Intergrated Services' package that will import the spreadsheet details for me. I might be going the long way around this, but it was my intention to bring in all the data and then run a couple of 'INSERT INTO' Stored Procedures.
My biggest issue / vunerability I have is that there is no error checking of the data on the way in to ensure that it has not already been imported. What I was thinking I could do to resolve this was to create a Checksum field comprising of a number of different columns (incl src_date) and then somehow write something that will look at the values of each intended imported row and then work out whether a duplicate checksum was found in the target table and then rejected the import routine as Duplicate Data Found (or something similar) and move onto the next stored procedure.
My problem is two fold, one I have no idea how to create said checksum and two no idea where to begin on coding a procedure etc that looks to see if the value already exists etc etc.
I have looked up checksum creation on the net and there appears to be plenty of resource to explain how to create one, so I guess my main question is, Where do I start when it comes to writing some code that will do the check of the checksum before the importation routine begins (or at least the Insert Into procedures.
I would truly appreciate anyone's help on this. In the meanwhile I am off to learn how to create them.
I would like to add, if anyone sees this as a bad idea, then please speak up.
I need to generate HASH of text values for my app. I can generate hash values for normal fields using CHEKCSUM and BINARY_CHECKSUM function but it does not support checksum of text, ntext, image, and cursor, as well as sql_variant.
Looking for some clarification on the CHECKSUM option of the BACKUP command.
If the the CHECKSUM option is specified in the backup, will the backup fail if CHECKSUM finds bad values (or at least raise an error)? Or, is it only reported when doing a RESTORE VERIFYONLY?
With this discussion here http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=70328 I started to thinkn about Microsoft really calculated checksum value.
This code is 100% compatible with MS original. That is, the result is identical. You can use it "as is", or you can use it to see that MS function does not produce that unique values one could expect.
With text/varchar/image data, call with SELECT BINARY_CHECKSUM('abcdefghijklmnop'), dbo.fnPesoBinaryChecksum('abcdefghijklmnop') With integer data, call with SELECT BINARY_CHECKSUM(123), dbo.fnPesoBinaryChecksum(CAST(123 AS VARBINARY)) I haven't figured out how to calculate checksum for integers greater than 255 yet.CREATE FUNCTION dbo.fnPesoBinaryChecksum ( @Data IMAGE ) RETURNS INT AS
BEGIN DECLARE@Index INT, @MaxIndex INT, @SUM BIGINT, @Overflow TINYINT
IF @SUM > 2147483647 SELECT @SUM = @SUM - 4294967296
RETURN @SUM ENDActually this is an improvement of MS function, since it accepts TEXT and IMAGE data.CREATE FUNCTION dbo.fnPesoTextChecksum ( @Data TEXT ) RETURNS INT AS
BEGIN DECLARE@Index INT, @MaxIndex INT, @SUM BIGINT, @Overflow TINYINT
We are using binary_checksum in some of instead of update trigger. The problem came into the knowledge when update falied without raising any error. We came to know after research that checksum returns same number for two different inputs and thats why update failed.
We are using following type of inside the trigger.
For detecting delta records, I'm a big fan of SQLIS' checksum transform. I'm having difficulty in it's install on my current machine, however. After the installation and the new transform is added to my DataFlow toolbox... I can't open the UI for the transform to define the checksum. Instead, I get the following error:
===================================
Could not load file or assembly 'Microsoft.ExceptionMessageBox, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. (Microsoft Visual Studio)
------------------------------ Program Location:
at Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransformUI.Edit(IWin32Window parentWindow, Variables variables, Connections connections) at Microsoft.DataTransformationServices.Design.DtsComponentDesigner.StartComponentUI(Boolean startGenericUI)
I am using the Konesans Checksum transformation ( http://www.sqlis.com/21.aspx ) to detect changes in my big (many columns, type 2 SCD) dimensional table.
But I am running into collossions
The checksum transformation, sometimes misses a small change in the record, for instance when a certain flag is set or unset. Is there a more robust checksum generator? Of any other suggestions on to solve this?
From what I've seen, the CheckSum_Agg function appears to returns 0 for even number of repeated values. If so, then what is the practical use of this function for implementing an aggregate checksum across a set of values?
For example, the following work as expected; it returns a non-zero checksum across (1) value or across (2) unequal values.
declare @t table ( ID int ); insert into @t ( ID ) values (-7077); select checksum_agg( ID ) from @t; ----------- -7077 declare @t table ( ID int ); insert into @t ( ID ) values (-7077), (-8112); select checksum_agg( ID ) from @t; ----------- 1035
However, the function appears to returns 0 for an even number of repeated values.
declare @t table ( ID int ); insert into @t ( ID ) values (-7077), (-7077); select checksum_agg( ID ) from @t; ----------- 0
It's not specific to -7077, for example:
declare @t table ( ID int ); insert into @t ( ID ) values (-997777), (-997777); select checksum_agg( ID ) from @t; ----------- 0
What's curious is that (3) repeated equal values will return a checksum > 0.
declare @t table ( ID int ); insert into @t ( ID ) values (-997777), (-997777), (-997777); select checksum_agg( ID ) from @t; ----------- -997777
But a set of (4) repeated equal values will return 0 again.
declare @t table ( ID int ); insert into @t ( ID ) values (-997777), (-997777), (-997777), (-997777); select checksum_agg( ID ) from @t; ----------- 0
Finally, a set of (2) uneuqal values repeated twice will return 0 again.
declare @t table ( ID int ); insert into @t ( ID ) values (-997777), (8112), (-997777), (8112); select checksum_agg( ID ) from @t; ----------- 0
I'm trying to load data from old SQL server 2000 to new SQL server 2014. I need to do a checksum to check if all the source data is loaded in the target database(SQL server 2014). I've created the insert statement for the same which works. I need to use checksum to make sure all the source rows are loaded in the target table. I haven't done checksum before.
Here is my insert statement:
INSERT INTO [Test].[dbo].[Order_tab] ([rec_id] ,[date_loaded] ,[Name1] ,[Name2] ,[Address1] ,[Address2]
HelloI´m using MS-SQL Server 2k with a custom application connecting to it.I need a quick and simple way to detect any change (insert, update, delete)to my database (not just to a single table). The purpose is to notifysomehow different instances of the application about the data changes,since they should refresh their local query results.Is there any function, stored procedure, system database entry, etc. I canexploit to get the job done? Can you suggest me the most suitable mechanismto implement this?T.I.A.AndreA
I am entering to administration of SS2005 SP1 (Windows 2003) having files mdf, ndf, ldf in C:Program FilesMicrosoft sql serverMSSQLData This dir also has two *.cer files.
Apparently no encryption is used
How can I get known what these *.cer files are for?
Hi,I want to check with VB.net whether a field in a SQL-table is NULL or not.This code doesnot work:If xxx = NULL then<statements>End IfI got the error, that NULL is not supported ?How do I code the check ???Help is appreciated, Gr.
when i run in query analyzer , using "exec SP_TEST" , it work and display result
but when run Exec Master..xp_CmdShell 'bcp "exec mydbtest..SP_TEST " queryout C:TEST.TXT -c -Slocalhost -Usa -Ppassword'
it give error "SQLState = S0002, NativeError = 208" and show Error = [Microsoft][ODBC SQL Server Driver][SQL Server]Invalid object name '#TMP_CUSTOMER'.
Hassle free locating / reporting or deletion of duplicate rows in a SQL table based on the uniqueness of a column or columns that the user provides with a few nifty hands off features. Keywords: delete unique column record records
(This code may also be viewed at http://www.planet-source-code.com/vb/scripts/ShowCode.asp?txtCodeId=576&lngWId=5
/*Description: UTILITY - Locate in MASTER
Syntax: EXEC sp_RemoveDups TableName, DupQualifierFieldNameList, DeleteDups, UniqueColName, CreateIdentColIfNeeded, StoredProcedureResult Only the first two arguments are required For HELP, enter the stored procedures name without any arguments or see the PRINT statements below NO DELETION WILL OCCUR by default - only duplicate recordset returned. To delete records, pass in a 0 for the DeleteDups argument.
Purpose: Allow removal of duplicate rows where 1. We define what fields qualify the duplicate 2. We select the unique rowid or it is detected automatically else no action takes place
Method:Delete by RowID all duplicate rows except the highest RowID (in alpha-sort sequence) of each group of duplicates.
DATEBYCHANGE 09-23-2002FrankOriginal v1.0 09-23-2002FrankChanged the name from sp_RemoveDupsByRowID to sp_RemoveDups 10-8-2002Sean P. O. MacCath-MoranMade @UniqueColName optional Added logic to auto-detect RowGUID and Identity columns Added logic to check for unique value column if no RowGUID or Identity column exists Added logic to create a temporary ID field as a last resort if no unique column could be located Added HELP
*/
CREATE PROCEDURE sp_RemoveDups @TableName as varchar(50) = NULL, @DupQualifierFieldNameList as varchar(200) = NULL, @DeleteDups as bit = NULL, @UniqueColName as varchar(50) = NULL, @CreateIdentColIfNeeded as bit = NULL, @StoredProcedureResult int = NULL OUTPUT
DECLARE @SUM int DECLARE @COUNT int DECLARE @NextColumn varchar(50)
/*==================================================================================*/ /*========================VARIABLE INITIALIZATION AND SETUP========================*/ /*=================================================================================*/ /*If no unique column is located then a temporary Identity column will be created useing the name specified in this TempIdentColName string*/ SET @TempIdentColName = 'TempIdentityColXY123'
SET @SQL_DetermineUniqueTemplate = 'SELECT @COUNT = COUNT(Count), @SUM = sum(Count) from ' SET @SQL_DetermineUniqueTemplate = @SQL_DetermineUniqueTemplate + CHAR(13) + '(SELECT TOP 100 PERCENT <COLUMN_NAME>, COUNT(*) as Count FROM ' + @TableName SET @SQL_DetermineUniqueTemplate = @SQL_DetermineUniqueTemplate + CHAR(13) + ' GROUP BY <COLUMN_NAME> ORDER BY <COLUMN_NAME>) a'
/*Retrieve the Host Name. This will be used later in this SP as a test to determine if the user is making this call from within SQL Query Analyzer*/ SELECT @HostName = hostname FROM master..sysprocesses WHERE spid = @@SPID AND program_name = 'SQL Query Analyzer'
/*Set ActionText to be used in message output*/ IF (@DeleteDups IS NULL) OR (@DeleteDups = 1) SET @ActionText = 'Selection' ELSE SET @ActionText = 'Deletion'
/*If a value is specified for use by UniqueColName it cannot exist in the columns from the DupQualifierFieldNameList*/ IF CHARINDEX(@UniqueColName, @DupQualifierFieldNameList) > 0 BEGIN /*The value in UniqueColName was detected in DupQualifierFieldNameList.*/ IF NOT (@HostName IS NULL) PRINT 'The UniqueColName provided (' + @UniqueColName + ') must not exist in DupQualifierFieldNameList (' + @DupQualifierFieldNameList + '). Other solutions will be sought automatically.' SET @UniqueColName = NULL END
/*If UniqueColName is provided then perform check to ensure that all the values in that column are, in fact, unique.*/ IF NOT (@UniqueColName IS NULL) BEGIN SET @SQL = REPLACE(@SQL_DetermineUniqueTemplate,'<COLUMN_NAME>', @UniqueColName) /*Perform a check of this column to determine if all of it's values are unique*/ EXEC sp_executesql @SQL, N'@SUM as int OUTPUT,@COUNT as int OUTPUT',@SUM OUTPUT,@COUNT OUTPUT /*Test to determine if this column contains unique values*/ If @SUM <> @COUNT BEGIN /*The column specified by UniqueColName does not contain unique values.*/ IF NOT (@HostName IS NULL) PRINT 'The UniqueColName provided (' + @UniqueColName + ') does not contain unique values. Other solutions will be sought automatically.' SET @UniqueColName = NULL END END
/*==============================================================*/ /*======================HELP OUTPUT TEXT======================*/ /*==============================================================*/ IF (@TableName IS NULL) OR (@TableName = '/?') OR (@TableName = '?') OR (@DupQualifierFieldNameList IS NULL) OR (@DupQualifierFieldNameList = '/?') OR (@DupQualifierFieldNameList = '?') BEGIN IF NOT (@HostName IS NULL) BEGIN PRINT 'sp_RemoveDups ''TableName'', ''DupQualifierFieldNameList'', [''DeleteDups''], [''UniqueColName''], [''CreateIdentColIfNeeded''], <''StoredProcedureResult''>' PRINT '=====================================================================================================================================================================' PRINT 'TableName: Required - String - Name of the table to detect duplicate records in.' PRINT 'DupQualifierFieldNameList: Required - String - Comma seperated list of columns that make up the unique record within TableName.' PRINT 'DeleteDups: Optional - Bit, Set to 0 to delete duplicate records. A value of NULL or 1 will return the duplicate records to be deleted.' PRINT 'UniqueColName: Optional - Bit - A table must have a unique column value in it to perform the deletion logic. If no UniqueColName is provided then an attemp will be made to locate the RowGUID column. If that fails then an attempt will be made to locate the Identity column. If that fails then all of the columns of the table will be examined and the first one with all unique values will be selected.' PRINT 'CreateIdentColIfNeeded: Optional - Bit - By default this SP will create an identity column if no unique column can be located. Pass in a 1 here to run this feature off.' PRINT 'StoredProcedureResult: Optional - OUTPUT - Int - Returns a 3 if an error occured, otherwise returns a 0.' END SET @StoredProcedureResult = 3 RETURN END
/*========================================================================*/ /*======================DETECT USABLE UniqueColName======================*/ /*========================================================================*/ IF @UniqueColName IS NULL BEGIN /*Check for a RowGUID or Identity column in this table. If one exists, then utilze it as the unique value for the purposes of this deletion*/ IF EXISTS(SELECT * FROM SysColumns WHERE ID = Object_ID(@TableName) and ColumnProperty(ID,Name,'IsRowGUIDCol') = 1) SET @UniqueColName = 'RowGUIDCol' IF EXISTS(SELECT * FROM SysColumns WHERE ID = Object_ID(@TableName) and ColumnProperty(ID,Name,'IsIdentity') = 1) SET @UniqueColName = 'IdentityCol'
IF @UniqueColName IS NULL /*If no RowGUID or Identity column was found then check all of the columns in this table to see if one of them can be utilized as a unique value column*/ BEGIN /*Select all of the columns from the table in question...*/ DECLARE MyCursor CURSOR LOCAL SCROLL STATIC FOR SELECT name FROM syscolumns WHERE OBJECT_ID(@TableName)=ID
OPEN MyCursor FETCH NEXT FROM MyCursor INTO @NextColumn WHILE @@fetch_status = 0 BEGIN /*Create SQL string with correct column name in place.*/ SET @SQL = REPLACE(@SQL_DetermineUniqueTemplate,'<COLUMN_NAME>', @NextColumn) /*Perform a check of this column to determine if all of it's values are unique*/ EXEC sp_executesql @SQL, N'@SUM as int OUTPUT,@COUNT as int OUTPUT',@SUM OUTPUT,@COUNT OUTPUT /*Test to determine if this column contains unique values*/ If @SUM = @COUNT BEGIN /*A unique values column is detected. Use it and break out of the loop UNLESS column is specified in DupQualifierFieldNameList*/ IF CHARINDEX(@NextColumn, @DupQualifierFieldNameList) = 0 BEGIN /*NextColumn was NOT detected in DupQualifierFieldNameList, so this is the column we will use.*/ SET @UniqueColName = @NextColumn BREAK END END ELSE FETCH NEXT FROM MyCursor INTO @NextColumn END CLOSE MyCursor DEALLOCATE MyCursor
END END
/*If no UniqueColName has been found then create one UNLESS @CreateIdentColIfNeeded = 1*/ IF (@UniqueColName IS NULL) AND ( (@CreateIdentColIfNeeded IS NULL) OR (@CreateIdentColIfNeeded = 0) ) BEGIN /*Add a sequence column to the table...*/ IF NOT (@HostName IS NULL) PRINT 'Creating temporary identity column in the ' + @TableName + ' table named ' + @TempIdentColName + ' for use in this ' + LOWER(@ActionText) + ' process...' EXEC('ALTER TABLE ' + @TableName + ' ADD ' + @TempIdentColName + ' [int] IDENTITY (1, 1)') SET @UniqueColName = @TempIdentColName END
/*============================================================================*/ /*======================EXECUTE DELETION OR SELECTION======================*/ /*===========================================================================*/ IF @UniqueColName IS NULL BEGIN /*No UniqueColName was provided by the user and none were detected by the script. This deletion algorythm cannot run.*/ IF NOT (@HostName IS NULL) PRINT 'Could not perform ' + LOWER(@ActionText) + ' process. No unique columns were located and the UniqueColName flag is set to 1 (False).' SET @StoredProcedureResult = 3 RETURN END ELSE BEGIN IF NOT (@HostName IS NULL) PRINT 'Performing ' + LOWER(@ActionText) + ' utilizing the unique values in the ' + @UniqueColName + ' column as a reference...' /* Create and execute an SQL statement in the form of:
SELECT * (or DELETE) FROM TableName WHERE UniqueColName IN ( SELECT UniqueColName FROM TableName WHERE UniqueColName NOT IN ( SELECT MAX(Cast(UniqueColName AS varchar(36))) FROM TableName GROUP BY DupQualifierFieldNameList, DupQualifierFieldNameList, etc ) ) */ /*Delete all duplicate records useing @UniqueColName as a unique ID column */ IF (@DeleteDups IS NULL) OR (@DeleteDups = 1) SET @SQL = 'SELECT * ' ELSE SET @SQL = 'DELETE '
SET @SQL = @SQL + 'FROM ' + @TableName + ' WHERE ' + @UniqueColName + ' IN ' SET @SQL = @SQL + CHAR(13) + CHAR(9) + '(' + CHAR(13) + CHAR(9) SET @SQL = @SQL + 'SELECT ' + @UniqueColName + ' FROM ' + @TableName + ' WHERE ' + @UniqueColName + ' NOT IN ' SET @SQL = @SQL + CHAR(13) + CHAR(9) + CHAR(9) + '(' + CHAR(13) + CHAR(9)+CHAR(9) SET @SQL = @SQL + 'SELECT MAX(Cast(' + @UniqueColName + ' AS varchar(36))) FROM ' SET @SQL = @SQL + @TableName + ' GROUP BY ' + @DupQualifierFieldNameList SET @SQL = @SQL + CHAR(13) + CHAR(9) + CHAR(9) + ')' + CHAR(13) + CHAR(9) + ')'
EXEC (@SQL) IF @@ERROR <> 0 BEGIN IF NOT (@HostName IS NULL) PRINT @ActionText + ' process failed.' SET @StoredProcedureResult = 3 END ELSE BEGIN IF NOT (@HostName IS NULL) PRINT @ActionText + ' completed successfully with this SQL: ' + CHAR(13) + @SQL SET @StoredProcedureResult = 0 END END
IF (@UniqueColName = @TempIdentColName) AND ( (@CreateIdentColIfNeeded IS NULL) OR (@CreateIdentColIfNeeded = 0) ) BEGIN /*Remove the sequence column from the table...*/ IF NOT (@HostName IS NULL) PRINT 'Removing temporary identity column named ' + @TempIdentColName + ' from the ' + @TableName + ' table...' EXEC('ALTER TABLE ' + @TableName + ' DROP COLUMN ' + @TempIdentColName) END GO
Hello,how I can detect via code (C++, C#) which sql servers like oracle, mssql,mysql are close to me and connectable within one or two second (timeout towait for answers should be variable)?Thank you.regardsMark
I need to detect what version/ or versions of SQL Server are installed on a machine before I can install my software. I am currently looking for SQL Server 2000 / SQL Server 2005 /SQL Server Express Edition & SQL Server 2008 CTP (for the time). How do I find this out. Is SQL-DMO the way forward or should this be some WMI script. I would appreciate any pointers/scripts etc.
I have read a lot of people mentioning using @@version from a query window but as I need the software to do this on first start up can this work for all versions
Running SQLEXPR_ADV.EXE against Windows XP SP0 causes the setup to exit gracefully with a message indicating the the right service pack is not installed.
However, running SQLEXPR_ADV.EXE against Windows XP SP1 does not exit and continues to install. Then, when you try to run a script against the installation, you get the message:
[Microsoft][ODBC SQL Server Driver][DBMSLPCN][SQL Server does not exist or access denied].
This will become very problematic when installing SQL Server Express as the database for small installations.
The setup really needs to be updated to look for all the system requirements that it can detect. Evidently it can detect the Service Packs. The Wise Installer is not able to script that requirements unless there is some way to make the Windows Installer look for it that is not documented.
I need to make a control in a trigger, but there is a sequence of N INSERT inside oneself transaction and I should only control when it is the last register. I ignore the quantity of INSERT, it can be one or several. And in the values there is not any indicator that allows me to deduce if it is the last one. It would seem that I need a trigger for COMMIT... Is there some form of making this?