CREATE TRIGGER tr_UDFTest ON UserDefinedFields FOR INSERT, UPDATE
AS
DECLARE @foobar varchar(100)
SELECT @foobar= foobar FROM inserted
IF ( @foobar = 'foobar') INSERT INTO LogTable (LogText) values ('Found foobar')
ELSE INSERT INTO LogTable (LogText) values ('Did not find foobar')
GO
Basically, the trigger monitors the values begin put in the foobar field,
and acts accordingly based on what it finds. In practice, my needs are a
bit more complex (the trigger will be programmatically generated, based on
a set of rules) but the principle is much the same.
ErrorTable is defined as :
create table LogTable (LogText varchar(128))
UserDefinedFields is a table whose definition may change depending on the
user's needs, but for now assume it contains a varchar column called
foobar.
My problem is that if the user's needs change, and they remove the field
foobar, the trigger causes all subsequent inserts/updates to fail with an
error indicating the column foobar doesn't exist. (Which makes sense of
course!)
CREATE TRIGGER tr_UDFTest ON UserDefinedFields FOR INSERT, UPDATE
AS
DECLARE @foobar varchar(100)
if not exists (select * from syscolumns sc inner join sysobjects so on sc.id = so.id
where sc.name = 'foobar'
and so.name = 'UserDefinedFields') BEGIN
INSERT INTO LogTable (LogText) values ('Error : Foobar column does not exist!')
RETURN
END
SELECT @foobar= foobar FROM inserted
IF ( @foobar = 'foobar') INSERT INTO LogTable (LogText) values ('Found foobar')
ELSE INSERT INTO LogTable (LogText) values ('Did not find foobar')
GO
I'd be happy with the above 'flavor' of solution (bailing, or logging an
error and bailing, when we hit unexpected problems) as long as the
inserts/updates don't fail otherwise. Perhaps I can nest a transaction, or
supress a RAISEERROR or something?
The cleanest solution would probably be to change a bunch of client
software such that it won't remove the foobar field if this field is
needed for a trigger (foreign key constraints based on the set of rules
I'm using are a nice and intuitive solution). Unfortunately, that doesn't
work well with my timeframe (done by thursday) as changing the client
software is impossible by then. Any ideas or suggestions? Platform is
Win2k, SQL 2000 Enterprise (I think enterprise, certainly 2000).
I have a simple query which is taking about 2.5 minutes to execute. What can be done to speed it up ? I cannot change the table structure. even without the group by, and without the float and division, it takes about 2.25 minutes.
Select 'SV00302' as FileSource, [Service_Call_ID], sum(0) as ActualHours, sum(cast([Estimate_Hours] as float)/100) as EstHours, max(MODIFDT+Modified_Time) as ModifiedDateTime
Say you have an existing populated SQL 2005 database, with 700+ tables, and you want to just change the order of the columns inside every table. Short of manually building conversion scripts, anyone know an automated way to do this? I was thinking thru ways to do them all in one shot, and have tools like Erwin and DbGhost that could be used also. Basically moving some standard audit columns from the end of the tables to just after the PK columns.
field ([idConteudo] [int] NULL) to ([idConteudo] [BigInt] NULL)
Or really should drop and create again?
CREATE TYPE [dbo].[tbProjeto] AS TABLE( [dsConteudo] [varchar](64) NOT NULL, [idConteudo] [int] NULL, [dtConteudo] [datetime] NULL DEFAULT (getdate()) ) GO
Is to Possible to Create a Triggers to capture Schema (alter table, Drop table) Changes only for certain tables.I don’t want schema change for entire database.
We have a database on a 2005 box, which we need to keep in sync with one on a 2014 box (until we can turn off the one on 2005). The 2005 database is still being updated with changes that must be applied to the 2014 database, given the nature of the data (medical documents) we need to ensure updates are applied to the 2014 database in very near real time (these changes are - for example - statuses, not the documents themselves).
Cunning plan #1, ulgy - not at all a fan of triggers - but use an after update trigger to run a sp on the remote box via a linked server in this format, with a SQL Server login for the linked server with permissions to EXEC the remote proc.
CREATE TRIGGER [dbo].[SourceUpdate] ON [dbo].[SourceTable] AFTER UPDATE AS SET XACT_ABORT ON; SET NOCOUNT ON; IF UPDATE(ColumnName)
[Code] ....
However, while the sp can be run against the linked server as a standalone query OK, when running it in a trigger it's throwing
OLE DB provider "SQLNCLI" for linked server "WIBBLE" returned message "The transaction manager has disabled its support for remote/network transactions.".
Msg 7391, Level 16, State 2, Procedure TheAfterUpdateTrigger, Line 19
The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "WIBBLE" was unable to begin a distributed transaction.
Whether it actually possible to call a proc on a remote box via a trigger and if so what additional hoops need to be jumped through (like I said, it'll run OK called via SSMS)?
I have a database (MSSQL). To demonstrate the problem let me show a fictive Tablestructure. I don't want to discuss about how to save the data differntly, because the structure is fix and I can't change it.To get this result I would do a sql query with a lot of joins like that:
SELECT firstname, lastname, email.value, phone.value FROM Customer INNER JOIN ( SELECT Customer_Properties.id, Customer_Properties.value FROM Customer_Properties
[code]...
I don't think that this is really performant and the SQL-Queries get very complicated. Give it a other methode for that? I can't change the data structure.
I have a table with a field called remarks as text field. I have a trigger on it, "Create Trigger trg_inbox_bess506a_mstr_on_del On dbo.inbox_bess506a_mstr For Delete As -- 040226, archive inbox to arc set nocount on insert into inbox_bess_mstr_arc ( pk_id, batch_id, py, appropriation, issueFrom, issueTo, submitBy, submitDate, validID, validDate, approveDate, approveBy, accountCode, transType --remark ) select pk_id, batch_id, py, appropriation, issueFrom, issueTo, submitBy, submitDate, validID, validDate, approveDate, approveBy, accountCode, transType --remark from deleted return
GO"
It fails with an error message: "Server: Msg 21, Level 22, State 1, Procedure trg_inbox_bess506a_mstr_on_del, Line 8 WARNING - Fatal Error 7113 occurred at Dec 22 2004 11:25PM. Please note the error and time, and contact your System Administrator."
It's failing on a field with remarks greater than 1885 chars.
When I used a stored procedure to do the same, it worked. Why is the trigger failing now? Is there a limit on size for triggers and not procedures?
hiI want to increase a varchar(5000)to varchar(8000) on a table thathas approximately million rows.....What is the impact on the serveror any good recomendations of a action to accomplish this in the best andfastest way.thanks davep
We have an UPDATE trigger that is failing. This seems like a basic task - we want to write a record to a separate tracking table when our main transaction table is updated for any reason. Our assumption is that we have a reference to the data from the "inserted" record that was just updated. The scenario here is that we are running a batch process which READS several thousand records in our transaction table each evening.
We then mark each individual record as processed on the transaction table and expect that the UPDATE trigger will successfully fire (it is not). The version of our trigger listed below shows our attempt to deal with the fact that TransactionID does NOT exist from "inserted." We also have a version of this trigger that deals with INSERTS - it works flawlessly.
ON [dbo].[FPS_Transaction] AFTER UPDATE AS declare @trxId uniqueidentifier; BEGIN TRY SET NOCOUNT ON
I have stored procedure on Server A which goes to ServerB to check and update table and then update on Server A as well.I have Trigger which suppose to execute stored procedure (as i mentioned above). But it failed with this error:--
Trigger code:-- CREATE TRIGGER [tr_DBA_create_database_notification] ON ALL SERVER AFTER CREATE_DATABASE AS --execute dbadmin.dbo.usp_DBA_Refresh_DBAdmin_Tables
Error:--The operation could not be performed because OLE DB provider "SQLNCLI11" for linked server "xxx" was unable to begin a distributed transaction.Process ID 62 attempted to unlock a resource it does not own: DATABASE 21. Retry the transaction, because this error may be caused by a timing condition. If the problem persists, contact the database administrator.
Same stored procedure, if i execute manually or if i create sql job and execute this stored procedure, it works just fine..In trigger also, if i execute start job which has stored procedure, it works.My question is,why it failed when i execute stored procedure in TRIGGER.
I have a trigger set on TABLE1 so that any update to this column should set off trigger to write to the AUDIT log table, it works fine otherwise but not the very first time when table1 has null in the column. if i comment out
and i.req_fname <> d.req_fname from the where clause then it works fine the first time too. Seems like null value of the column is messing things up
Any thoughts?
Here is my t-sql
Insert into dbo.AUDIT (audit_req, audit_new_value, audit_field, audit_user)
select i.req_guid, i.req_fname, 'req_fname', IsNull(i.req_last_update_user,@default_user) as username from inserted i, deleted d
From last 1 week or so, i have been facing very strange problem with my sql server 2005s database which is configured and set on the hosting web server. Right now for managing my sql server 2005 database, i am using an web based Control Panel developed by my hosting company.
Problem i am facing is that, whenever i try to modify (i.e. add new columns) tables in the database, it gives me error saying that,
"There is already an object named 'PK_xxx_Temp' in the database. Could not create constraint. See previous errors. Source: .Net SqlClient Data Provider".
where xxx is the table name.
I have done quite a bit research on the problem and have also searched on the net for solution but still the problem persist.
Strange one here - I am posting this in both SQL Server and Access forums
Access is telling me it can't append any of the records due to a key violation.
The query:
INSERT INTO dbo_Colors ( NameColorID, Application, Red, Green, Blue ) SELECT Colors_Access.NameColorID, Colors_Access.Application, Colors_Access.Red, Colors_Access.Green, Colors_Access.Blue FROM Colors_Access;
Colors_Access is linked from another MDB and dbo_Colors is linked from SQL Server 2000.
There are no indexes or foreign contraints on the SQL table. I have no relationships on the dbo_ table in my MDB. The query works if I append to another Access table. The datatypes all match between the two tables though the dbo_ tables has two additional fields not refrenced in the query.
I can manually append the records using cut and paste with no problems.
we have a table in our ERP database and we copy data from this table into another "stage" table on a nightly basis. is there a way to dynamically alter the schema of the stage table when the source table's structure is changed? in other words, if a new column is added to the source table, i would like to add the column to the stage table during the nightly refresh.
Is it possible to create a trigger to monitor when a value changes from 1 to 0 in a particular row. Example, rows have names, pins and status columns. When the status column of a row changes from 1 to 0, I am going to send a mail item to the person whose name has had the status changed.
Hi allI have two tables in SqlServer with Exactly Same Structure,I want to Copy all Records fromone of them to another one.I came across to "Insert....select..." statement But i have two problem 1) I don't know any thing about Columns name!!! i just know they have same structure and as far as i know , "Insert...select..." need the Column list to operate correctly, am i right? 2) these two table have One Prinary Key column with IDENTITY feature. Any Help Greatly appriciated.Regards.
I'm new to this whole SQL Server 2005 thing as well as database design and I've read up on various ways I can integrate business constraints into my database. I'm not sure which way applies to me, but I could use a helping hand in the right direction.
A quick explanation of the various tables I'm dealing with: WBS - the Work Breakdown Structure, for example: A - Widget 1, AA - Widget 1 Subsystem 1, and etc. Impacts - the Risk or Opportunity impacts for the weights of a part/assembly. (See Assemblies have Impacts below) Allocations - the review of the product in question, say Widget 1, in terms of various weight totals, including all parts. Example - September allocation, Initial Demo allocation, etc. Mostly used for weight history and trending Parts - There are hundreds of Parts which will eventually lead to thousands. Each part has a WBS element. [Seems redundant, but parts are managed in-house, and WBS elements are cross-company and issued by the Government] Parts have Allocations - For weight history and trending (see Allocations). Example, Nut 17 can have a September 1st allocation, a September 5th allocation, etc. Assemblies - Parts are assemblies by themselves and can belong to multiple assemblies. Now, there can be multiple parts on a product, say, an unmanned ground vehicle (UGV), and so those parts can belong to a higher "assembly" [For example, there can be 3 Nut 17's (lower assembly) on Widget 1 Subsystem 2 (higher assembly) and 4 more on Widget 1 Subsystem 5, etc.]. What I'm concerned about is ensuring that the weight roll-ups are accurate for all of the assemblies. Assemblies have Impacts - There is a risk and opportunity impact setup modeled into this design to allow for a risk or opportunity to be marked on a per-assembly level. That's all this table represents.
A part is allocated a weight and then assigned to an assembly. The Assemblies table holds this hierarchical information - the lower assembly and the higher one, both of which are Parts entries in the [Parts have Allocations] table.
Therefore, to ensure proper weight roll ups in the [Parts have Allocations] table on a per part-basis, I would like to check for any inserts, updates, deletes on both the [Parts have Allocations] table as well as the [Assemblies] table and then re-calculate the weight roll up for every assembly. Now, I'm not sure if this is a huge performance hog, but I do need to keep all the information as up-to-date and as accurate as possible. As such, I'm not sure which method is even correct, although it seems an AFTER DML trigger is in order (from what I've gathered thus far). Keep in mind, this trigger needs to go through and check every WBS or Part and then go through and check all of it's associated assemblies and then ensure the weights are correct by re-summing the weights listed.
If you need the design or create script (table layout), please let me know.
Are there any limitations or gotchas to updating the same table whichfired a trigger from within the trigger?Some example code below. Hmmm.... This example seems to be workingfine so it must be something with my specific schema/code. We'reworking on running a SQL trace but if anybody has any input, fireaway.Thanks!create table x(Id int,Account varchar(25),Info int)GOinsert into x values ( 1, 'Smith', 15);insert into x values ( 2, 'SmithX', 25);/* Update trigger tu_x for table x */create trigger tu_xon xfor updateasbegindeclare @TriggerRowCount intset @TriggerRowCount = @@ROWCOUNTif ( @TriggerRowCount = 0 )returnif ( @TriggerRowCount > 1 )beginraiserror( 'tu_x: @@ROWCOUNT[%d] Trigger does not handle @@ROWCOUNT[color=blue]> 1 !', 17, 127, @TriggerRowCount) with seterror, nowait[/color]returnendupdate xsetAccount = left( i.Account, 24) + 'X',Info = i.Infofrom deleted, inserted iwhere x.Account = left( deleted.Account, 24) + 'X'endupdate x set Account = 'Blair', Info = 999 where Account = 'Smith'
This Audit Trigger is Generic (i.e. non-"Table Specific") attach it to any tabel and it should work. Be sure and create the 'Audit' table first though.
The following code write audit entries to a Table called 'Audit' with columns 'ActionType' //varchar 'TableName' //varchar 'PK' //varchar 'FieldName' //varchar 'OldValue' //varchar 'NewValue' //varchar 'ChangeDateTime' //datetime 'ChangeBy' //varchar
using System; using System.Data; using System.Data.SqlClient; using Microsoft.SqlServer.Server;
public partial class Triggers { //A Generic Trigger for Insert, Update and Delete Actions on any Table [Microsoft.SqlServer.Server.SqlTrigger(Name = "AuditTrigger", Event = "FOR INSERT, UPDATE, DELETE")]
public static void AuditTrigger() { SqlTriggerContext tcontext = SqlContext.TriggerContext; //Trigger Context string TName; //Where we store the Altered Table's Name string User; //Where we will store the Database Username DataRow iRow; //DataRow to hold the inserted values DataRow dRow; //DataRow to how the deleted/overwritten values DataRow aRow; //Audit DataRow to build our Audit entry with string PKString; //Will temporarily store the Primary Key Column Names and Values here using (SqlConnection conn = new SqlConnection("context connection=true"))//Our Connection { conn.Open();//Open the Connection //Build the AuditAdapter and Mathcing Table SqlDataAdapter AuditAdapter = new SqlDataAdapter("SELECT * FROM Audit WHERE 1=0", conn); DataTable AuditTable = new DataTable(); AuditAdapter.FillSchema(AuditTable, SchemaType.Source); SqlCommandBuilder AuditCommandBuilder = new SqlCommandBuilder(AuditAdapter);//Populates the Insert command for us //Get the inserted values SqlDataAdapter Loader = new SqlDataAdapter("SELECT * from INSERTED", conn); DataTable inserted = new DataTable(); Loader.Fill(inserted); //Get the deleted and/or overwritten values Loader.SelectCommand.CommandText = "SELECT * from DELETED"; DataTable deleted = new DataTable(); Loader.Fill(deleted); //Retrieve the Name of the Table that currently has a lock from the executing command(i.e. the one that caused this trigger to fire) SqlCommand cmd = new SqlCommand("SELECT object_name(resource_associated_entity_id) FROM ys.dm_tran_locks WHERE request_session_id = @@spid and resource_type = 'OBJECT'", conn); TName = cmd.ExecuteScalar().ToString(); //Retrieve the UserName of the current Database User SqlCommand curUserCommand = new SqlCommand("SELECT system_user", conn); User = curUserCommand.ExecuteScalar().ToString(); //Adapted the following command from a T-SQL audit trigger by Nigel Rivett //http://www.nigelrivett.net/AuditTrailTrigger.html SqlDataAdapter PKTableAdapter = new SqlDataAdapter(@"SELECT c.COLUMN_NAME from INFORMATION_SCHEMA.TABLE_CONSTRAINTS pk , INFORMATION_SCHEMA.KEY_COLUMN_USAGE c where pk.TABLE_NAME = '" + TName + @"' and CONSTRAINT_TYPE = 'PRIMARY KEY' and c.TABLE_NAME = pk.TABLE_NAME and c.CONSTRAINT_NAME = pk.CONSTRAINT_NAME", conn); DataTable PKTable = new DataTable(); PKTableAdapter.Fill(PKTable);
switch (tcontext.TriggerAction)//Switch on the Action occuring on the Table { case TriggerAction.Update: iRow = inserted.Rows[0];//Get the inserted values in row form dRow = deleted.Rows[0];//Get the overwritten values in row form PKString = PKStringBuilder(PKTable, iRow);//the the Primary Keys and There values as a string foreach (DataColumn column in inserted.Columns)//Walk through all possible Table Columns { if (!iRow[column.Ordinal].Equals(dRow[column.Ordinal]))//If value changed { //Build an Audit Entry aRow = AuditTable.NewRow(); aRow["ActionType"] = "U";//U for Update aRow["TableName"] = TName; aRow["PK"] = PKString; aRow["FieldName"] = column.ColumnName; aRow["OldValue"] = dRow[column.Ordinal].ToString(); aRow["NewValue"] = iRow[column.Ordinal].ToString(); aRow["ChangeDateTime"] = DateTime.Now.ToString(); aRow["ChangedBy"] = User; AuditTable.Rows.InsertAt(aRow, 0);//Insert the entry } } break; case TriggerAction.Insert: iRow = inserted.Rows[0]; PKString = PKStringBuilder(PKTable, iRow); foreach (DataColumn column in inserted.Columns) { //Build an Audit Entry aRow = AuditTable.NewRow(); aRow["ActionType"] = "I";//I for Insert aRow["TableName"] = TName; aRow["PK"] = PKString; aRow["FieldName"] = column.ColumnName; aRow["OldValue"] = null; aRow["NewValue"] = iRow[column.Ordinal].ToString(); aRow["ChangeDateTime"] = DateTime.Now.ToString(); aRow["ChangedBy"] = User; AuditTable.Rows.InsertAt(aRow, 0);//Insert the Entry } break; case TriggerAction.Delete: dRow = deleted.Rows[0]; PKString = PKStringBuilder(PKTable, dRow); foreach (DataColumn column in inserted.Columns) { //Build and Audit Entry aRow = AuditTable.NewRow(); aRow["ActionType"] = "D";//D for Delete aRow["TableName"] = TName; aRow["PK"] = PKString; aRow["FieldName"] = column.ColumnName; aRow["OldValue"] = dRow[column.Ordinal].ToString(); aRow["NewValue"] = null; aRow["ChangeDateTime"] = DateTime.Now.ToString(); aRow["ChangedBy"] = User; AuditTable.Rows.InsertAt(aRow, 0);//Insert the Entry } break; default: //Do Nothing break; } AuditAdapter.Update(AuditTable);//Write all Audit Entries back to AuditTable conn.Close(); //Close the Connection } }
//Helper function that takes a Table of the Primary Key Column Names and the modified rows Values //and builds a string of the form "<PKColumn1Name=Value1>,PKColumn2Name=Value2>,......" public static string PKStringBuilder(DataTable primaryKeysTable, DataRow valuesDataRow) { string temp = String.Empty; foreach (DataRow kColumn in primaryKeysTable.Rows)//for all Primary Keys of the Table that is being changed { temp = String.Concat(temp, String.Concat("<", kColumn[0].ToString(), "=", valuesDataRow[kColumn[0].ToString)].ToString(), ">,")); } return temp; } }
The trick was getting the Table Name and the Primary Key Columns. I hope this code is found useful.
Structure Number1 Category: CategoryId(P.K.), UniqueName, CreatedDate, ModifiedDateCategoryNative: CategoryId(F.K.), LanguageId(F.K.), NativeName, Description, Importance, IsVisible, CreatedDate, ModifiedDate Structure Number2 Category: CategoryId(P.K.), UniqueName, CreatedDate, ModifiedDateCategoryNative: CategoryNativeId(P.K.), CategoryId(F.K.), LanguageId(F.K.), NativeName, Description, Importance, IsVisible, CreatedDate, ModifiedDate Can anyone tell me that between above 2 structure what is better and why? I just add CategoryNativeId(P.K.) in Number 2 structure.
How can I setup a table structure for the diagram shown in the attached bitmap?
1) I need to create a product using labour and materials. 2) I need to create hardware using labour and materials. 3) I need to be able to include some hardware in some products. 4) Some materials in the product are made of a culmination of materials. E.G., Concrete is made of sand, stone, and cement.
Any suggestions?
So far I have :
USE NORTHWIND
Create Table tbProducts ( ProductID int NOT NULL, Product varchar(50) )
go
ALTER TABLE tbProducts ADD CONSTRAINT tbProducts_pk PRIMARY KEY (ProductID) GO
CREATE Table tbMaterials ( MaterialID int NOT NULL, Material varchar (50) )
GO
ALTER TABLE tbMaterials ADD CONSTRAINT tbMaterials_pk PRIMARY KEY (MaterialID) GO
CREATE Table tbLabour ( LabourCode char (2) NOT NULL, Labour varchar(50) )
GO ALTER TABLE tbLabour ADD CONSTRAINT tbLabour_pk PRIMARY KEY (LabourCode) GO
CREATE Table tbProductMaterials ( fkProductID int NOT NULL, fkMaterialID int NOT NULL, Quantity Float NOT NULL )
GO ALTER TABLE tbProductMaterials ADD CONSTRAINT tbProductMaterials_pk PRIMARY KEY (fkProductID, fkMaterialID) GO
I'm new in creating tables. and I was asked to create table(s) to capture information using SQL server 2005 (express edition)
please read below and share your thoughts and opinions o what's the efficient and best way to structure the tables.
I need to capture user's input daily and every hour, text can be from 20-100 characters, and my company is expecting to have 50,000+users in the future...Also the hrs's text can be stored for max 120 days, after that data would be deleted from the table(s) An example is put all together, (not sure if it is the right and efficient way)
An example is put all together, (not sure if it is the rigth and effectient way) and if database would support that.. and if it will be fast enougth to search for data
(table structure) - (example with 3 users) userId | timestap | message 1 12/12/2007 10:00 some message 1 12/12/2007 11:00 some message 1 12/12/2007 12:00 some message 1 12/12/2007 13:00 some message 1 12/12/2007 14:00 some message 2 12/12/2007 10:00 some message 2 12/12/2007 11:00 some message 3 12/12/2007 12:00 some message 3 12/12/2007 13:00 some message 3 12/12/2007 14:00 some message ...etc..