Multi-rec Processing In A Trigger & Sp

Mar 28, 2001

SQL 7 SP3

Hi.

I have a trigger that fires for any updates/deletes/adds and logs information via a sp call.

If I have more than one record, is there a better way to process them other than using a cursor ?

Yes, they have to be processed through the sp.

Thanks,

Craig

View 3 Replies


ADVERTISEMENT

Multi-insert Trigger

Oct 7, 2005

I'm pretty new at T-SQL programming, though I'm pretty familiar with most SQL statements. I'm trying to write an insert trigger that handles a multi-row insert, but I'm avoiding using cursors (which I'll have to resort to if all else fails). I've looked everwhere I can think of for an answer, outside of buying another book, but I can't seem to find a solution. Anyone have a good way I can alter this trigger to accept multirow inserts without using a cursor?Note: The purpose of this trigger is to copy the features of a mobile phone account whenever a new mobile phone account is created (as in this is the information for an invoice, and the mobile account is to be copied and updated for the new invoice each month.)CREATE TRIGGER [copymobilefeatures] ON [dbo].[mobilesub] FOR INSERTAS
insert into #featuresselect i.invoicedate, i.subaccountnumber, i.planid, f.featureidfrom mobilesub m join (inserted i join mobilefeatures f  on i.invoicedate = f.invoicedate )  on m.subaccountnumber = f.subaccountnumber 
insert into mobilefeatures(invoicedate, subaccountnumber, planid, featureid)select * from #featuresI'd love code examples, but a detailed written explaination of what needs to be done might be even more helpful. I'm looking for understanding, not just something I can copy and paste. Thanks!

View 2 Replies View Related

Multi-Row Update Trigger

Nov 20, 2007

Hi,

I need to update LastReceivedQty and LastReceivedDate fields in the Product table each time a DeliveryNoteDetail entry is created for a PurchaseOrderDetail line.

DeliveryNote -> DeliveryNoteDetail -> PurchaseOrderDetail -> Product

DeliveryNote has the ReceivedDate
DeliveryNoteDetail has the ReceivedQty

I made the following trigger for handling single row updates, which works fine.

UPDATE Purchasing.Product
SET LastReceivedQty = i.ReceivedQty, LastReceivedDate = dn.ReceivedDate
FROM Purchasing.DeliveryNote dn INNER JOIN
Purchasing.DeliveryNoteDetail dnd ON dn.DeliveryNoteID = dnd.DeliveryNoteID INNER JOIN
inserted i ON dnd.DeliveryNoteDetailID = i.DeliveryNoteDetailID INNER JOIN
Purchasing.PurchaseOrderDetail pod ON dnd.PurchaseOrderDetailID = pod.PurchaseOrderDetailID INNER JOIN
Purchasing.Product p ON pod.VendorVendorProductID = p.VendorVendorProductID

Now I don't know how to handle multi-row situations when the same product is updated.
Since I cannot rely on the order that the updates are performed I need to somehow select the MAX(ReceivedDate).

View 1 Replies View Related

Instead Of Trigger Considerations For Multi-Row Insert

Apr 12, 2006

I would like to know how to, if at all possible, to reconstruct the following trigger as to be able to handle multiple row insert when a single insert command is used - because the trigger will only be called once...I'm not familiar and don't know anything about cursors and i've read that its not the best way to go.

TRIGGER ON childtable INSTEAD OF INSERT
AS
BEGIN
DECLARE @customkey char(16);
DECLARE @nextchild int;
DECLARE @parent int;
DECLARE @date datetime;

SET @date = getdate();

SELECT @parent = parenttable FROM inserted;

SELECT @nextchild=count(*)+1 FROM childtable WHERE parenttable = @parent;

IF (@nextchild >= 9998) return;

SET @customkey = €˜type€™+ convert(char(4),year(@date)) + convert(char(2),month(@date)) + convert(char(2),day(@date))+convert(char(4),@nextchild + 1);

INSERT INTO childtable (customkey,parent) VALUES (@customkey,@parent);
END

View 7 Replies View Related

Trigger Behaving Differently On Multi-row Update

Aug 1, 2007

Let me set the scene:

I have an update trigger on a table. When a specific column is updated, I get the rowid from 'inserted' and then pass it via service broker to another database that will fire off a maintenance routine at a later time. This whole process seems to work fine if I update a single row at a time through Query Analyzer.

During testing (of the service broker part) I found that if in Query Analyzer I run an update that updates all of the records at once, then the trigger seems to fire only once for the entire process, therefore killing the rest of my process.

I would have thought that regardless of how a record was being updated the trigger would fire atomically for each row.

Any guidance on this would be MOST appreciated!

View 20 Replies View Related

Converting Multi-column Referential Constraint Into A Trigger

Feb 28, 2008

Hi there,

I am looking for a way to define a trigger that is a replacement for a multi-column foreign key.

I know how to a convert a single-column foreign key constraint into a trigger (i.e., to resolve diamond-structured references).

CREATE TABLE parent_tab
(
col_a INTEGER NOT NULL,
CONSTRAINT pk PRIMARY KEY(col_a)
);

CREATE TABLE child_tab
(
col_x INTEGER NOT NULL,
CONSTRAINT fk FOREIGN KEY (col_x) REFERENCES parent_tab(col_a) ON DELETE CASCADE
);

The conversion would remove the foreign key definition and add this trigger:

CREATE TRIGGER tr_single
ON parent_tab INSTEAD OF DELETE
AS BEGIN
DELETE FROM child_tab WHERE (child_tab.col_x IN (SELECT col_a FROM deleted))
DELETE FROM parent_tab WHERE (parent_tab.col_a IN (SELECT col_a FROM deleted))
END;

Unfortunately, now I need to resolve a situation where there is involved a multi-column foreign key.

CREATE TABLE parent_tab
(
col_a INTEGER NOT NULL,
col_b INTEGER NOT NULL,
CONSTRAINT pk PRIMARY KEY(col_a, col_b)
);

CREATE TABLE child_tab
(
col_x INTEGER NOT NULL,
col_y INTEGER NOT NULL,
CONSTRAINT fk FOREIGN KEY (col_x, col_y) REFERENCES parent_tab(col_a, col_b) ON DELETE CASCADE
);

This does not work, because the temporary table "deleted" might contain more than one row. How do I make sure that the values belong to the same row?

-- incorrect trigger, might delete too many rows
CREATE TRIGGER tr_single
ON parent_tab INSTEAD OF DELETE
AS BEGIN
DELETE FROM child_tab WHERE (child_tab.col_x IN (SELECT col_a FROM deleted) AND child_tab.col_y IN (SELECT col_b FROM deleted))
DELETE FROM parent_tab WHERE (parent_tab.col_a IN (SELECT col_a FROM deleted) AND parent_tab.col_b IN (SELECT col_b FROM deleted))
END;

-- some magic needed :-)
CREATE TRIGGER tr_single
ON parent_tab INSTEAD OF DELETE
AS BEGIN
DELETE FROM child_tab WHERE (child_tab.col_x IN (SELECT col_a FROM deleted AS t1) AND child_tab.col_y IN (SELECT col_b FROM deleted AS t2) AND row_id(t1) = row_id(t2))
DELETE FROM parent_tab WHERE (parent_tab.col_a IN (SELECT col_a FROM deleted AS t1) AND parent_tab.col_b IN (SELECT col_b FROM deleted AS t2) AND row_id(t1) = row_id(t2))
END;

I know the trigger definition above is ***... but I hope that it helps to make clear what I need.

Btw., I use SQL Server 2005.

Thanks in advance,

slowjoe

View 3 Replies View Related

How To Create Trigger For Multi Insert Employees Update Tb Employee Once

Jan 24, 2008

how to do this
i have table of employee ,evry employee have a unique ID "empid"
empid VAL_OK
--------------------------
111 0
222 0
333 0

now insert multiple insert to my work_table shifts for all month for evry employee
like this
(this is work_table)
empid date val
--------------------------------------------------
111 01/02/2008 1
111 02/02/2008 2
...............
111 29/02/2008 5
--next employee
222 01/02/2008 1
222 02/02/2008 4
...............
222 29/02/2008 6
--next employee
333
--next employee
444
--next employee
555
-------------------------------------------------------------


now i need for evry OK insert (for all month) each employee
go to the TB_Employee
and update each employee once !!
from VAL_OK=0 to VAL_OK=1
like this

empid VAL_OK
--------------------------
111 1
222 1
333 1
----------------------
like this i know who is the employee have shift for all month and who NOT !

i think it like this



Code Snippet
Create trigger for_insert on tb_work
For insert
begin
if @@rowcount = 1
Update tb_employee
Set
val_ok= 1

else
/* when @@rowcount is greater than 1,
use a group by clause */
Update tb_employee
set
val_ok= 1
select empid from tb_work
group by tb_work.empid

End







TNX

View 4 Replies View Related

TSQL + VBA Excel 2003 - Importing Data From MS Excel 2003 To SQL SERVER 2000 Using Multi - Batch Processing

Sep 11, 2007

Hi,
I need to import an SQL string from MS Excel 2003 to SQL SERVER 2000.
The string I need to import is composed by 5 different several blocks and looks like:



Code Snippet

CommandLine01 = "USE mydb"
CommandLine02 = "SELECT Block ..."
CommandLine03 = "GO
ALTER TABLE Block...
GO"
CommandLine04 = "UPDATE Block..."
CommandLine05 = "SELECT Block..."

The detail of the SQL string is at:
http://forums.microsoft.com/msdn/showpost.aspx?postid=2093921&siteid=1&sb=0&d=1&at=7&ft=11&tf=0&pageid=1



I am trying to implement OJ's suggestion:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2117223&SiteID=1
to use multi - batch processing to import the string to SQL SERVER, something like:




Code Snippet
Dim SqlCnt, cmd1, cmd2, cmd3
'set the properties and open a connection

cmd1="use my_db"
cmd2="create table mytb"
cmd3="insert into mytb"

SqlCnt.execute cmd1
SqlCnt.Execute cmd2
SqlCnt.Execute cmd3

Below is the code (just partial) I have, and I need help to complete it.
Thanks in advance,
Aldo.




Code Snippet
Function TestConnection()
Dim ConnectionString As New ADODB.Connection
Dim RecordSet As New ADODB.RecordSet

ConnectionString = "Driver={SQL Server};Server=myServer;Database=myDBName;Uid=UserName;Pwd=Password"
ConnectionString.Open

CmdLine01 = " USE " & myDB
CmdLine02 = " SELECT ACCOUNTS.FULLNAME FROM ACCOUNTS" ...

CmdLine03 = "GO
ALTER TABLE Block...
GO"

CmdLine04 = "UPDATE Block..."
CmdLine05 = "SELECT Block..."

RecordSet.Open CmdLine01, ConnectionString
RecordSet.Open CmdLine02, ConnectionString

ConnectionString.Execute CmdLine01
ConnectionString.Execute CmdLine02

'Retrieve Field titles
For ColNr = 1 To RecordSet.Fields.Count
ActiveSheet.Cells(1, ColNr).Value = RecordSet.Fields(ColNr - 1).Name
Next

ActiveSheet.Cells(2, 1).CopyFromRecordset RecordSet

'Close ADO objects
RecordSet.Close
ConnectionString.Close
Set RecordSet = Nothing
Set ConnectionString = Nothing

End Function






View 7 Replies View Related

Can't Access Inserted Table From Trigger; Msg 4104 The Multi-part Identifier ... Could Not Be Bound.

Sep 27, 2006

I'm a newbie have trouble using the "inserted" table in a trigger. When I run these SQL statements:CREATE DATABASE foobarGOUSE foobar GOCREATE TABLE foo ( fooID int IDENTITY (1, 1) NOT NULL, lastUpdated datetime, lastValue int, PRIMARY KEY(fooID))GOCREATE TABLE bar ( barID int IDENTITY (1, 1) NOT NULL, fooID int NOT NULL, [value] int NOT NULL, updated datetime NOT NULL DEFAULT (getdate()), primary key(barID), foreign key(fooID) references foo (fooID))GOCREATE TRIGGER onInsertBarUpdateFoo ON Bar FOR INSERTAS UPDATE Foo SET lastUpdated = inserted.updated, lastValue = inserted.[Value] WHERE foo.fooID = inserted.fooIDGO

I get the error message:

Msg 4104, Level 16, State 1, Procedure onInsertBarUpdateFoo, Line 4
The multi-part identifier "inserted.fooID" could not be bound.

I can get the trigger to work fine as long as I don't reference "inserted".

What am I missing?

I'm using Microsoft SQL Server Management Studio Express 9.00.2047.00 and SQL Express 9.0.1399

Thanks in advance for your help...
Larry

View 7 Replies View Related

SQL 2012 :: Disaster Recovery Options For Multi-Database Multi-Instance Environment

Sep 23, 2014

Disaster Recovery Options based on the following criteria.

--Currently running SQL 2012 standard edition
--We have 18000 databases (same schema across databases)- majority of databases are less than 2gb-- across 64 instances approximately
--Recovery needs to happen within 1 hour (Not sure that this is realistic
-- We are building a new data center and building dr from the ground up.

What I have looked into is:

1. Transactional Replication: Too Much Data Not viable
2. AlwaysOn Availability Groups (Need enterprise) Again too many databases and would have to upgrade all instances
3. Log Shipping is a viable option and the only one I can come up with that would work right now. Might be a management nightmare but with this many databases probably all options with be a nightmare.

View 1 Replies View Related

SQL 2012 :: MSDTC In Multi-node / Multi-instanced Cluster

Aug 17, 2015

More often than not, I typically don't touch DTC on clusters anymore; however on a project where the vendor states that it's required. So a couple things here.

1) Do you really need DTC per instance or one for all?
2) Should DTC be in its own resource group or within the instance's group?
2a) If in it's own resource group, how do you tie an instance to an outside resource group? tmMappingSet right?

View 9 Replies View Related

The Multi Delete && Multi Update - Stored Procedure Not Work Ok

Feb 4, 2008

the stored procedure don't delete all the records
need help



Code Snippet
DECLARE @empid varchar(500)
set @empid ='55329429,58830803,309128726,55696314'
DELETE FROM [Table_1]
WHERE charindex(','+CONVERT(varchar,[empid])+',',','+@empid+',') > 0
UPDATE [empList]
SET StartDate = CONVERT(DATETIME, '1900-01-01 00:00:00', 102), val_ok = 0
WHERE charindex(','+CONVERT(varchar,[empid])+',',','+@empid+',') > 0
UPDATE [empList]
SET StartDate = CONVERT(DATETIME, '1900-01-01 00:00:00', 102), val_ok = 0
WHERE charindex(','+CONVERT(varchar,[empid])+',',','+@empid+',') > 0




TNX

View 2 Replies View Related

Help With Multi Join Or Multi Tier Select.

Jul 20, 2005

Hello,I am trying to construct a query across 5 tables but primarily 3tables. Plan, Provider, ProviderLocation are the three primary tablesthe other tables are lookup tables for values the other tables.PlanID is the primary in Plan andPlanProviderProviderLocationLookups---------------------------------------------PlanIDProviderIDProviderIDLookupTypePlanNamePlanIDProviderStatusLookupKeyRegionIDLastName...LookupValue....FirstName...Given a PlanID I want all the Providers with a ProviderStatus = 0I can get the query to work just fine if there are records but what Iwant is if there are no records then I at least want one record withthe Plan information. Here is a sample of the Query:SELECT pln.PlanName, pln.PlanID, l3.LookupValue as Region,p.ProviderID, p.SSNEIN, pl.DisplayLocationOnPCP,pl.NoDisplayDate, pl.ProviderStatus, pl.InvalidDate,l1.LookupValue as ReasonMain, l2.LookupValue as ReasonSub,pl.InvalidDataFROM Plans plnINNER JOIN Lookups l3 ON l3.LookupType = 'REGN'AND pln.RegionID = l3.Lookupkeyleft outer JOIN Provider p ON pln.PlanID = p.PlanIDleft outer JOIN ProviderLocation pl ON p.ProviderID = pl.ProviderIDleft outer JOIN Lookups l1 ON l1.LookupType = 'PLRM'AND pl.ReasonMain = l1.LookupKeyleft outer JOIN Lookups l2 ON l2.LookupType = 'PLX1'AND pl.ReasonSub = l2.LookupkeyWHERE pln.PlanID = '123456789' AND pl.ProviderStatus = 0ORDER BY p.PlanID, p.ProviderID, pl.SiteLocationNumI know the problew the ProviderStatus on the Where clause is keepingany records from being returned but I'm not good enough at this toanother select.Can anybody give me some suggestions?ThanksDavid

View 5 Replies View Related

Multi-database Multi-server

Mar 27, 2007

I am new to Reporting Services and hope that what I am looking to do is within capabilities :-)



I have many identical schema databases residing on a number of data servers. These support individual clients accessing them via a web interface. What I need to be able to do is run reports across all of the databases. So the layout is:



Dataserver A

Database A1

Database A2

Database A3



Dataserver B

Database B1

Database B2



Dataserver C

Database C1

Database C2

Database C3



I would like to run a report that pulls table data from A1, A2, A3, B1, B2, C1, C2, C3



Now the actual number of servers is 7 and the number of databases is close to 1000. All servers are running SQL2005.



Is this something that Reporting Services is able to handle or do I need to look at some other solution?



Thanks,



Michael

View 5 Replies View Related

DTSRUN: To Multi-tier Or Not To Multi-tier. That Is The Question...

Jul 20, 2005

Greetings,We are trying to set up a set of "Leading Practices" for ourdevelopers, as well as ourselves, and hope some gentle reader canrecommend some documentation in favor of what appears to be the rightposition to take.We do not allow third party applications to run on our SQL Servers. Wewant to include DTS Packages under the definition of third partyapplications, insisting instead that the developers save theirpackages as COM Formatted files into their source code control systemsand run them from their app servers. The devlopers would like to hearthis from someone besides ourselves.While strong recomendations to remove guest access to MSDB altogetherabound, I have been unable to find a straight forward discussion ofthe advantages of structured file storage and app server off load ofDTS packages.Can anyone suggest any articles, white papers, rants, etc attemptingto formulate a solution to the benefits of taking msdb away fromguest, with the advantages of running DTS from an App server orworkstation platform, with the packages protected in source codecontrol?Thank youJohn Pollinsjpollins @ eqt . com

View 2 Replies View Related

CLR-Based Trigger? Recursive Trigger? Common Table Expression?

Nov 14, 2006

Hey,

I'm new to this whole SQL Server 2005 thing as well as database design and I've read up on various ways I can integrate business constraints into my database. I'm not sure which way applies to me, but I could use a helping hand in the right direction.

A quick explanation of the various tables I'm dealing with:
WBS - the Work Breakdown Structure, for example: A - Widget 1, AA - Widget 1 Subsystem 1, and etc.
Impacts - the Risk or Opportunity impacts for the weights of a part/assembly. (See Assemblies have Impacts below)
Allocations - the review of the product in question, say Widget 1, in terms of various weight totals, including all parts. Example - September allocation, Initial Demo allocation, etc. Mostly used for weight history and trending
Parts - There are hundreds of Parts which will eventually lead to thousands. Each part has a WBS element. [Seems redundant, but parts are managed in-house, and WBS elements are cross-company and issued by the Government]
Parts have Allocations - For weight history and trending (see Allocations). Example, Nut 17 can have a September 1st allocation, a September 5th allocation, etc.
Assemblies - Parts are assemblies by themselves and can belong to multiple assemblies. Now, there can be multiple parts on a product, say, an unmanned ground vehicle (UGV), and so those parts can belong to a higher "assembly" [For example, there can be 3 Nut 17's (lower assembly) on Widget 1 Subsystem 2 (higher assembly) and 4 more on Widget 1 Subsystem 5, etc.]. What I'm concerned about is ensuring that the weight roll-ups are accurate for all of the assemblies.
Assemblies have Impacts - There is a risk and opportunity impact setup modeled into this design to allow for a risk or opportunity to be marked on a per-assembly level. That's all this table represents.

A part is allocated a weight and then assigned to an assembly. The Assemblies table holds this hierarchical information - the lower assembly and the higher one, both of which are Parts entries in the [Parts have Allocations] table.

Therefore, to ensure proper weight roll ups in the [Parts have Allocations] table on a per part-basis, I would like to check for any inserts, updates, deletes on both the [Parts have Allocations] table as well as the [Assemblies] table and then re-calculate the weight roll up for every assembly. Now, I'm not sure if this is a huge performance hog, but I do need to keep all the information as up-to-date and as accurate as possible. As such, I'm not sure which method is even correct, although it seems an AFTER DML trigger is in order (from what I've gathered thus far). Keep in mind, this trigger needs to go through and check every WBS or Part and then go through and check all of it's associated assemblies and then ensure the weights are correct by re-summing the weights listed.

If you need the design or create script (table layout), please let me know.

Thanks.

View 4 Replies View Related

Trouble With Update Trigger Modifying Table Which Fired Trigger

Jul 20, 2005

Are there any limitations or gotchas to updating the same table whichfired a trigger from within the trigger?Some example code below. Hmmm.... This example seems to be workingfine so it must be something with my specific schema/code. We'reworking on running a SQL trace but if anybody has any input, fireaway.Thanks!create table x(Id int,Account varchar(25),Info int)GOinsert into x values ( 1, 'Smith', 15);insert into x values ( 2, 'SmithX', 25);/* Update trigger tu_x for table x */create trigger tu_xon xfor updateasbegindeclare @TriggerRowCount intset @TriggerRowCount = @@ROWCOUNTif ( @TriggerRowCount = 0 )returnif ( @TriggerRowCount > 1 )beginraiserror( 'tu_x: @@ROWCOUNT[%d] Trigger does not handle @@ROWCOUNT[color=blue]> 1 !', 17, 127, @TriggerRowCount) with seterror, nowait[/color]returnendupdate xsetAccount = left( i.Account, 24) + 'X',Info = i.Infofrom deleted, inserted iwhere x.Account = left( deleted.Account, 24) + 'X'endupdate x set Account = 'Blair', Info = 999 where Account = 'Smith'

View 1 Replies View Related

Generic Audit Trigger CLR C#(Works When The Trigger Is Attached To Any Table)

Dec 5, 2006

This Audit Trigger is Generic (i.e. non-"Table Specific") attach it to any tabel and it should work. Be sure and create the 'Audit' table first though.

The following code write audit entries to a Table called
'Audit'
with columns
'ActionType' //varchar
'TableName' //varchar
'PK' //varchar
'FieldName' //varchar
'OldValue' //varchar
'NewValue' //varchar
'ChangeDateTime' //datetime
'ChangeBy' //varchar

using System;
using System.Data;
using System.Data.SqlClient;
using Microsoft.SqlServer.Server;

public partial class Triggers
{
//A Generic Trigger for Insert, Update and Delete Actions on any Table
[Microsoft.SqlServer.Server.SqlTrigger(Name = "AuditTrigger", Event = "FOR INSERT, UPDATE, DELETE")]

public static void AuditTrigger()
{
SqlTriggerContext tcontext = SqlContext.TriggerContext; //Trigger Context
string TName; //Where we store the Altered Table's Name
string User; //Where we will store the Database Username
DataRow iRow; //DataRow to hold the inserted values
DataRow dRow; //DataRow to how the deleted/overwritten values
DataRow aRow; //Audit DataRow to build our Audit entry with
string PKString; //Will temporarily store the Primary Key Column Names and Values here
using (SqlConnection conn = new SqlConnection("context connection=true"))//Our Connection
{
conn.Open();//Open the Connection
//Build the AuditAdapter and Mathcing Table
SqlDataAdapter AuditAdapter = new SqlDataAdapter("SELECT * FROM Audit WHERE 1=0", conn);
DataTable AuditTable = new DataTable();
AuditAdapter.FillSchema(AuditTable, SchemaType.Source);
SqlCommandBuilder AuditCommandBuilder = new SqlCommandBuilder(AuditAdapter);//Populates the Insert command for us
//Get the inserted values
SqlDataAdapter Loader = new SqlDataAdapter("SELECT * from INSERTED", conn);
DataTable inserted = new DataTable();
Loader.Fill(inserted);
//Get the deleted and/or overwritten values
Loader.SelectCommand.CommandText = "SELECT * from DELETED";
DataTable deleted = new DataTable();
Loader.Fill(deleted);
//Retrieve the Name of the Table that currently has a lock from the executing command(i.e. the one that caused this trigger to fire)
SqlCommand cmd = new SqlCommand("SELECT object_name(resource_associated_entity_id) FROM
ys.dm_tran_locks WHERE request_session_id = @@spid and resource_type = 'OBJECT'", conn);
TName = cmd.ExecuteScalar().ToString();
//Retrieve the UserName of the current Database User
SqlCommand curUserCommand = new SqlCommand("SELECT system_user", conn);
User = curUserCommand.ExecuteScalar().ToString();
//Adapted the following command from a T-SQL audit trigger by Nigel Rivett
//http://www.nigelrivett.net/AuditTrailTrigger.html
SqlDataAdapter PKTableAdapter = new SqlDataAdapter(@"SELECT c.COLUMN_NAME
from INFORMATION_SCHEMA.TABLE_CONSTRAINTS pk ,
INFORMATION_SCHEMA.KEY_COLUMN_USAGE c
where pk.TABLE_NAME = '" + TName + @"'
and CONSTRAINT_TYPE = 'PRIMARY KEY'
and c.TABLE_NAME = pk.TABLE_NAME
and c.CONSTRAINT_NAME = pk.CONSTRAINT_NAME", conn);
DataTable PKTable = new DataTable();
PKTableAdapter.Fill(PKTable);

switch (tcontext.TriggerAction)//Switch on the Action occuring on the Table
{
case TriggerAction.Update:
iRow = inserted.Rows[0];//Get the inserted values in row form
dRow = deleted.Rows[0];//Get the overwritten values in row form
PKString = PKStringBuilder(PKTable, iRow);//the the Primary Keys and There values as a string
foreach (DataColumn column in inserted.Columns)//Walk through all possible Table Columns
{
if (!iRow[column.Ordinal].Equals(dRow[column.Ordinal]))//If value changed
{
//Build an Audit Entry
aRow = AuditTable.NewRow();
aRow["ActionType"] = "U";//U for Update
aRow["TableName"] = TName;
aRow["PK"] = PKString;
aRow["FieldName"] = column.ColumnName;
aRow["OldValue"] = dRow[column.Ordinal].ToString();
aRow["NewValue"] = iRow[column.Ordinal].ToString();
aRow["ChangeDateTime"] = DateTime.Now.ToString();
aRow["ChangedBy"] = User;
AuditTable.Rows.InsertAt(aRow, 0);//Insert the entry
}
}
break;
case TriggerAction.Insert:
iRow = inserted.Rows[0];
PKString = PKStringBuilder(PKTable, iRow);
foreach (DataColumn column in inserted.Columns)
{
//Build an Audit Entry
aRow = AuditTable.NewRow();
aRow["ActionType"] = "I";//I for Insert
aRow["TableName"] = TName;
aRow["PK"] = PKString;
aRow["FieldName"] = column.ColumnName;
aRow["OldValue"] = null;
aRow["NewValue"] = iRow[column.Ordinal].ToString();
aRow["ChangeDateTime"] = DateTime.Now.ToString();
aRow["ChangedBy"] = User;
AuditTable.Rows.InsertAt(aRow, 0);//Insert the Entry
}
break;
case TriggerAction.Delete:
dRow = deleted.Rows[0];
PKString = PKStringBuilder(PKTable, dRow);
foreach (DataColumn column in inserted.Columns)
{
//Build and Audit Entry
aRow = AuditTable.NewRow();
aRow["ActionType"] = "D";//D for Delete
aRow["TableName"] = TName;
aRow["PK"] = PKString;
aRow["FieldName"] = column.ColumnName;
aRow["OldValue"] = dRow[column.Ordinal].ToString();
aRow["NewValue"] = null;
aRow["ChangeDateTime"] = DateTime.Now.ToString();
aRow["ChangedBy"] = User;
AuditTable.Rows.InsertAt(aRow, 0);//Insert the Entry
}
break;
default:
//Do Nothing
break;
}
AuditAdapter.Update(AuditTable);//Write all Audit Entries back to AuditTable
conn.Close(); //Close the Connection
}
}


//Helper function that takes a Table of the Primary Key Column Names and the modified rows Values
//and builds a string of the form "<PKColumn1Name=Value1>,PKColumn2Name=Value2>,......"
public static string PKStringBuilder(DataTable primaryKeysTable, DataRow valuesDataRow)
{
string temp = String.Empty;
foreach (DataRow kColumn in primaryKeysTable.Rows)//for all Primary Keys of the Table that is being changed
{
temp = String.Concat(temp, String.Concat("<", kColumn[0].ToString(), "=", valuesDataRow[kColumn[0].ToString)].ToString(), ">,"));
}
return temp;
}
}

The trick was getting the Table Name and the Primary Key Columns.
I hope this code is found useful.

Comments and Suggestion will be much appreciated.

View 16 Replies View Related

Trigger - Require Help For Updating A Trigger Following An INSERT On Another Table

Oct 30, 2007

Table 1





First_Name

Middle_Name

Surname


John

Ian

Lennon


Mike

Buffalo

Tyson


Tom

Finney

Jones

Table 2




ID

F

M

S

DOB


1

Athony

Harold

Wilson

24/4/67


2

Margaret

Betty

Thathcer

1/1/1808


3

John

Ian

Lennon

2/2/1979


4

Mike

Buffalo

Tyson

3/4/04


5

Tom

Finney

Jones

1/1/2000


I want to be able to create a trigger that updates table 2 when a row is inserted into table 1. However I€™m not sure how to increment the ID in table 2 or to update only the row that has been inserted.

View 17 Replies View Related

Trigger - Require Help For Updating A Trigger Following An INSERT On Another Table

Feb 5, 2008

A





ID

Name


1

Joe


2

Fred


3

Ian


4

Bill


B





ID


1


4

I want to be able to create a trigger so that when a row is inserted into table A by a specific user then the ID will appear in table B. Is it possible to find out the login id of the user inserting a row?

I believe the trigger should look something like this:

create trigger test_trigger
on a
for insert
as
insert into b(ID)

select i.id
from inserted i
where
--specific USER

View 9 Replies View Related

DTS PROCESSING

Aug 3, 2001

I have a DTS that imports data from an orcle database into SQL Server.
Doesn't the processing mostly occur on the SQL Server, not on the
oracle database from which the data is being imported?
The oracle database is vendor provieded and they are saying
our SQL Server DTS package is killing their server.
Any insight is appreciated.
Thanks

View 1 Replies View Related

XML Processing

Mar 27, 2008

Hey All,

I've got a process that creates records in my database based on XML input that I've gotten. What I am doing it giving this XML to a stored procedure to handle a specific task, then modify the XML and send it to the next stored procedure.

For instance, the XML could hold header records with detail records, I would first send the XML to a stored procedure that creates the header records, then updates the XML so the XML now knows the identity values of the header records I have just created, and then send the XML to the next stored procedure to create the details for those headers.

All works great and fine, but I have a problem with writing the identity values back to the XML. It seems I can only change one item in the XML at a time and thus need to loop this. For many records this really takes a long time.

Here is some sample code of what I'm doing (please excuse any typos, this is a simplified version of the code) :

declare @lvSeq numeric(15)
declare @lvRowNo int
declare @lvNumRows int

insert into myHeaderTable (
recid, recdesc
) select
ref.value('@recid', 'nvarchar(25)') recid,
ref.value('@recdesc', 'nvarchar(250)') recdesc
from @pXML.nodes('//headers/header') R(ref)

select @lvRowNo=1, @lvNumRows = @pXML.value('count(//headers/header)', 'int')
while (@lvRowNo<=@lvNnumRows) begin
select @lvSeq = recseq
from myHeaderTable
where recid = @pXML.value('//headers/header[position()=sql:variable("@lvRowNo")]/@recid)

set @pXML.modify('replace value of (//headers/header[position()=sql:variable("@lvRowNo")]/@recseq with sql:variable("@lvSeq")')

select @lvRowNo=@lvRowNo+1
end


Obviously I am looking for a better way to update the XML with the sequences. The insert takes a second, the loop takes minutes with large XML sets. I guess MSSQL is searching the whole XML to find the item to update.

It would be nice if I didn't have to loop through the XML. One solution I was thinking off is to store the XML in a temporary table with a single record per header item. Then I could do the modify in one go and recreate the XML by simply selecting the contents of the temporary tabel. I have no idea if this is possible.

So something like this:

select
ref.value('@recid','nvarchar(25)') recid,
ref.value('.','XML') XMLData -- this gives an error
into #TMP_XML
from @pXML.nodes('//headers/header') R(ref)

insert into myHeaderTable (
recid, recdesc
) select
recid,
ref.value('@recdesc', 'nvarchar(250)') recdesc
from #TMP_XML CROSS APPLY XMLData.nodes('/header') R(ref)

update #TMP_XML
set XMLData.modify('replace ....')
from myheadertable
where #TMP_XML.recid = myheadertable.recid

-- recreate XML here, not sure how....

View 1 Replies View Related

SQL Mail Processing

Sep 1, 1998

Is it possible to read a mail message,
store its msgtext in a variable,
decrypt the retreived string,
and execute the string.(wich is a sql statement)

If it is possible wich directions should I take
to implement the above.

View 1 Replies View Related

Row Vs Column Processing Help!!

Jun 15, 2005

Hello friends,
I needed a suggestion, I am currently working on a reporting website that generates reports and i need to store all the reports in the database.

I usually go by row wise processing as it can be easily controlled but the problem is there will be a lot of reports, that is an estimation of 30,000 rows in a month and i m not sure if sql server can hold more than 2 billion rows per table.

Please Help!!

View 2 Replies View Related

Regarding Records Processing???

Mar 1, 2006

Hi…all

I will just explain whole scenario what I m facing in tricky problem..

We have xml files coming at regular interval by some other source into sql server 2000…daily having records near @10000 to 70000…we have job scheduled to run it regular interval…we doing this by some filter criteria… suppose the flow is like staging table into secondary table and then final into primary table….
We design DTS package accordingly means take the records from staging table put into secondary table and then into primary table…(near @ 8 task involved in it…)
Suppose xml file came at 8:30 am and our DTS package will run at 9:00 am…and then 11:00 am and the 1:00 pm like that….what I observing from many days is that after running job at 9:00 am successfully some good data still pending in secondary table not processed into primary table. But when again job ran at 11:00 am it processed that pending good records into primary table…some times when I ran this job manually through DTS design level the good data that pending in secondary table processed!!!

My question is that why this job not processed all the good records in single shot????

T.I.A

Papillon

View 3 Replies View Related

XML Data Processing

Dec 30, 2005

Hello everyone. I need help regarding the following:Given the following table:CREATE TABLE T1 (C1 nvarchar(10), C2 money)INSERT INTO T1 VALUES ('A',1)INSERT INTO T1 VALUES ('B',2)INSERT INTO T1 VALUES ('C',3)let's say that i have this table in a local server and i want to uploadit to a remote server and in the remote server upload it to a databasethat contains the same table.the uploading part can be done by another application in the remoteserver, but i want i need is a way to transfer the data at the fastestpossible way.what steps do i need to follow?tia,Rey Guerrero

View 10 Replies View Related

Processing Dataset As A Whole, And Not Per Row

Sep 20, 2007

Which component should be used to process dataset as a whole, and not on per row basis? I have need to process dataset conditionally (condition based on dataset), e.g. if a special row is present in dataset than dataset should be processed in a special way. Should I maybe use one Script Transformation to determine if dataset satisfies condition (and store that result into a variable), and then based on that condition (variable value) perform or not processing (using Conditional Split and Script Transformation)?

View 7 Replies View Related

Need Help With Row By Row Processing In SSIS

Feb 28, 2007

Hello all,

I am having trouble trying to construct the following process in SSIS/SQL
2005:

1. Grab a set of unprocessed rows (ProcessDT = null) in an 'Action' table
2. For each of these rows, execute multiple stored procedures base on the
action type
If actiontype = 1, exec spAct1a @param1, @parm2
exec spAct1b @param1, @parm2, @param3, @param4
If actiontype = 2, exec spAct2a @param1, @parm2, @param3
exec spAct2b @param1, @parm2, @param3
etc....
3. Update ProcessDT so it's not processed again
4. Repeat until all rows are processed

Note - all sp params are contained in additional columns in the Action
table. Basically the Action table is a store for post-event processing of
sorts
but is order dependent, hence the row by row processing. And some of my
servers are 2000 so Service Broker is not an option (yet).

I first attempted to do this totally within the control flow - using an ado
recordset/foreach loop control, but I could not figure out how to run
conditional
process paths based on the ActionTypeID. I then tried to do this within the
dataflow using on OLEDB data source, a conditional split, and an oledb
command control
which almost got me there - the problem being for each row I need to execute
multiple sp's and it appears as if the oledb command only gives me one sp.

Any help would be appreciated!

Thanks in advance!

View 3 Replies View Related

Processing One Row At A Time

Jul 11, 2007

I'm populating a new table based on information in an existing table. The new table is a list of all "items" and contains a primary key. The old table is a database of receipts where items can appear many times in any order.



I have put together the off-the-shelf components to do this, using a lookup transformation to see if the item is already in the new table. Problem is, because there's so much repetition in the old table I need to process the old table one row at a time. Batch processing is generating errors because the lookup doesn't detect duplicates within the buffer.



I tried setting the "DefaultBufferMaxRows" property of the task to 1, but that doesn't seem to have any effect.



To get the data from the old table, I'm using an OLE DB source. To get the data into the new table, I'm using the OLE DB Command transformation with parameters to execute an INSERT statement.



This is a job I have to do exactly once, so I don't care if I have to run it overnight. I'd rather have a simple, easy to understand but inefficient script so I understand what it's doing completely.



Any help on forcing SSIS to process one row at a time?



Lee Silverman
JackRabbit Sports

View 8 Replies View Related

Processing Time

Sep 25, 2007

Hi,



I am verifying my reports processing time. I get the information from the Reporting Service DB - [ExecutionLogs] table. I have the following information:



[TimeEnd] €“ time that reports generation ends.

[TimeStart] - time that reports generation starts.

[TimeDataRetrieval] - amount of time spent running the data sources.

[TimeProcessing] - time spent processing the report.

[TimeRendering] - time spent generating the output format.



If this information is correct the following statement should be true:



([TimeEnd] - [TimeStart]) = ([TimeDataRetrieval] + [TimeProcessing] + [TimeRendering])



But it isn't, ([TimeEnd] - [TimeStart]) is always bigger then ([TimeDataRetrieval] + [TimeProcessing] + [TimeRendering]).



Why does this happen?



Regards,


Rodrigo

View 5 Replies View Related

Query Processing

Apr 6, 2006

can anyone give me query processing for Database mirroring

View 1 Replies View Related

AS Processing Task

Nov 27, 2005

When using the AS processing task with a connection to "an Analysis Services project in this solution", only some processing options are available for processing dimensions. For instance, it is not possible to select "Process Update". Once I change the connection manager to point to the deployed cube database, I can choose from all the options. Is this by design?

View 1 Replies View Related

Lot And Fifo Processing In Sql

Feb 12, 2008

I need help from sql experts for the following problem

DOCTYPE DATE QTY PRD LOT
Purchase 1 jan 20+ AA 2007FW
Purchase 4 jan 50+ AA 2007SS
Purchase 9 jan 10+ AA 2007FW
Sale 3 jan 10- AA
Sale 4 jan 20- AA
Returned Good 4 feb 10 AA


As you can see I don't have the LOT code in sales records, so I must update these records with the following logic:

I have to assign LOT code to sales in FIFO order:
from the table above the 3rd of January I find the first sale of 10, take the first LOT (ascending date order), check for qty on hand (20+), consume 10 from LOT (10 remaining), update sale record with 2007FW LOT code.

Then I find next sale of 20, as before I take the first LOT with qty on hand to consume, again it's the first record with only 10 remaining so I set LOT qty on hand to zero but I have 10 more to allocate to a LOT code. So I find next available LOT of 50 and consume 10, with 40 remaining.

It's important to remember that I can sell part of a "lot", have to track the remaining goods per LOT

Is it possible in SQL to do that in batch mode? How? If I have to split sale record consuming 2 or more LOTS, how can I do? Can you show me SQL or good hints?

Thanx in advance

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved