What Are The Scenarios
May 12, 2006when one should use SSIS..different data sources,different destinations..etc...i hav never been in such a situation..i have been a developer..beside SSIS are there any MS tools?
thks
when one should use SSIS..different data sources,different destinations..etc...i hav never been in such a situation..i have been a developer..beside SSIS are there any MS tools?
thks
Does anyone have any good places where I can get some practice scenarios for DBA activity? Also any transact sql puzzles to solve for practice purposes. I want to get as much "real world" activity under my belt as possible in a quick time-frame.
Thanks.
What are the specific types of scenarios where we could use SSB and BizTalk in tandem?
I have come across a gotdotnet sample of an SSB adapter for BizTalk. As i understand a Biztalk orchestration could be an end point for the SSB conversation.
But what advantages can be obtained using this as compared to a typical SQL adapter for BizTalk which does CRUD operations on DB.
any pointers in this directions would be helpful
TIA
Paritosh
I have done some performance testing to see if asynchronous triggers performs any better than synchronous triggers in a simple audit scenario -- capturing record snapshots at insert, update and delete events to a separate database within the same instance of SQL Server.
Synchronous triggers performed 50% better than asynchronous triggers; this was with conversation reuse and the receive queue activation turned off, so the poor performance was just in the act of forming and sending the message, not receiving and processing. This was not necessarily surprising to me, and yet I have to wonder under what conditions would we see real performance benefits for audit scenarios.
I am interested if anyone has done similar testing, and if they received similar or different results. If anyone had conditions where asynchronous triggers pulled ahead for audit scenarios, I would really like to hear back from them. I invite any comments or suggestions for better performance.
The asynchronous trigger:
Code Snippet
ALTER TRIGGER TR_CUSTOMER_INSERT ON DBO.CUSTOMER
FOR INSERT AS
BEGIN
DECLARE
@CONVERSATION UNIQUEIDENTIFIER ,
@MESSAGE XML ,
@LOG_OPERATION CHAR(1) ,
@LOG_USER VARCHAR(35) ,
@LOG_DATE DATETIME;
SELECT TOP(1)
@CONVERSATION = CONVERSATION_HANDLE ,
@LOG_OPERATION = 'I' ,
@LOG_USER = USER() ,
@LOG_DATE = GETDATE()
FROM SYS.CONVERSATION_ENDPOINTS;
SET @MESSAGE =
( SELECT
CUST_ID = NEW.CUST_ID ,
CUST_DESCR = NEW.CUST_DESCR ,
CUST_ADDRESS = NEW.CUST_ADDRESS ,
LOG_OPERATION = @LOG_OPERATION ,
LOG_USER = @LOG_USER ,
LOG_DATE = @LOG_DATE
FROM INSERTED NEW
FOR XML AUTO );
SEND ON CONVERSATION @CONVERSATION
MESSAGE TYPE CUSTOMER_LOG_MESSAGE ( @MESSAGE );
END;
The synchronous trigger:
Code Snippet
ALTER TRIGGER TR_CUSTOMER_INSERT ON DBO.CUSTOMER
FOR INSERT AS
BEGIN
DECLARE
@LOG_OPERATION CHAR(1) ,
@LOG_USER VARCHAR(15) ,
@LOG_DATE DATETIME;
SELECT
@LOG_OPERATION = 'I' ,
@LOG_USER = USER() ,
@LOG_DATE = GETDATE()
INSERT INTO SALES_LOG.DBO.CUSTOMER
SELECT
CUST_ID = NEW.CUST_ID ,
CUST_DESCR = NEW.CUST_DESCR ,
CUST_ADDRESS = NEW.CUST_ADDRESS ,
LOG_OPERATION = @LOG_OPERATION ,
LOG_USER = @LOG_USER ,
LOG_DATE = @LOG_DATE
FROM INSERTED NEW
END;
Hi
I am writing a package that i want to have email me on two possible success scenarios.
essentially, this is the conditions:
IF @result = 0
BEGIN
EXEC master..xp_cmdshell @copyfile, NO_OUTPUT
PRINT 'Operation Successful'
END
ELSE
BEGIN
RAISERROR ('This operation failed. Error Code:01. The source and destination files are identical.', 0, 1)
END
If the first condition is met, I want to fire off an email stating success.
If the second condition is met (the RAISERROR) then i want to fire off a different email.
Now, the problem is, I am not sure how to flow it. Both conditions are successful, thus it always fires off the success email if i use an "On Success" flow.
How do i capture the RAISERROR before the email to tell it which email to send?
Does this make sense?
Hi
I am writing a package that i want to have email me on two possible success scenarios.
essentially, this is the conditions:
IF @result = 0
BEGIN
EXEC master..xp_cmdshell @copyfile, NO_OUTPUT
PRINT 'Operation Successful'
END
ELSE
BEGIN
RAISERROR ('This operation failed. Error Code:01. The source and destination files are identical.', 0, 1)
END
If the first condition is met, I want to fire off an email stating success.
If the second condition is met (the RAISERROR) then i want to fire off a different email.
Now, the problem is, I am not sure how to flow it. Both conditions are successful, thus it always fires off the success email if i use an "On Success" flow.
How do i capture the RAISERROR before the email to tell it which email to send?
Does this make sense?
I'm attempting to develop a course whose objective is to present the users with several scenarios of broken or hung databases caused by different things that they then have to fix.
Do you have any ideas or examples of how to break a database and the reasons behind it, also how to repair it afterwards !!
Richard
Hello, I had a notification set up using xp_sendmail working fine for a while. Recently I updated the SQL Server (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) to sp3a and we moved the mailbox (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) that I was using to a Exchange 2003 server. I can still send my notification if I use the domain ID that runs the SQL service and the sa ID, but not the NT ID's that were running it before. I have users use NT authentication (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) from a domain that's different than the one that the SQL server (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) resides. There is a trust and nothing has changed with that. Below are the results when I try to run the notification using an NT ID. This ID has full permissions over the SQL server. SQL Mail session started.ODBC error 8198 (42000) Could not obtain information about Windows NT (http://www.windowsitpro.com/Forums/messageview.cfm?catid=1664&threadid=129743#) group/user 'INTERNALxxx.xxx'. Stopped SQL Mail session. As you can see SQL Mail starts and stops ok, but I get the error on the xp_sendmail itself. I can run xp_logininfo to return all of the ID's using this NT login. But if I run xp_logininfo just on the problem ID, I get the following results. EXEC master..xp_logininfo@acctname = 'INTERNALxxx.xxx' Server: Msg 8198, Level 16, State 24, Procedure xp_logininfo, Line 58Could not obtain information about Windows NT group/user 'INTERNALxxx.xxx'. Here's when it works. EXEC master..xp_logininfo BUILTINAdministrators group admin BUILTINAdministrators NULLINTERNALxxx.xxx user admin INTERNALxxx.xxx NULLINTERNALSxx Axx group admin INTERNALSxx Axx NULLSISDOMsxx.dxx.axx user admin SISDOMsxx.dxx.axx NULLSISDOMSxx.Dxx.Axx group admin SISDOMSxx.Dxx.Axx NULLINTERNALlxxx.rxxx user user INTERNALlxxx.rxxx NULL The ones in bold work for everything. Please advise? Julie
View 2 Replies View RelatedHi
I am trying to find disaster recovery scenarios for Analysis Services which seems to be lacking. Is there anything documented.
Thanks
I kow for a solid comparison between using datareaders and datasets I will have to perform that myself. But for now I will be utilizing datasets...
What I am doing is currently utilizing assemblies to create my datasets ahead of time. I will eventually compile them as dlls. I'm just utilizing assemblies during my building/testing fase.
My questions is:
Is it faster to completely build the datasets and all needed connections inside the assemblies/dlls and fill them? Or to build the datasets and connections as a sub procedure that can be accessed and then fill them as each required set of data is needed?
I ask because I will be having many different data connections and so I'm not sure if it's faster to explicitly build/fill almost each and every one and have them compiled at runtime ready to be accessed, or to file them when called from a sub etc...?
As I take it, the server should track and monitor which are used the most, and cache them, so as to operate faster. I wonder if it will still do this if the datasets aren't pre-filled?