I'm using service broker and keep getting errors in the log even though everythig is working as expected
SQL Server 2005 Two databases Two end points - 1 in each database Two stored procedures: SP1 is activated when a message enters the sending queue. it insert a new row in a table SP2 is activated when a response is sent from the receiving queue. it cleans up the sending queue.
I have a table with an update trigger In that trigger, if the updted row meets a certain condition a dialogue is created and a message is sent to the sending queue. I know that SP1 and SP2 are behaving properly because i get the expected result. Sp1 is inserteding the expected data in the table SP2 is cleaning up the sending queue.
In the Sql Server log however i'm getting errors on both of the stored procs. error #1 The activated proc <SP 1 Name> running on queue Applications.dbo.ffreceiverQueue output the following: 'The conversation handle is missing. Specify a conversation handle.'
error #2 The activated proc <SP 2 Name> running on queue ADAPT_APP.dbo.ffsenderQueue output the following: 'The conversation handle is missing. Specify a conversation handle.'
I would appreceiate anybody's help into why i'm getting this. have i set up the stored procs in correctly?
i can provide code of the stored procs if that helps.
Hi! I have some try .. catch block trying to insert some data into database. During its action duplicate key row insert error could raise, for example. The question is how could I know distinguish it from other sql errors? Object ex (Catch ex As Exception) has only message property '{"Cannot insert duplicate key row in object 'dbo.Group_Courses' with unique index 'IX_Group_Courses'.The statement has been terminated."}' and type System.Data.SqlClient.SqlException. Knowing the type of error is not enough, because there are different SqlExceptions. Even the message is not unique for this error, because now i deal with 'dbo.Group_Courses' and then it could be other table. Is there something that unique identifies each error? For example error code. If it exists, where could I get it? Thanks in advance!
Hi, I would like to handle a sql error in t-sql and return a certain value in case error occurs. For example if I would like to add a record I want to return a certain identity value or maybe a status of transaction (0 for incomplete, 1 for succesfull trans). If error occurs in sql I cannot return any values back to asp.net because of What I am doing at the moment is catching an error in asp.net and then displaying an error message. Is there a way to return only a return value to asp.net and somehow handle the error in t-sql? Thanks
How to capture the Error Msg 8114 in sql server 2000.
Below is the sample code i have used.
CREATE TABLE Test ( Column1 int IDENTITY, Column2 int NOT NULL )
USE tempdb go ALTER PROCEDURE CHK_INSERT @Column2 int AS DECLARE @ExecQuery nVARCHAR(4000) set @ExecQuery='INSERT test VALUES ('+cast(@Column2 as varchar)+')' EXEC (@ExecQuery) IF @@ERROR <>0 BEGIN PRINT 'Error Occured' END
SET @ID= SUBSTRING(@strDatos , 10, 20 ) set @Monto= cast(SUBSTRING(@strDatos , 30, 7 ) as decimal(9,2))/100 set @TipMov= cast(SUBSTRING(@strDatos , 7, 3 ) as int) set @Tienda= cast(SUBSTRING(@strDatos , 2, 2 ) as int) set @NoCaja= cast(SUBSTRING(@strDatos , 5, 2 ) as int) set @Fecha = convert(char(11), getdate(), 101) set @Hora= CONVERT(VARCHAR,getdate(),108) set @NoAuto=ABS(CHECKSUM(NewID())) % 100000 + 1
--Se guarda el monto Actual y el nombre en variables SElect @SaldoActual = SALDO, @NomEmp = NOM_EMP, @Trans_Num=last_trans_num +1 FROM TBSALDOS WHERE ID=@ID --select @SaldoActual,@ID,@Monto,@TipMov,@Tienda, @NoCaja, @Fecha, @Hora,@NoAuto
if (@Monto<=@SaldoActual) begin set @SaldoNuevo=@SaldoActual-@Monto
Hi, i get this error while i manually execute dts. But when i execute dts on my .aspx page, i can't handle this error on "... catch(Exception ex) {... }" part. catch (Exception ex) { return ex.Message ; //DtsPackage.OnError += new PackageEvents_OnErrorEventHandler(DtsPackage_OnError); } Here is dts message in a text file. Step 'DTSStep_DTSDataPumpTask_3' failedStep Error Source: Microsoft OLE DB Provider for SQL ServerStep Error Description:Transaction (Process ID 124) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.Step Error code: 80004005Step Error Help File:Step Error Help Context ID:0 Any idea?
Hi, i get this error while i manually execute dts. But when i execute dts on my .aspx page, i can't handle this error on "... catch(Exception ex) {... }" part. catch (Exception ex) { return ex.Message ; //DtsPackage.OnError += new PackageEvents_OnErrorEventHandler(DtsPackage_OnError); } Here is dts message in a text file.
Step 'DTSStep_DTSDataPumpTask_3' failed
Step Error Source: Microsoft OLE DB Provider for SQL Server Step Error Description:Transaction (Process ID 124) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Step Error code: 80004005 Step Error Help File: Step Error Help Context ID:0
CREATE FUNCTION [dbo].[ToTime] ( @intHora int, --A valid hour @intMin int -- A valid minute ) RETURNS smalldatetime AS BEGIN declare @strTime smalldatetime declare @errorvar int
select @strTime=cast(convert(varchar,cast((cast(@intHora as varchar) +':'+ cast(@intMin as varchar)) as smalldatetime),108) as varchar) return @strTime; END
the function works perfect but when the parameter for the hour is a negative number (for example -1), or a number > 23 and the parameter for the minute is an negative number (-1) or a number > 59, the function produce an error. I need handle this error converting the wrong value in 0, but i don't want to do this using "if statement". for example
if @intHora < 0 or @intHora >23 begin set @intHora = 0 end if @intMin <0 or @intMin>59 begin set @intMin = 0 end
please, If someone know some sql function (try - catch doesn't work) to handle this kind of error or some good way to do it, please help me.
Basically the error that I am getting is in our test automation when running as non-admin on the box (regular user). We use .Net C# SQLConnection class to connect to SQL express 2005 impersonating with admin credentials. After getting the connection we try to execute a select command and it some time fails with following error: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The handle is invalid.) at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) at System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParserStateObject.ReadPacket(Int32 bytesExpected) at System.Data.SqlClient.TdsParserStateObject.ReadBuffer() at System.Data.SqlClient.TdsParserStateObject.ReadByte() at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlDataReader.ConsumeMetaData() at System.Data.SqlClient.SqlDataReader.get_MetaData() at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method) at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method) at System.Data.SqlClient.SqlCommand.ExecuteReader()
This happen only in the non-admin scenario mentioned above. Any idea what is triggering this?
Below is the script. The problem is when I simulated the Oracle linkdrop, my SQL2K never have to a chance to head to the GOTO section as itdies with this error msg and exit. Any idea on a workround? Thanks.Server: Msg 7399, Level 16, State 1, Procedure USP_Link_Check, Line 8OLE DB provider 'MSDAORA' reported an error.[OLE/DB provider returned message: ORA-12154: TNS:could not resolveservice name]OLE DB error trace [OLE/DB Provider 'MSDAORA' IDBInitialize::Initializereturned 0x80004005: ].----------------------------------------------ALTER PROCEDURE [USP_Link_Check] ASDECLARE @myERROR int -- Local @@ERROR, @myRowCount int -- Local @@ROWCOUNT--- Verify network connectionsselect *from openquery(OraLink,'select count(*) from Oracle.table')IF @myERROR != 0 GOTO HANDLE_ERRORHANDLE_ERROR:Print ' Error in Oracle Link'RETURN @myERROR---------------------------------------------
How would you do a log in a massive rows loading, I'm having problems because every row error(because of casting, format, lookup) in a transformation task is redirected to a text file as a log, this is ok when only exist one error by row, but in the case when I have two errors in the same row detected by diferents transformation tasks only the first one is reported to the text file, I have to wait to the second information load, after I correct the first error, to find the second one, I need to validate as many errors exists by row in the same load...
which component or which strategy can I use in a SSIS Packge to achieve this?
I need help about this , What i've done so far is a data processing extension that gets its data from a WS , I use it as a datasource for a report , That report will be displayed withing an ASPX page containing a reportviewer control that displays that report , The problem is catching errors that occurs withing the data processing extension .. ,
When i debugged the extension i saw that after throwing an exception the code ends up inside the "cancel()" event inside the IDBcommand implementation...
In all examples for a data processing extension i could find online they always have a comment saying the cancel is not implemented and throw a NewUnsupportedExtension()..
What happens is i get some uninformative , non-userfriendly error from the reporting services , something like "an error occurred during the report processing(rssomehing)"
How should i go about handling those exception that happened inside the data processing extension?
I've tried catching the error inside the event of the reportviewercontrol - reporterror but it seems that event don't fire when those errors are generated..
Any help on how to approach this would be highly welcome
Hi All, I need to send out email when error occurs in the package. Is it a good practice to put the send email task in the event handler? Then MaximumErrorCount is set to 1. But for some reason, some time I saw more than one email are sent out. Please advise. Thanks
We have implemented our service broker architecture using conversation handle reuse per MS/Remus's recommendations. We have all of the sudden started receiving the conversation handle not found errors in the sql log every hour or so (which makes perfect sense considering the dialog timer is set for 1 hour). My question is...is this expected behavior when you have employed conversation recycling? Should you expect to see these messages pop up every hour, but the logic in the queuing proc says to retry after deleting from your conversation handle table so the messages is enqueued as expected?
Second question...i think i know why we were not receiving these errors before and wanted to confirm this theory as well. In the queuing proc I was not initializing the variable @Counter to 0 so when it came down to the retry logic it could not add 1 to null so was never entering that part of the code...I am guessing with this set up it would actually output the error to the application calling the queueing proc and NOT into the SQL error logs...is this a correct assumption?
I have attached an example of one of the queuing procs below:
Code Block DECLARE @conversationHandle UNIQUEIDENTIFIER, @err int, @counter int, @DialogTimeOut int, @Message nvarchar(max), @SendType int, @ConversationID uniqueidentifier select @Counter = 0 -- THIS PART VERY IMPORTANT LOL :) select @DialogTimeOut = Value from dbo.tConfiguration with (nolock) where keyvalue = 'ConversationEndpoints' and subvalue = 'DeleteAfterSec' WHILE (1=1) BEGIN -- Lookup the current SPIDs handle SELECT @conversationHandle = [handle] FROM tConversationSPID with (nolock) WHERE spid = @@SPID and messagetype = 'TestQueueMsg'; IF @conversationHandle IS NULL BEGIN BEGIN DIALOG CONVERSATION @conversationHandle FROM SERVICE [InitiatorQueue_SER] TO SERVICE 'ReceiveTestQueue_SER' ON CONTRACT [TestQueueMsg_CON] WITH ENCRYPTION = OFF; BEGIN CONVERSATION TIMER ( @conversationHandle ) TIMEOUT = @DialogTimeOut -- insert the conversation in the association table INSERT INTO tConversationSPID ([spid], MessageType,[handle]) VALUES (@@SPID, 'TestQueueMsg', @conversationHandle);
SEND ON CONVERSATION @conversationHandle MESSAGE TYPE [TestQueueMsg] (@Message)
END ELSE IF @conversationHandle IS NOT NULL BEGIN SEND ON CONVERSATION @conversationHandle MESSAGE TYPE [TestQueueMsg] (@Message) END SELECT @err = @@ERROR; -- if succeeded, exit the loop now IF (@err = 0) BREAK; SELECT @counter = @counter + 1; IF @counter > 10 BEGIN -- Refer to http://msdn2.microsoft.com/en-us/library/ms164086.aspx for severity levels EXEC spLogMessageQueue 20002, 8, 'Failed to SEND on a conversation for more than 10 times. Error %i.' BREAK; END -- We tried on the said conversation, but failed -- remove the record from the association table, then -- let the loop try again DELETE FROM tConversationSPID WHERE [spid] = @@SPID; SELECT @conversationHandle = NULL; END;
SET @VRUSERVICEMIN = 46 SET @BILLEDFLAT = 0 -- PROCESS ONLY RECORDS WHERE THERE IS NO FLAT FEE IF @BILLEDFLAT = 0 BEGIN IF @VRUSERVICEMIN BETWEEN 0 AND 15 BEGIN SET @VRUSERVICEMIN = .25 END ELSE IF @VRUSERVICEMIN = 15 BEGIN SET @VRUSERVICEMIN = .25 END ELSE IF @VRUSERVICEMIN BETWEEN 15 AND 30 BEGIN SET @VRUSERVICEMIN = .5 END ELSE IF @VRUSERVICEMIN = 30 BEGIN SET @VRUSERVICEMIN = .5 END ELSE IF @VRUSERVICEMIN BETWEEN 30 AND 45 BEGIN SET @VRUSERVICEMIN = .75 END ELSE IF @VRUSERVICEMIN = 45 BEGIN SET @VRUSERVICEMIN = .75 END ELSE IF @VRUSERVICEMIN > 45 BEGIN SET @VRUSERVICEMIN = 1 END END
I test my the now function and it is getting the right date. When I try to send that to the sql database I have it turns it into 1/1/1900. Does anyone know why this is happening, I have tried everything Here is my code:
I don't know what's wrong with SQL2000 setup, the problem is: whenever I execute a query with a date/time such as 02/02/200, SQL give me an error message saying that the date is "Out of Range".
I'm having trouble with something and I was hoping that you could point me in the right direction.
Here's the scenario: I have a VB application that clients use to add records to a SQK 2K DB. The info they have added that day is shown in a grid. They have the ability to edit items in the grid, and then update those changes to the database. The problem is that sometimes they change the values to something they shouldn't. To combat this I've started experimenting with check constraints. In query analyzer I test the constraint by trying to update an entry to an 'illegal' value. When I do this, I get an error saying: "Server: Msg 547, Level 16, State 1, Line 1" and the change is not made. What I'd like to do is to give the user a dialog box notifying him of the error. Is there a way to have a sub-routine or stored procedure be triggered when a message of this type is generated?
My specs are: W2K pro clients, SQL2K on an NT4 sp6a server. Application is written in VB6.
I have a select statement that works, but I know there has to be a better way( I apologize for being TSQL brain dead today). Here is the statement
SELECT PATIENT_ACPT_STATUS, DISP_NOTES, RECORD_ID, modified_by, last_name, ddate FROM PATIENT_MEDICATION_DISPERSAL_ Where (ddate = convert(char(10),getdate(),101) and (MODIFIED_BY = CURRENT_USER) and rec_status = 1) or (ddate = convert(char(10),getdate(),101) and (PATIENT_ACPT_STATUS = 1)) OR (ddate = convert(char(10),getdate(),101) and (MODIFIED_BY = 'open'))
I have a need when a Update, Insert or Delete is done to a record in DB "A", it will send the appropriate UID to a different table in different DB "B".
My first thought was to have a trigger on the table in DB "A" simply call a stored procedure on DB "B" and do the UID.
However - my question is what is the best approach and what's the best way to establish the connection to DB "B" for the UID from within DB "A"? We can't use linked servers - DNSLESS string would be the preferred connect way. Not sure how to execute it within a trigger.
Is even using a Trigger to Stored Proc the best way?
What about Transaction Replication - which I've never attempted - is that a better way?
Just looking for some guidance here so I don't needlessly burn time down a path that isn't recommended or simply won't work.
I build a local cube from a relation database. In the database there are 1:n relations. Is there a way to handle 1:n relations? For example: I have a table LOGGEDFLAW and a table LOGGEDREASON with a 1:n relation between them. We create a select statement of these tables and as an result we get duplicate records of LOGGEDFLAW each time more than 1 record of LOGGEDREASON are associated to 1 record of LOGGEDFLAW - this is the standard result I get with an relational JOIN operation. Now I want to count the LOGGEDFLAWs without the duplicates generated by the 1:n relationship.
Please help me out how to implement the locking in below scenario
Req -
There are two tables Table1 & Table2 If I will insert in table1 then related data fields will be auto updated in table2 , similarly based on the data in table2 table1 data needs to be updated.
Now the sync of table1 & table2 is working fine.
My prob is we are handling the updation/insertion from the UI screens . Two separate screen for each table. When we have multiple user accessing the screens say - User1 updates table1 and User2 updates table2 then we need to implement the locking so that at one time one screen will allow updation in the table1 and hence table2. The other screen shouldnt allow updation in table2 and hence in table1.
This is very common locking functionality ...but am not getting any way to implement it , Please advise.
I have a friend who is doing a voting application for one of his customers and they are concerned about the volume that sql server can handle. He's looking a single sql server 2005 with plenty of hd space and 4gb of memory. The app will look to see if you voted and then insert a record accordingly.
Are there any papers out there or apps that can show the amount a server can handle?
I already asked this question; however, I am giving all the detailsnow:We get large files(millions of records) and we need to load it into ourtables using import export wizard. Some of the fields in the file canbe Null and so we are forced to create table with fields that allowNulls with default ''. However when we insert data into these tablesit puts Null in those fields even though we have a default '' (I do notthink we have any work around for that; do we?)Finally we need to go through each field and update it to '' if it is aNull and that takes LOT OF TIME.If (select count (*) from <tablename> where <columname> is Null) >0BeginUpdate <tablename>set <columnName> = ''where <columnName> is NullendPlease let me know if there are any work arounds for this crisis ?Thank you very much in advance!
I need to know if there is a better way to construct this SQL statement.(Error handling is omitted)MS SQL Server 2000Insert into FSSUTmpSelect a.acct_no, a.ac_nm, a.ac_type, 10, -1,-1 * sum(CHARINDEX(convert(char(4), b.post_yr), @post_yr) * CHARINDEX('-',CONVERT(char(2), b.post_prd - @post_prd - 1)) * b.prd_trn_amt),-1 * sum(CHARINDEX(convert(char(4), b.post_yr), @post_yr) * CHARINDEX('0',CONVERT(char(1), b.post_prd)) * b.prd_trn_amt)FROM GLAccounts a, GLBalances bWHERE b.cmpny_cd = a.cmpny_cdAND b.acct_no = a.acct_noAND a.cmpny_cd = @cmpny_cdAND ac_ctrl_type between '200' and '219'Group by a.acct_no, a.ac_nm, a.ac_typeThe part Im wondering about is the 2 sum sections.The GLBalances table has following important fields:Post_yr -- the posting yearPost_prd the posting periodPrd_trn_amt The beginning balances if the period is 0, or the nettransactions for periods 1 through 12.The first sum gives the current balance as of the period @post_prd by addingall of the periods from 0 to @post-prdThe second sum is just the beginning balance.It is doing a conditional sum by using CHARINDEX to be 0 if the recordshould not be added and 1 if it should.There is a problem as it stands when you are looking for the balances whenthe @post_prd is 9 or greater because b.post_prd - @post_prd 1 willbe 10 or smaller. Then the CONVERT(char(2) .. is an error, so CHARINDEXis 0 when it needs to be 1.I can fix that by using SIGN and it will work fine. What I what to know, isthere a better way to populate the table, where one of the values is aconditional sum?This is a STORED PROCEDURE from a commercial product, so I cant changeanything else other than the STORED PROCEDURE.
Hi,What exactly does the above (subject) error mean? I'm getting it from anadp file when used by a few people at the same time (each user has thefile in their own filespace though). Access is through windowsauthentication and it only seemed to occur during an update of aspecific table.. The problem is it didn't happen to everyone and Ican't recreate it at all on my own, so am wondering if it was somethingto do with the level of traffic to/from the server at that specific time.Any clues?Cheers,Chris
Does anyone knows of a tool that will help me manage my queries? I have 100's of them and all scattered. On my PC at home, some in my laptop, some of them at work, some in memory sticks. Then, when I need them I can't find the one I'm looking for so I end up writing the query again.
I am developing a smart device application using Visual Studio 2005 and SQL Server Compact Edition. Since we cannot write stored procedures, functions, views what is the best way to write/read to/from the database tables other than writing inline sql.
Any ideas and suggestions regarding this would be helpful.
(1) I have 2 packages that run independently of eachother. (2) Both are scheduled in SQLServer Agent Jobs as separate jobs. (3) Job A is scheduled to run every 12 hours and Job B is scheduled to run every 10 minutes. (4) However, I want to prevent Job B from running if Job A happens to be running. (5) It is unknown exactly how long Job A will take to finish so I can't schedule Job B around it.
The way I wanted to approach this situation is as follows.
Within Job A's package, create a "marker" file when the package starts and delete it when the package finishes. So the existence of this marker file will tell Job B's package if it should run or not.
The concept is simple, but I'm not sure how to implement this.
For example, to create the marker file, I would use a File System Task, but I don't see an option in there to "create file". (However, I do see an option for "delete file".) Also, what task would I use in Job B's package to check if the marker file exists.
Lastly, If you have better approach, I would like to hear about it.