Mysterious Backups Happening
Jul 11, 2000
I'm no guru but this one has me stumped!!
Each day there is a backup occurring of a database but nothing is scheduled either as a Job or under the database backup scheduling option.
Any suggestions greatly appreciated.
Thanks.
View 1 Replies
ADVERTISEMENT
Feb 25, 2006
I have 2 aspx pages. one is "login.aspx" and the second is "test_connection.aspx". the "login.aspx" is using the membership class for my website's security. if u have have restarted your computer and you first load this "login.aspx", this will work fine and you will see that you can create a user. when you load (or view) next "test_connection.aspx" you will get this error message:
System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.
Login failed for user 'sqluser1'.
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
If I will restart my computer and I will load first the "test_connection.aspx", this page will work just fine and if you will load next "login.aspx", i will get this error message:
Cannot open user default database. Login failed.
Login failed for user 'YECIAASPNET'.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.
Login failed for user 'YECIAASPNET'.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
the page "test_connection.aspx" is using an sql username i have created. I am only using one database in this project and thats it ASPNET.mdf
what's happening? i cant understand. I dont know what im doing wrong. i am very new to this dotNET and MS-SQL. im not sure if this has relation to the other problem i have on the other thread (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=254568&SiteID=1).
PLS HELP.
View 4 Replies
View Related
Nov 15, 2015
The space allocated to the Log in question is 180 GB. During this time period I was running TLog backups every 5 minutes, yet the log continued to chew through to 80 GB used, even after the process was complete and a final TLog backup had been taken. It continued to stay very large until the Full backup was complete -- or something else that I'm unaware of completed. Like every other DBA I typically take a TLog backup to shrink the log, but what appeared to be the case here was the Full completed and it released the used log space. All said, will Transaction Log backups not free up the log during Full backups?
View 3 Replies
View Related
Dec 24, 2007
i was working on a production server, and have stopped the sqlserver service along with the sql server agent. since i had to copy a MDF file . now i started the service again . but i find that there are no transactions happening ....what could be the reason.
View 8 Replies
View Related
Jan 12, 2008
Hi,
Im facing one problem When more than one user tries to insert record through the application only one users data is getting inserted. Wat might be the problem?
This is the stored proc im using..
create
proc [dbo].[GSK_insertregion] (@Country varchar(50),@Userid int,@RegionId varchar(50))
as
begin
insert
into UserHierarchy(Country,Userid,RegionId) values(@Country,@Userid,@RegionId)
end
Please help..
Sundaresan.R
View 4 Replies
View Related
May 23, 2008
Hi,
I set up my package for logging to SQL Server. I set up a connection manager for the logging, but did not specify a database. Doesn't that mean that the logging should default to msdb.sysdtslog90?
But when I check the table, there's nothing in it, after I run the package.
What gives?
View 1 Replies
View Related
Jan 13, 2004
Hi
I have deployed a website on a server having Windows2000, IIS5.0 . It uses SQL Server 2000 which is on another remote server. While developing I used the visual tools in VS.net to make a connection and have used DYnamic properties of the connection object to map the connection string to the entry in to the config file.
This works fine on my developement machine which has IIS and SQL Server 2000 on the same machine.
The entry in the web.config for my connection string is:
value= " server=xxx.yyy.com; Trusted_Connection=yes;provider=SQLOLEDB.1;Initial Catalog=events; User id=myuser; Password=password;"
where xxx.yyy.com is the server running SQL Server2000.
I do not get any error but the conncetion doesnot happen and my datagrid doesnot get filled.
The code for creating the connection is designer generated code.
Any clues?
-svp
View 20 Replies
View Related
Sep 6, 2007
Hello people, I don't expect anyone to know the answer to this but I guess we'll see huh?
I'm using Microsoft SQL Server Management Studio to do all my SQL stuff, and one of the tools that comes with this is a program called Microsoft SQL Profiler 2005. Anyways, so I'm using this profiler to capture SQL processes in the background while I work with the front end.
Here is a problem I'm facing... I am looking at this table right... and I see that when I do this certain process on the front end, it inserts 3 rows into the table. So I'm thinking "I know that on the tracer, I should be looking for an insert or a Stored Procedure with and insert in it."
So I use the tracer on the front end process and it shows me all the stored procedures that happen during the process of inserting these 3 rows into this table (btw this process is doing other things besides inserting stuff into this table, but it's what i'm currently working with at the moment.)
So what I do here at my job is I take this code, tweak it to work for our front end, and wah lah.... we're good to go. So in saying that, I make my own stored procedure and I all these stored procedures that are happening.
My result......
I get only 2 row inserts into the table.... the 1st row and the last row..... the middle row isn't inserted for some mysterious reason. I tried checking all the stored procedures for unique information pertaining to that specific row insert but to no avail I couldn't.
So my question to you guys is.... is there anything I'm overlooking that could possibly be inserting that row in the table? Thanks guys!
View 2 Replies
View Related
Jul 23, 2005
I have two problems I need some help with.First, I've just inherited a system and am delving into a few timeoutproblems that are causing problems for the users.Now, if I do a simple select * from the table (which looks to be thecause of the problem at this stage) in QA, I get the results back inless than a second. If I open the table in EM it takes about 10. Isthere a difference in viewing the data this way ? I'm used to EM beingvirtually the same speed. There is only one row. Minor questionreally, just something I'd like to understand if there is adifference.CREATE TABLE [QUERY] ([QUERY_ID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[CAT_ID] [numeric](18, 0) NOT NULL ,[QUERY_DESCR] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[USER_NAME] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[USER_ID] [int] NOT NULL ,[IND_EURO] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULLCONSTRAINT [DF_QUERY_IND_EURO] DEFAULT ('N'),[IND_DGCOLUMNS] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL CONSTRAINT [DF_QUERY_IND_DGCOLUMNS] DEFAULT ('N'),[NO_GROUPS] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_GROUPS] DEFAULT(0),[NO_FIELDS] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_FIELDS] DEFAULT(0),[NO_LINES] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_LINES] DEFAULT (0),CONSTRAINT [PK_QUERY] PRIMARY KEY CLUSTERED([QUERY_ID]) WITH FILLFACTOR = 90 ON [PRIMARY] ,CONSTRAINT [FK_QUERY_QUERY_CATEGORY] FOREIGN KEY([CAT_ID]) REFERENCES [QUERY_CATEGORY] ([CAT_ID]) ON DELETE CASCADE ON UPDATE CASCADE) ON [PRIMARY]GOI don't think any re-indexing has been done on this (or the othertables in the db). I was wondering if constant adding/deleting rowscould cause the index to be massive and in need of a good clear out.Any pointers would be appreciated. From what I can tell, there wassome problems trying to get replication to work. I need to dig deeperto see if this is now correct.-------------------------Secondly, there is a another table in the same database.CREATE TABLE [FIELD_DATA] ([ID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[DATA_ID] [numeric](18, 0) NOT NULL ,[FIELD_ID] [numeric](18, 0) NULL ,[FIELD_CODE] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[FIELD_VALUE] [numeric](15, 5) NULL ,CONSTRAINT [PK_FIELDDATA] PRIMARY KEY CLUSTERED([ID]) WITH FILLFACTOR = 90 ON [PRIMARY]) ON [PRIMARY]GOIt holds approx 4 million rows. The rest of the tables have minimaldata and about the same amount (consider them the same if you will).Now, another 'copy' of this database is held elsewhere (differentclient data) and this holds 40 million rows. The difference is thatthe first DB is 4.5GB and the second 6.5GB (approx). Does this provemy theory that re-indexing would be a good idea ?ThanksRyan
View 3 Replies
View Related
Nov 2, 2006
hi there,
can the SSRS works with DB mirroring?
If yes, how to do it?
I just tried to enble mirroring for reportserver and reportservertempDB, but it doesnt work after failover.
Any idea ?
View 6 Replies
View Related
Jan 17, 2003
18275 :
Unnamed tape (Family ID: 0xcfb8a96c, sequence 1) dismounted from tape drive ‘master tape backup’.
this also happens for the mounted part also, how do I name a tape drive? or stop these warnings in the event logs?
thanks for your help
View 1 Replies
View Related
Feb 11, 2015
Even rowchecksum is different in target from source then also update is not happening
alter Procedure SP_Archive_using_merge
AS
BEGIN
SET NOCOUNT ON
Declare @Source_RowCount int
Declare @New_RowCount int
[Code] ....
View 1 Replies
View Related
Apr 12, 2008
declare @old varchar(30); set @old = 'old';
Command(s) completed successfully.
select @old;
Msg 137, Level 15, State 2, Line 1
Must declare the scalar variable "@old".
Why?!
View 4 Replies
View Related
Jun 5, 2008
I tried the following,
1. Created a INSTEAD OF TRIGGER for INSERT
2. In the trigger, I have written a sql which inserts data in the same table
3. When I do the INSERT on the table the data in the trigger gets inserted
4. I was sort of expecting a recursive loop
Is this the normal behavior?
------------------------
I think, therefore I am - Rene Descartes
View 5 Replies
View Related
Apr 27, 2007
I have a simple invoice, inventory, billing program that is doing something strange but I can't track it down.
I can't figure out what triggers it but occasionally when I add an invoice to a customer all the other invoices are added to the last customer updated.
My update statement on the invoice table is
UPDATE Invoices
SET Date = @Date, InvoiceTotal = @InvoiceTotal, SubTotal = @SubTotal, Tax = @Tax, CustomerID = @CustomerID
WHERE (CustomerID = @CustomerID)
The only triggers on the table are to update the qty in Inventory and another trigger that fires when a payment is entered.
Any Ideas?
View 7 Replies
View Related
Feb 9, 2007
"The log file for database is full. Back up the transaction log forthe database to free up some log space."Now I only know this way to deal with that manually,Step1. in option , chance Recovery model from FULL to Simple.Step2: go to task to manually shrink the log fileStep3: Change recovery model back from simple to FULL.But by this way, I could get same problem again, the log file is fill,and need free up.Could you give an idea how to prevent this from happening? what andhow should I do???Thanks a lot in advance for your help.
View 3 Replies
View Related
Apr 16, 2008
Hi,
Iam having 500 pages report,the report ia running for the file key ,I have reference some of the feilds to the reports header segment to display in every page,For every file key the report header feild also will change ,It is working for all file keys but the reference not occuring from body to report header when file key grouping changes to other grouping
Please let me know is there any possible way to do this one
View 1 Replies
View Related
Jun 5, 2008
I have encountered a very frustrating situation when trying to use SQLBulkCopy. I have two excel files that I am trying to import into two tables in an MSSQL Server 2005 Express DB. One excel file has 5,000 rows, while the other file has 500,000 rows.I was able to import the smaller file successfully using this vb.net code: Protected Sub L26ExcelToSQL() 'Declare Variables
Dim sSQLTable As String = "Local26Members"
Dim sExcelFileName As String = "Full Local 26 List Formatted.xls"
Dim sWorkbook As String = "[Sheet1$]"
'Create connection strings
Dim sExcelConnectionString As String = "Provider=Microsoft.Jet.OLEDB.4.0;" & _ "Data Source=D:hostingmemberwolsite1l26voterreg" & sExcelFileName & ";" & _ "Extended Properties=""Excel 8.0;HDR=YES"""
Dim sSqlConnectionString As String = ConfigurationManager.ConnectionStrings("SiteSqlServer").ConnectionString.ToString 'Execute a query to erase any previous data from our destination table
Dim sClearSQL = "DELETE FROM " & sSQLTable Dim SqlConn As SqlConnection = New SqlConnection(sSqlConnectionString) Dim SqlCmd As SqlCommand = New SqlCommand(sClearSQL, SqlConn) SqlConn.Open() SqlCmd.ExecuteNonQuery() SqlConn.Close() 'Series of commands to bulk copy data from the excel file into our SQL table
Dim OleDbConn As OleDbConnection = New OleDbConnection(sExcelConnectionString) Dim OleDbCmd As OleDbCommand = New OleDbCommand(("SELECT * FROM " & sWorkbook), OleDbConn) OleDbConn.Open() Dim dr As OleDbDataReader = OleDbCmd.ExecuteReader() Dim bulkCopy As SqlBulkCopy = New SqlBulkCopy(sSqlConnectionString) bulkCopy.DestinationTableName = sSQLTable bulkCopy.WriteToServer(dr) OleDbConn.Close() End Sub
However, when I tried to import the 500,000 row excel file, I got the following error: Server Error in '/L26' Application.
A
transport-level error has occurred when receiving results from the
server. (provider: TCP Provider, error: 0 - The specified network name
is no longer available.)
Description: An
unhandled exception occurred during the execution of the current web
request. Please review the stack trace for more information about the
error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException:
A transport-level error has occurred when receiving results from the
server. (provider: TCP Provider, error: 0 - The specified network name
is no longer available.)
Source Error:
Line 438:Line 439: bulkCopy.DestinationTableName = sSQLTableLine 440: bulkCopy.WriteToServer(dr)Line 441:Line 442: OleDbConn.Close()
Source File: d:hostingmemberwolsite1L26DuesDefault2.aspx.vb Line: 440
Stack Trace:
[SqlException (0x80131904): A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +925466 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +800118 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +186 System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) +556 System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj) +164 System.Data.SqlClient.TdsParserStateObject.ReadPacket(Int32 bytesExpected) +34 System.Data.SqlClient.TdsParserStateObject.ReadBuffer() +44 System.Data.SqlClient.TdsParserStateObject.ReadByte() +17 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +79 System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() +1336 System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) +916 System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) +151 _Default.CSVToSQL() in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:440 _Default.ButtonTest3_Click(Object sender, EventArgs e) in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:905 System.Web.UI.WebControls.Button.OnClick(EventArgs e) +105 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +107 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +7 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +11 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +1746
Version Information: Microsoft .NET Framework Version:2.0.50727.1433; ASP.NET Version:2.0.50727.1433 After I received this error message, I tried viewing my database through the MSSQL Control Panel utilized by my hosting provider (WebHost4Life). However, I was unable to connect to the database and received this error: ___________________Microsoft OLE DB Provider for SQL Server error '80040e14' Database 1496 cannot be autostarted during server shutdown or startup. /getDBinfo.asp, line 29
_____________________ Now here is the most frustrating/mysterious part. I figured that maybe the error message were a result of the large size of the second excel file, so just for testing purposes, I created a new table in my MSSQL database. The table just has two fields, both set to varchar(50). I then created a test excel file, that had one row with the word "test" in the first and second column. When I tried using the code above to import the test excel data into the test table, I got the same exact error as I did with the 500,000 row file!Please help, I'm really stumped and I am not sure when I am having so much trouble replicating the success I had the 5,000 row file. Any suggestions are much apprecaited. -Bryan
View 4 Replies
View Related
Dec 16, 2003
Is there a way to either set Sql Server 2000 or ASP.net datetime fields to a standard format. The problem is that I am passing correct datetime fields using stored procedures and keep getting "Cannot convert datetime into string". It seems to me that many other developers are having that same problem. I tried alot of different methods and still have the same problem. I'm using c# and I never had a problem with datetime fields when I was using vb.net. The problem is that SQl Server is returning datetime formats that are not compatible with c#. I have code that works in other projects but when I try to use that same code I get that conversion error. How do I set the datetime in SQl Server and ASP.Net when I run queries so that the datetime output is in mm/dd/yyyy?
View 19 Replies
View Related
Nov 27, 2000
Hi!
I get this mysterious error while running an update.
SQLServer tells me that my update row is to big to fit in a row.
Then I check the size of the row and and it is like 2000 bytes so
it should fit without any problem at all. The really strange about
it that when comments a field it works just fine. That field is always
null. Could that be the problem. Anyone who have ever had a similar
problem or so?
View 5 Replies
View Related
Jan 29, 2001
Second post for the day. I've just noticed a few indexes in our tables that have are named like such: hind_c_207_1. They seem to have been put in automatically, and are messing with our index structure. Anyone know where these come from and how I can stop them? Any help would be great.
View 2 Replies
View Related
Jul 23, 2005
Hey all,strange problem here... query #1 displays 357 records correctly and allis well. However, when placed within query #2 as a subquery, it updatesevery single record in the lta table, what's going on here? anythoughts?1.) select *from LTA INNER JOIN new_listON lta.voy = new_list.voy ANDlta.poe = new_list.poe2.)update ltaset lta.LL_RCVD = 'N'where exists (select *from LTA INNER JOIN new_listON lta.voy = new_list.voy ANDlta.poe = new_list.poe)
View 4 Replies
View Related
May 15, 2008
I am facing a pretty strange issue running a query in SQL Server through an ODBC client. Here is what is happening -
1. I am able to connect to SQL server. The connect entry is present in ODBC log as well as SQL Server trace. Running sp_who2 active indicates the connection to be active. No problem here.
2. At this point, running a simple query (something that returns in less than 10 mts or so) comes back with the result set. No problem here.
3. However, when a complex query (that it does not return anything even after 10 mts) is run, a run of sp_who2 active or dbcc inputbuffer commands indicate that the connection created initially is not there any longer. SQL Server trace indicates a logout event for the connection. However, ODBC log indicates that the connection is still active in SQL Server and the ODBC client continues to wait for the results of the query for hours until I kill it.
I use ODBC 4.2 client on AIX 5.2 and connect to SQL Server 2005.
One of the theories I have is that the AIX server does not respond to more than a certain number of keep alive messages resulting in SQL Server closing the connection gracefully thinking that the client is no longer available. However, even after changing the keep alive setting for TCP/IP in Network Configuration Menu to 4 hrs, the same behaviour continues.
Can someone throw some light as to what might be happening?
Thanks.
View 4 Replies
View Related
Aug 14, 2015
I have configured an alert like below to track all blocked events in SQL Server across all databases and then kick start a sql job when a blocking happens which inserts data to a table, when there is a blocking in SQL server , i get an email  --which is working fine and i am able to track all queries.
but, HOW to get notifications ONLY if BLOCKING IS HAPPENING FOR MORE THAN 30 SECONDS OR 1 MINUTE with out using sp_configure?
---ALERT
USE [msdb]
GO
EXEC msdb.dbo.sp_update_alert @name=N'Blocking Process',Â
@message_id=0,Â
@severity=0,Â
@enabled=1,Â
[Code] .....
View 4 Replies
View Related
Oct 22, 2015
I have a SQL server 2014 environment. How to set up a mechanism which will throw a mail alert whenever we get a deadlock on the system.
View 2 Replies
View Related
Jul 14, 2006
hello all,
i am making a query which select the data again a particuler date.
I insert values in the table for with current date(Today's date) and the records is inserted with the date format(2006-07-14 16:12:09),now when i run the query after 2 or 3 minutes to select the records inserted today, my query returns no results.
I think it is because of the the time (14:16 in this case) that after 2 minutes, the query looks for the records inserted at (2006-07-14 18:12 or 2006-07-14 19:12) and does not get the result.
Is there a method to not consider the time(14:16) when running the query but the query fetches the records including the records inserted at this time(14:16) no matter at what time I run the query today?
Please anyone help me!
Thanks in advance!
View 4 Replies
View Related
Jul 23, 2005
Hi All,I'm trying to track down a mysterious problem we're experiencing inwhich updates and inserts to tables in our mssql2k server appear to be'disappearing.'To explain our situation:We have a web page (written in ASP, if that's relevant) on which weaccept enrollment information.When that page is submitted, the form data is passed to a storedprocedure on our mssql2k server, which performs several operations,all of which are wrapped in a transaction.In particular, the stored procedure performs an update operation on arecord in one table (i'll call it TableA) and an insert into anothertable (TableB).If the procedure encounters a problem (ie after each update / insertoperation in the procedure we test for IF @@Error<>0) it performs arollback, performs a select similar to the one immediately below, andthen RETURNs.SELECT '1' as error, 'Unable to update TableA' as errormsgIf the procedure doesn't fail any of the @@Error tests, thetransaction is committed, and a membership number is SELECTed to bereturned.SELECT '0' as error, @memnum as membershipnumberThe @memnum variable is populated within the transaction.Back in the ASP page we test both for the proc returning an emptyrecordset, or for it passing an explicit value in the error field, andpush the page to an error page if either of these conditions are met.If, on the other hand, none of these conditions are met, and themembershipnumber field in the recordset is populated with a validmembership number, we push to a confirmation page.This confirmation page receives the membership number in a sessionvariable, performs a SELECT against TableB (the table that receivedthe insert during the proc) using that membership number in the WHEREclause, and the resultant recordset is used to populate theconfirmation details on that page. That recordset is also then used topopulate the details of a confirmation email, which is automaticallysent by the confirmation page.And now here's our problem: we've become aware of a handfull of peoplewho have gone through the enrollment process, have received theconfirmation email containing the information they supplied asexpected, but the data appears to be entirely missing from our tables.By that I mean that the record in TableA does not appear to have beenupdated (under normal circumstances that record should have hadseveral flags set, and several other fields updated with informationsupplied by the person enrolling), and the record in TableB does notappear to have been inserted.In essence, looking at our tables, it *feels* like the transaction inthe stored procedure for that particular enrollment hit a problem andwas rolled back. However, the evidence that we have in the form of theconfirmation email argues strongly that the data must have existed inour tables (particularly in TableB), if only for an unknown period oftime.We're kind of at our wit's end to work out what is going wrong withthese enrollments. From my understanding of transactions (and I couldwell be wrong) any changes to data (ie updates, inserts etc) containedwithin are essentially 'invisible' to any other operation (ie theSELECT that happens in the confirmation page) until the transaction iscommitted, implying that the effect of the update and insert shouldhave been 'permanently' successful if no error code is received and ifa valid membership number was returned. I ask, because someone in ourteam has suggested that maybe the operations in the transaction'lasted long enough' in the tables to have been visible for the SELECTon the confirmation page to have worked, but were then subsequentlyrolled back, explaining why the confirmation email is appropriatelypopulated and why the data then appears to be missing. However, as Isaid, this doesn't match my understanding of how transactions behave.Sorry for the length of this post, but I felt it was best to explainthis as best as I could.Does anyone have any advice they can give us on this situation? ie,are there any known problems with operations in transactions 'bleedingover' into tables, but then being rolled back at some later point?Does anyone have any thoughts or suggestions on how we can furtherdiagnose this issue?Truly, any help will be immensely appreciated...Thanks in advance,M Wells
View 2 Replies
View Related
Mar 18, 2006
Hi everyone,It looks like a mystery, but I hope there should be some explanation tothe issue I experience. Once in a blue moon a random stored procedurestops working the way it was designed. The stored procedure code looksunchanged. Recompiling, altering the code do not help. It looks like itis simply does not execute some part of it or does not run at all.However it returns no errors.One time a procedure entered into infinite loop and almost hang thewhole server.When I copy procedure code and save it under different name, it worksas designed. But nothing helps with existing procedure. The only wayhow to fix it is to completely drop and recreate.The problem is, that you usually have to do it in the middle of thebusiness day after you spent few hours trying to realize what wentwrong before you realize that you got another mysterious corruption. Ofcause I have no clue of how to detect such things in advance and toprevent them from occurring in the future.I can guarantee that the SQL code in those procedures was absolutelybug free, fully tested and was working fine for a long time.For the first time I thought that internal compiled code might corrupt.In this case altering or recompiling should help.I also thought about execution plan, but it should be also fixed bydoing things above.DBCC checkdb does not find any errorsThe issue never goes away until stored procedure is manually droppedand recreated with the same SQL code.So, I'm asking all if someone experienced something similar and canexplain how to prevent it, please share the knowledge. I wouldappreciate any type of help.Thank you.
View 5 Replies
View Related
Jul 20, 2005
Yesterday afternoon I lost access to my local MSDE 2000 sp3(a?)instance. Usinig NT or SQL authentication I get an 'access denied'error when I try to connect. This happened suddenly with no changesto the server on my part. I've tinkered with the Client NetworkUtility settings, from Multiprotocol only, every combination of Multiprotocol, TCP/IP, and named pipes. I've tried connecting with mynetwork machine name, using (local), and using my IP address (both127.0.0.1 and my network IP). I've tried connecting from QueryAnalyzer on other computers. Of course I've rebooted as well, all tono avail.I know that there were network changes being made, but my domain login(which I normally use to connec to my SQL Server as a sysadmin) isstill in the local administrators group on my machine and I haven'tnoticed any problems accessing anything else.Any ideas?
View 1 Replies
View Related
Jul 25, 2007
Hi!
When executing some "big" update on a Sql Server 2005 standard edition SP1 (not clustered), sometimes it stops in a middle of work with this message:
The specified network name is no longer available. [SQLSTATE 08S01] (Error 64) Communication link failure [SQLSTATE 08S01] (Error 64)
I was reading some articles but they were all about clustered servers. Somewhere I read that it could be a memory problem, so we put 3G RAM and separate 20G paging partition beside the other data and backup partitions. The problem stays...
Funny thing is that it can sometimes happen but sometimes not with the same conditions!?
The server is Windows 2003 Server standard edition with SP1.
Any sugestion...help?!
View 9 Replies
View Related
Aug 8, 2007
Hello, I have a report that looks fine in preview, but when put to PDF or printed contains gaps at the right and bottom.
Here is a picture of the problem: http://northeasttigers.webng.com/pdfproblem.jpg
My tables are the same width as the body, 15.5cm, and the report width is 21cm. Also adding the bottom table's top location attribute to it's height gives the same height as the body.
Any help much appreciated.
Greg
View 4 Replies
View Related
Sep 8, 2006
I get the following error messages in the sql server error log
Source Logon
Message
Error: 18456, Severity: 14, State: 11.
and
Source Logon
Message
Login failed for user 'NT AUTHORITYANONYMOUS LOGON'. [CLIENT: 185.23.11.33]
The scenario is: We set up log-shipping (LS) between a clustered sql server system (source server) and a stand-alone sql server box (target server). (SQL Server 2K5 EE + SP1), and LS goes very well, but on the target server, we found the above-mentioned error messages.
BTW: the two servers are in the same domain.
Did I miss something in configuration?
Thanks in advance for your help..
Jeff
View 4 Replies
View Related
Oct 12, 2006
Explain this:
Package runs successfully from BIDS
It runs successfully in SSIS store
It runs successfully when I manually execute the Job
JOB FAILS EVERY FREAKING NIGHT
View 33 Replies
View Related