Using Transactions Causes Database Connection To Fail
Feb 8, 2006
In one of my packages, I set the package-level property called TransactionOption=Required. During run-time I saw an error saying "[Execute SQL Task] Error: Failed to acquire connection "SQL_DW". Connection may not be configured correctly or you may not have the right permissions on this connection. ". When the property is changed to anything other than Required, it works fine (the calling package that calls this package is not involved in a transaction).
The machine running the packages is Windows Server 2003, and so is the database where the data lives. I verified that the machine containing the database does has Enable Network DTC Access checked in Control Panel -> Add/Remove Windows Components -> Application Server.
Hi there, I have decided to move all my transaction handling from asp.net to stored procedures in a SQL Server 2000 database. I know the database is capable of rolling back the transactions just like myTransaction.Rollback() in asp.net. But what about exceptions? In asp.net, I am used to doing the following: <code>Try 'execute commands myTransaction.Commit()Catch ex As Exception Response.Write(ex.Message) myTransaction.Rollback()End Try</code>Will the database inform me of any exceptions (and their messages)? Do I need to put anything explicit in my stored procedure other than rollback transaction? Any help is greatly appreciated
I have got this error below which is similar with what has posted in forum.
SQL_ERROR (-1) in OdbcConnection::connect sqlstate=28000, level=-1, state=-1, native_error=18456, msg=[Microsoft][SQL Native Client][SQL Server]Login failed for user 'sa'.
As i have found its solution in this forum. Here I would like to ask since my Ms SQL server epress set up fail due to the error above, but some of the previous setup is successful. In this case, am I going to run the .exe set up to continue the remaining process or I have to uninstall and install it again? Thanks.
We get the below error while performing a distributed transaction on linked server. We have several linked servers configured in the source server and all of them succeed with the distributed transaction except on one.
We did all the basic troubleshooting and moreover the distributed transactions work fine if we use a remote server instead.
Error: OLE DB provider "SQLNCLI10" for linked server "SERVERNAME.REDMOND.CORP.MICROSOFT.COM" returned message "No transaction is active.". Msg 7391, Level 16, State 2, Line 3 The operation could not be performed because OLE DB provider "SQLNCLI10" for linked server "SERVERNAME.REDMOND.CORP.MICROSOFT.COM" was unable to begin a distributed transaction.
Test code: begin distributed transaction select top 10 * from [SERVERNAME.REDMOND.CORP.MICROSOFT.COM].master.sys.objects ROLLBACK
Source server : Microsoft SQL Server 2008 (RTM) - 10.0.1779.0 (X64) Nov 12 2008 12:10:04 Copyright (c) 1988-2008 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.0 <X64> (Build 6001: Service Pack 1) (VM)
Target server : Microsoft SQL Server 2008 (RTM) - 10.0.1600.22 (Intel X86) Jul 9 2008 14:43:34 Copyright (c) 1988-2008 Microsoft Corporation Enterprise Edition on Windows NT 5.2 <X86> (Build 3790: Service Pack 2)
I had link my 4 of workstations to server with MySql.1 pc of my pc can run a software which can update MsSql perfectly but notothers(3 failed).I tried to add System Dsn data source for Control Panel - Odbc data source32.The pc which working fine with the software function but 3 of the rest not.My pcs running xp and win98 !Regards.Thanks.Leslie Lim
Hi, I have a web apllication using asp.net, I have two connectionstrings (cadenaCon and cadenaCon2) in web.config. <connectionStrings> <add connectionString="Server=SUNBOGW-MROMEROSQLEXPRESS;Database=ModulosWebkkk;UID=sa;PWD=martha;" name="cadenaCon" providerName="System.Data.SqlClient" /> <add connectionString="Server=SUNBOGW-MROMEROSQLEXPRESS;Database=ModulosWeb;UID=sa;PWD=martha;" name="cadenaCon_bak" providerName="System.Data.SqlClient" /></connectionStrings>In the class file I have this sub and it works well Public Sub New() conexion = New SqlConnection(ConfigurationManager.ConnectionStrings("cadenaCon").ConnectionString.ToString)End Sub My problem is I like to try to connect with cadenaCon and if this connection fails use the other. I used a try but it don´t work, Public Sub New() Try conexion = New SqlConnection(ConfigurationManager.ConnectionStrings("cadenaCon").ConnectionString.ToString) Catch ex As Exception Throw New Exception(ex.Message) conexion = New SqlConnection(ConfigurationManager.ConnectionStrings("cadenaCon_bak").ConnectionString.ToString) Finally End TryEnd Sub Please I aprecitate a sugestion. Thanks Martha
Hi group,being pretty new to setting up sql server, I am stuck. Have installed sql 2000 server on Win 2000 server, IIS 5.0. I try to connect via ASP script that looks like this:<%Set Conn = Server.CreateObject("ADODB.Connection")ConStr = "PROVIDER=SQLOLEDB; DATA SOURCE=192.168.1.99"ConStr = ConStr & ";UID=someUser;PWD=somePassword" ConStr = ConStr & ";initial catalog=myDatabaseName;network library=DBMSSOCN"Conn.Open ConStrConn.CloseSet Conn = Nothing%>The IP number is the webserver my router points to on port 80.But get an HTTP 500 error and "The page cannot be displayed"Any suggestions as to what I am doing wrong here, would be very much appreciated.I have run applications for a long time that operate Access databases, but starting a new, an big project, forces me to upgrade to sql databases.Best wishesFCH
When i attempted to connected to Microsoft SQL Server 2005 Express via ASP.NET 2.0 application, it seems to throw the following error.
I had set up the Protocols for Express (TCP/IP and Named Pipes are both enabled) to allow remote connections using both TCP/IP and named pipes. Login failed for user ''. The user is not associated with a trusted SQL Server connection.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Login failed for user ''. The user is not associated with a trusted SQL Server connection.
Source Error:
The source code that generated this unhandled exception can only be shown when compiled in debug mode. To enable this, please follow one of the below steps, then request the URL:
1. Add a "Debug=true" directive at the top of the file that generated the error. Example:
<%@ Page Language="C#" Debug="true" %>
or:
2) Add the following section to the configuration file of your application:
Note that this second technique will cause all files within a given application to be compiled in debug mode. The first technique will cause only that particular file to be compiled in debug mode.
Important: Running applications in debug mode does incur a memory/performance overhead. You should make sure that an application has debugging disabled before deploying into production scenario.
A colleague wants to insert many millions of records where thevalues are computed in a C++ program. He connects to thedatabase with ODBC, and does an INSERT for each row.This is slow, apparently because each INSERT is a separatetransaction. Is there a way to delay committing the datauntil several thousand records have been written? InsideSQL Server this is simple, but I don't see an equivalentwhen using ODBC. Or is there something better than ODBC?Or might it be faster to write values to a file and thenuse bulk insert? I would appreciate any thoughts on thisgeneral problem!Thanks,Jim
I am running SQL Server 2012 on a VM and am trying to pull data from a network drive (S: or misvnascfs0003).I can import data using the Import Wizard and save the package as an SSIS package. When I try to Run the Package I get The Acquire Connection method call to the connection manager "SourceConnectionExcel" failed with the error...
If I save the file to the VM local drive the SSIS package run fine.When I log into SSIS through SSMS i have to run as "Administrator".
We find that if we deploy the OLAP database with a different name on the test server, then regardless of how we change the connection string provided to the SSIS package that processes the cube, then the package fails to connect to the database. To clarify:
In development the OLAP database is called MyOlapDB and the source database is called MySqlDB. Both are on the same machine. When the the application is built and released for test, the test team install the databases on a replica of the production environment (i.e. web app on one machine, OLAP DB on another and SQL database on yet another). They also, quite rightly, implement the new test databases so they incorporate the build version number. So, MyOlapDB123 and MySqlDB123 are both from build 123.
This is when the problems start. Regardless of how the connection string is specified in the job that processes the cube, the SSIS integration package fails with the error:
[Analysis Services Execute DDL Task] Error: Errors in the metadata manager. Either the database with the ID of 'MyOlapDB' does not exist in the server with the ID of 'OurTestServer', or the user does not have permissions to access the object.
We have tried config files and job properties, but neither work. Also, simply attempting to run the package using the DTEXECUI does not work either.
Looking inside the XML of the package, we clearly see the ConnectionManager object which has the original connection string, which is
Data Source=localhost;Initial Catalog=MyOlapDB;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;
However, editing the initial catalog here still does not solve the problem. Searching the XML for the string MyOlapDB reveals the OLAP database name in two other places - both within the object data of the two Analysis Services Execute DDL tasks.
Anyone know how to solve this problem without having to hack the XML of the package?
Dear experts, I've a database in sql2005 and now I want to build a same one on another machine. I've searched thru google and told that i can use backup& restore or using script. I've tested using backup/restore and it works great now i want to give a test to script. I've script the database successfully to a sql script file, however, when i run it against the new database server, the new database was not created as expected. Could anybody explain why? Thanks in advance
I am trying to attach a database, but get access error. here is what I did:
(1) I installed the microsoft sql server 2005 express and sql server management studio express on my pc (vista).
(2) I imported a database called myCompany.mdf, saved in the Microsoft SQL Server/MSSQL.1/MSSQL/Data directory, all users (including myself mary-PCmary) allowed with full control.
(3) Attach database using microsoft SQL server management studio express -> right click Database -> Attach -> attachDatabase windown pop up -> click Add..-> give me the following error:
TITLE: Microsoft SQL Server Management Studio Express ------------------------------
Failed to retrieve data for this request. (Microsoft.SqlServer.Express.SmoEnum)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.Express.ConnectionInfo)
------------------------------
The server principal "mary-PCmary" is not able to access the database "model" under the current security context. (Microsoft SQL Server, Error: 916)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00.3042&EvtSrc=MSSQLServer&EvtID=916&LinkId=20476
------------------------------ BUTTONS:
OK ------------------------------
I am the admin of this PC and can see the database and log files in the /data directory with full permission, but why can't I attach the databae file to the sql server? can someone help? thanks,
I added a connection (ADO.NET) object by name testCon in the connection manager - I wanted to programmatically supply the connection string. So I used the "Expressions" property of the connection object and set the connectionstring to one DTS variable. The idea is to supply the connection string value to the variable - so that the connection object uses my connection string.
Then I added a "Backup Database Task" to my package with the name BkpTask. Now whenever I try to set the connection property of BkpTask to the testCon connection object, by typing testCon, it automatically gets cleared. I am not able to set the connection value.
Then after spending several hours I found that this is because I have customized the connection string in testCon. If I don't customize the connection string, I am able to enter the "testCon" value in the connection property of the BkpTask.
On my dev server I have working ssis packages that use connections Microsoft OLEDB provider for Oracle MSDAORA.1 and Oracle provider for oledb and OracleClient data provider.
I use one or the other according to my needs.
In anticipation and to prepare for the build of a new production server, I have build a test server from scratch and deployed to it the entire dev.
Almost everything works except Microsoft OLEDB provider for Oracle.
ssis packages on the test machine will return an error
Error at Pull Calendar from One [OLE DB Source [1]]: The AcquireConnection method call to the connection manager "one.oledb" failed with error code 0xC0202009.
Error at Pull Calendar from One [DTS.Pipeline]: component "OLE DB Source" (1) failed validation and returned error code 0xC020801C.
[Connection manager "one.oledb"]: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80004005 Description: "Oracle error occurred, but error message could not be retrieved from Oracle.".
I have used the same installers for OS, SQL and Oracle SQL*Net on both dev and test machines. The install and then the restore/deployment on Test went fine.
Does anyone could point me to the right direction to solve this issue?
Hi, I am preparing to upgrade my database to 7.0. While doing DBCC on my existing database (in version 6.5), the process completed with status fail. Below is the message given:
Processed 64 entries in the Sysindexes for dbid 7. Found 7 bad entries in the Sysindexes. DBCC execution completed. If DBCC printed error messages, see your System Administrator.
Should I just ignore this message and proceed? If I were to clean up the bad entries, how do I proceed?
I was installing SQL server 2005 by unattended installation. I've been using this method for a few times before, and they were all successful. However, when I use this method recently, the following message pops up in the middle of the installation (around the time when it is trying to install SQL Server Database service):
SQL Server Setup was unable add user NTADMINNcheng to local group SQLServer2005MSFTEUser$NTHKGV34$NCHENG
and because of this, the whole setup failed.
Could someone please help me on this issue? Thanks in advance.
Hi All, I would like to know write all transcations in the log file to the database with out doing full backup of the database.. Please let me know is there any dbcc statement or some other method to do this..
This should be a fairly simple question. It's based on this error message:"Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing. Previous count = 1, current count = 0." I get this when executing a stored procedure upon processing a form. This error happens when I intentionally provide input to the stored procedure that I know should cause it to error out. I catch the exception, and it contains the error message, but it also contains the above message added on to it, which I don't want. I won't post the entire stored procedure. But I'll list a digest of it (Just those lines that are significant). Assume that what's included is what happens when I provide bad input:BEGINBEGIN TRYBEGIN TRANSACTION RAISERROR('The item selected does not exist in the database.', 16, 1); COMMIT -- This won't execute when the RAISERROR breaks out to the CATCH block END TRY BEGIN CATCHROLLBACKDECLARE @ErrorSeverity INT, @ErrorMessage NVARCHAR(4000)SET @ErrorSeverity = ERROR_SEVERITY()SET @ErrorMessage = ERROR_MESSAGE() RAISERROR(@ErrorMessage, @ErrorSeverity, 1) END CATCH END Okay, so that works fine. The problem is when I execute this with an SqlCommand object, on which I've opened a transaction. I won't include the entire setup of the data (with the parameters, since those seem fine), but I'll give my code that opens the connection and executes the query: con.Open(); SqlTransaction transaction = con.BeginTransaction(); command.Transaction = transaction; try { command.ExecuteNonQuery(); transaction.Commit(); } catch (Exception ex) { transaction.Rollback(); } finally { con.Close(); } I'm calling the stored procedure listed above (which has its own transaction), using a SqlCommand object on which I've opened a transaction. When there is no error it works fine. But when I give the stored procedure bad data, it gives me that message about the transaction count. Is there something I need to do in either my SQL or my C# to handle this? The entire message found in the Exception's Message is a concatenation of the message in my RAISERROR, along with the transaction count message I quoted at the beginning. Thanks, -Dan
Help, I had a tran log grow to it's restricted size, however the person that created this made the max size almost equal to the set max size. Needless to say I have not space to work with. SQL got bounced and my db went into recovery mode. After recovery mode was complete I tried to put my database in emergency mode but it exec's but never sets the mode. Next I tried to dbcc checkdb and I get msg 7929, level 16 state 1, line i Check statement aborted. Database contains deferred transactions. There is no back up for this database. Dev play area. I can not detatch db becase of the same error. What next? Any help would be great.
Every once in a while a scheduled restore of a production database backup to a development server will fail with the following error.
RESTORE cannot operate on database 'XXX' because it is configured for database mirroring or has joined an availability group
While it is true the production database is involved in database mirroring, the development server does not have database mirroring enabled. This error tells me something within the backup is telling the development server the database is configured for database mirroring.
However the perplexing part for me is that we only receive this error maybe 5% of the time, if that, and only on a couple of our databases. We have numerous other restores of mirrored production databases to development servers that have never produced this error. So my question is what is causing this error to occur, and why is it not happening all of the time? We get around this error by deleting the DEV database and re-running the restore job.
I have done a full backup on 3pm, and a differential backup on everyday 5pm.
I try to restore it back in my testing server and i encounter the problem in restoring the File3 and i try to restore the File 2 and it is okie. Can i know wat is the problem usually cause this error? Thank you
Hey, I hope someone can quickly tell me what I am obviously missing for this weird problem. To give a general picture, I have an ASP.net webpage that allows users to select values from several dropdown menus and click an add button which formats and concatenates the items together into a listbox. After the listbox has been populated the users have the option to save the items via a save button. The save button parses each item in the listbox to basically de-code the concantenated values and subsequently inserts them into a table residing on a backend MSSQL 2005 database. PROBLEM: In the process of testing the application, I noted this strange behavior. If I use the webpage to insert the values, go to the table where the values are stored and delete the rows; Upon a refresh of the web page the same actions seem to be getting replayed and the items are again inserted into the table. Naturally, what I'd really like would be for the page to refresh and show that the items aren't any longer there and not the other way around. If the code that performed the insert was residing in a component that was set for postback I'd expect this type of behavior but its in the Save buttons on_click event. I have tried practically everything in effort of targeting the problem but not having much luck with it. Is this behavior practical and expected in ASP.net or has anyone ever heard of anything similar? I have never encountered this type of problem before and was hoping someone could provide some clues for resolving it. If more information is required I'd be happy to supply it. Hopefully, there's a simple explanation that I am simply unaware since I haven't experienced anything like this before. Anybody got any ideas??? Thanks.
Is there somewhere in MSDE (or SQL) where you can see how many transaction are made to a sertain database or by a sertain user? At this way i could figure out witch database/user uses most (or least) recources (cpu) over a period of time.
I've been running the Ola Hallengren maintenance script for the last five months without missing a beat. Today I find an error stating the UserDatabase Integrity check job failed last night. This is running on SQL Server 2014 BI edition w/64 Gigs.
I ran a DBCC CHECKDB on each database manually and all worked until I tried it on the biggest one that is about 18 gbs. It just keeps running and I eventually stopped it so I'm guessing it is memory, but doesn't make sense considering it has 64 gbs. I have it set to 64/4 max / min. Again, this was never an issue until last night.I've been looking up all morning, but not seeing much on this error "The operating system returned error 1453"?
A server I'm working on has a very unique situation, where user tables and production tables reside on the same database. Users update / create tables or populates these tables, so it can't be a table-specific trigger. However, they give a new meaning to "kamikaze pilots" as it's not uncommon for them to "accidentally" update / insert / delete 500,000,000 + records in a single statement. I've tried educating them to use batching, but to no avail, so now I'm forced to stop these statements BEFORE they execute, based on rowcount, as they fill up the database log so quickly that it goes into recovery mode (It has a 200GB log file - insane, I know).
I recon the mosts transactions allowed should be 1,000,000 records in a single statement. Looking for database trigger to stop them from executing statements with large records?
Client is running X- version of application and corresponding database size is huge. Now client's vendor is releasing Y-version of same application with many database schema changes (like new tables added, new columns added, renamed existing columns and etc) To upgrade to the Y-version, vendor is suggesting to my client that down the system and do the upgrade for application/database to Y-version. We are sure that this process will take days together to upgrade to the Y-version. My client is not ready to down the system for that long. So we are trying to find the solution with minimal down time.The approach we are thinking is,
1) Create the replicated database to another server (server2) from production server(server1) using golden gate with X-version
2) Create new tables/schema updated tables from Y-version database on same server1. Here for Updated schema tables we are planning to use the name <table_name_Y_version> as the same table name exists in X-version.
3)With above 2 steps, golden gate replicate the changes from production to server1 and server1 will have the new Y-version table schema (with different concatenate name ' _Y_version'). BTW , there is no affect for the production
4) At this stage we are planning to find best approach, to fill the '<table_name>_Y_version' from X-version tables. two challenges here a) all data needs to be moved to Y-version tables b) they have to sync data in real time.
we thought of going to
a) ssis package to pump the data to Y-version tables, but real time data will not sync.
b) trigger based technique, previous experience said, lot of load
I have some code that dynamically creates a database (name is @FullName) andthen creates a table within that database. Is it possible to wrap thesethings into a transaction such that if any one of the following fails, thedatabase "creation" is rolledback. Otherwise, I would try deleting on errordetection, but it could get messy.IF @Error = 0BEGINSET @ExecString = 'CREATE DATABASE ' + @FullNameEXEC sp_executesql @ExecStringSET @Error = @@ErrorENDIF @Error = 0BEGINSET @ExecString = 'CREATE TABLE ' + @FullName + '.[dbo].[Image] ( [ID][int] IDENTITY (1, 1) NOT NULL, [Blob] [image] NULL , [DateAdded] [datetime]NULL ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]'EXEC sp_executesql @ExecStringSET @Error = @@ErrorENDIF @Error = 0BEGINSET @ExecString = 'ALTER TABLE ' + @FullName + '.[dbo].[Image] WITHNOCHECK ADD CONSTRAINT [PK_Image] PRIMARY KEY CLUSTERED ( [ID] ) ON[PRIMARY]'EXEC sp_executesql @ExecStringSET @Error = @@ErrorEND
I need detailed instructions on how to connect to a database from a Microsoft Access 2007 database to a Microsft Office Accounting 2007 database. The accounting database is an SQL 2005 datbase. It has an instance name of "MSSMLBIZ".
When I try I get an SQL error 53. Do not have permissions or database does not exist.