Distribution Job Step Retries All In One Connection And/or Transaction?
Jan 24, 2007
When a distribution job step "Retry Attempts" is > 0 and the step has certain problems, the step is "retried" after "Retry Interval".
But I am uncertain as to the details. Are the retries within one database transaction? Does each "try" get its own transaction? And what about connection? Is the "retry" done with the same connection? I know this may sound funny but I would like to know exactly what is going on here.
I created a Calculated measure in cube something like this : ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent].&[SPEND],[Measures].[Transaction Amount]). To get only spend transactions. Now, I want to slice this measure with same hierarchy to find the amount distribution across different transaction types under spend transaction. But this query behaving like the measure doesn't have relation with measure.
you can think this as below query: WITH MEMBER SPEND AS ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent].&[SPEND],[Measures].[Transaction Amount]) SELECT NON EMPTY {SPEND} ON 0 ,NON EMPTY ([TransType].[TransTypeHierarchy].[TransTypeCategoryParent]) ON 1 FROM [CUBE]
I currently have a simple transactional replication setup for a database. My publisher and distributor are on the same box. The subscription is setup using a push agent.
My question is related to recovery of the subscriber.
So lets say replication is setup and working fine. Suddenly we had a failure on the subscriber database. Now I could just reconfigure the subscription, and the subscribing database would be back up and good to go, but the problem is that over time, we have made some changes to the subscribing database that are not made in the publisher. For example, the tables have different indexes. Just reconfiguring the subscritpion would not recover these objects.
So I have to acutally restore the subscriber database. So I do that, and apply transaction logs up to the most recent transaction log backup. Now, consider that my transaction log backups on the subscriber happen every 4 hours, and the most recent transaction log backup I had was from 3 hours ago. So now at this point, my subscribing database is 3 hours behind my publisher.
Now, will the distribution agent resend the missing 3 hours of transactions?
In the distribution agent properties, there are two settings for transaction retention, "at least" and "but not more than". Currently they set to 0 and 72 hours respectivly. Now I would assume that if I set the "at least" setting to the subscriber transaction log backup period, in this case 4 hours, I would be covered, and the distribution agent would indeed re-replication the transactions that happend since the recovery point 3 hours.
I just wanted to verify that this is acutally what these settings are refering too, and that if I set the "at least" setting to 4 hours, I would be covered.
We have replication setup on a sql server 2000. We encountered issue that the distribution agent goes down (distrib.exe stop running) in the event of network connection broken. We would like to know:
is this the expected result that the distribution agent will go down in the event of communication failure between the distribution server and subscriber server? if not, is there a way to programmatically control and restart the agent? Is there any sp in SQL server which can monitor the replication communication error message? is there any sp in SQL server which can be run to restart the agent? For the best practice, what do you think we can do to achieve an event-driven kind of mechanism so when an communication breaks, the agent can be restarted by the triggered event (or at least a simple way to automatically restart)?
On SQL Server 2005 SP2 for Publisher and Distributor on the same instance, my old snapshots are not being cleaned up.
The following error is in the agent history:
Executed as user: DomainMyUser. Could not remove directory '\vmsql01ReplDatauncPublication_TRANSACTIONAL20070702104416'. Check the security context of xp_cmdshell and close other processes that may be accessing the directory. [SQLSTATE 42000] (Error 20015). The step failed.
xp_cmdshell is enabled and I can run commands like :
exec master.dbo.xp_cmdshell ' md c:TestFolder'
The permissions to the snapshot share and file system are that DomainMyUser has full control.
I have logged into the machine as this user and can remove snapshots so it does not seem to be a permission issue.
On other machines I do not get any errors but the snapshot folder still is not cleaned up.
I want to convert .rdlΒ to .rdc need full steps.Actually i created .rdl report using sp sucessfully.Now i want to convert it to rdlc while doing it iam getting some authentication error and some thing else.I created rdl in 2008 and i want to change it to rdlc 2010.
I have a package that has multiple data flow tasks. At the end of a task, key data is written into a raw file (file name stored in a variable) that is used as a data source for the next task. Each task requires a success from the preceding task.
Here's the rub:
If I execute the entire package, the results of the package (number of records of certain tasks) differs significantly from when I execute each step in the package in turn (many more records e.g. 5 vs 350).
I get the feeling that the Raw file is read into memory before it is flushed by the previous task, or that the next task begins preparation tasks too early.
Any help is greatly appreciated.
I am running on Server 2003 64 (although the same thing happens when deployed on a Server 2003 32 machine)
I hope the answer is as simple as the question -- but after reading all the documentation I could find (understand?) and a lot of posts here, I'm no closer to achieving the goal.
I have a Visual C# app, DAYTRACKER, developed in VS2005. It uses a database with several tables constructed using SQL Server 2005 Developer Edition.
I want to deploy the app plus the database plus SQL Express to another machine, to be used by a single user (the administrator) with no need for network connectivity of any kind.
What I have so far is: 1. The application is successfully deployed from a CD-ROM, having used the Publish process within VS2005, and opens on the new machine -- without database connectivity, however. 2. SQL Express is successfully deployed (it deployed as a 'prerequisite' when I went through the Publish process in VS2005) 3. I manually copied the database's .mdf and .mdl files, using SQL Server Managers 'Copy Database' function, then transferred the copies to the new machine into the ..MSSQL.1MSSQLdata folder (where they appear along with the master.mdg, mastlog.ldf etc files)
Now, the DAYTRACKER application's DAYTRACKERConnectionString under 'Settings' in the VS2005 studio reads 'Data Source=DELL3;Initial Catalog=DayTracker;Integrated Security=True' (which are the appropriate parameters for the machine, DELL3, on which I wrote the program.)
The problem, of course, is that SQL Express on the new machine doesn't connect the application to the database. When I go to the 'SQL Server Configuration Manager' and go to the 'SQL Server 2005 Services' and double-click on the 'SQL Server (SQLEXPRESS)' icon (the service is running) and the user is logged on using 'Local System Account'. Under the 'Service' tab the Host Name is 'MUSIC' (which is the name of the new machine I've installed the app onto -- which of course is not the name - DELL3 - that the app's connection string is expecting). Under the 'Advanced' tab, I've tried correcting the name of the Startup Parameters default .mdf and .mdl entries to ..DayTracker.mdf and ..DayTracker_log.mdl, but the server won't start up after I make the changes.
What I'm hoping for: a step-by-step way of doing this type of deployment, preferable getting it all onto one CD-ROM, and installing it on the new machine so that it all works seamlessly from the start, not requiring any 'tweaking' of the SQLServer Express settings by the end-user.
But I'll take pretty much anything that fixes the specific db connectivity problem I've described.
Hi, I have to transport a big database table and can't read it at once with "select * from table" because the table is bigger than my system memory. Is there a way to read the table step by step? I thought it was possible with ADO and his serverside cursors but I don't now how. I need an "universal" solution that works on SQL Server 2000/2005, MySQL and Oracle.
Connecting to a networked SQL Server Box from my local machine
Open Query Analyzer from Start menu, logging in using sa account.
from the object browser i select my stored procedure (WEA_InsertClaim) - right click and select Debug.
i am prompted to enter the parameter values, which i do, auto rollback checkbox is checked - click Execute.
T-SQL Debugger opens and runs through the stored procedure.
but only buttons enabled are the "Go", "Toggle Breakpoint", "Clear All Breakpoints"
so i can set breakpoints etc. but when i select Go it will not stop at the breakpoints it just runs through the stored procedure from start to finish. giving the correct return code as its output
is there something i need to enable in order to make it stop at breakpoints??
Is there a good step by step guide to setting up an indirect configuration? I've followed the steps in Kirk Haselden's 'Keep your packages in the dark' article 3 or 4 times this afternoon and nothing seems to work. I cannot even get the XML configuration file to work by accessing it directly. I'm using the wizard to create the file.
Just wondering if there was something else out there....
I posted this on the sql server security forums too. Here is the error that i get when i change the type of the step to ssis
Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum) Additional information: An exception occured while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo) The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction.(Microsoft SQL Server, Error: 3930)
This error pops up right after i change the type of the step to "SQL Server Intergration Services Package"
I have made the following configurations:
The user group (windows group) that the user belongs has the following roles in msdb :
i have made a proxy to sql server agent which has the following subsystems :
"SQL Server Integration Services Provider" the proxy is tied to the same login which has those SQLagent and dts roles in msdb database.
Im using windows authentication and the user that logs into the sql server is in the same group that i have set all of the rights.
Ps. Clearly im missing some role or right somewhere because as soon as i give the group sysadmin role then all the users in that group can create SSIS steps in the agent.
Ps. Ps. I have been living under the impression that i dont have to give sysadmin rights to people that create ssis packages and schedule then with the agent.
I am trying to learn Reporting Services using the title "MS SQL Server 2005 Reporting Services Step by Step" by Stacia Misner and Hitachi Consulting, published in 2006. I am experiencing problems with some of the exercises. I got as far as Chapter 4 when I followed directions to create a SQL statement to define a query string for a dataset. Pg 80:
select * from vProductProfitability
where year = 2003 and
MonthNumberOfYear = 1
The view vProductProfitability does not exist in the tutorial database that came with the book, rs2005sbsDW. The result of this query is the basis for the entire chapter on developing basic reports and I'm being denied a learning opportunity because the view does not exist. In short, I'm stuck.
I have tried to find somewhere at Microsoft to place this question and get some answers so I can continue thru the tutorial. To no avail. Does anyone have any suggestions?
BTW, the solution that came in the CD is also wrong because the query noted above is also in the solution.
Dear Experts, please guide me for replcation..... i'm using sql server 2005 server tools developer edition. my OS is professional 2000. and i've sqlserver 2000 client tools also
i've two databases in my machine. publisher is in different instance, and the transactions might be maximum 50 per day. my aim is when ever developer enters the data into the main database, automatically it should be updated on the my two databases also. i mean the server is sysA and the database is srtp. my machine is sys20 and the databases are srtp1, srtp2. how should i make sysA as publisher? i've right click on the server databases, but it is showing newsubscriptions option only.
Vinod Even you learn 1%, Learn it with 100% confidence.
I need some some clarifications on how Context connections and transactions inter operate in CLR.
The context connection allows for ADO objects to be " running in the same transaction space". So the association to the current transaction is implied. So as long as I set for example my SqlCommand to use the context connection I am going to be running under the same transaction.
SqlConnection sqlConn = new SqlConnection("context connection=true"); SqlCommand sqlComm = new SqlCommand("EXEC myCommand", sqlConn);
I guess my ambiguity comes from the fact that the Transaction is not specifically specified.
In addition what happens upon a trigger that for example watches and insert on a table? If the insert occurs under a transaction, I would assume that I will be also picking up that transaction in the CRL Trigger, thus the whole operation would seem atomic.
Hi,I have a problem. When my computer loses connection toserver during transaction it is rolled back automaticlyby server after 2-5 minutes. During the time a can resetmy computer, run my program again and I can seedata from not commited transaction becouse I have toset isolation level to read uncommited.How to force the server immediately rollback transaction?TIAclint
I have some tables that I need to wipeout and reload once per day. So I have created a T-SQL task that does a truncate table, then a data flow task, then a update statistics task. Running those works fine, but as soon as I put them all into a sequence container and set the container to require transactions, the first step gets a connection failure. "Failed to acquire connection "<name>". Connection may not be configured correctly or you may not have the right permissions on this connection". I am using a .Net SQLClient data provider for the T-SQL task and in the data flow task I have to use a OLEDB provider so that I can run the task locally in development.
Is there something I am doing wrong or is there some other way to handle truncate, load, update stats all in a transaction?
Intel D975XBX 930 2GB ram HD 300GB - MatroxVD Windows XP Sp2 + updates Visual Studio 2005 Professional edition SQL Sever 2005 Standard Edition
20070203 Install Practice Files at D:Program FilesMicrosoft PressVisual C Sharp Step by Step.
Configuring SQL Sever Express Edition Hostname - xxxxxxxxx..
At sqlcmd s xxxxx...SQLExpress E got - Pipes error.
Open Microsoft SQL Sever 2005 - Configuration Tools SQL Configuration Manager select Protcols for SQLEXPRESS select Named Pipes enable close.
At sqlcmd s xxxxx..SQLExpress E get Sqlcmd: : Unknown Option. Enter -? for help.
Chdir C:Documents and SettingsAll UsersStart MenuProgramsMicrosoft Server 2005 Sqlcmd s xxxxxx..SqlExpress E Hresult 0x2, Lvel 16, State 1 Named Pipes Provider: Couldnot open a connection to SQL Server [2]. Sqlcmd: Error: Microsoft SQL Native Client : An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections.. Sqlcmd: Error: Microsoft SQL Native Client : Login timeout expired.
How do I fix this?
Need to be able to to use the Northwind Traders data base
Does anybody know what it means? The context is: a package with a single Data Flow finishes with this error: [Connection manager "xxx"] Error: The connection manager failed to defect from the transaction. All components inside the data flow reports a successfull green status, however the DataFlow Task remains yellow for a long time, when suddenly the package stops with the above message. Another information is that the table remains locked during package execution (a select on that table does not complete...). This is a serious issue, and I couldnt find any relevant information about that. Thanks in advance.
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
I am getting this error :Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.have anybody idea?!
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
I have a simple SSIS package with three "Execute SQL Tasks". I am using ADO.Net Connection to execute SPs on a DB server.
When I execute this package It works fine. So far so good.
Now, I need to implement transation on this package. And problem starts now onwards. When I try to execute package after setting TransationOption = Required for the Sequence container which contains all the tasks, I get following error.
[Execute SQL Task] Error: Failed to acquire connection "NYCDB0008.Export". Connection may not be configured correctly or you may not have the right permissions on this connection.
"NYCDB0008.Export" is the name of the ADO.Net connection. I have been hunting for any solution but all in vain. I have tried changing all DTC settings on the dev as well as Database server.