In my shared hosting environment where I am making large volumes of database calls, I am very anxious to preserve my heavily throttled threads-I presume when I make a database call, that it is going to a separate server and any threads that server may be using are in addition to my " hosting plan ".
My question (s) is, is it therefore a no-brainer to always make calls to:Dim oResult As IAsyncResult = iDBCmd.BeginExecuteReader()
arWaitHandles(0) = oResult.AsyncWaitHandle
iIndex = WaitHandle.WaitAny(arWaitHandles, 60000, False)
i.e. run all calls to the database asynchronously?
Is it really releasing a thread back to my asp.net App while it is waiting? (I.e. Is it worth doing)
I'm assuming it's possible to call a web service asynchronously from a SQL CLR function, is this correct or is there a fundamental flaw in calling the asynchronous methods?
I've tried coding a sample and despite creating a new thread, which in turn calls the web service asynchronously and using newthread.join I can never get the webmethodasync_completed sub to be called.
I am using SQL Server 2005 and I have an endpoint that exposes some stored procedures as web-methods in the endpoint.
One particular stored procedure I have exposed takes a long time to execute: about 10 - 15 minutes. While, it is OK, that this stored procedure takes this long, it is not desirable for the HTTP Request that executed this proc to not wait for that long.
What I want to be able to do is to call the stored procedure and have the call return immidetaly but the stored proc continues what its doing. I will call another stored proc at a later time to retrive the result of the first stored proc. The first proc will store its results in a temp table. I am thinking of using SQL Server Service Broker to achieve this.
Is there a better a way to achieve this? And how does SQL Server process the Service Broker requests, i.e., I dont want the query to be executed when the server is busy. Are there any hints that I need to give to Service Broker to be able to do this?
I have a lengthy stored procedure that does archiving that I'd just like to start and let it run, quickly returning control back to the ASP.net web page. To accomplish this my research shows that I should be passing the "adAsyncExecute" option but all of the examples I've been able to find don't jive with the way I execute stored procedures. Shown below is the code I'm using. How would I introduce "adAsyncExecute" ? public static int DoStoredProcedure(SqlConnection aConn, string procName, params SqlParameter[] theParams) { int retVal = 0; SqlCommand aComm = new SqlCommand(procName, aConn); aComm.CommandType = CommandType.StoredProcedure; foreach(SqlParameter aParam in theParams) { aComm.Parameters.Add(aParam); } SqlParameter returnValue = aComm.Parameters.Add("@ReturnValue", SqlDbType.Int); returnValue.Direction = ParameterDirection.ReturnValue; try { aConn.Open(); aComm.ExecuteNonQuery(); retVal = (int) returnValue.Value; } catch(Exception ex) { Debug.Fail("Error running SQL Query: " + ex.Message, "DBTools.DoStoredProcedure"); throw; } finally { aConn.Close(); } return retVal; }
I'm working with ADO 2.8 en C++ with Visual Studio 2005. I want to perform a "select" in asynchronous mode. I don't really understand the logical of the recordset events. For example, I received a number of MoveComplete event higher than the number of rows in my recordset. It is really not clear for me ...
Does someone knows where I can find a a good example of C++ (or VB) code to manage select statements in asynchronous mode ?
Within a stored procedure, is it possible to call multiple other stored procedures asychronously? For example, I'd like to execute both local and remote stored procedures, but don't want/need to wait for the output while the original stored procedure continues to execute each subsequent command.
I have a checksum calculation as a persisted, indexed computed column on a temporary table that I used to compare against original records to detect changes.
It seems that the update/ insert statements in my procs get out of sync on larger tables (500,000 rows +) with the checksum calculations. The only thing I can think of is that the column calculations are performed asynchronously in relation to the updates/ inserts. This is a problem for me.
Is my assumption correct? If it is, how can I adjust for this, i.e., force the computations to be performed synchronously or wait for the computations to complete before running comparisons?
Is it possible to execute a child package from a parent package ashnchronously.
I have a SQL Server table containing a list of packages to execute. I want to create a master package that will query this table, and execute each of the packages asynchronously.
I've tried using the Execute Package task and also executing packages programtically from a Script task, but these only seem to work synchronously.
I have also seen suggestions about executing packages asynchronously from T-SQL by starting a job, however, I'd rather not have to dynamically create jobs for each package I want to execute.
Any ideas would be welcome. Or even an answer of "No this is not possible".
I have a checksum calculation as a persisted, indexed computed column on a temporary table that I used to compare against original records to detect changes.
It seems that the update/ insert statements in my procs get out of sync on larger tables (500,000 rows +) with the checksum calculations. The only thing I can think of is that the column calculations are performed asynchronously in relation to the updates/ inserts. This is a problem for me.
Is my assumption correct? If it is, how can I adjust for this, i.e., force the computations to be performed synchronously or wait for the computations to complete before running comparisons?
if service broker is not an option, how can the trigger and sp(s) called by our trigger act asynchronously to the event that fired the trigger in the first place. We are more concerned with the original event being committed than the actions that follow via the trigger.
I have about 30 different reports that I want to pull into a dashboard. I need to make sure that they don't execute in serial to get good performance.
There are two ways I can approach it
1) I can create a stored procedure for each report and then make sync calls for each of the reports from the web site. So, basically this will be controlled from the web end.
2) I can do this from the SQL Server database, if there is someway to execute these stored procedures in parallel.
I am calling a stored procedure (say X) and from that stored procedure (i mean X) i want to call another stored procedure (say Y)asynchoronoulsy. Once stored procedure X is completed then i want to return execution to main program. In background, Stored procedure Y will contiue his work. Please let me know how to do that using SQL Server 2000 and ASP.NET 2.
Hi all, i have a DTS package that i call from a C# app, i had it working great until i decided to use an activeX script to do the data transformations instead of the row copy. I need to use ActiveX to add a standard name to the last column in the destination table. the problem is the task is executing without errors (from c#) but nothing is happening, its failing silently. If i modify the Data Transformation back to a standard column mapping (with separate DTSTransformations for each column) it works fine, but as soon as i use activeX to handle the transformations it doesn't work. Can anyone tell me what i may be doing wrong. heres the calling code from C# if(f.Name.Substring(13,7).ToLower() == "product") { try { activity.Log("Starting Product DTS Package..."); DTS.Package2Class package = new DTS.Package2Class(); object pVarPersistStgOfHost = null; package.LoadFromSQLServer( "192.168.8.8", "username", "thepassword", Microsoft.SqlServer.DTSPkg80.DTSSQLServerStorageFlags.DTSSQLStgFlag_Default, null, null, null, (string)ConfigurationSettings.AppSettings["productDTSPackage"], ref pVarPersistStgOfHost); package.GlobalVariables.Item(1).Value = f.FullName.ToString(); package.Execute(); package.UnInitialize(); //force release of COM object System.Runtime.InteropServices.Marshal.ReleaseComObject(package); package = null; } catch(Exception e) { activity.Log(string.Format("Failureprocessing {0}",WorkingPath + f.Name) +" - "+ e.Message); } and the activex i tried to use for the transformations is: '**********************************************************************' Visual Basic Transformation Script'************************************************************************' Copy each source column to the destination columnFunction Main() DTSDestination("Yesmail Id") = DTSSource("product series") DTSDestination("Customer CKM CustId") = DTSSource("product family") DTSDestination("Product SKU") = DTSSource("transaction date") DTSDestination("product model name") = DTSSource("serial number") DTSDestination("serial number") = DTSSource("product model name") DTSDestination("transaction date") = DTSSource("Product SKU") DTSDestination("product family") = DTSSource("Customer CKM CustId") DTSDestination("product series") = DTSSource("Yesmail Id") DTSDestination("fileName") = DTSGlobalVariables("sourceFile").Value <-- this is why im using the activex for this field to get a Global Variable. Main = DTSTransformStat_OKEnd Functionthe weird thing is the PACKAGE runs fine from ENTERPRISE MANAGER with this activex, it just doesn't do it from my Calling app, perhaps i have missed something i need to change in the package constructor? BTW: i do have my assembly signed in C# for the COM wrapper.thanks in advance, mcm
Hello, I have done extensive work with Classic ASP for 9 years now. Working with Stored Proceedures etc However now am working in c# am adviced by a friend that its best to use the 1. The "Object data source" to call SP 2. Are there any documentations pointers on best practices, how its done ? I want to use the "on click event" in my class file. thanks Ehi
Hello, are there any sample codes that show you how to execute a SP in .net 1. Using a class 2. Calling the class in the onlick_button function ? thanks Ehi
I'm developing a new Stored Proc that will be taking information in to enter customer and order info. Looking at the current Stored Proc its using alot of If statements and I'm thinking of breaking the sp up into different Stored Procs and calling them all from One main stored proc. I know how to do this but I was wondering how it would effect performance. So should I simply keep all the stuff in one proc or module it out into multiple ones to make it easier to follow and read?
I'm working in SQL Server 2005. I have an existing SP that does exactly what I need (it's the aspnet_UsersInRoles_IsUserInRole SP). I want to reuse this SP and use it's return value as a field in the SP I'm writing. How do I go about doing this? I could take the logic out of the called SP and wrap it in a function; but, I would really like to reuse the SP aspnet_UsersInRoles_IsUserInRole. Thanks in advance for any assistance.
hi, i'm trying to call a dts package from vb.net. i got 2 examples which both don't work. first one gives me a [DBNETLIB][ConnectionOpen(Connect()).]SQL Server does not exist or access denied error. source code:Dim serverName As String = "SERVERNAME"Dim oPackage As New DTS.Package()Dim oStep As DTS.StepDim pVarPersistStgOfHost As Object = Nothing oPackage.LoadFromSQLServer(serverName, "USERID", "PASSWORD", DTSSQLServerStorageFlags.DTSSQLStgFlag_Default, _ "DTSPASSWORD", Nothing, Nothing, "DTSPACKAGENAME", pVarPersistStgOfHost) For Each oStep In oPackage.Steps oStep.ExecuteInMainThread = TrueNext oPackage.Execute() Dim err As LongDim source, description, message As StringFor Each oStep In oPackage.StepsIf oStep.ExecutionResult = DTSStepExecResult.DTSStepExecResult_Failure Then oStep.GetExecutionErrorInfo(err, source, description) message = String.Format("ErrorCode: {0}, Source: {1}, Description: {2}", err.ToString(), source, description)Else message = "Success"End IfNext oPackage.UnInitialize()oPackage = Nothing second example tries to create a dts package dynamically. this time i get the error that the CustomTask can not be casted, somehting about the QueryInterface. source code:Dim oPackage As New Package2()Dim oConnection As Connection2Dim oStep As Step2Dim oTask As TaskDim oCustomTask As BulkInsertTask oConnection = oPackage.Connections.[New]("SQLOLEDB")oStep = oPackage.Steps.[New]()oTask = oPackage.Tasks.[New]("DTSBulkInsertTask")oCustomTask = CType(oTask.CustomTask, BulkInsertTask) <-- error With oConnection .Catalog = "CATALOG" .DataSource = "SERVERNAME" .ID = 1 .UseTrustedConnection = True .UserID = "USERID" .Password = "PASSWORD"End With oPackage.Connections.Add(oConnection)oConnection = Nothing With oStep .Name = "InsertGemal" .ExecuteInMainThread = TrueEnd With With oCustomTask .Name = "InsertGemal" .DataFile = "D:ImportGemal.dat" .ConnectionID = 1 .DestinationTableName = "Gemal" .FieldTerminator = ";" .RowTerminator = " "End With oStep.TaskName = oCustomTask.Name With oPackage .Steps.Add(oStep) .Tasks.Add(oTask) .FailOnError = TrueEnd With oPackage.Execute() oPackage.UnInitialize()oPackage = Nothing any help highly appreciated! t.i.a.,ratjetoes.
Hello, I have the class below. And trying to execute it on a button click event. What am i doing wrong ? Thanks Here is the button click event1 protected void Button1_Click(object sender, EventArgs e) 2 { 3 signup_data_entry signup = new signup_data_entry(); 4 signup.signup_data_entry(); 5 6 }
Here is my class file. please advice 1 public class signup_data_entry 2 { 3 public signup_data_entry() 4 { 5 //SqlConnection con = new SqlConnection("cellulant_ConnectionString"); 6 SqlConnection con = new SqlConnection(ConfigurationManager.AppSettings["cellulant_ConnectionString"]); 7 8 SqlCommand command = new SqlCommand("Cellulant_Users_registration", con); 9 command.CommandType = CommandType.StoredProcedure; 10 11 con.Open(); 12 13 //string IP = new string(); 14 15 16 command.Parameters.Add(new SqlParameter("@RegionID", SqlDbType.Int, 0, "RegionID")); 17 command.Parameters.Add(new SqlParameter("@RegionDescription", SqlDbType.NChar, 50, "RegionDescription")); 18 19 command.Parameters[0].Value = 4; 20 command.Parameters[1].Value = "SouthEast"; 21 22 int i = command.ExecuteNonQuery(); 23 24 25 } 26 }
I have problem in calling the SP from ASP.NET application @DefApp nvarchar(255)= '' ,@DefBusFunction nvarchar(255)= '' ,@DefImpact nvarchar(255) = '',AS Begin declare @sql nvarchar(4000)declare @whereClause nvarchar(4000)DECLARE @return_value intdeclare @sqlWhere nvarchar(4000) select @sql = 'SELECT DefApp, DefBusFunction, DefImpact FROM Def LEFT JOIN ZFunction ON (def.DefApp = ZFunction.App) AND (Def.DefBusFunction = ZFunction.BusFunction)' if @DefImpact <> '' Set @whereClause = ' where DefImpact ' SET @whereClause = @whereClause + ' = ''' + @DefImpact + '''' set @sqlWhere=@sql + @WhereClauseEXEC @sqlWhereEnd and i am calling this SP from my ASP.net application to fill the SSRS report. I have written code in ASP.NET like: sqlCmd = new SqlCommand("subbusample", conn); sqlCmd.CommandType = CommandType.StoredProcedure;sqlCmd.Parameters.Add(new SqlParameter("@DefImpact", SqlDbType.NVarChar, 255, txtValue3.Text.ToString())); sqlCmd.Parameters.Add(new SqlParameter("@DefBusFunction", SqlDbType.NVarChar, 255, txtValue1.Text.ToString())); sqlCmd.Parameters.Add(new SqlParameter("@DefApp", SqlDbType.NVarChar, 255, txtValue2.Text.ToString())); RptViewer.ProcessingMode = Microsoft.Reporting.WebForms.ProcessingMode.Remote; RptViewer.ServerReport.ReportServerUrl = new System.Uri("http://servername/ReportServer");RptViewer.ShowParameterPrompts = false; RptViewer.ServerReport.ReportPath = "/folder name/Subbu_Sample";
when i execute it, i am not able to fill the report with data.
I'm new to using DTS packages and I'm running into a problem. Hopefully someone can help me out. I have an ASP.NET page that needs to call a DTS package. Everything seems to be working right up until I call opkg.Execute. The package seems to run but nothing happens. After further investigation I have found what looks like a permissions issue. this is the message I'm getting" Error Number: -2147467259 {Integer}
Error Description: "The Microsoft Jet database engine cannot open the file ''. It is already opened exclusively by another user, or you need permission to view its data." {String}
This is obviously a permissions issue. I'm just not sure who the DTS package is running as and where to set the permissions.
If you have a begin tran in a sp that calls another sp and that called sp fails. can you execute a roll back from the called sp, or should it return and error which will roll back the begin tran in the original sp
Hi all, I want to call dts package from visual basic. The dts package will do a backup of the database. I am not sure how this can be achieved. Any help will be great Thanks
There is no inherent mechanism available in SQL Server (replication, log shipping, or clustering) which allows you to load balance you database server. Clustering is only useful for a failover situation and does not allow active/active balancing.
Is is possible to use merge replication between two identical OLTP servers and manage transactions via MSMQ? Will this mechanism allow for a load balanced OLTP server?
Will this work? If not, why not? What will work? Will federated servers work for an entire database??