i tried to use xp_cmdshell in order to execute a vbscript from a trigger and it works, but i notice that trigger wait until the vbscript was terminated. i do some things into vbscript, so i can't wait until the end. There is a way to don't wait until the end, in practice, run vbscript in asynchronous mode??
Durning install I selected Window's Authentication only, but now it seems we may need to use a Mixed Mode with an SA account etc... is there anyway to switch SQL 2005 to use Mixed Mode after the fact?
We have reports in SharePoint integrated mode which are really slow when compared to native mode. I have been asked to research and give info on what exactly causes the delays.
Any articles which give me information as to what happens when a report is run from SharePoint server and where does it log.
There is a query which when executed in the grid mode(ctrl+d) takes approx 0.02 seconds(about 21,000 rows) But when I execute in the text mode, it takes about 0.40 seconds!! Why is this difference? Also, when the records from this table are read from a VB application, they are equally slow (as in the text mode!) Why is it so slow on the text mode & relatively faster in the grid mode? Has anyone got any idea on ‘Firehose’ style cursor ?(which may speed up access of data in the VB application)
Recently I read such statments 'When SQL Server is run in "lightweight pooling" mode (fiber mode) and the DTC service is started, unexpected behavior may occur.' Can someone say anything about fibe mode?I am appreciated for it.:)
Currently, our Report Builder is running on Local Intranet mode. I'm investigating what the security implications are in changing it to Internet mode. However, I've not been able to find any documentation on this.
Does anyone know of any documentation that addresses this issue or have experience that they can share with changing Report Builder security zone from Intranet mode to Internet mode?
I have a SQL 2005 OTC. CTP version running on Windows 2003 server.
I would like to find out how the SQL server option changed to Windows Authentication mode from Mixed mode over the weekend. From the SQL log, I don't see when it changed. I would like to see Date/Time and client IP. If I can see User ID (windows) that would be great. Where I can find these info in SQL server?
I have a Web application with reports which lasts for a long time, for it I reflected to make call Asynchronous for reports (that is I execute the report and warned the user when the report is ready). I found an example which supplied with SQL Server, but as I am novice C# I understands not everything in the example ( AsynchronousRenderCS).
Hi all, Can anybody tell me if the have had any luck on creating and processing asychronous cursors. According to Microsoft SQL 7.0 books online after you create your Async cursor and then display the variable @@cursor_rows it should display either a negative number like -1245 meaning that it is still loading or a whole number meaning that it finish loading, but every time I display this variable I get -1 and according to MSSQL documentation this means I'm creating a Synchronous cursor, I have modified the cursor threshold settings, declared my cursor INSENSITIVE, and still can't get a cursor to be Async.
What I am looking to do is have a stored procedure begin a dialog with my request service. With that dialog established my stored procedure sends 50 request messages, one for each of the 50 of the United States. I want these to be processed asynchronously by a procedure that is called on activation for the request queue. In that activation procedure the request is processed against the respective state and a response message is sent to the response service (to the response queue). I want to be able to tie these request messages and response messages together with some type of shared identifier. These requests don't need to be processed in any specific order and don't need any fancy locking mechanism via conversation group since these requests require to be processed asynchronously. What is the best approach? Do I need to create 50 seperate queues and open dialogs with each? If this is the route to take, would this be a performance hit?
My goal is to have all 50 states process all at once, each finishing with a response message sent to the response queue. The initiating procedure, after sending these 50 requests, would then spin and wait for all 50 to complete, be it a response, error, or timeout. When all 50 have returned, the procedure would then merge the results and return. So as you can see in my scenario, I dont care when a state is complete as they do not affect the outcome, nor do they access any of the same resources when being processed.
We're looking for a solution to an audit trail issue. Our business people are looking to track each column value that changes(before and after images) for every table on our database as well as the userid that changed the data and when it was changed. Are there any methods that have been employed by other sites to track this level of detailed changes without resorting to triggers for each table and has anyone worked out a way for this audit trail writing to be handled asynchronously within SQL Server?
Hi--done some searching, but I am not finding exactly what I need. I am using an asynchronous script component as a lookup since my table I am looking up on requires an ODBC connection. Here is what my data looks like:
From an Excel connection:
Order Number
123
234
345
The table I want to do a lookup on has multiple rows for each order number, as well as a lot of rows that aren't in my first table:
Order Number Description
123 Upgrade to System
123 Freight
123 Spare Parts
234 Upgrade to System
234 Freight
234 Spare Parts
778 Another thing
889 Yet more stuff
etc. My desired result would be to pull all the items from table two that match on Order Number from table one. My actual results from the script I have is a single (random) row from table two for each item in table one.....So my current results look like:
Order Number Description
123 Freight
234 Freight
345 Null
And I want:
Order Number Description
123 Upgrade to System
123 Freight
123 Spare Parts
234 Upgrade to System
234 Freight
234 Spare Parts
345 Null
etc.... Here is my code, courtesy of half a dozen samples found here and elsewhere...
odbcCmd = New OdbcCommand("SELECT F4211.SDDSC1, F4211.SDDOCO FROM DB.F4211 F4211 Where F4201.SHDOCO = ?", odbcConn)
odbcParam = New OdbcParameter("1", OdbcType.Int) odbcCmd.Parameters.Add(odbcParam)
End Sub
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Dim reader As Odbc.OdbcDataReader odbcCmd.Parameters("1").Value = Row.SO odbcCmd.ExecuteNonQuery() reader = odbcCmd.ExecuteReader() If reader.Read() Then
With Output0Buffer .AddRow() .SDDSC1 = reader("SDDSC1").ToString .SONumb = Row.SO .SOJDE = CDec(reader("SDDOCO")) End With
End If
reader.Close()
End Sub
Public Overrides Sub ReleaseConnections() connMgr.ReleaseConnection(odbcConn) End Sub
End Class
I just don't know what I need to do to get every row from F4211 where SDDOCO matches Row.SO instead of a single row...... Any ideas or help? Oh, the reason I am starting with my Excel connection is that sheet lists the Orders I need detailed data for, and is only a few hundred rows....F4211 is really really really big.
I have also worked out an alternate way to do this using merge join tasks...but then my datareader source goes off and fetches 300,000 rows from F4211 before my final result set of about 1200 rows. That just feels like a bad approach to me...or am I being over-cautious? I'm a newb (if you couldn't already tell)...so guidence is appreciated.
I noticed that the current SLQCe driver does not offer support for the APM(Asynchronous Programming Model). Are there any plans to do this in the future? In light of the lack of APM functionality doe anyone have any ideas or thoughts on how async operations could be done, or if they are even needed in the context of applications that use SQL Ce
If you have a data file, and you only want CERTAIN rows to pass to the destination, ie) a table
and you are using a script task to accomplish this,
is this a synchronous or asynchronous transformation?
Q. And how do you assign the values to the output? Do you have to create output columns, or not?
I am very very confused right now. I can't seem to find a decent answer to what is a very basic question either in my SSIS book or in the documenation. Perhaps it is so basic, that the question doesn't seem valid? I don't know. But I just don't understand this at all.
I have a tough situation of trying to execute multiple instance of same package, to reduce the process load times.
Introduction:
We have src system which get 7000 tiny files of 72 rows each, and the SSIS package uses For Each Loop task and iterates through each file and loads data. We have a Process table that keeps track of the status of the SRC Process & ETL Load Process.
When the src process starts, For each row in the process table, it assigns a row status 'Assigned' brings in the flat file of 72 rows & updates the status as 'Complete'. When the ETL starts, for each file in the shared directory, it assigns status 'Running' and loads the data and updates status 'Complete'. Then the file is moved to different processes folder. Its like the bridge table between the 2 processes.
Bride Table Format: Table_PK(identity col), (DATE, City) is natural key, it is a cross join of date & City, so the process is getting 1 file every day for 1 city. Initial status are both 'Queued'
Since the bridge table is prepopulated, the src process(which is on Unix) starts multiple threads and gets files with in 30 minutes. But the SSIS is serial process & takes 2 -3 hrs to load the files, most of the time is taken by file operations and SSIS can only start only 1 thread. Future Plan:
So to bring down the processing times, we wanted to start the SSIS packages in the Bridge table instead of starting in the share folder. i.e. for each row in the bridge where SRCProcess is Complete & ETLProcess Queued, start the SSIS process for this src file. Since our SRC files are names as "CityName_Date.csv" it will not be difficult. So we wanted to start multiple threads, that way the load process will be fast. Implementation:
In the T-SQL loop we wanted to use 'xp_cmdshell' and call DTEXEC utility with the src file name as variable. But the DTEXEC is a synchronous process, but I am looking for a way to implement this asyncronously. Just like using "nohup executionscript &" in unix. So any ideas on how to implement this, I looked on the web, and there is some thing about service broker, but all it says is about sending messages & queuing. Any light on how to implement this on windows server is going to be a life saver. Thanks a lot, Venkat
I am evaluating the possibility of replicating a database over a network to our HQ from the control site (one way). The original database is on SQL server.
The database is likely to grow to many terabytes so we would be using transactional replication.
The table to be replicated recieves about 500 records per second. The table will probably consist of a record key (8 byte int), site ID (4 byte int), reading (8-byte float), and timestamp (8 byte timestamp). All up, 28 bytes + whatever overhead exists.
MINOR DETAIL: The HQ's copy should be preferably no more than 1 hour behind the control site's. This would be a long-term setup that would last for many years. Our link is currently about 2 MB/s, and is fairly reliable.
QUESTIONS: I'm guessing that (bytes/record)*(records/second) won't be the whole story. Does anyone have an estimate of the average data efficiency factor for transactional replication? How would I go about calculating how much bandwidth would be needed? Is there a formulae hiding somewhere?
Greetings, I have a requirement for a SQL Server 2005 database to notify 3rd Party applications of a table change. As is stands now, 3rd Party applications access the database via a Data Access Layer (DAL) dll (C#). I'd like to somehow implement an asychronous event notification scheme via the same DAL to these clients.
Can someone offer a clever way to implement such events?
Broker Services? I am under the impression the SSBS is typically deployed when databases are to communicate with one another.
The application that I'm working is has a job that runs 24/7 which checks a table for rows with specific status, if it finds it, then I has to execute a job for each row that it finds. I don't know wheather if SQL Server jobs are asynchronous or not, since I need the execution of jobs to be in parallel instead of in a pipeline mode.
Porting an existing SQL 2k DTS job over to a SQL 2k5 SQL Server running SSIS.
Background: The job loads data into an empty work table and performs some work before clearing out the work table. This job runs every minute.
Question: If the job happens to take longer than a minute, does SSIS create a second instance of the job? Or perhaps it does what DTS did and reschedules the job for the next iteration?
Concern: I need to know because there would be key contraint violations if another instance of the job started before the working table was cleared out.
Can anyone please tell me what happens if I have Asynchronous mirroring setup and my Primary server physically dies and not available then what happens?. Does
1. Automatic failover occur to Secondary server? 2. What does the Database state show as. Primary, disconnected?. 3. what happens to my transactions. Are they lost? 4. Does any data loss occur?
If I rebuild a new server how do I sync back my current primary to the new one? In that case is it going to be just a fail back?
Hi, Im new to this list and after many days of trying to figure this out-here we go Can you please tell me where I€™m going wrong in my asynchronous script component? I€™m almost there but by using to variable iReadingCount to add the row is not quite correct. There has to be a better way !!! Thanks in advance Dave
I have to process a data from a staging table this which has imported a data in a structure like this, each line has a tag which is a fieldname <MyName > followed by the value <Advice Note Number> is the Tag that tells me it is the start of the next record with the only gotca is there may be up to six <Contractor Billing> Tags in one record.
Tag Val1 Val2 <Advice Note Number> 1374239 <Customer Names> My Name <Customer Address> My Address <Completion Date Time> 2005/11/25 16:30:00 <Service Order Number> 123456 <Phone Number> 999535431 <Telephone Exchange> MNG <Contractor ID> Fabrikan <Service Order Type> F3 <Contract ID> 41 <Comments> 1 2 <Contractor Billing> 165 1 <Contractor Billing> 167 1 <Customer Signature> NO <Advice Note Number> 1374240 <Customer Names> My Name <Customer Address> My Address <Completion Date Time> 2005/11/25 16:30:00 <Service Order Number> 123456 <Phone Number> 999535431 <Telephone Exchange> MNG <Contractor ID> Fabrikan <Service Order Type> F3 <Contract ID> 41 <Comments> 1 2 <Contractor Billing> 165 1 <Customer Signature> NO
So I need a asynchronous script component ( Setting SynchronousInputID=0 turns your component into an asynchronous component - thus giving you access to the output buffer.) Because I need to map this data structure like this
Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Public Class ScriptMain Inherits UserComponent Dim iReadingCount As Integer = 0 Dim Comments1 As String Dim Comments2 As String Dim Comments3 As String
Dim AdviceNoteNumber As Integer Dim CustomerNames As String Dim CustomerAddress As String Dim ArrivalDateTime As Date Dim CompletionDateTime As Date Dim ServiceOrderNumber As String Dim PhoneNumber As String Dim TelephoneExchange As String Dim ContractorID As String Dim ServiceOrderType As String Dim ContractID As String Dim Comments As String Dim ContractorBilling As String Dim ContractorBillingQuantity As Integer Dim ContractorBilling2 As String Dim ContractorBillingQuantity2 As Integer Dim ContractorBilling3 As String Dim ContractorBillingQuantity3 As Integer Dim ContractorBilling4 As String Dim ContractorBillingQuantity4 As Integer Dim ContractorBilling5 As String Dim ContractorBillingQuantity5 As Integer Dim ContractorBilling6 As String Dim ContractorBillingQuantity6 As Integer Dim ApprovalCode As String Dim TelecomRejectReason As String Dim ContractorRejectResponse As String Dim CustomerSignature As String Dim ReceivedOnTime As String
'Public Overrides Sub CreateNewOutputRows() 'End Sub
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer) Try
If StrConv(Row.Tag, VbStrConv.ProperCase) = "<Advice Note Number>" Then AdviceNoteNumber = CInt(Trim(Row.Val1)) 'Increase the reading count by 1 iReadingCount += 1
ElseIf StrConv(Row.Tag, VbStrConv.ProperCase) = "<Comments>" Then Comments1 = Left(Trim(Row.Val1), 160) Comments2 = Left(Trim(Row.Val2), 160) Comments3 = Left(Trim(Row.Val3), 160) 'One Line If Len(Comments1) > 1 And Len(Comments2) = 1 And Len(Comments3) = 1 Then Comments = Comments1 End If 'Two Lines If Len(Comments1) > 1 And Len(Comments2) > 1 And Len(Comments3) = 1 Then Comments = Comments1 & " " & Comments2 End If 'Three Lines If Len(Comments1) > 1 And Len(Comments2) > 1 And Len(Comments3) > 1 Then Comments = Comments1 & " " & Comments2 & " " & Comments3 End If
I'm creating a script component that reads from an OLEDB source and writes to an OLEDB destination.
For every input row, I need to output several rows.I tried using the Row.DirectRowToOutput0() method inside a loop in the
Input0_ProcessInputRow routine but that's not working. Should I be using Addrow() instead? If I use Addrow() does this mean it needs to be an asynchronous transformation?
I remember seeing a blog entry (Jamie's?) that did almost exactly what I wanted, but I can't find it now.
Well my 1 in/multiple out asynchronous script component was looking fabulous, until I tried to run it. Turns out you can't step through a script component with the debugger, so I'm kind of stuck.
I'm getting the error 'There is no current row in the buffer. A row may need to be added using the AddRow method.'
Here's the script I'm running. For each input row, it's trying to unstring linefeed-seperated input column data into a set of arrays, then create an output row for each populated occurrence and use Addrow() to write the new row. (According to the MSDN doco I shouldn't need to use CreateNewOutputRows())
Can anyone spot where I'm going wrong?
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Public Class ScriptMain
Inherits UserComponent
Dim rcobc(0 To 9) As String
Dim rcobcdesc(0 To 9) As String
Dim rcobcbase(0 To 9) As String
Dim rcobcunits(0 To 9) As String
Dim ratechg(0 To 9) As String
Dim ratelevy(0 To 9) As String
Dim ratered(0 To 9) As String
Dim ratetotal(0 To 9) As String
Dim arrposn As Integer
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
MSDN states the following on: Readable Secondary Replicas (AlwaysOn Availability Groups) for SQL Server 2014:
Limitations and Restrictions:
Change tracking and change data capture are not supported on secondary databases that belong to a readable secondary replica:
Change tracking is explicitly disabled on secondary databases. Change data capture can be enabled on a secondary database, but this is not supported.
This confuses me: You can not track the changes. However you can enable CDC?
The scenario I am trying to achieve is to use SSIS CDC components on an asynchronous secondary replica. Is this possible? If not what would be other viable approaches?
I have configured active passive cluster in production environment. And we also have a dr which we have configured with asynchronous mirroring with no witness. Currently active node(node
a) is in sync with dr. When failover happens and the second node(node b) becomes active, the mirror is broken and goes to disconnected mode.
But when we failback again to node a mirror is connected again and is in sync again. In our setup we have active passive cluster and a standalone server as dr.
I am working with the Data Flow Task Script Component for the first time. I have created a second Output. In my script I add rows to this output.
I have found that Ssis does not release those rows to the second Output until it has processed all of the incomine pipeline records. This will not work for me as there are going to be a few million records coming down the pipe, so I need the Script Component to as soon as possible release these records downstream for insert into the destination component Ole Db component.
If you have an output that is not synchronous with the input what is the best way of processing the data.
I am currently using a generic queue, and a custom class. I am creating an instance of the class in the ProcessINputRow and then adding it to the Queue.
The CreateNewOutputRows Dequeues the class instances and creates buffer rows.
Is there by chance a cunning way to make the input columns automatically populate the output of an asynchronous script transformation?
My transformation writes several rows for each input row read. I'm creating some new columns along the way but I'd like all of the input columns to get output each time also. However I can't see any obvious way to achieve this, short of manually defining each column to the output and populating it in the script.
Problem: Have to call SPs reside in 2 SQL server DBs which will return the table information, need to merge and perform sort the result and return the final result set. this is something like wrapper SP calling other SPs asynchronoulsy.
Can this be possible in SP / SQL functions to make asynchronous calls?
If yes can you please guide me to achive this.
If no, any alternative approach, or the reason why.
On URL: http://technet.microsoft.com/en-us/library/ms135931.aspx among other things there is an VB example for "Creating and Configuring Output Columns" of Custom Transformation Component with Asynchronous Outputs:
Public Overrides Sub OnInputPathAttached(ByVal inputID As Integer)
Dim input As IDTSInput90 = ComponentMetaData.InputCollection.GetObjectByID(inputID) Dim output As IDTSOutput90 = ComponentMetaData.OutputCollection(0)
Dim vInput As IDTSVirtualInput90 = input.GetVirtualInput()
For Each vCol As IDTSVirtualInputColumn90 In vInput.VirtualInputColumnCollection
Dim outCol As IDTSOutputColumn90 = output.OutputColumnCollection.New()
When I copy-paste given code into ScriptMain of Ascynchronous Script Transformation Component I receive error message "sub 'OnInputPathAttached' cannot be declared 'Overrides' because it does not override a sub in a base class."
i have a for each loop and it has about 20 data flow tasks (simple data extractions). i notice when i run the package it only runs up to 4 data flow tasks at a time. others have to wait till one of the first 4 flows finishes.
i was wondering if there's a way to change the limit of how many data flow tasks can run at a time. is there a property some where ?
i know this will be stressfull to the server, but the server is well equiped with CPU power and memory, so performance will not be an issue.