Iterating Data Movement
Mar 28, 2006
Hi all
I am wanting to continuously monitor a source table throughout the day and as data becomes available, process it and insert it into one of a number of tables.
I have tried achieving this using a FOR LOOP and setting the halt condition such that it is not stisfiable. However, this has a couple of problems:
1) It runs in a tight loop and consequently degrades system performance enormously.
2) I can't get transactions to work. I would like each iteration of the loop to spawn a new transaction under which the tasks in the loop can run. Therefore, if one of the tasks fails during such an iteration, only the updates affected by that iteration are lost.
Ideally, I would like to be able to put a wait statement within the loop container so that it runs every couple of seconds. And would also like to implement transactions as described above.
All help is appreciated.
Jays :-)
View 2 Replies
ADVERTISEMENT
Jan 22, 2014
I'm moving set of data from one partition to another what is the best way.
what all the things need to be considered
Note: The set of data will be all from one partition to another one partition
My current query:
UPDATEtable1
SET table1.partitioncolumn = @newpartitioncolumn
FROMtable1
INNER JOIN table2
ON table1.id = table1.id
AND table1.partitioncolumn = @oldpartitioncolumn
View 7 Replies
View Related
May 28, 2015
The secondary server for my availability group was recycled. When SQL Server came back online the data movement for a database was suspended. The error log shows:
"AlwaysOn Availability Groups data movement for database 'XXXXXXXXX' has been suspended for the following reason: "system" (Source ID 5; Source string: 'SUSPEND_FROM_RESTART'). To resume data movement on the database, you will need to resume the database manually. For information about how to resume an availability database, see SQL Server Books Online."
I was able to resume data movement with no issue. I would like to understand the technical reason as to WHY the data movement was put in the suspended state and left there upon recycle. I searched for an article that would list possible reasons (BOL, Google, Bing, etc..). I just could not find much information out there on 'SUSPEND_FROM_RESTART'.
View 2 Replies
View Related
Feb 13, 2015
I am aware that TDE protects data at Rest and not during communication or data in motion (UNLESS you use Encrypted communication channels using SSL certs etc). Hence I am thinking of doing data export from a TDE encrypted database to a database on the instance where TDE is not enabled or supported. I believe it works and need to take care of relationships between tables.The target database is hosted on SQL 2012 standard edition on which TDE is not supported.
View 4 Replies
View Related
Aug 24, 2007
Hi all,
I'm getting this error on expanding the Databases node in SQL Server Management Studio Express Object Explorer:
Failed to retrieve data for this request. (Microsoft.SqlServer.Express.SmoEnum)
An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.Express.ConnectionInfo)
Could not continue scan with NOLOCK due to data movement. (Microsoft SQL Server, Error: 601)
A similar error message also appeared when I tried opening an existing data connection within Database Explorer in Visual Web Developer Express. (However, the web application ran fine and it managed to access the database normally.)
These errors only appeared recently. Any ideas how to go about solving this issue?
Thanks in advance.
View 1 Replies
View Related
Apr 2, 2008
Hi All,
In one of my interfaces ,Source is flat file which has field called StoreID in the Detail Record.
StoreID can be Multiple.Now I have to generate different files for Each StoreID present in the Source file.
To achieve this first I populate the data from the file into a Temp Table and use ForEach ADO Enumerator to iterarate based on StoreID and produce different files.This is giving a satisfactory result.
But now i have to change the flow so that Temp table is not used,i.e i have to iterate directly from the flat file.
Do we have a built in enumerator to achieve this.
or should we do this in Script task only??
any other Options??
Thanks in Advance...
cheers
Srikanth
View 4 Replies
View Related
Mar 5, 2008
Peace be on you,
In my organization, we have a database which exceed unto 2 GB. The application on which we are working is developing since the time of asp / cb 6 and in short it is the baby of many developers. It is a kind of ERP which has no documentation.
Problem 1:
=========
Now, to reduce the size of the database we have examined some of the orphan procedures, tables, and columns. Is it possible for me to create a DTS package which create the backup of all the orphan stuff to some other database and delete it from the actual one. Moreover, We still have doubts that some of fields which we think are orphan, might not be. So we need another DTS package which rollback all the changes. (Any Idea for that)
Problem 2:
=========
We want another DTS package which would move the data from one database to another and same rollback DTS for this one. (Any Idea for that)
Problem 3:
=========
How can we use DTS programming to do these tasks ? what benefits do I got from DTS Programming over DTS Wizard.
Any help would be greatly appreciated.
View 4 Replies
View Related
Mar 20, 2008
We have a stored procedure that failed with: Could not continue scan with NOLOCK due to data movement
We are running with ISOLATION LEVEL READ UNCOMMITTED. There are other jobs running, some of which might be hitting these tables (all using the ROWLOCK hint - though I know that's not guaranteed), however, this stored proc would not be going near the same rows. But even if they were, we'd be happy with either the before-look or after-look. This needs to be a low-impact job and should have minimal impact on ther other jobs, so we can't take out locks. Is there any hint we can use to do this? e.g. can we tell the query to just wait until the data has stopped moving then try again?
View 3 Replies
View Related
Mar 6, 2007
HI All,
In several threads there has been discussion regarding adding connection managers to a package's data flow, etc. My challenge is that I have a large solution that contains many packages, and I need to change the connection manager linked to the data flow in all of the packages. When the solution was initially designed, data sources were used, and it has become a tedious maintenance issue to keep those in sync. We want to use a standard OLEDB connection manager, but adding a connection manager to each package and editing the corresponding data flow tasks in each package to use that new connection manager is a daunting task. I've coded a .Net module to access the packages, remove the old connection manager (data source) and add the new OLEDB data source. However, as I traverse the objects in the package hierarchy, when I come to the data flow object, the innerobject is not a dts object, but rather a _com object.. I can't seem to find any documentation/examples as to how to iterate the tasks within a data flow and change the connection manager. If you have any information, that would be quite helpful. If you reply with a code sample, if you would be so kind as to relate it to one of the sample packages provided with SSIS so I can run it, that would be great.
Thank you.
Steve.
View 1 Replies
View Related
Mar 30, 2005
Hi,
I have a 6 different textboxes in my web application. I have 6 different tables in my database such as tbl1,tbl2,tbl3 etc.
When the user clicks the submit button I have to check whether the values in the textboxes match the value in the database. (if in txt1 the user enters 3 I need to go to tbl1 and check if there is such a value).
What is the most efficient way to perform such a check? Will I need to write 6 select statements or can I use a loop and if I can use a loop I would appreciate an example
Thanks
View 2 Replies
View Related
Nov 30, 1998
I have a table named sales, num of rows are 20. total number of rows for stor_id is 5 rows
I assume that when I create a cursor like this one which will have 5 rows from the cursor Now if I print a statement , I assume that I will have the statement 5 times( one for each row) Am I correct...
well, if this is the concept, I am still having one printed statement.. I do not know why , am I doing something wrong?
thanks for help
CREATE PROCEDURE p_cursor @ord_nbr char(4)
declare rst cursor for
select * from sales
where stor_id= 123
open rst
fetch rst
if (@@fetch_status=0)
print ' I may have 5 rows '
close rst
deallocate rst
View 3 Replies
View Related
Aug 21, 2006
We're very interested in having our application use SQL/e but we can't have the 4 gig limit. It makes sense to me that SQL/e simply should not be able to access a file over the network, and then you wouldn't have any reason to put a 4 gig limit on it.
At that point it becomes a very flexible alternative for remote users that need to have a large amount of data (i.e. documents, images etc.) with them.
We'd love to start building an abstraction layer so that we can support both SQL Server and SQL/e so that we can support network and remote users and not have the nightmare that is SQL Server Express installation. (care of the windows installer group's bugs...)
Thanks! Hoping for a favourable answer!
View 3 Replies
View Related
Jul 10, 2007
how do you iterate through each record in a table within a user-defined function...
bit new to sql server so need some help asp. thanks
View 9 Replies
View Related
May 7, 2008
Hello,
I have an SQL task which returns a set of dates, and I would like to iterate over this set, re-assigning the date to a global variable each time (User::CurrentDate), so that I can perform a number of tasks based on this date.
Can someone show me how this is possible in SSIS?
Thanks,
Simon
View 1 Replies
View Related
Oct 2, 2007
Hello everyone, I have a table in which I need to iterate field, possibly several rows, when I enter a new record with the same item ID number. An example will make this much clearer.
ItemID CurrentLocation Iter
A01 Inventory 1
A01 Cutting 0
A01 WIP 2
B01 WIP 0
B02 WIP 1
B02 Inventory 0
I dont want to delete any old rows so that I can keep a history of where each item has been. The iterative column is in reverse order so that 0 is the newest value (location) and higher numbers are older locations. An item could go through a CurrentLocation several times.
Now, if I insert a row with ItemID = A01 and Current Location = Polishing, I want the Iter field of all previous rows to iterate by +1 and this new row to have Iter = 0.
What would be the easiest, best way to do this? Use a stored procedure or do it in my code or what? I'm pretty new at SQL server so if i'm missing a better way to accomplish the same thing, then please point me in that direction. Thanks for your help and/or time.
Scott
View 3 Replies
View Related
Apr 23, 2015
I have transactional replication setup from server A to Server B. I wanted to move the subscriber from B to C. What could be the best approach.
1. Backup the DB from Server B and restore on Server C. set the replication between A & C.
2. setup the transaction replication between A & C along between A & B. Test A& C working fine and then remove B.
If I am going with approach 2 , I have to replicate data approx. 70 GB so If I ran both the replication on Server A that will stress as 140 GB of data moving out. How do I control this large movement ? Can the replication be manual synch?
View 1 Replies
View Related
Oct 5, 2006
Hi All,
I have SQL server 2005 Database which is having following Replication stuff.
1. 6 merge Subscribers
2. 5 Snapshot Subscribers (Push Susbribers)
3. 3 Transactional Publisher
Due to the Performance Issue, there is need to move SQL server from the Current Server to an Higher End Server.
I want to keep all the Replication settings after movement of the Database. Can anyone tell me how to acheive this requirement?
Is there a possibility to keep the Replication settings ? Even we can have the Same System Name to the New Server.
Awaiting response...
Thanks,
Thams.
View 1 Replies
View Related
Sep 23, 2004
Hello all,
I have recently started working on a project which involves using MSSQL to access a simple database. I have worked with Postgres SQL before, so I have a general idea of what SQL can be used for, but I'm having some difficulties applying that knowledge to MSSQL.
Currently, I would like to do the following (in abstract terms):
declare tmp record
select column1 from tableA into tmp
for each entry from above selection do
insert into tableB values (tmp[column1], 0, 0, 0)
I remember doing something like this fairly easily in postgres. Trying to put that into MSSQL, I have:
CREATE FUNCTION dbo.newDay (@mDate datetime)
RETURNS int
AS
BEGIN
DECLARE @id int
DECLARE item_cursor CURSOR FOR
SELECT id FROM tblKitchenCat
OPEN item_cursor
FETCH NEXT FROM item_cursor INTO @id
WHILE @@FETCH_STATUS = 0
BEGIN
INSERT INTO tblKitchenList VALUES (@id, 0, 0, 0, 0, 0, @mDate)
FETCH NEXT FROM item_cursor INTO @id
END
CLOSE item_cursor
DEALLOCATE item_cursor
RETURN 0
END
GO
I get a syntax error next to AS... what is it?
Can somebody please help me out here... any articles related to moving to MSSQL from Postgres would also be highly appreciated.
In addition to that, I would like to schedule a particular function to run once a day, say at 2am. Is there a way to do this using MSSQL?
Thanks in advance.
Cheers,
Michael
View 3 Replies
View Related
Jul 23, 2005
Hi,I want to log updates to specific fields, storing the new and oldvalues. Is there any way I can iterate the collection of updatedfields within a trigger in order accomplish this?Thanks in advance,Julie Vazquez
View 4 Replies
View Related
Feb 1, 2007
Hi There,
Can someone please let me know what is the best way to iterate the output rows of a script component and stick in those ids in a where clause of a select query (to retrieve additional info from a database)? Is this possible at all? If not, what is the best way to deal with this situation?
Thanks a lot!!
View 20 Replies
View Related
Apr 20, 2007
Hello everyone,There's an interesting SQL problem I've come across that I'm currentlybanging my head against. Given the following table that contains itemlocation information populated every minute :location_id date_created=========== ============5 2000-01-01 01:00 <-- Don't need5 2000-01-01 01:01 <-- Don't need5 2000-01-01 01:02 <-- Need7 2000-01-01 01:03 <-- Don't need7 2000-01-01 01:04 <-- Need5 2000-01-01 01:05 <-- Need2 2000-01-01 01:06 <-- Don't Need2 2000-01-01 01:07 <-- Need7 2000-01-01 01:08 <-- Needhow would you generate a result-set that returns the item's locationhistory *without* duplicating the same location if the item has beensitting in the same room for a while. For example, the result setshould look like the following :location_id date_created=========== ============5 2000-01-01 01:027 2000-01-01 01:045 2000-01-01 01:052 2000-01-01 01:077 2000-01-01 01:08This is turning out to be a finger twister and I'm not sure if itcould be done in SQL; I may have to resort to writing a stored-proc.Regards,Anthony
View 9 Replies
View Related
Jul 30, 2015
I have a scenario in which a schedule is recorded like the top table below. Notice the start and end times, the meeting length, and the fact that you could book more than 1 meeting (book factor) during the times slot. The second table is the result needed. I have it working using the dreaded cursor, but I know there's got to be a more elegant solutions.
empID
bookFactor
mtgLen
mtgStart
mtgEnd
1
2
15
7/1/2015 8:00
[code]....
View 8 Replies
View Related
Nov 26, 2007
I'm trying write a reusable script component that takes data from rows that were rejected from a SQL Destination operation and put them into a common SQL error table.
This script would basically function to take the input columns selected in the script, and build a delimited string, (similar to the 'Flat File Source Error Output' that is contains redirected rows from reading a flat file) and insert this string into a SQL table called 'SourceData' to store errors.
I'm trying to script the component to iterate through all input columns (as selected in the input columns screen) and build a simple string.
Code Block
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
'Use the incoming error number as a parameter to GetErrorDescription
Row.ErrorDescription = ComponentMetaData.GetErrorDescription(Row.ErrorCode)
Try
Row.ErrorColumnName = ComponentMetaData.InputCollection(0).InputColumnCollection(Row.ErrorColumn).Name
Catch ex As Exception
Row.ErrorColumnName = String.Concat("Column Name retrieval failure. Details", ex.Message)
End Try
'
'Build input data
'
Dim inData As String
For Each inputCol As IDTSInputColumn90 In ComponentMetaData.InputCollection(0).InputColumnCollection
inData = String.Concat(inData, "~", inputCol.Name) 'I don't want the name, but the value.
Next
Row.SourceData = inData
'
End Sub
I've only got as far as iterating the names of columns in the input buffer, but how do i get the values?
The result i'm trying to achieve is :
Selected columns in 'Input Column' screen : Name, Address, Phone
OutPut column 'SourceData' value : Harry~Melbourne~None
I don't want to write the code as:
Code Block
inData = Row.Name
indata = String.Concat(inData,"~",Row.Address)
indata = string.concat(inData,"~",Row.Phone)
as this make my code not very reusable. I've got some tables which are 100+ columns long and I don't wish modify the code too much
I have also tried overriding the ProcessInput() function of the script component to iterate through the buffer columns :
Code Block
Public Overrides Sub ProcessInput(ByVal InputID As Integer, ByVal Buffer As Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer)
MyBase.ProcessInput(InputID, Buffer)
Dim iCnt As Integer = 0
Dim inData As String
If Buffer.ColumnCount > 0 Then
For iCnt = 0 To Buffer.ColumnCount - 1
inData = String.Concat(Buffer.Item(iCnt)) 'Error thrown here: PipelineBuffer has encountered an invalid row index value.
Next
End If
End Sub
but i get an error when i run it.
Please help.
View 18 Replies
View Related
Aug 11, 2015
I've got this issue with a query in SSIS. From a table in SQL Server I'm getting over 25000 different identifiers. These identifier are associated to many values in a table in one Oracle Database. This is the schema that I have implemented for doing this.
The problem is that some days the identifiers can be over 45000, and at this point perform a loop for every one is not the best solution (It can take to much time to get the result). Previously I have performed another query where from the SQL statement.
I am creating and sending a unique row with all the values concatenated and then I have recover this unique string from an object and use it to create the query in the ODBC Source that invoke the table in Oracle: something like this:
'Select * from Oracle_table' + @string_values
with @string_values = 'where value in (........)'. It works good because the number of values is small enough to be used, like 250. But in this case I can not use this approach because the number is really big and obviously the DBA of Oracle is going to cancel the query.
So I wonder, how can I iterate over the object getting only a few number of values everytime, something like 300 or maximum 500, to avoid the cancellation of the query but at the same time doing the minimum number of loops.
View 5 Replies
View Related
Nov 24, 2006
Hi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
View 5 Replies
View Related
Dec 14, 2005
After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.
System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.
Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.
At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.
anyone having this problem before?? hopefully you guys can help.
Thanks,
View 4 Replies
View Related
Dec 4, 2007
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
View 1 Replies
View Related
Nov 2, 2015
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
View 4 Replies
View Related
Oct 18, 2006
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
View 7 Replies
View Related
Aug 12, 2015
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
View 0 Replies
View Related
Jul 20, 2005
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
View 1 Replies
View Related
Nov 10, 2015
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:
SQL Server:
Oracle:
DDL:
create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);
Result:
I follow up this article: [URL] ....
VB Script:
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
[Code] ..........
View 8 Replies
View Related
Apr 16, 2008
Hi all, i got this error:
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?
Thanks
View 13 Replies
View Related