Archiving Data Transformation Service Packages

Feb 29, 2000

Is there a way, for example to script a DTS Package, so that it can be deleted and recreated at a later date if necessary? I have quiet alot of these, but few are used regularly. The msdb database is now up to 80 MB. However I don't want to delete them and have no way to recreate them.If I took a backup of msdb and then deleted the packages, would restoring msdb at a later date restore the packages?????

View 1 Replies


ADVERTISEMENT

Analysis Service: Pb When Archiving/restoring From 1 Machine To Another

Oct 19, 2004

I have to install cubes on another machine (not on the same site/NT domain).
I have archived the OLAP db and generated a .cab file.

I have retored on the destination server.
It works fine in Analysis Manager console (I can process and lokk at the data).
But when I want to access the cube from Excel using OLAP cube, I have an error msg.

Is there something I need to do on the distination server ?
Langage, SP, role, version of SQLServer/Analysis services ...

Thanks

View 3 Replies View Related

Archiving Data

Aug 2, 2000

Hi everyone!

My problem is, that i don#t know how i can archive the data. That means to documentate when, who, etc. changed the data (in a seperate table).
I tried to solve it with different triggers.

Thanks in advance,

Maria.

View 1 Replies View Related

Data Archiving

Oct 6, 2004

Hi ,

I need to archive my production database to a new Server.....

Is it possible to move data using INSERT INTO ServerName.DBName.dbo.TableName from the current Database Server!!!

Do I need to create a linked server to do this....or shoud I go for DTS..

Thanks
Cheriyan.

View 1 Replies View Related

Data Archiving

Jan 11, 2007

What's the best archiving procedures? :)

SlayerS_`BoxeR` + [ReD]NaDa

View 2 Replies View Related

Archiving Data From Some Tables - Timestamp Data Type?

Aug 21, 2015

I'm working on archiving data from some tables. I've duplicated the data structure, with the exception of not including the IDENTITY specifier on INT columns, so that the archive table will keep the value that was generated in the original table. This is all going well, until I tried to copy the data over where the column is specified as a timestamp data type. I've looked this up and found a couple of things. First, documentation for SQL 2000 says,

Timestamp is a data type that exposes automatically generated binary numbers, which are guaranteed to be unique within a database. Timestamp is used typically as a mechanism for version-stamping table rows. The storage size is 8 bytes.

And then documentation for the soon to be released SQL 2016 on the rowversion data type says,

The timestamp syntax is deprecated. This feature will be removed in a future version of Microsoft SQL Server. Avoid using this feature in new development work, and plan to modify applications that currently use this feature.

and

Is a data type that exposes automatically generated, unique binary numbers within a database. rowversion is generally used as a mechanism for version-stamping table rows. The storage size is 8 bytes. The rowversion data type is just an incrementing number and does not preserve a date or a time.

OK, I've read the descriptions, but I don't get it. Why have a timestamp/rowversion data type?

View 9 Replies View Related

Archiving Data To Flat File?

May 24, 2002

How do I put data into a text or excel file before I attempt a deleteion from a large table. I know how to select the necessary data, but i'm not sure about the t-sql required to put it into a file?

are there any better methods of archiving?

thanks

View 3 Replies View Related

Archiving And Deleting Records (Data)?

Aug 10, 2015

I wrote a script to archive and delete records rom a table back in 2005 and 2009.

I can't seem to get the syntax right. Any sample script to simply archive and delete records?

This is what I have so far.

DECLARE @ArchiveDate Datetime
SET @ArchiveDate = (SELECT TOP 1 DATEPART(yyyy,Call_Date)
FROM tblCall
ORDER BY Call_Date)
--SELECT @ArchiveDate AS ArchiveDate
DECLARE @Active bit

[Code] ....

View 9 Replies View Related

Data-archiving And Purging Strategy

Mar 31, 2008



Regarding SQL Server data, I am looking to implement the beset Data-Archive and Purge policy. Normal, we do SQL Backups and keep the history for some period , for example, 8 weeks, so we can go back and restore any data point in time upto 8 month in past. and we also do Tape backups.

Question is Where can I get nice article or documentation on this to best design such policy where I make sure that I am covered for point in time recovery of database (which is sql backups) and point in time recovery in far past, say, 3 years ago using tape backups, and I need to make sure that I don't repeat the same efforsts.

Any advices or suggestion on this topic.

Thanks,

View 5 Replies View Related

Extracting Data From A Table For Archiving Purposes

Sep 22, 1999

Hello,

I have been placed in the position of administering our SQL server 6.5 (Microsoft). Being new to SQL and having some knowledge of databases (used to use Foxpro 2.6 for...DOS!) I am faced with an ever increasing table of incoming call information from our Ascend MAX RAS equipment. This table increases by 900,000 records a month. The previous administrator (no longer available) was using a Visual Foxpro 5 application to archive and remove the data older than 60 days. Unfortunately he left and took with him Visual Fox and all of his project files.

My question is this: Is there an easy way to archive then remove the data older than 60 days from the table? I would like to archive it to a tape drive. We need to maintain this archive for the purposes of searching back through customer calls for IP addresses on certain dates and times. We are an ISP, and occasionally need to give this information to law enforcement agencies. So we cannot just delete it.

Sorry this is so long...

Thanks,

Brian R
WESNet Systems, NOC

View 1 Replies View Related

Transformation Service Question About De_dup Process.

Jun 8, 2006

A standard SQL statement that is used a lot in SQL is a left join.



What I'm trying to do is dedup a table.



Example:



Select * from table1 as t1

left join table2 as t2

on t1.firstname = t2.firstname

and t1.lastname = t2.lastname

and t1.gender = t2.gender

and t1.dob = t2.gender

where t2.lastname is null



What I thought was with in a data flow I have one OLDBD SQL statement
that is the full table and in the second table I have the OLDBE SQL
statement deduping the dups. What I am wanting to do next is remove the
dups and then rejoin the deduped dups back into the oringal table that
was less dups.





In the data flow I have:



OLeDB source: Original Table

OLEDB Sounce 1: Deduped Dups

Sort both OLEDB connections

on the OLEDB Dups connection I put a multicast

Then place the both flows into a Merge Join that I have setup as a left
outer join select all columns from the oringal table linking to the
right table on firstname, lastname, dob, gender. But no columns from
the right table is selected.



Then I have the merge join go into a conditional spilt to seperate out
the NULL values and choose the error out. Redirecting the data through
the red arrow into a union all will the other multicast dups. Then into
OLEDB destination tables.



I only get the dedupped dups throught the process what is the best way
to do this with Tranformations. Maybe I'm not using the merge join
correctly or the conditional split please help.



Thanks.

View 6 Replies View Related

Transaction Replication && Data Archiving On SQL Server 2000

Jul 23, 2005

Hi techiesI have set up a Transaction replication from My Primary Server toSecondary Server on Orders table.Thousand of records gets inserted on Orders every hour which getreplicated on the secondary server. it works finereporting apps uses Secondory server's Orders table data for generatingreports .The Problem :Let say if i want to Remove older records from Orders table in theprimary serverwith out reflecting this change on the secondary server.is there a way to PREVENT this operation /transaction to be propogatedto the secondary server.Note : i am moving the records to another table (orders_Archive ) anddeleteing the rows from orders table . Also I need all the rows to bepresent on the secondary server table.Please advice ASAPRegards,Raj

View 4 Replies View Related

Walkthrough To Execute SSIS Packages Via Web Service

Aug 17, 2007

I am seeking a walkthrough for executing packages via a web service. All I have found so far are fragmented bits of information. This article is a good start but it leaves out some critical security setting information. http://msdn2.microsoft.com/en-us/library/ms403355.aspx#service.

Does anyone know of a good walkthrough that includes the security setup: Impersonation, Proxies, etc.?

View 6 Replies View Related

Remote Execution Of SSIS Packages Via Web Service

May 30, 2006

Is anyone executing SSIS packages using a web service similar to the example in http://msdn2.microsoft.com/en-us/ms403355(SQL.90).aspx

From what I've read, there is a new HTTP server embeded in SQL Server (so we don't have to have IIS) that this could be done from??







View 2 Replies View Related

Calling SSIS Packages From A Service Broker Queue

Feb 26, 2007

I would like to call an SSIS package from a Service Broker Queue.

There is one way that I am aware of -

Using xp_cmdshell from within an activation stored procedure and using DTEXEC.

Is there a more elegant way of executing an SSIS package from within SSB?

Also, I am not interested in writing a .NET external activator to process my messages in the queue. I would like this operation to be strictly database oriented. Having said this, I am also trying to avoid triggers processing the messages in the queue.

Thank you!

View 6 Replies View Related

Unable To Execute Multiple SSIS Packages From Windows Service

Aug 28, 2007

Need some help...
When we tried to run mulitple packages one after the other from a windows service, first one succeeds but later ones are throwing below error :
"The script threw an exception: The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there.
A deadlock was detected while trying to lock variable "System:ackageName, User::BusinessDate, User::Environment, User:ortfolioName" for read access. A lock could not be acquired after 16 attempts and timed out."

Later, we tried to create separate AppDomains for each package and execute via console application, but ended up with below error (The below expressions were defined in OnError Event) :
"The result of the expression "@[User::ReportErrorFrom]" on property "FromLine" cannot be written to the property. The expression was evaluated, but cannot be set on the property.
The result of the expression ""Error At :" + @[System:ourceName] + "" +
"Error Description : "+ @[System::ErrorDescription] + "" " on property "MessageSource" cannot be written to the property. The expression was evaluated, but cannot be set on the property."

At last, we tried to span a separate process (System.Diagnostics.Process) for each package. this seems working but taking very long time:
A package that normally takes 2 min, is taking 60 min.

We also tried creating an SSIS Package that executes mulitple packages. But only first package is getting executed, and second one is throwing below error (Here the variable it is trying to lock is of first package):
"Failed to lock variable "UniqueInstrumentsQuery1" for read access with error 0xC0010001 "The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.".

Please help us with some work around for this. Thanking you in advance,

View 17 Replies View Related

SQL 2012 :: Importing Packages Into Integration Service Catalog - Master Key Error 15581

Jul 28, 2015

I try to import packages into integration service catalog, i am getting the master key error .

" Please create a master key in the database or open the master key in the session before performing the operation (error:15581)""

Version is SQl 2012

View 1 Replies View Related

Transfer Data To Excel 2007 By Using SQL Server Data Transformation Services

Jun 11, 2007

My vendor requires data to be sent in Excel format.  Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576).  Right now my data sits in SQL 2000.  I am using MS SQL Enterprise Manager 8.0 to prepare the data.  Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance. 

View 3 Replies View Related

Does A Synchronous Transformation Process All Rows In A Buffer Before Outputting To Next Transformation?

Jun 5, 2006

Hi,

If you have two synchronous transformation components and the input of the second is connected to the output of the first, does the first transformation process (loop through) all rows in the buffer before outputting these rows to the second transformation? Or does the first transformation output each individual row to the second transormation as soon as it has finished processing it?

Thanks in advance,
Lawrie.

View 5 Replies View Related

Integration Services :: Difference Between Audit Transformation And Row-count Transformation?

Apr 22, 2015

tell me the difference between Audit transformation and rowcount transformation.

Because audit and rowcount transformation will provide the environment variables.

Only difference i am finding is rowcount returns the count of rows its updating .

Apart from these is there any other difference?

Tell me the scenario where i need to use the audit transformation.

View 3 Replies View Related

Data Transformation

Sep 14, 2000

We are transferring data between AS/400 and SQL Server 7.0 using DTS. Some of these transfers may need to be very close to real time. It doesn't seem like a continuously running job is the best solution for that.

Do you know any tools or utilities that can help us to move the data?

Thank you,
Anastasia.

View 2 Replies View Related

Data Transformation

Feb 19, 2003

i have something like this:

select * from accounts

name type amount
==== ==== ======
mary saving 123.00
mary chequing 246.00
mary investment 135.00
john saving 678.00
john chequing 987.00
john investment 0.00

what should i do to present the data in the following format?

name saving cheq investment
==== ====== ==== ==========
mary 123.00 246.00 135.00
john 678.00 987.00 0.00


Thanks.

View 3 Replies View Related

Data Transformation

Jan 12, 2008

Hi, newbie here with a simple?(maybe)question.

I have an Access Database that I have imported into SQL Server2000 and that worked great, but now I have to get it into 2005. My question is, How can I get the tables and all info in the tables into an SQL Script so I can run that script on the 2005 server?

The SQL 2000 is on my dev server and I have all the Tools, (Ent Manager, Query Analyzer,etc...) but the 2005 Server is Godaddy's and they only have the basic web interface. I can run Sql files and create databases and tables, but thats about it.

View 2 Replies View Related

Data Transformation Services

Jul 18, 2007

HiI was told that using DTS will allow me to schedule stored procedures to keep an sql database up to date. For example if a user registers but does not activate the registration, his details will be removed by a stored procedure which is scheduled to run every 24 hours. I use to use the global.asax file to fire a update by using a file containing a the date of the last update and then by adding 24 hours to it, it would execute a SP to delete unwanted data.I have tried to install DTS with no success. I am running the followingVisual web studio expressSQL 2005 Express. (From SQLExpr_exe) and I have told it to install all the extra componentsInstalled SQLEXPR_Toolkit.exe with all its optionsInstalled SQLServer2005_DTS.MSI When I go into the sql server using MS SQL Server Management Studio Express. I cannot see the Data transformation services node. I have also just installed server reports which I had no problems installing.Can somebody please help me. 

View 2 Replies View Related

SSIS Data Transformation

Jan 31, 2008

I have begun using SSIS and I am a little taken aback by the complexity of it especially since I just want to do a simple data transformation such as in DTS.
Are there any tutorials for data transformation for SSIS on the web/this forum and what if I want to do a simple transformation from Access to SQL Server?

View 1 Replies View Related

DTS Error During Data Transformation

Nov 8, 2000

I tried transforming data from one server to another using DTS. and then i got an error as below,
-----------------------------------------------------------------------------
Details: Error Source: Pump Data Step
Details: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.

Facility: 4, Severity 1,Code: 1176, HRESULT:0x80040498

Desc: The Data Transformation Services cannot copy or transform data from a Desktop or server to a standard, Enterprise, or small business server version of SQL server unless your destination server is per user licensing mode.

Source : Microsoft Data Transformation Services(DTS) Package

Source : Microsoft Data Transformation Services(DTS) Package
Code: 0x80040428
Description: Package failed because step'Pump data step' failed.
Error Message: IDispatch error #552
--------------------------------------------------------------------------

This is the full description of the error dialog...

Please suggest me some solutions....

Thanks in advance,
Kamalesh D

View 2 Replies View Related

Data Transformation - Hanging

Jul 20, 2005

I'm running a DTS package on SQL Server. The source is MS Access and thetarget is Oracle.On a "Drop Table" command the process just hangs. There are no foreign keys onthe table. Several tables have already been processed successfully by thistime.I think I've ruled out corruption by dropping and recreating the targetdatabase on Oracle.Any ideas?M Man

View 1 Replies View Related

Lookup Transformation (Can It Be Using The Old Data?)

Mar 31, 2008



I have a lookup transformation that retrieves a key for a certain column of values, in this case, a name. So, I go in to the lookup table with a name and come out with its key. I had it working and then I added new entries to the lookup table for a bunch of new names. Now, for some reason, I am not getting the matches for the new names. But I am still getting the matches for the names that existed before I added the new ones.

I'm wondering if the lookup transformation is using the old set of data and some how not picking up the new names. Do I have to trigger something in the lookup transformation to let it know that the lookup table data has changed?

View 4 Replies View Related

Data Transformation Question

Sep 6, 2006

I have a student table that needs some clean up. My first task is to remove all the periods (.) from the middle name column. Some people have two char middle initials with two periods. Can this be done via a SSIS package (I sure it can, but not how)? I can't think of a simple update statement that would accomplish the same thing.

Any direction would be appreciated.

View 3 Replies View Related

Data Transformation Services Question.

Oct 23, 2002

I am creating a DTS package for an import of data from MSAccess97 to SQLServer2000. I am quite new to the DTS so bear with me please.

Everything was fairly simple until I got stumped by the following problem:

In the source database there was a table with multiple fields (let's say A, B, C) for each record containing their values (let's say 1,2,3).

Now, in the destination database the table is built differently. It is a table containing the field data and definition.

So basically I need to turn this

ID A B C
0 1 2 3
1 4 5 6


into this

ID FieldName FieldVal
0 A 1
0 B 2
0 C 3
1 A 4
1 B 5
1 C 6


I am not sure how to go about that... any thoughts? :confused:

View 5 Replies View Related

Transformation Component Data Store

Jun 8, 2006

i am developing one custom transfer component, where i am building one custom object and want the same to be transfered from ComponentUI to component.I explored in this issue and came to know that we can make use of SaveToXML and LoadXML methods of IDTSPersist90 interface. The problem is i could not able to make use of this interface.If any body faced same issue and got the solution, let me know the same.

Thanks in advance

Karun

View 1 Replies View Related

Transformation Components And Moving Data

Apr 10, 2007

Hi,

I am having a simple difficulty regarding my transformation component that I have created.

I followed all of the relevant documentation to create the component, UI and such, and when I run a program that transfers data through my component (without it doing anything to the data) to a flat file source, all I get in the flat file is ,,,,,,,,,,

Meaning that the data is not actually being passed through my component to the destination, but it does recongize the correct number of columns. I get this warning for each column: [DTS.Pipeline] Warning: The output column "PurchaseOrderDetailID" (20) on output "OLE DB Source Output" (11) and component "Source - PurchaseOrderDetail" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.

To fix this issue, all I need to do is "check" the boxes in the advanced editor for my component. However, I want to be able have these boxes "checked" automatically.

My question is, when you "check" a box next to a column name in the advanced editor, what does that exactly do that allows you to transfer the data? What do I need to program in order for it to replicate that? I actually want it to happen automatically, so by default, all data in all columns are transfered through my component out the other side, so I just need to know how to do this by code, and not how to replicate the UI.

My component, I thought, did that automatically, as I specified everything that I thought was required based on all of the documentation I read. Obviously, I am missing something. Here are the methods that I believe would all be involved.

Please let me know what I am missing.

int[] inputColumnBufferIndexes; // ...
int[] outputColumnBufferIndexes; // ... used in PreExecute
PipelineBuffer outputBuffer; // used in ProcesInput

public override void OnInputPathAttached(int inputID)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vCol in vInput.VirtualInputColumnCollection)
{
IDTSOutputColumn90 outCol = output.OutputColumnCollection.New();
outCol.Name = vCol.Name;

outCol.SetDataTypeProperties(vCol.DataType, vCol.Length, vCol.Precision, vCol.Scale, vCol.CodePage);
}
}

public override void PreExecute()
{
//base.PreExecute();

IDTSInput90 input = ComponentMetaData.InputCollection[0];
IDTSOutput90 output = ComponentMetaData.OutputCollection[0];

inputColumnBufferIndexes = new int[input.InputColumnCollection.Count];
outputColumnBufferIndexes = new int[output.OutputColumnCollection.Count];

for (int x = 0; x < input.InputColumnCollection.Count; x++)
{
IDTSInputColumn90 column = input.InputColumnCollection[x];
inputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(input.Buffer, column.LineageID);
}

for (int x = 0; x < output.OutputColumnCollection.Count; x++)
{
IDTSOutputColumn90 column = output.OutputColumnCollection[x];
outputColumnBufferIndexes[x] = BufferManager.FindColumnByLineageID(output.Buffer, column.LineageID);
}
}


public override void PrimeOutput(int outputs, int[] outputIDs, PipelineBuffer[] buffers)
{
//base.PrimeOutput(outputs, outputIDs, buffers);
if (buffers.Length != 0)
{
outputBuffer = buffers[0];
}
}

public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
//base.ProcessInput(inputID, buffer);

if (!buffer.EndOfRowset)
{
IDTSInput90 input = ComponentMetaData.InputCollection.GetObjectByID(inputID);
while (buffer.NextRow())
{
// TODO: Examine the columns in the current row.
// Add a row to the output buffer.
outputBuffer.AddRow();
for (int x = 0; x < inputColumnBufferIndexes.Length; x++)
{
// Copy the data from the input buffer column to the output buffer column.
outputBuffer[outputColumnBufferIndexes[x]] = buffer[inputColumnBufferIndexes[x]];
}
}
}
else
{
// EndOfRowset on the input buffer is true.
// Set EndOfRowset on the output buffer.
outputBuffer.SetEndOfRowset();
}
}

View 6 Replies View Related

CTE In OLE DB Command Data Flow Transformation

Dec 20, 2006

I am trying to use a CTE in an OLE DB Command data flow transformation object. However, when I enter the cte and corresponding query in the SqlCommand field of the OLE DB command editor dialog, I get a syntax error. Can CTE's be used data flow objects? I have been able to use them in an Execute SQL Control Flow Item, but not in any data flow item.

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved