I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:
I have 2 flat files to load into a datamart via SSIS. Need to implement:- 1. How can I prevent loading of same file again? 2. If by chance wrong data has been loaded how can I rollback? Kindlt guide asap as I have to implement these. JigJan
I am trying to load data from one database to another database with in same sql server.Our DBA has created a new db and I am trying to load data in to this new one.But I am getting error in SSIS packages while scheduled as job. The failed job history was not showing much information So I ran the package dirently to get more info on error message.
I was able to transfer around 5 to 6 tables . But I am unable to transfer the final 2 tables. I am thinking its a space issue. Appreciate if somebidy can explain what the real problem is ?
Below is the error information I got in SSIS
[OLE DB Source [1]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Protocol error in TDS stream". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: An existing connection was forcibly closed by the remote host. ". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: An existing connection was forcibly closed by the remote host. ". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available. ".
[DTS.Pipeline] Error: The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC0202009. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.
[DTS.Pipeline] Error: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0047039.
Ok, we have built a data mart using SSIS etc...for transformations and loading.
Our biggest single problem we have currently is loading data from an Oracle server to our SQL server. Some tables from oracle run fine when retrieving the data but there is one particular table that just doesn't load fast enough (9 million records take over 12 hours). It seems that we are idling alot and its not always running.
I am considering whether SSIS can be used to deal with a formatted report that consists of a number of pages delimited by a control character with rows deleimited by CRLF. There are a number of header records that will need to be removed on each page and each line consists of data in fixed width columns.
The files will be fed to us on a regular basis so we will require an automated solution
Can I use one of the import objects available within SSIS directly to deal with this kind of file or will I have to bite the bullet and start coding a solution using C# or vb.net ?
.If record from Detail (Spectrum table) is null then do insert the record into Spectrum table set status_flag to 'A' for active else do update the record (replace all old values with new values) set status_flag to 'A' for active end-if
· If record from Master (Staging table) is null then do soft delete set status_flag to 'D' for delete end-if
i am performing the ETL on the as400 db2 database using ms- dts,ssis.
i have built the connection b/w as400 and source to extract data from as400 to staging means in dataflow . when i have built the oledb connction for loading data to destination as oledb destination.then it will connct successfully to the db2 as destination but when execute the task then it not load data , and give provider error.
I am trying to load a file using SSIS that contains records with two different layouts in one data file but in the flat file connection I can only specify one layout and this is causing the records with the second layout to be loaded incorrectly.
The different record layouts can be identified by the first character of the record. Example: If Field begins with "A" then assign one layout; "B" assign second layout.
Has anybody come accross this issue, if so some guidence would be appreciated.
p.s. Does anyone have any needles I can borrow? I think sticking them in my eyes would be nicer than working with SSIS.
===================================
An error occurred while objects were being copied. SSIS Designer could not serialize the SSIS runtime objects. (Microsoft Visual Studio)
===================================
Could not copy object 'Preparation SQL Task' to the clipboard. (Microsoft.DataTransformationServices.Design)
------------------------------ For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft%u00ae+Visual+Studio%u00ae+2005&ProdVer=8.0.50727.762&EvtSrc=Microsoft.DataTransformationServices.Design.SR&EvtID=SerializeComponentsFailed&LinkId=20476
------------------------------ Program Location:
at Microsoft.DataTransformationServices.Design.DtsClipboardCommandHelper.SerializeRuntimeObjects(ICollection logicalObjects) at Microsoft.DataTransformationServices.Design.ControlFlowClipboardCommandHelper.InternalMenuCopy(MenuCommand sender, CommandHandlingArgs args)
===================================
Invalid access to memory location. (Exception from HRESULT: 0x800703E6) (Microsoft.SqlServer.ManagedDTS)
------------------------------ Program Location:
at Microsoft.SqlServer.Dts.Runtime.PersistImpl.SaveToXML(XmlDocument& doc, XmlNode node, IDTSEvents events) at Microsoft.SqlServer.Dts.Runtime.DtsContainer.SaveToXML(XmlDocument& doc, XmlNode node, IDTSEvents events) at Microsoft.DataTransformationServices.Design.DtsClipboardCommandHelper.SerializeRuntimeObjects(ICollection logicalObjects)
I have SSIS package which is extracting data from oracle. I am using always use defualt code page =true and Defualt code page value=1252.
While transferring values in to target table the package is success but when I check the decimal values For example I have 488.9602 or 56.908 values in oracle table but when they come to SQL server they are rounded as 488.0000 or 56.000.
How can I populate the correct values from source to target
The target table has data type decimal(26,4).
Also when I check in the preview of source table I am able to see correct values. This is happening during the transfer only.
I want to transfer a few access mdb kept in a folder to sql server. I am using foreach file enumerator. I want to perform the following actions :
1. At time of loading the file, check if the file name starts with "ED". If not, move the file to error folder. 2. Count no. of records that were there in the access db and that were transferred to the sql table. 3. I also want to run the package from command line. Even though i figured out that, i want to display only limited messages on the console. For eg: File processing begins, File currently being processed, TOtal rows transferred, File processing completed etc.
I need some suggesstion regarding which SSIS tasks to use regarding my job in hand. Basically what I need is to extract data from joining multiple tables and then load to a particular output database.
What are peoples opinions on using SSIS as a "central repository" and replacement for all scheduled tasks. Example, we have a bunch of servers which we have installed services which we have written. Currently we have scheduled tasks on each machine to stop and start the services. One of my collegues is using SSIS to run a system which runs tasks on multiple machines by remotely running programs on other machines via scheduled tasks and then collects the data and puts it into a database.
He's now pitching the idea that we remove the scheduled tasks on each machine and start and stop our services via SSIS so that it's centralized. In addition, we can also check for holidays in our database before starting services. Since it doesn't seem like SSIS was meant for this type of use, I'm weary of using the tool to do something it wasn't intended for.
Any opinions? I'm also worried that the learnnig curve for everyone is going to be too high.
I have this procedure to remove certain characters from file names.
The SQL Task has this: exec dbo.spCleanseFileName @strFileName = ?, @strFileNameCleansed = ?
The stored procedure: CREATE PROCEDURE [dbo].[spCleanseFileName](@strFileName varchar(40),@strFileNameCleansed varchar(40) output)
I have it in an SSIS package and my problem is that, after that SQL Task completes, the value for the ),@strFileNameCleansed variable is blank. I HAVE confirmed that the procedure DOES set the correct value inside the SP.
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
Hi I have a strange problem with SSIS packages. (brief description - packages select some data from DBs , write them to CSV file and then CSV file is copied and renamed to a folder made up of the date) I have 5 packages scheduled to run, these jobs run perfectly when test scheduled during the day, (so its not a user permissions problem). However it seems the 1st package to run at night will fail. The reason I say 1st is the following, I had Package A scheduled at 11:20PM and package B at 11:30PM , Package B always succeeded but package A always failed. I would test A during the day and it would run fine (the jobs would run successfully aswell as just executing the package manually) .
The I changed the time with B to 11:50PM and it succeeds and A fails ! without changing the packages themselves.
This counts out a possibilty of a DB backup causing the problem (pack always succeeded at 1130, now fails at the same time)
I was thinking maybe as the folder wasn't created when the 1st pack ran this was causing the fail, but when i test run the job this morning it succeeds .. and todays folder doesn't exist either !
Hi I have migrated a DTS that had some activeX transformation tasks within data pump flow tasks.
Those parts were migrated as "DTS 2000 tasks" .. so activeX transformation tasks aren't possible in SSIS ? I know ActiveX script tasks are but for transformations ?
1. IF i leave these Encapsulated DTS 2000 tasks in the migrated SSIS package, will it run independently of the original DTS or does it need the old DTS running to "call" that part from ? (I hope im making sense here) is it possible to load this functionality internally into the new SSIS ?
2. How could I (if i can't do ActiveX transformation tasks) achieve this is SSIS ? can I achive this using the script tasks in SSIS ?
HI All, I have Uzip Files to be downloaded From ftp.companyname.com, the zip files get updated everyday thus i have to download the newly added files, the Zip file has got 13 text Files within it, the issue is: 1. how do i download it for the new zip file only, the zip files are shown below, i am trying using FTP Task, but need more info or other alternative. 2. (Optional) How do i UnZip it and and take the text files and then load them to sql server 2005, each text file has to be loaded to sql server tables. 3. How do i automate it, i mean every time i run the package (on Job based) it has to look the new file only, see the zip file below to understand what i am saying about.
As you can see the Zip file names are the blues color, they are added at different time, when you browse the ftp.companyname.com you will get the blue color files, thus what i need is to download only the current Zip file (i mean i have to downLoad only the newly added Zip file (recent one)). Thus the SSIS Task has to go to this FTP server and look the newly added zip file. (Optional) if possible, After that i have Unzip it b/se i have 13 text files there, and then Load them to Sql Server 2005.
Please get help on this, the deadline is near by time, if possible try as soon as possible, I would like to say Thank you for every help you do and try. Thanks,
I'm trying to do some custom SSIS logging using event handlers, similar to the ideas provided by Jamie Thomson in the past. My problem is that when I use System:ourceID as one of the items to be logged, I can't match up the SourceID to any of the GUIDs that are displayed in the property window for the various tasks in my package.
Where is this sourceID coming from and how can I track it down?
I can't find information on how you are suppose to handle the TransactionOption setting inside of a custom task.
I have a custom task that I have developed, and it basically calls a COM+ object that writes data to a database. When I have the task inside of a container that has the transactionoption set to required, and my custom task is set to supported, if one of the X items fail to execute in my custom task I am telling my task to fail the parent, which I thought would rollback everything. But it does not.
Is there someplace that I need to write rollback code in custom SSIS tasks? If there is I can't find any mention of it anywhere. Any examples out there on how to build custom SSIS tasks that support the TransactionOption parameter?
I have developed a big SSIS package to extract data from flat-files ( + 200 Dataflows ).
The situation is the following, inside de SSIS package, there are a lot of validations before extracting & loading the flat-files, i'm running this validations in paralell, so that when a file arrives, it enters the "validation process" and start extracting the file.
When i run the SSIS package from BIDS it works the way i have concepted it... but when i run the ssis in the server, the tables that are loaded through the process are only "available" when the SSIS PACKAGE ends, it is imperative that trough the process, when a table receives new data, it becomes ready, and don't just be available when the SSIS package finishes...
I have attached the an lousing .jpeg.
It is importart for the tables to be available, so the stored procedures(OUTSIDE SSIS PACKAGE) that are dependent of some tables, start working before the SSIS package Ends.
I have a SQL 2000 script which I use to automatically schedule various backup tasks. This script adds and schedules a full backup once a week, differential backups nightly, and log backups hourly, in addition to a couple other maintenance tasks such as rebuilding indexes.
The idea is that a less technically savvy person can set a few variables at the top of the script (such as DB name and backup file folders) and click 'execute' to run the script and schedule the backups etc all in one go, for different clients.
Since some of the stored procs I use in this script are deprecated in SQL 2005, I am trying to replicate this functionality in SSIS but am having trouble figuring out how I can get all of this functionality encapsulated in the same 'click and go' manner where the user can simply execute the package and all the jobs will be scheduled without any user interaction.
Is this even possible? Where should I be looking for examples of how to do this?
I'm facing a problem with my SSIS package regarding the execution order of tasks. I am using my package for the purpose of loading data from XML to staging tables in the database, and have a loop to process all XML files.
As a precondition to the loading action itself, I am running a stored procedure against the database (using the ExecuteSQL task) to check whether all staging tables are empty. The output parameter of that stored procedure is mapped to a variable I have defined in the SSIS package, so I can use it as a basis for the decision whether to run the loading action or not.
In order to test my package I added a script task, right after the execution of the stored procedure with a precedence constraint, that pops up a messagebox with the value of the stored procedure return value stored in the package variable.
The problem is that for the first iteration of the loop, the variable value presented by the messagebox is incorrect (equals to the initial value I assigned to the variable), and it looks like the script task starts before the stored procedure execution task finishes and its output value is stored in the package variable.