Processing Last Created/modified Files From A Location Using SSIS Package
Mar 12, 2008
We have a scenario to process last created/modified files from a location using SSIS package , eventhough the folder contains multiple files with same name and extension.
Kindly give respond to this if any one has worked on this.
Regards,
Sajesh
View 7 Replies
ADVERTISEMENT
Jun 11, 2015
Script to find the details of creation date and modified date of all files located in a path?
I wanna write few custom messages before I delete some files from a path.
View 9 Replies
View Related
Aug 23, 2007
Hi,
Is there a way to find out when an SSIS package was modified?
Thanks,
Siva.
View 6 Replies
View Related
Oct 27, 2015
Is it possible to trace who modified ssis package and when it was modified?Â
View 2 Replies
View Related
May 18, 2015
How to load files with similar format , from two different locations into same database with same ssis.
Lets say
Location 1: C:LoadFilesCust1APP_123445.txt
Location 2: D:LoadFilescust2VDD_543121.txt
Currently we have one ssis which loads and process files from C:LoadFilesCust1 only. we have to modify the existing package it to load files from Location 2 (D:LoadFilescust2) as well. Also while loading, the ssis should assign a value to existing column CustID depending upon the file name. File names always start with APP_ in first location. VDD_ in second location
Assign CUSTID as 100 if file name starts with APP_
Assign CUSTID as 200if file name starts with VDD_
View 1 Replies
View Related
Apr 20, 2007
I have to copy files from a sharepoint or extranet location (basically https://.....) location to my local server using SSIS.
Any kind of early help would be really great.
View 1 Replies
View Related
Dec 5, 2007
Hi friends,
I have Transfered my dts packages from sql 2000 to sql2005 by directly migrating the rows from Sysdtspackages table on sqlserver 2000 to Sysdtspackages table on Sqlserver 2005.Now im able to see all my DTS of sql2000 in sqlserver 2005 Management studio under MANAGMENT--->LEGACY--->Data transforamtion services and i have all the corresponding records in sysdtspackages table of MSDB database on SQLSERVER 2005.
Now i have to schedule a job for executing these dts packages. In the job schedule window, when i try to select my dts packages on the SSIS package store for the package source and go to SSIS --->MSDB, IM NOT ABLE TO FIND MY DTS PACKAGES? WHERE ARE MY DTS PACKAGES GONE. HOW CAN I SCHEDULE.
i can find another table by name 'sysdtspackages90' on msdb database.do i have to migrate the data from sysdtspackages to sysdtspacakges90 ?
Please help out.
Regards
Arvind
View 6 Replies
View Related
Jun 27, 2007
Hi,
I just created my first SSIS package.
For now, I can only run it inside Visual Studio.
I want this packet to be run every day as a scheduled task. How do I do that?
When I double click my .dtsx I get an Execute Package Utility. Where can I set my package to run every day at certain time?
Thank you.
View 4 Replies
View Related
Jul 2, 2007
Hi All
I am facing problem while processing my SSIS package
The error is:
"Invalid delimited data. Text qualifier must be followed by a column delimiter (except for the last column)."
The text qualifier is the double-quotes character “. Text fields are supposed to get a double quote at the beginning of the string and one at the end. The column delimiter is the upright bar or pipe character |.
Some of the descriptions in table column have double-quotes embedded within the text string. When SSIS encounters one of these embedded quotes, it thinks that is the end of the text string and expects to find the column delimiter character next. It doesn’t, because the " is embedded between other alphabetic characters, so it raises an error.
For example if my column value looks like Test"String"One
What old DTS doing:
DTS make the embedded " into "". The resulting text string then would look like this:
"Test ""strings"" one".
When running the same data into a database using DTS, DTS recognized automatically the "" in the string and changed it to " so in the target the string looked like this: test "string" one. No problem.
SSIS, however, does not allow and does not support doing this "doubling" of embedded " when you are using " as the text qualifier. So it exports the sample string above like this (I added the column delimiters as well):
|"test "string" one"|.
That then causes the error
Please help me out
T.I.A
View 2 Replies
View Related
Oct 24, 2007
Hi All,
I created a SSIS package to extract data from a flat file source and load them into a table in a data base. After I created the package i checked it in to source control(perforce).
But the problem is once a month new flat source file comes and data should be updated.
Once the new flat file comes, is there anyway that SSIS package can identify the path of the flat file and execute the package automatically? In Flat file source only the data will be changed. Not location or data type or anything.
Can i use parameters to do that?
Thanks
View 6 Replies
View Related
Jul 17, 2006
We are trying to import data from a .csv file which sits on shared location. This package runs fine when we run it from designer. but we are having problem when we do it at run time (accessing it through a service). Same package runs fine if that file is on same server.
Is any one gone through this issue before? i appreciate any help in resolving this issue.
--------Log----
#Fields: event,computer,operator,source,sourceid,executionid,starttime,endtime,datacode,databytes,message
OnPreValidate,SMSPAD1125M,RFCGKommar1,GIRI_ETL_XREF,{4D456D56-B35F-4FCC-8A89-2D03AC545C76},{5395DAA0-DB96-49CA-BDE7-0DA5C623A2B0},7/17/2006 10:46:42 AM,7/17/2006 10:46:42 AM,0,0x,(null)
OnError,SMSPAD1125M,RFCGKommar1,GIRI_ETL_XREF,{4D456D56-B35F-4FCC-8A89-2D03AC545C76},{5395DAA0-DB96-49CA-BDE7-0DA5C623A2B0},7/17/2006 10:46:42 AM,7/17/2006 10:46:42 AM,-1073659875,0x,Connection "FlatFile" failed validation.
---------------------------------
Thanks,
-G
View 3 Replies
View Related
Oct 22, 2007
Hi,
I created a SSIS Package programatically based on the few threads here in this forum. This package just has a data flow task, during data transfer for every 1000 rows or so I want to update the status in a table in the database.
How do I achieve this?
As of now I just have a source and a destination, no transformations in between the flow. I'm not sure if rowcount will help, when I tried it using a onprogress even handler it always showed up as zero.
Thanks
View 6 Replies
View Related
May 22, 2008
Hello everyone,
This is a new level of complexity for me..
The Boss wants..
1. To control an SSIS package , start, check status, and emergency stop + rollback a package from a web page. Does anyone know of an example or good articles to start with.
2. I have one of the iterations of the data invoke and use a COM object (third party) It will value the items and change a field.
I can always do #2 as a second step but I need all the help I can get on #1.
Thank you,
View 1 Replies
View Related
Jul 10, 2015
I have an ssis package that moves data from a new csv file in a share location to sql server database table. However I need to get this agent job triggered whenever a new csv file gets added to the shared location.
What is a best strategy to do this keeping in mind that while package is running and two new csv files come in and package shd copy data from both the files.
View 5 Replies
View Related
Oct 8, 2007
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current
SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
View 20 Replies
View Related
Sep 23, 2007
Hi everyone,
I wanted to thank everyone for posting a ton of valuable information in these forums. I also want to thank all the moderators that have been replying with really insightful help!
I am trying to programmatically create an SSIS package to take .CSV data and put it into a SQL Server 2005. I am assuming that this is pretty common scenario.
I have used many of the examples in this forum as well as heavily borrowing from this example http://www.codeproject.com/csharp/Digging_SSIS_object_model.asp written by Moim Hossain.
I can get my package to create and execute properly but no data is being written to the SQL Server table. This has puzzled me for the last 2 days!
I know the issue isnt with the server itself because I tested it by graphically creating a test SSIS package and it transfers the .CSV data to the table perfectly.
Would anyone know why this would happen? The Execution results are returning success but no data is written to the table!
Could anyone please provide insight as to what my issue may be?
Thanks in advance!
Code Snippet
using System;
using System.IO;
using System.Data.SqlClient;
using System.Collections.Generic;
using System.Text;
using Microsoft.SqlServer.Dts.Runtime;
using PipeLineWrapper = Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using RuntimeWrapper = Microsoft.SqlServer.Dts.Runtime.Wrapper;
namespace SumCodeApp
{
class SumCodeApp
{
// Variables.
private Package package;
private ConnectionManager flatFileConnectionManager;
private ConnectionManager destinationDatabaseConnectionManager;
private Executable dataFlowTask;
private List<String> srcColumns;
int file_count;
SqlConnection connection;
String folder_path;
String username;
String password;
String DB_server;
String catalog;
// Default Constructor.
public SumCodeApp()
{
}
// Constructor taking in user info.
public SumCodeApp(String folder_path, String username, String password,
String DB_server, String catalog)
{
this.folder_path = folder_path;
this.username = username;
this.password = password;
this.DB_server = DB_server;
this.catalog = catalog;
}
private void CreatePackage()
{
package = new Package();
package.CreationDate = DateTime.Now;
package.ProtectionLevel = DTSProtectionLevel.DontSaveSensitive;
package.Name = "SumCode Package";
package.Description = "Upload the SumCode files to the database";
package.DelayValidation = true;
package.PackageType = DTSPackageType.DTSDesigner90;
}
private void CreateFlatFileConnection()
{
String flatFileName = ". 1105.csv";
String flatFileMoniker = "FLATFILE";
flatFileConnectionManager = package.Connections.Add(flatFileMoniker);
flatFileConnectionManager.Name = "SSIS Connection Manager for Files";
flatFileConnectionManager.Description = String.Concat("SSIS Connection Manager");
flatFileConnectionManager.ConnectionString = flatFileName;
// Set some common properties of the connection manager object.
//flatFileConnectionManager.Properties["ColumnNamesInFirstRow"].SetValue(flatFileConnectionManager, false);
flatFileConnectionManager.Properties["Format"].SetValue(flatFileConnectionManager, "Delimited");
flatFileConnectionManager.Properties["TextQualifier"].SetValue(flatFileConnectionManager, """);
flatFileConnectionManager.Properties["RowDelimiter"].SetValue(flatFileConnectionManager, "
");
flatFileConnectionManager.Properties["DataRowsToSkip"].SetValue(flatFileConnectionManager, 0);
// Create the source columns into the connection manager.
CreateSourceColumns();
}
private void CreateSourceColumns()
{
// Get the actual connection manager instance
RuntimeWrapper.IDTSConnectionManagerFlatFile90 flatFileConnection = flatFileConnectionManager.InnerObject as RuntimeWrapper.IDTSConnectionManagerFlatFile90;
RuntimeWrapper.IDTSConnectionManagerFlatFileColumn90 column;
RuntimeWrapper.IDTSName90 name;
// Fill the source column collection.
srcColumns = new List<String>();
srcColumns.Add("CreateDate");
srcColumns.Add("CorpID");
srcColumns.Add("SumCodeID");
srcColumns.Add("Priority");
srcColumns.Add("SumCodeAbv");
srcColumns.Add("SumCodeDesc");
srcColumns.Add("SumCodeGroupID");
foreach (String colName in srcColumns)
{
column = flatFileConnection.Columns.Add();
if (srcColumns.IndexOf(colName) == (srcColumns.Count - 1))
//column.ColumnDelimiter = "
";
column.ColumnDelimiter = "{CR}{LF}";
else
//column.ColumnDelimiter = ",";
column.ColumnDelimiter = "Comma {,}";
name = (RuntimeWrapper.IDTSName90)column;
name.Name = colName;
column.TextQualified = true;
column.ColumnType = "Delimited";
column.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
column.ColumnWidth = 0;
column.MaximumWidth = 255;
column.DataPrecision = 0;
column.DataScale = 0;
}
}
private void CreateDestinationDatabaseConnection()
{
destinationDatabaseConnectionManager = package.Connections.Add("OLEDB");
destinationDatabaseConnectionManager.Name = "Destination Connection - SumCodeCorpGroup";
destinationDatabaseConnectionManager.Description = "Connection to the temporary table SumCodCorpGroup";
destinationDatabaseConnectionManager.ConnectionString = "Data Source=DIVWL-356KCB1;Initial Catalog=SumCode;Provider=SQLOLEDB;Persist Security Info=True;User ID=sum;Password=code";
}
public class Column
{
private String name;
private Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType dataType;
private int length;
private int precision;
private int scale;
private int codePage = 0;
public String Name
{
get { return name; }
set { name = value; }
}
public Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType DataType
{
get { return dataType; }
set { dataType = value; }
}
public int Length
{
get { return length; }
set { length = value; }
}
public int Precision
{
get { return precision; }
set { precision = value; }
}
public int Scale
{
get { return scale; }
set { scale = value; }
}
public int CodePage
{
get { return codePage; }
set { codePage = value; }
}
}
private Column GetTargetColumnInfo(string sourceColumnName)
{
Column cl = new Column();
if (sourceColumnName.Contains("CreateDate"))
{
cl.Name = "CreateDate";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
else if (
sourceColumnName.Contains("CorpID"))
{
cl.Name = "CorpID";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
else if (sourceColumnName.Contains("SumCodeID"))
{
cl.Name = "SumCodeID";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
else if (sourceColumnName.Contains("Priority"))
{
cl.Name = "Priority";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
else if (sourceColumnName.Contains("SumCodeAbv"))
{
cl.Name = "SumCodeAbv";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
else if (sourceColumnName.Contains("SumCodeDesc"))
{
cl.Name = "SumCodeDesc";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
else if (sourceColumnName.Contains("SumCodeGroupID"))
{
cl.Name = "SumCodeGroupID";
cl.DataType = Microsoft.SqlServer.Dts.Runtime.Wrapper.DataType.DT_STR;
cl.Precision = 0;
cl.Scale = 0;
cl.Length = 255;
cl.CodePage = 1252;
}
return cl;
}
private void CreateDataFlowTask()
{
String dataFlowTaskMoniker = "DTS.Pipeline";
dataFlowTask = package.Executables.Add(dataFlowTaskMoniker);
}
public void ImportFile(String directory_path)
{
// Create the package.
CreatePackage();
// Create Flat File Source Connection.
CreateFlatFileConnection();
// Create Database Destination Connection.
CreateDestinationDatabaseConnection();
// Create DataFlowTask.
CreateDataFlowTask();
// Create the DataFlowTask
PipeLineWrapper.IDTSComponentMetaData90 sourceComponent = ((dataFlowTask as TaskHost).InnerObject as PipeLineWrapper.MainPipe).ComponentMetaDataCollection.New();
sourceComponent.Name = "Source File Component";
sourceComponent.ComponentClassID = "DTSAdapter.FlatFileSource";
PipeLineWrapper.CManagedComponentWrapper managedFlatFileInstance = sourceComponent.Instantiate();
managedFlatFileInstance.ProvideComponentProperties();
sourceComponent.RuntimeConnectionCollection[0].ConnectionManagerID = flatFileConnectionManager.ID;
sourceComponent.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatFileConnectionManager);
managedFlatFileInstance.AcquireConnections(null);
managedFlatFileInstance.ReinitializeMetaData();
Dictionary<String, int> outputColumnLineageIDs = new Dictionary<String, int>();
PipeLineWrapper.IDTSExternalMetadataColumn90 exOutColumn = null;
foreach (PipeLineWrapper.IDTSOutputColumn90 outColumn in sourceComponent.OutputCollection[0].OutputColumnCollection)
{
exOutColumn = sourceComponent.OutputCollection[0].ExternalMetadataColumnCollection[outColumn.Name];
managedFlatFileInstance.MapOutputColumn(sourceComponent.OutputCollection[0].ID, outColumn.ID, exOutColumn.ID, true);
outputColumnLineageIDs.Add(outColumn.Name, outColumn.ID);
}
managedFlatFileInstance.ReleaseConnections();
String a = sourceComponent.RuntimeConnectionCollection[0].Name.ToString();
String b = sourceComponent.OutputCollection[0].Name;
String c = sourceComponent.OutputCollection[0].Description;
String d = sourceComponent.OutputCollection[0].OutputColumnCollection.Count.ToString();
// Create DataFlowTask Destination Component.
PipeLineWrapper.IDTSComponentMetaData90 destinationComponent = ((dataFlowTask as TaskHost).InnerObject as PipeLineWrapper.MainPipe).ComponentMetaDataCollection.New();
destinationComponent.Name = "OLEDB SQL Connection";
destinationComponent.ComponentClassID = "DTSAdapter.OLEDBDestination";
PipeLineWrapper.CManagedComponentWrapper managedOleInstance = destinationComponent.Instantiate();
managedOleInstance.ProvideComponentProperties();
// Create a path and attach the output of the source to the input of the destination.
PipeLineWrapper.IDTSPath90 path = ((dataFlowTask as TaskHost).InnerObject as PipeLineWrapper.MainPipe).PathCollection.New();
path.AttachPathAndPropagateNotifications(sourceComponent.OutputCollection[0], destinationComponent.InputCollection[0]);
destinationComponent.RuntimeConnectionCollection[0].ConnectionManagerID = destinationDatabaseConnectionManager.ID;
destinationComponent.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(destinationDatabaseConnectionManager);
managedOleInstance.SetComponentProperty("AccessMode", 0);
managedOleInstance.SetComponentProperty("OpenRowset", "[SumCode].[dbo].[SumCodeCorpGroup]");
managedOleInstance.SetComponentProperty("AlwaysUseDefaultCodePage", false);
managedOleInstance.SetComponentProperty("DefaultCodePage", 1252);
managedOleInstance.SetComponentProperty("FastLoadKeepIdentity", false); // Fast load
managedOleInstance.SetComponentProperty("FastLoadKeepNulls", false);
managedOleInstance.SetComponentProperty("FastLoadMaxInsertCommitSize", 0);
managedOleInstance.SetComponentProperty("FastLoadOptions","TABLOCK,CHECK_CONSTRAINTS");
managedOleInstance.AcquireConnections(null);
managedOleInstance.ReinitializeMetaData();
PipeLineWrapper.IDTSInput90 input = destinationComponent.InputCollection[0];
PipeLineWrapper.IDTSVirtualInput90 vInput = input.GetVirtualInput();
foreach (PipeLineWrapper.IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
//if (outputColumnLineageIDs.ContainsKey(vColumn.LineageID.ToString()))
//{
managedOleInstance.SetUsageType(input.ID, vInput, vColumn.LineageID, Microsoft.SqlServer.Dts.Pipeline.Wrapper.DTSUsageType.UT_READONLY);
//}
}
List<String> tmp = new List<String>();
foreach(PipeLineWrapper.IDTSInputColumn90 inc in destinationComponent.InputCollection[0].InputColumnCollection)
{
tmp.Add(inc.Name);
}
PipeLineWrapper.IDTSExternalMetadataColumn90 exColumn;
foreach (PipeLineWrapper.IDTSInputColumn90 inColumn in destinationComponent.InputCollection[0].InputColumnCollection)
{
exColumn = destinationComponent.InputCollection[0].ExternalMetadataColumnCollection[inColumn.Name];
Column mappedColumn = GetTargetColumnInfo(exColumn.Name);
String destName = mappedColumn.Name;
exColumn.Name = destName;
managedOleInstance.MapInputColumn(destinationComponent.InputCollection[0].ID, inColumn.ID, exColumn.ID);
}
managedOleInstance.ReleaseConnections();
DTSExecResult result = package.Execute();
a = "0";
}
}
}
View 3 Replies
View Related
Oct 10, 2007
Hi
Any one please tell me is there any possible way to identify the table modified date.
I have checked the table created date from sysobjects or by right click properties. my requirement is to identify the exact date of table modification and column creation,alter dates.
Is there any such provision in sql server 2000 or 2005 , My application is in sql server 2000.
I need to confirm this because some database structure modification has affected my application and causing dataloss i need to check with the date of structural change of table and lost data date
can any one help
View 8 Replies
View Related
Jul 19, 2007
Hello
The default location where db's are created is almost full.
Is there any way to change the default location where the .mdf and .ldf are created when I create a new db? I know I can do this when I create a new db and go select where I want it to be, but can I make it default to a different directory automatically so no human intervention is needed?
tia
r/P
View 1 Replies
View Related
Oct 10, 2007
Hi,
I have created a job that will execute a SSIS package which will unzip some zip files. For unzipping we are using WinZip. In the package I have used a .Net script task for unzipping. This script is using WZUNZIP. When I am executing the package directly it is unzipping all the zip files. But when I am executing the job that will execute the SSIS package for unzipping it is going on with the execution and not unzipping the zip files. So finally I stopped the job.
Can you please help me to resolve this issue?
View 11 Replies
View Related
Jan 13, 2004
I'm an Oracle DBA and just getting used to MS Sqlserver. I noticed that the windows explorer "date modified" field for my database files ( .MDF files ) doesn't change much even though there is activity going on. Sometimes it doesn't change for a week.
Is this the expected behavior? Could it be that no data is changing in my database? ( I find that hard to believe)
Thanks for any insights.
View 1 Replies
View Related
Jan 31, 2013
is it possible to delete a source file(*.txt) when a SSIS package is done with it?
View 3 Replies
View Related
Nov 11, 2014
I am working on FTP TASK in SSIS Package. i have to get files from FTP that file names are like 20141110.txt. i want to download any particular date file from ftp. How to i set expression in Remote path?
View 3 Replies
View Related
Sep 7, 2007
I have three machine:
S: Running SQL Server Express
V: Running SSIS package in VS.Net
F: Shared folder host excel files
And an openrowset SQL statement: select * from openrowset(..... \Fexcel.xls....). This statement can be run in SS management studio connecting to S using my Windows logon(integration security) without any problem.
However, the same SQL running inside SSIS package (integration security using my Windows account) get the following error:
Error: 0x0 at Check headers: OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" returned message "The Microsoft Jet database engine cannot open the file '\Fexcel.xls'. It is already opened exclusively by another user, or you need permission to view its data.".
Error: 0xC002F210 at Check headers, Execute SQL Task: Executing the query "....openrowset....." failed with the following error: "Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
(My Windows account is administrator of Windows and sysadmin or SQL Sever Express on S)
View 1 Replies
View Related
May 28, 2007
Hi,
I have a problem with the task "event watcher".
I've made a query like the one in msdn (SELECT * FROM __InstanceCreationEvent WITHIN 10 WHERE Targetinstance ISA "CIM_DirectoryContainsFile" and TargetInstance.GroupComponent= "Win32_Directory.Name="e:\\temp""). I have 20 similar tasks for watching in different folders, but when there are too much tasks in parallel, it doesn't work anymore. I change the numbers of executables to 128 (in the general properties of the package (to test)) but it doesn't seems to work.
I don't understand why it works when there are only 1 or 2 (6 seems to be the maximum) tasks and not if there are more than 6.
Could you help me with this issue?
Configuration : Windows Server 2003, SQL Server 2005, SSIS, Sql Server Agent
Thanks a lot.
Julien.
View 3 Replies
View Related
Jul 3, 2007
I have a SQL2000 DTS package that executes vbscript to loop through a recordset which:
- runs a stored procedure and populated tables
- builds a recordset from the populated tables to write records to an Excel file
- writes status to text files with either the error or success notices
I use FSO to set up the success and error files, but the scheduled job in SQL2005 which calls the SSIS package returns the following error:
"Retrieving the file name for a component failed with error code 0x0015F74C"
I can successullly run this (vbscript) in both the SSIS package via the BI Development Studio and in MS Access (exactly the same code in both) - but not as a SSIS package called in a scheduled job in SQL2005.
I am at an impasse with this ... any and ALL assistance would be GREATLY appreciated.
TIA,
Bob
View 1 Replies
View Related
Jun 30, 2015
We have several hundred very simple ETL SSIS 2K8 package files (*.dtsx).
I'd like to be able to interrogate them to determine source and destination fields.
There's no great need to map source to dest or to extract data types.
So far, the most promising candidate is to load them using OPENROWSET into an XML field in a SS table.No problem there, but querying using OPENXML has me stumped.
The package files will change a couple of times per year, so the process needs to be repeatable with minimal manual intervention.
View 3 Replies
View Related
Oct 19, 2010
I have a SSIS package which reads an excel file and loads data into a table using script component(C#) as a source. The package runs without any errors when I manually run it on my machine and on the server. But the package fails when run as a SQL Server Agent job.
I tried all the possible fixes I found on the web but still can't get it to work.
View 14 Replies
View Related
Mar 27, 2007
Does anyone know where a good article pertaining to where you should locate your Data and Log files (in SQL Server 2005)?
I read an article several years ago stating that Log files should be on a seperate RAID 1 and Data on a seperate RAID 5.
Anyway, any help is appreciated.
View 1 Replies
View Related
Jun 18, 2007
Hello,
Here is the following mind-numbing problem I have (and wished I did not have to experience)
A set of 2 SSIS packages is scheduled to run in a sql server agent job on the same server. Both packages use an environment variable that point to a package configuration file. In this file there are 2 connections, one to a sql server with a sql server user id and passwordn, another to an AS400 DB2. Both packages are deployed on the same server in SSIS server under MSDB sql storage, with package protection level set to 'rely on server storage and roles for access control'.
Today the connection to the As400 needed to change, it is now connected to another AS400 server. The packages have been modified to use the new connection. In the configuration file the old connection has been commented out and the new connection string was added, the connection itself was given a new more meaningfull name in the packages.
Running the packages from visual studio 2005 works. After testing I have deployed the packages to SSIS server in MSDB storage.
Now when I start the sql server agent job that runs these packages, the job quits with an error, in the history I see an error message that it failed to connect to the sql server with the given sql server user account.
When I the step in the sql server agent job properties for both packages, under the Tab 'Data sources' I see that it is using the new AS400 connection. I can also see the connectionstring for the sql server with the user id (but no password).
To make it possible for my packages to run (the users are waiting for the data) I have solved it like this:
- under the 'configurations' tab I have added the name of my package configuration file.
- i did this for both packages in both job steps
when I run the job , it works without problems.
Now , my question is: I have hardcoded the pathname for the package configuration file in my job. Instead of the package using the environment variable to find th epackage configuration file. I would prefer my packages , when in a sql server agent job to also use the environment variable. What can I do to make this happen?
Flabbergasted as always,
View 6 Replies
View Related
Sep 26, 2000
Hi everybody,
On the time of installation SQL Server asking me where I wont to locate the DATA files and the PROGRAM files. It’s giving to me choice to put database AND log files on one disk and program files on separate. But what about to separate LOG and DATA files. I have RAID1 especially created on F: drive for LOG files and RAID 5 on E: for DATABASE files. When I have to separate that if not on the time of installation? How I can do that?
Thanks,
Miriam
View 3 Replies
View Related
Aug 8, 2006
In SQL2000, there's an option to change the location of the template folder. This allows me to create a customized set of templates on a network folder and have all the developers reference the centralized location. Can the same be done in SQL2005 and how would I go about doing so?
Thanks.
View 1 Replies
View Related
May 15, 2007
I have a situation from where I need to loop through different folders and files in these folders. After processing these files, I need to archive these folders to different location.
e.g., C:MainFolderMar01 ==> Multiple files in Mar01 folder
C:MainFolderMar02 ==> Multiple files in Mar02 folder
Does any one know the best way to do this in SSIS?
Thanks in advance.
BC
View 1 Replies
View Related
Jan 28, 2008
Hi All,
Can plz any one help me, acutally as i know sys.files table gives you the acutally physical location of the database fiels with name as well like this
C:sqldatax.mdf
but what i m looking is, is there any qury or script that will give me only the path of the data,log, index files like this
C:sqldata
Thanks and looking forward.
-MALIK
View 5 Replies
View Related