SSIS - Data Source Password Lost When Imported Into Integration Services
Apr 17, 2007
After designing a SSIS package in Visual Studio 2005 that had two connection manager defined to keep the password. After I deployed the package to a file system. I then Imported the .dtsx file after making a Integration Services connection in Sql Server Management Studio. When I tried to run the package it failed when it tried to make the connection. When I edited the connection manager connection string and added the password and the package ran fine but it does not retain the password!. I need to have this package scheduled to run daily so I need to know how to have the package keep the password in the connection string. I have seen other posts on this issue but not seen a good solution. Could someone point me to the proper MSDN article that would explain how to implement this ? Is it a SQL Server configuration issue or a property in Visual Studio SSIS design time ?
I'm using a shared data source to connect an Oracle server in my packages. After changing the database user password in the shared data source, I noticed the package concerned would fail with the following description.
I have ssis package that pull data from SAP (Using ADO.net connection) to SQL server every night but i have noticed that all data from source is not getting pulled by package . package losing some amount of row.
I am working to archive some old data from a data warehouse using SQL server and SSIS. The data will be read and denormalized, then shipped out to a delimited text file.
The rowcount of the incoming data is significant, call it 10M+ rows per unit of work (one text file).
There are development advantages of using a stored proc for the data source - mainly ease of changing the denormalization logic as required. Wondering if there are performance advantages of an embeded query for the data source instead?
It was mentioned by one developer that when using a stored procedure, the output stream from the proc and subsequent SSIS steps cannot start until the full procedure processing is complete; i.e. the proc churns out its' result set in one big chunk.
He hinted that an embedded query does not have this same effect, but I am not sure that is accurate.
I'm using SSIS 2005 Enterprise edition, I'm creating a package that reads an excel (xls) file using the "excel source" component, and it dumps the data into an OLEDB destination (a sql server). When I drag the excel source component and create the excel connection to my file the component automatically reads the columns and their datatypes.
The problem is that I have a column which has numeric data and the package uploads as NULL every number that starts with a zero. (note: in excel this column is formatted as "text", despite it has only numbers, because it's the only way excel maintains the left sided zeros).
So I checked the data types by right clicking the excel source component -> show advanced editor and my surprise is that this column's data type is detected as double-precision float, and it doesn't let me change it. URL... but it only works when the first row of data has a number beginning with zero on this column. How to get the data imported correctly?
In my project source is Oracle and I am using ODBC to connect oracle for lading.I have create 2 project parameter for connection string one for connection and another for password..when I am making expression on ODBC connection it is showing error like below I can't establish a connection because our legacy driver doesn't support 'Password' as a connection string attribute.
when I am passing expression like @[$Package::V_Constring]+ "PWD=faster1" on odbc connection it working fine.
When I use just the ConnectionString property on the ODBC connection manager and use a 'pwd' attribute; all is well. E.g., "uid=<user>;pwd=<password>;Dsn=<dsn name>;". But as soon as I flip the sensitive attribute, I'm getting the classic error:
The expression will not be evaluated because it contains sensitive parameter variable..The sensitive parameter is desired, of course. I don't want the password in the clear.
I can set almost all properties of FTP task and FTP connection manager using expressions option. In that option, I don't see that I can set FTP PASSWORD using a variable. How do I set the password dynamically?
I have created a FTP task in SSIS to upload file to the server. Due to some policy issues we do not have the actual password with us. we have been provided with a encrypted password. Hence I am not able to connect to the server through the FTP task connection manager. Is there a way to connect to the server and send file using the encrypted password, so that we don't require the actual password to run the process successfully?
I create a connection to an OLE db source and use SQL Server authorization and save the password, the connection manager seems to "forget" the password. That is, when I click the 'save password' check box, and do a test connection, it connects fine. But as soon as I close that connection window, and reopen it, the password box is empty, and the 'save password' box is still checked.
While using the connection manager name in the SSIS component (say script task) the connection is failing . As a workaround, the whole connection string has been put in a variable and used that variable in the SCRIPT task.
Is it a bug or some other property need to be set to use ConnectionManager Name .
I'm having a problem with saving a password in a SSIS package that is accessing an Oracle database as an external data source. I experienced the problem that many people have had with the SSIS package not saving the Oracle password as expected. I successfully saved the Oracle datasource password (Password #1) using the "Encrypt sensitive with password" security setting and then setting a separate/new password (Password #2) for the SSIS package. I've imported this package into my Integration Services installation, and I can get it to run successfully by manually entering Password #2 when I open the package.
Now I want to pull the SSIS package into a SQL Job Agent, along with some other steps, to run on an automatic schedule. My question -- how/where do I embed Password #2 (the one for the SSIS package) in the settings for the SQL Job Agent step that I use to run the SSIS package? The SQL Job Agent step settings box prompts me for Password #2 before it will let me edit settings for the package, but it doesn't save that password. So when I actually run the whole job, the SSIS package always fails.
I'm trying to execute a SSIS package via proxy user but I keep getting the following error message regardless of I have tried to do to fix it, I have done the following so far:-
1. Recreated the proxy user 2. Retyped the password, under credentials
Message : Unable to start execution of step 1 (reason: Error authenticating proxy "proxyname", system error: Logon failure: unknown user name or bad password.). The step failed.
I have a simple solution with (right now) 4 packages, 3 configuration files and one "global" datasource. Here is the View Code on the data source:<DataSource xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:dwd="http://schemas.microsoft.com/DataWarehouse/Designer/1.0" xsi:type="RelationalDataSource" dwd:design-time-name="02d8345e-665e-4bd8-9323-1e846af220e5" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine"> <ID>PPDM37 Staging</ID> <Name>PPDM37 Staging</Name> <CreatedTimestamp>0001-01-01T07:00:00Z</CreatedTimestamp> <LastSchemaUpdate>0001-01-01T07:00:00Z</LastSchemaUpdate> <ConnectionString>Provider=SQLNCLI.1;Data Source=engsql03.dev.ihs.com;Persist Security Info=True;Password=;User ID=;Initial Catalog=PPDM37_Staging</ConnectionString> <ConnectionStringSecurity>PasswordRemoved</ConnectionStringSecurity> <Timeout>PT0S</Timeout></DataSource>As you can see, the <ConnectionStringSecurity> node is set with PasswordRemoved. When editing the data source and checking the Keep Password option on it still saves with PasswordRemoved! I've even gone in and hand edited the above XML. The next time I view the XML it is set back to PasswordRemoved. How the heck do I keep the password with the connection string? I am pushing the connection string out to a global config file and it does not save the password there either. Extremely frustrating as it is causing us to hard code the password inside our packages which means it is no longer able to move from one server to the next without touching every single package.Suggestions? Is there a setting somewhere in SSIS that is doing this?
I want to design an SSIS package that loads data from files into SQL Server and I want to automate the process. My major issue is that the source file doesnt come in the same format. Some times I comes in either .csv , .xls , .txt or even .rpt file format. Is there a way I can write a code that checks through my folder and based on the available format on the folder it loads the value in ssis.
we need download files from FTP location , so we are using script component as source and when we are using the fallowing code// read a file and write out as columns in rows
WebClient myWebClient = new WebClient(); myWebClient.Credentials = new NetworkCredential(Variables.ftpLogin, Variables.ftpPassword);
// Concatenate the domain with the Web resource filename. string myStringWebResource = Variables.fileURL + Variables.fileName;
string s = myWebClient.DownloadString(myStringWebResource); StringReader reader = new StringReader(s);
we are getting the fallowing error The server returned an error: (404) Not Found. are we missing anything in the code.
I'm trying to import some XLS files that I receive from some suppliers. The problem is that every time they send some columns with text values but formatted as number. When I read those columns with SSIS Excel Source, they come all with null values.
I don't want to change the columns data types every time, so I would like to know if there's a way to bypass the column types that are already there.
I tried to use both the Jet driver and the Office 12 driver. I've already used the IMEX=1 on ExtendedProperties too with no success. Is there a way to force reading the columns as text, even if they have data types assigned to them?
I've got a problem to retrieve data from a Xml Source. Basically, I call a method from a Web Service which gives me a Xml file.
The problem is that the XML structure is not really good. But we can't touch it.
Here is the Xml File :
Code Snippet
<?xml version="1.0" encoding="utf-16"?> <ArrayOfWSTargetVO xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <WSTargetVO> <ProjectId> <Value>131</Value> </ProjectId> <Id> <Value>Toto</Value> </Id> <Name> <Value>bateau</Value> </Name> </WSTargetVO> <WSTargetVO> <ProjectId> <Value>131</Value> </ProjectId> <Id> <Value>Tata</Value> </Id> <Name> <Value>F35</Value> </Name> </WSTargetVO> ... </ArrayOfWSTargetVO> As you can see, for each WSTargetVO, we have a projectid, an id and a name. But the value is not directly put into these nodes but in a new one : <value>
That causes my problem because here is the xsd file generated by visual studio :
Code Snippet
<?xml version="1.0"?> <xsd:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsd="http://www.w3.org/2001/XMLSchema" attributeFormDefault="unqualified" elementFormDefault="qualified"> <xs:element name="ArrayOfWSTargetVO"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" maxOccurs="unbounded" name="WSTargetVO"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="ProjectId"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="Value" type="xs:unsignedByte" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element minOccurs="0" name="Id"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="Value" type="xs:string" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element minOccurs="0" name="Name"> <xs:complexType> <xs:sequence> <xs:element minOccurs="0" name="Value" type="xs:string" /> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> </xsd:schema> And when I try to use the outpul results from the Xml file, I can't see how I can get a datatable with three columns corresponding to projectid, id and name.
Integration Services only asks me to choose between WSTargetVO or ProjectID or Id or Name and give me the <value> value.
I don't know if it is possible to modifiy the contents of the XmlFile or something else using XPath.
Of course, if I try to modifiy the XSD file and delete the value node to have a simple structure, I see my three columns but i can't get any data.
I'm aware that the XML file is pretty bad but it is impossible for me to change it.
If somebody has an idea, I would be happy to hear it :-)
I am using SSIS 2014 and installed adapter for sharepoint list source and destination and when I refresh the toolbox I don't see them. Is there a way to manually add them?
I am able to collect data from Progress DB, using ODBC Connectivity. The problem I am facing is, i have to iterate thru multiple servers. How do i configure ODBC source dynamically. It creates problem. Using expression, i tried to set the connectionstring dynamically, but it fails.
I need to grab data from teradata(using odbc connection).. i have no issues if its just bunch of joins and wheres conditions.. but now i have a challenge. simple scenario, i have to create volatile table, dump data into this and then grab data from this volatile table. (Don't want to modify the query in such a way i don't have to use this volatile table.. its a pretty big query and i have no choice but create bunch of volatile tables, above scenarios is just mentioned on simple 1 volatile table ).
So i created a proc and trying to pass this string into teradata, not sure if it works.. what options i have.. (I dont have a leisure to create proc in terdata and get it executed when ever i want and then grab data from the table. )
I'm encountering a very peculiar situation when I'm trying to compare source and target data using conditional split. Following is the Data Flow and how I'm trying to achieve this.
Source Data : Col_A (PK) Col_2 1 100 8 500 Target Data : Col_A (PK) Col_2 1 100 3 700 8 500 Look-up Target on Col_A to check for existing records. Now we have four columns in Look-up match output: Col_A, Col_B, Lkp_Col_A (Target Col), Lkp_Col_B (Target Col).
Conditional Split: Compare Col_B with Lkp_Col_B
Update target if there is any change in the existing value of Col_B.When I'm running the package for every record in source, the conditional split fails and even when there is no change in Col_B, some of the records (Not all and quite randomly) get updated with the same value. If I run the package for few records, it works absolutely fine.
All examples I found refer to classes under Microsoft.SharePoint namespace. However, I have the SharePoint CSOM that only gives me the Microsoft.Sharepoint.Client namespace.
I need to read the selected values of a multichoice field, but not sure how to do it with classes in the namespace above.
everthing works, exept the TSQL_x0020_Reference_x0020_Numbe field.
I have a package which has an Excel source with the 'Data access mode' set to SQL command and then a sql select statement. When I try and hit the 'Preview...' button below the 'SQL command text' window I get the following error:
"Error at Standard Data Flow Tasks [source tasks name]: No column information was returned by the SQL command"
Ordinarily this would be down to the fact that my SQL is shocking, I hit the 'Preview...' button whilst the workbook the source is pointing at was open and it works fine??
I can't figure this out, but needless to say the package errors with a NEEDSNEWMETADATA when I try and run it.
We have a single generic SSIS package that is used to import several hundred iSeries tables into SQL. I am not looking to rewrite the process. But I am looking for ways to improve performance.
I have tried retain same connection, maximum insert commit size, lock table (tablock), removed some large columns, played with the log file location and size, and now I am working to tweak the defaultbuffermaxrows.
To describe the data flow task - there are six data flows tasks (dft) working at the same time. Each dtf has their own list of iSeries tables and columns and the corresponding generic SQL table names. Each dtf determines their list of tables based on the number of columns to import. So there is dft30 (iSeries table has 1-30 columns to import), dtf60 (iSeries table has 31-60 columns to import), etc. The destination SQL tables are generically called Staging30, Staging60, etc. Each column in the generic Staging tables are varchar(100). The dtfs are comprised of an OLE DB Source and an OLE DB Destination.
The OLE DB Source uses a SQL Command from Variable to build a SELECT statement. The OLE DB Source uses a connection manager that uses an IBM iAccess IBMDA400 provider. The SQL Command ends up looking like this for the dtf30. This specific example is importing from the iSeries table TDACLR and it only has two columns so it will be copied to the Staging30 table.
select TCREAS AS C1,TCDESC AS C2,0 AS C3,0 AS C4,0 AS C5,0 AS C6,0 AS C7,0 AS C8,0 AS C9,0 AS C10,0 AS C11,0 AS C12,0 AS C13,0 AS C14,0 AS C15,0 AS C16,0 AS C17,0 AS C18,0 AS C19,0 AS C20,0 AS C21,0 AS C22,0 AS C23,0 AS C24,0 AS C25,0 AS C26,0 AS C27,0 AS C28,0 AS C29,0 AS C30,''TDACLR'' AS T0 from Store01.TDACLR
The OLD DB Source variable value looks like the following, but I am not showing the full 30 columns
select cast(0 AS varchar(100)) AS C1,cast(0 AS varchar(100)) AS C2,cast(0 AS varchar(100)) AS C3,cast(0 AS varchar(100)) AS C4,cast(0 AS varchar(100)) AS C5, ... cast(0 AS varchar(100)) AS C30.
The OLE DB Destination uses OpenRowSet Using FastLoad From Variable. The insert into Staging30 ends up looking like this.
Of course we then copy and transform the Staging30 data to the SQL table that equals T0.
But back to defaultbuffermaxrows. Previously the dtfs had default values of 10000 for DefaultBufferMaxRows and 10485760 for DefaultBufferSize. I added a SQL task to SUM the iSeries column sizes, TCREAS and TCDESC in this example, and set the DefaultBufferMaxRows by dividing the SUM of the columns max_length into 10485760. But I did not see a performance improvement. Do you think that redefining the columns as varchar(100) for the insert is significant? Should I possibly SUM the actual number of columns (2) as 2x100 or SUM the 30x100?
I used a DTS package to import data from one database to a new database (different servers) and it seemed to work fine with the exception that it lost all the passwords. I got one error telling me that for security reasons, it left all passwords NULL. Sure enough - they are now all blank.
Ideas - any good articles with quick fixes ???
Thanks a bunch - I am just getting into a lot of unusual problems this week. Nancy
By the way - what happened to the SQL*Pass message boards ??? Nobody visits there now....
In my package , I am used CDC Source transformation and received the Net changes then insert into Destination. But whatever Data coming from CDC source data type Varchar value needs to Converting Non Unicode string to Unicode string SSIS. So used Data conversion transformation to achieved this. I need to achieve this without data conversion.