I have an file I FTP onto the server using a FTP task in my DTS package. The problem I'm having is the FTP task downloads using binary format, whereas I need it to download using ascii format. Binary is causing characters in the file to change, and my import is failing.
I cannot set a breakpoint in a Script task and have it complete successfully. I am running Vista 32-bit and only the workstation tools for SQL Server 2005 SP2. The steps to recreate are:
1) Create a new SSIS project/package. 2) Add a Script Task. 3) Set a breakpoint in the Script Task. 4) Run the package.
When I remove the breakpoint, the script task succeeds. When I put it back the task fails. The execution results say "Error: The script files failed to load." I completely uninstalled all SQL Server 2005-related entries using the Programs and Features control panel, rebooted, reinstalled the Workstation Tools, then applied a freshly-downloaded SQL Server 2005 SP2 and rebooted. If I change "RecompileScriptIntoBinaryCode" to false the script fails whether there's a breakpoint or not. If it's true (the default), the script only fails when a breakpoint is set. I would like to be able to set a breakpoint to assist in debugging the package. For the time being, I can move the package to a different (non-Vista) OS, where it works fine, but I would like to be able to develop and debug on Vista.
This should be simple. I have a package which reads a flat file into a data flow. One of the columns (RefNumber) needs to be parsed and split into 2 distinct values. So in the dataflow I add the 2 new columns (ID1 & ID2) in a derived column transformation, and then call a script task.
In the script task, RefNumber is readonly, ID1 & ID2 are readwrite.
Here's a cut-down version of the script task with the boring stuff removed;
Public Overrides Sub Parser_ProcessInputRow(ByVal Row As ParserBuffer) Dim narrative As String = Row.RefNumber.Trim()
If (String.IsNullOrEmpty(narrative)) Then Return End If
'lots of stuff happening here not relevant to the question so snipped
If ((IsNumeric(narrative)) And (narrative.Length = 16)) Then Row.ID1 = Int32.Parse(narrative.Substring(0, 8)) Row.ID2 = Int32.Parse(narrative.Substring(8)) Return End If End Sub
Looking at a data viewer after the script task, the values aren't being set. I also stuck some MsgBoxes into the script task and the Row values are being properly in the script.
I know I'm missing something obvious ... any ideas?
I have an FTP task in a for each containter and am setting the RemotePath using an expression (works great). Thought I could use this to start learning some of the scripting funtionality in SSIS (in a script task) so found some code in this forum (thanks Original Posters!) and tried my hand at some coding... Intent was to create a variable and then dynamically overwrite the Expression in the FTP Task from the script (I know I don't need to do this, I just wanted to use it for learning purposes)....
I have a variable named varFTPDestPathFileName (string) and want to set it to the value of varFTPDestPath (string) + varFTPFileName (string). Note: all variables are scoped at the package level (could this be the problem?). I did not assign any of the variables to ReadOnly or ReadWrite on the Script Task Editor page (seems to me that doing this in the code is a whole lot cleaner [and self documenting] than on the Task Editor page)...
I keep getting the following error: "The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there."
Here is the script:
Public Sub Main() Dim vars As Variables ' Lock for Read/Write the variables we are going to use Dts.VariableDispenser.LockForRead("User::varFTPDestPath") Dts.VariableDispenser.LockForRead("User::varFTPFileName") Dts.VariableDispenser.LockForWrite("User::varSourcePathFileName") Dts.VariableDispenser.GetVariables(vars)
' Set Value of varSourcePathFileName <<--- ERROR OCCURS HERE vars("User::varSourcePathFileName").Value = _ Dts.Variables("User::varFTPDestPath").Value.ToString + _ Dts.Variables("User::varFTPFileName").Value.ToString
vars.Unlock()
Dts.TaskResult = Dts.Results.Success
End Sub
I would also like to be able to loop through the Dts.VariableDispensor to see the contents of the variables and their values.
Somthing like
For each ??? in vars msgbox(???.Value) Next
One other question... Do we always have to preface the variable with "User::" or "System::", if so can you explain why?
I have a situation where I have to read an encrypted password from a table and set the password and userID for the connections. I wrote functions to retrieve the data from a table, decrypt the password and UserID, and set the connection string for the connection. What happens, though, is that the connection string I wrote to the connection gets changed when reading it back so that the password is no longer included. Also, in testing the connection, it fails, telling me login fails. Can anyone shed any light on this? Does anyone have sample code to show the setting of a password for a connection in a script task? All of the examples I find are for Integrated security.
Hi, Let's say I have a package taking as parameter "InvoiceID". I want to execute this package as a child in another package. The parent package gets the list of invoices to produce and calls the child package for each entry of the list.
How do I pass the InvoiceID to the child? I know I can use the parent's variables from the child but I don't want the child package to be dependant on the parent package. For example, I might want to execute the "child" package as a stand-alone in development (providing it with a predefined InvoiceID). I might also want to call the same child package from another parent package with other variable names.
What I would like to do is "push" the value instead of "pulling" it. I know it's possible using the command line and the /SET option (ex.: /SET Package.Variables[InvoiceID].Value;' 184084)... Is it possible using the Execute Package Task?
I'm creating an SSIS package that will execute legacy dts packages. The package to be executed is decided at runtime using a sql query task.
Executing a dts package statically works fine, but when I try to set the details of the dts to be run via expressions, I get the error below.
To make it dynamic, I created a variable of type string, and put the package name in here. I also have a string variable for the packageid. Then I set up an expression on the Execute DTS 2000 Package Task that sets the PackageName & PackageID property to this variable.
The PackageId is a string variable I've retrieved using:
select top 1 name, cast(id AS varchar(50)) id, cast(versionid AS varchar(50)) versionid from sysdtspackages where name = @PackageName order by createdate desc
When I use the task to set the package id it works find (by selecting a dts, then changing the name), but when I try to provide the package id I get this message:
but the method signature specifies String PackageGuid, and it is a string..
any ideas??
i tried casting the variable like so:
(DT_GUID) @[User:TSPkgId], as the versionid is a uniqueidentifier on sysdtspackages.. It didn't like that at all (can't cast from type DT_WSTR to DT_GUID error code 0xC00470C2)
I have a table that I'm loading as part of a control flow that in turn is copied to a target table by using a data flow task. I am doing this because a different set of fields may be used from the source entry to create the target entry based on a field in the source table. That same field may indicate that multiple entries need to be created in the target table from one source table entry for which I use a multi-cast transformation.
The problem I'm having is that no matter how many entries there are in the source table, the OLE DB Source during execution only shows 7,532 entries being taken from the source table. If there are less than 7,532 entries in the source table, everything processes fine. More than 7,532 and the data flow task only takes 7,532 and then seems to hang. It also seems as though only one path of the multi-cast transformation is taken when the conditional split directs a source entry down that path.
Seems like a strange problem I know, but any insight anyone could provide is appreciated. Thanks.
I ran into a variety of problems trying to set a script task breakpoint in a package containing multiple script tasks. The debugger apparently treats the breakpoint as if it were set in ALL tasks in the package, not just the one in which it is actually set.
At best this results in hitting breakpoints in scripts where they are not set and at worst the debugger brings up the "Send error report" dialog and quits (while the package continues to run). The latter seems to happen most often when the script with the breakpoint has more lines than an earlier script and the breakpoint is set at a line number that exceeds the number of lines in the earlier script--it bombs when the earlier, shorter script starts.
To get the debugger to work under these circumstances I had to add some nonsense code like
While False Dim i as Integer = 0 End While to every script, at the same line number near the beginning of the script (line 40, for example). I then set a breakpoint on the middle statement in one of the scripts (doesn't matter which) to cause the debugger to open at runtime. It doesn't hit the breakpoint because the line is never executed. If the breakpoint is set on a line that can be executed in any script in the package, bad things tend to happen. I then add a "stop" statement to the script that I want to debug. This only works if the debugger is already open, hence the dummy breakpoint above. This workaround is usable, but I am debugging a package that has quite a few scripts and having to insert dummy code in all of them at a fixed line number is rather inconvenient. I would really like to see breakpoints work the way one would expect--only in the scripts where they are set. Is there some other, easier way around this problem? Is there at least an easier way to get the debugger to open so that "stop" will work?
All, I have to automatically grap a dbf or ascii file from my hard drive and then insert that into an already existing database table. Does anyone know how to do this? The only thing I can find is to do it manually from enterprise manager, but I need to automate this.Thanks in advance!
I've got one SQL Server 7.0 table with a "Decsciption" Column of length 4000. The values in this column contains "End of Line" ASCII Character. The ASCII Value of this character is 10. I'm not able to remove this ASCII Character. I tried by using REPLACE function. But i could not remove that character.
I’m new to T-sql and need help understanding the CHAR() and ASCII functions. I tried to run these commands to better understand them but I get the same results each time. Here is the query with the char command
USE Northwind SELECT FirstName + ' ' + LastName, + CHAR(13) + Address, + CHAR(13) + City, + Region FROM Employees WHERE EmployeeID = 1
Nancy Davolio 507 - 20th Ave. E. Apt. 2A Seattle WA
(1 row(s) affected)
Here is the same command without the char function.
USE Northwind SELECT FirstName + ' ' + LastName, + address, + city, + region from employees where employeeID = 1
address city region ------------------------------- ------------------------------------------------------------ --------------- --------------- Nancy Davolio 507 - 20th Ave. E. Apt. 2A Seattle WA
(1 row(s) affected)
As you can see I get the same results back except for the column names being displayed.
Hi I am importing data from mainframe using DTS the problem is that I am getting some COMP fields from the mainframe these are compressed data fields that are used to reduce storage space on the main frame. Can some one please help me understand how we can use DTS to solve this problem that is decompress it before transferring to table in SQL server. any code in scripting language to help decompress this datatype will be really appreciated.
Dear all, I've a different problem while writing some script.
actually i need all the currency symbols and its ascii values. i downloaded from one website, it is showing correctly in msword. but i'm not getting the same thind in EDITPLUS.
Dear all, I've a different problem while writing some script.
actually i need all the currency symbols and its ascii values. i downloaded from one website, it is showing correctly in msword. but i'm not getting the same thing in EDITPLUS.
item custclass totalcustclass ------------------------- 06-5841 INST-CLINPRAC 1 06-5841 INST-MKT/MEDIA 2 06-5841 PROGRAM 1 06-5841 STANDARD 4 06-5845 STANDARD 1 AX-048 INST-MKT/MEDIA 4 KT-048 PROGRAM 2 KT-048 STANDARD 4
i want condition like if item is starting with number then totalcustclass whcih is count(*) remain same giving correct results...but if item startign with ascii character then totalcustclass r getting double so i have to /2..
i want results:
item custclass totalcustclass ------------------------- 06-5841 INST-CLINPRAC 1 06-5841 INST-MKT/MEDIA 2 06-5841 PROGRAM 1 06-5841 STANDARD 4 06-5845 STANDARD 1 AX-048 INST-MKT/MEDIA 2 KT-048 PROGRAM 1 KT-048 STANDARD 2
select item, custclass,
case when item is <ascii> then count(custclass)/2 else count(custclass) as totalcustclass
from itemcustclass
can anyone tell me what condition will come in case?
I have data that comes from a legacy system. I can obtain the data in anASCII format. Currently I have created scripts in ACCESS to import the datainto tables.What I would like to do is create an automated import function in SQL.I am new to SQL, can anyone point me in the direction I should look to findout how I could perform this task?Using SQL 2005.ThanksMatt--Matt Campbellmattc (at) saunatec [dot] comMessage posted via http://www.sqlmonster.com
I need to generate Ascii 7 bit flat file, based on data in db, using integration services, FTP task. Currently i am generating file with ansi-latin and then using the script task converting it to the ascii 7. File looks to be generated properly. But when the target system reads this, they complain that the file has junk charecters some thing like this. when i open it after generating the file it looks fine to me in DOS also. I dont know what is the target system and what OS is used by them. what cud be the issue for these junk charecters and is it possible that a Ascii 7 file generated by windows doesnt work in other OS? If the method i am doing to generate the ascii 7 is not currect then what is the best method for this?
PS: earlier i had generated a flat file using data export from Excel & that worked in the target system well. is there any difference in the file encoding generated by excel and integration services?
I am working on a project that will be mimicking an existing interface that we have with one our our clients. That interface today sends EBCDIC packed fields. We do not want to introduce changes to the external clients interface file when we rebuild it in SQL 2005 Integration Services and I need to find out how I can take ASCII data and convert it to the host (mainframe) representation, which is what we currently provide to our external client using Integration Services.
Has anyone had to do this? If so, can I accomplish it natively with SSIS, or do I need to look to a third party vendor for a component?
I am glad if someone help me. My question is like belo:
i have ascii data in one column name(failcode), i want to use it to link with other table which is the data type in integer. so i have to convert it into integer type.
May i know how to write in query to convert ascii to int in Sequence Server(Microsoft SQL Server 7.0)?
I have a problem with alot of my SPs. All compile correctly but cause erroneous data due to IF statements begin ignored due to
characters (see below).
Example SP... -- Opened within last 12 Months
IF @NEWACCIND = 1 BEGIN
EXECUTE usp_DFDX03_D0150_A4 @COSTALL OUTPUT END
-- Accounts in Arrears in Current Quarter
Should look like ...
-- Opened within last 12 Months IF @NEWACCIND = 1 BEGIN
EXECUTE usp_DFDX03_D0150_A4 @COSTALL OUTPUT END -- Accounts in Arrears in Current Quarter
I need to find all SPs with double
instance and manually replace. There are hundreds of SPs in total. I have tried
SELECTCOLID, ID FROMSYSCOMMENTS WHERECHARINDEX (CONVERT (VARCHAR(3), CHAR(13)+CHAR(13)), TEXT) > 0
but this also returns SPs containing 2 consecutive blank lines as well (which there are alot of due to formatting of T-SQL). Really I need to distuinguish between and new line which both appear to be CHAR(13)
Thought I would share this since it caused me so much grief.
In some mainframe systems, some dates are stored as the string "00000000". In my SSIS package, I was trying to anticipate for this string, as well as any other combination of zeros (e.g., "000", "0000", etc), since I had already seen lots of dirty data in the flat file (like non-printing characters, etc).
So, what I tried to do was perform an integer conversion on the string and test if it was the equivalent of the numerical value zero:
Code Snippet
(DT_STR)[ColumnName] == 0 ? .... Now, for some reason, that doesn't work, even though a similar operation in SQL does work:
Code Snippet
SELECT TOP 1 ISNUMERIC('00000000') FROM tableName
In the end, I had to resort to testing for a match on the literal string "00000000" and hope that no other dates came in as "000" or other variation. Fortunately, this has been true so far.
However, the moral of the story is, converting a series of zeros into a numeric zero, and testing against that, does not seem to work. I don't have a good explanation for why that is, but I would guess it has something to do with the limitation of the conversion function.
I tried to setup a flat file data source that has code page 37 (EBCDIC)
Then I have a flat file destionation that is ASCII.
And inbetween I have tried several different data flow conversion tasks liked Data Conversion, and Derived Column. But I keep getting errors about different code pages.
I also tried to load the EBCDIC data into a SQL Server DB, and it complains about different code page.
Has anyone been able to do this with SSIS out of the box, without any extra components ?
I've an issue with double-quotes in CSV file. One of the columns may contain this kind of value: "STATUS ""H"" "
I've got quote set to "
The file source fails on such records.
I found this thread and Scott tells us there that the file can't contain " in data.
Is this 100% correct?
I've got mutliple text columns and the pain is that I don't know which column might have these cases in future. To create a script means to write my own file parser for all files I use.
I'm working on a database conversion from Sybase to SQL Server 2005 and have hit a wall with a character conversion problem when reading non-ASCII characters (encrypted password string) via JDBC.
My application runs on Solaris and accesses a SQL Server 2005 database via the Microsoft JDBC driver. The server was unfortunately specified as having a SQL_Latin1_General_CP1_CI_AS collation at installation time, and the database being accessed has taken this default. After creation the data was migrated across via DTS.
The invalid character is a dagger '†'. When read over JDBC it is converted to a question mark '?'.
In my original environment a Sybase database was accessed via JDBC driver from Solaris and the correct value was returned. The Sybase database used Latin1_General_BIN as it's collation. By way of experimentation I have modified the default collation sequence within the SQL Server 2005 database, and created a new table to hold the password. I am then able to correctly return strings containing this character from within SQL Server Management Studio, but the same problem still exists when accessing it via JDBC.
I am not sure where to focus my investigation and would be grateful for any useful pointers/advice. To me it looks like it's a JDBC driver issue as with the change in collation it works from a non-JDBC client.
Adedoyin Akinnurun writes "i have a database that is running using regular ascii characters.. i am trying to migrate this database to support several other languges(globalization). I need ideas on how to migrate this database to UTF 8 . Someone suggested converting all the varchar and char to nvarchar and nchar .. but i have a lot of data on the system and this might take a lot of time.. Any ideas would be appreciated !!!!"
Our database defines the long_value column as nvarchar(max). I want to find out which rows actually contain non-ASCII characters in that column, but this clause also returns rows with only ASCII characters:where long_value like (N'%[' + nchar(128) + N'-' + nchar(65535) + N']%')
im new to the ASP/SQL scene, so please bare with me.
i have to send an email (with two attachments) through sql server.
the mail arrives, but the attachments are not attachments... they are ascii text in the body. im using an existing stored proc, which apparently works. so there is no reason this shouldn't work.
The file exists, the path is correct. Its obviously seeing and reading the file (otherwise there would be no data to print as ascii)
thanks all.
Also, if you know of a better place where i can post this thread, please let me know