I've created a DTS package -- that uses a query to export to a .txt file. My question is -- if the results of this query are zero (no results returned within the package ) -- how can I tell the package not to export a zero byte file. Any thoughts on that? Any help you could give would be greatly appreciated. Thanks!
hi i am getting an error with my code, it says 'value of type byte canot be converted to 1 dimensional array of byte' do you know why and how i can correct this error, the follwoing is my code. can anyone help me correct the error and let me know ow to solve it thanks for any help givenPublic Sub ProcessRequest(ByVal context As HttpContext) Implements IHttpHandler.ProcessRequestDim myConnection As New Data.SqlClient.SqlConnection("ConnectionString") myConnection.Open() Dim sql As String = "Select Image_Content from ImageGallery where Img_Id=@ImageId"Dim cmd As New Data.SqlClient.SqlCommand(sql, myConnection)cmd.Parameters.Add("@imgID", Data.SqlDbType.Int).Value = context.Request.QueryString("id") cmd.Prepare()Dim dr As Data.SqlClient.SqlDataReader = cmd.ExecuteReader() dr.Read() context.Response.ContentType = dr("imgType").ToString()context.Response.BinaryWrite(CByte(dr("imgData"))) ----- this is the line with the error End Sub
Can we regenerate a .rdl file based on the byte[] stream created by Render method?
user sends the report name with parameters to application server and it is application server sends the request to Reporting Server and get the report back (Render method). But how to pass the returned byte[] stream to user and show her/him a report?
Hello, I'm having users upload documents to my db and storing them as an Image datatype, I can do that without an issue. I'm also able to find that record and return it as a byte(). Now, what's the best practice/scheme to return it to the browser, even FF, and have it prompt the user with the Open or Download thing Dialogue we all know and love.
Hi, I'm a bit new to DTS but the problem I have encountered relates to importing a text file. On occasion the file is zero bytes which causes an error in DTS. I have added some VBScript to the workflow to check the filesize. If size > 0 then I set Main = DTSStepScriptResult_ExecuteTask otherwise Main = DTSStepScriptResult_DontExecuteTask. This all works fine except that when the file is 0 the dependent process that is waiting for completion/success does not run, presumably because the task was not run (as reqd). I have tried setting the status rather the DontExecuteTask but then the task runs anyway creating the error. Anyonw know how to get this to work or how to get around the problem?
Hello all, I was trying to run a test to write a ebcdic file out with a comp - 3 number (testing this for other people) and have run into a problem writing the string out to the flat file destination. I have the following script component:
Code Block
' Microsoft SQL Server Integration Services user script component ' This is your new script component in Microsoft Visual Basic .NET ' ScriptMain is the entrypoint class for script components Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper Imports Microsoft.SqlServer.Dts.Runtime.Wrapper Public Class ScriptMain Inherits UserComponent Public Overrides Sub CreateNewOutputRows() ' ' Add rows by calling AddRow method on member variable called "Buffer" ' E.g., MyOutputBuffer.AddRow() if your output was named "My Output" ' Output0Buffer.AddRow() Dim myByteArray() As Byte = {&H12, &H34, &H56, &H7F} Output0Buffer.myByteStream = myByteArray Output0Buffer.myString = "ABCD" Output0Buffer.myString2 = "B123" myByteArray = Nothing End Sub End Class
I have added myByteStream as a DT_BYTES length 4, myString as (DT_STR, 4, 37) and myString2 as (DT_STR, 4, 37) to the output 0 buffer.
I then add a flat file destination with code set 37 (ebcdic us / canda) with the corresponding columns using fixed width.
When i place a dataviewer on the line between the two the output looks as I expect ("0x12 0x34 0x56 0x7F", "ABCD", "B123"). However, when it gets to the flat file destination it errors out with the following:
Code Block [Flat File Destination [54]] Error: Data conversion failed. The data conversion for column "myByteStream" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
If i increase the size of the byte stream (say, to 50) the error goes away but I am left with the string "1234567F" instead of the appropriate hex values. Any clues on how to go about this? I obviously don't care if it gets transferred to "readable" text as this is supposed to be a binary stream, thus the no match in target page seems superfulous but is probably what is causing the problems.
NOTE: this is relating to the following thread (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2300539&SiteID=1) in that I am trying to determine why these people are not seeing the "UseBinaryFormat" when importing an EBCDIC file (i see this fine when i use an ftp'd file, but it auto converts to ascii) with comp-3 values. I also see the "UseBinaryFormat" when I am importing a regular EBCDIC file which I create that has no import errors with zoned decimals.
How can I prevent a package, or a section of a package, from being run more than once simultaneously? The package makes use of a staging database table, and if two copies of this package are run at the same time, there will be a race condition and possible corruption of the data in the database table. I have also noticed that the SSIS logging gets messed up and starts overwriting itself if I have more than one of these packages run at the same time.
What I am thinking of should work the same way as a serializable transaction lock on a normal query, where one query has to wait if another query has a lock on the table. I don't want the package to throw an error, I want it to wait until the other one is done.
I have been trying to use transactions in a test SSIS package, but they are not quite doing what I want them to do. I put various SSIS steps in a sequence container and require a serializable transaction. I have also tried putting the transaction at the package level, but that is not preventing simultaneous package runs.
Is there a built-in way to do this that is easy to implement? Otherwise, what I could do is insert a flag into the database indicating that a package is running, then make sure to reset the flag on any error or on completion. It seems like this alternate method could be error-prone though.
"The statement has been terminated.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I have a stored procedure that performs a task on a loop. Sometime it fails, somtimes it succeeds and that's just fine, the errors are handled within the stored procedure and I return counts of success and failures.
The trouble is that the package execution terminates when even one of the task fails. I don't want it to, I just want it to continue as the pakage should then go on to extract the source of the failues and post them back where they came from.
Any ideas how I can achieve this?
I have tinkered with the MaximumErrorCount setting, but I get the same result regardless of whether this is set to 0, 1, or 99999.
I also have a 'failure' path leading from the SQL task, but the package does not follow it.
I'm using an OLE DB connection and when none of the tasks fail, the process works perfectly.
Hi, I have a problem importing data from SQL Server 2000 'text' columns to SQL Server 2005 nvarchar(max) columns. I get the following error when encountering a transfer of any column that matches the above. The error is copied below,
Any help on this greatly appreciated...
ERROR : errorCode=-1071636471 description=An OLE DB error has occurred. Error code: 0x80004005.An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Unicode data is odd byte size for column 3. Should be even byte size.". helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC} (Microsoft.SqlServer.DtsTransferProvider)
I have an SSIS and SSAS project in the same solution. I need to debug the SSIS package regardless if there is an error or two in the SSAS project. Is there a way to ignore the SSAS project while I debug the SSIS package?
"The log file for database is full. Back up the transaction log forthe database to free up some log space."Now I only know this way to deal with that manually,Step1. in option , chance Recovery model from FULL to Simple.Step2: go to task to manually shrink the log fileStep3: Change recovery model back from simple to FULL.But by this way, I could get same problem again, the log file is fill,and need free up.Could you give an idea how to prevent this from happening? what andhow should I do???Thanks a lot in advance for your help.
I am planning to develop a single package that will download files from ftp server, move the files to internal file server and upload it in the database. But I want to run this package for multiple ftp file providers. For each provider the ftp server might be different and the transformation to upload the files into a database table might be different.
So can I create a single package and then multiple configuration files (xml), which will contain the details fo the ftp file providers and then pass the xml file as a parameter while executing the package. The reason being that the timings of fetching the files is different for each ftp file provider and hence cannot be combined into one.
Here's the deal. I have a child package, (say, pack01.dtsx), which uses a dtsconfig file for its connection string, which can be called from other packages, but which also can be called by itself.
However I also have another package (say, pack02.dtsx) which uses the same dtsconfig file for its connection string. It calls on pack01.dtsx.
When I use DTEXECUI and run pack01.dtsx, specifying the proper .dtsconfig file, it goes well. But when I try and run pack02.dtsx, an error occurs saying pack01.dtsx connection cannot be established.
How do I pass the connectionstring being used by pack02 to pack01, without having to remove the configuration file setting of pack01? Can a Parent Package configuration and a configuration file try and map to the same property?
I use the DTS 2000 Migration Wizard to migrate one of the DTS 2000 packages to SSIS. The migration failed with the following error message:
LogID=17 #Time=6:31 PM #Level=DTSMW_LOGLEVEL_ERR #Source=Microsoft.SqlServer.Dts.MigrationWizard.Framework.Framework #Message=Microsoft.SqlServer.Dts.Runtime.DtsRuntimeException: Failed to save package file "C:Documents and SettingsfuMy DocumentsVisual Studio 2005ProjectsKORTONKORTONProcessCubesMF.dtsx" with error 0x80070002 "The system cannot find the file specified.". ---> System.Runtime.InteropServices.COMException (0xC001100E): Failed to save package file "C:Documents and SettingsfuMy DocumentsVisual Studio 2005ProjectsKORTONKORTONProcessCubesMF.dtsx" with error 0x80070002 "The system cannot find the file specified.".
at Microsoft.SqlServer.Dts.Runtime.Wrapper.ApplicationClass.SaveToXML(String FileName, IDTSPersist90 pPersistObj, IDTSEvents90 pEvents) at Microsoft.SqlServer.Dts.Runtime.Application.SaveToXml(String fileName, Package package, IDTSEvents events) --- End of inner exception stack trace --- at Microsoft.SqlServer.Dts.Runtime.Application.SaveToXml(String fileName, Package package, IDTSEvents events) at Microsoft.SqlServer.Dts.MigrationWizard.DTS9HelperUtility.DTS9Helper.SaveToXML(Package pkg, String sFileLocation) at Microsoft.SqlServer.Dts.MigrationWizard.Framework.Framework.StartMigration(PackageInfo pInfo)
Looking at the call stack, it looks like COM wrapper fails on SaveToXML. Can someone tell me how I should workaround this problem?
I am trying to create and later read a data file from a package deployed in SSISDB, but it is not reading it while it is successfully creating the file. The same package when run from the file system package, runs successfully. Generating ispac and deploying in SSISDB is running for infinite time. Is it a permission issue?
Hi people,I have a little problem with this!There are some variables in C# code: int personID = 10; string personName = "Tom";
System.IO.BinaryReader reader = new System.IO.BinaryReader(FileUploadPersonPhoto.PostedFile.InputStream);
byte[] personPhoto = reader.ReadBytes(FileUploadPersonPhoto.PostedFile.ContentLength); After that, there is a SQL query: string query = "INSERT INTO PersonTable (PersonID, PersonName, PersonPhoto) VALUES ("
+ personID + ", '" + personName + "', " + personPhoto + ")"; In Debug mode, value of the query is "INSERT INTO PersonTable (PersonID, PersonName, PersonPhoto) VALUES (10, 'Tom', System.Byte[])" and it does not work! Is there any prefix or something else that I should put and make it work?Thank you in advance!P.S. Do not want to use sql parameters at this piont!
I have a binary column with length 20 in SQL server table. I store dynamically C# byte[] value range from 0 to 19. So for example, If I store length of 16 and when I try to retrive it back into byte [] in C# it returns whole length 20. When I see in debug it has value from 0 to 15 which I want to use but from 16 to 19 is zero. How can I get just length value which I stored.
I use DataRow to get whole row and from the row object I extract byte [] based on column name.
I need some help working with CLR UDTs. I have created two UDTs called trajectory and point. Each trajectory consists of a list of points. Each point consists of three members : lon( type double), lat( type double) and datetime.
I have written my own IBinarySerialize.Write method for the trajectory type which is the following:
Dim maxSize As Integer = 4000 Dim value As String = "" Dim paddedvalue As String
Dim i As Integer Dim pt As Point
For i = 0 To point_list.Count - 1
pt = point_list.Item(i)
If i = 0 Then value = value & pt.X & "|" & pt.Y & "|" & pt.D
Else
value = value & ">" & pt.X & "|" & pt.Y & "|" & pt.D End If
I have a bigint column called "MillisecondsSince1970" that I need to convert to a date - SSIS is erroring out when I use DATEADD with the 8 byte int (if I use 4 byte it works but the column is bigger than 4 byte). The error is really lame:
[Derived Column [79]] Error: The "component "Derived Column" (79)" failed because error code 0xC0049067 occurred, and the error row disposition on "output column "Date" (100)" specifies failure on error. An error occurred on the specified object of the specified component.
Anyone have a way around it... a VB.NET equivalent of DATEADD or something else I can do?
I have followed many examples found on this site, but still get an invalid cast execption when I attempt to run code below. the exception is thrown when I try to convert the sql image to a byte[] in the download section of the code.
//upload file private void Button1_Click(object sender, System.EventArgs e) { //Get the filename of the pdf file to be uploaded. string strFilename = File1.PostedFile.FileName.Substring(File1.PostedFile.FileName.LastIndexOf("\") +1);
//Get the file type string strFileType = File1.PostedFile.ContentType;
//Get the file size int intImageSize = File1.PostedFile.ContentLength;
// Reads the Image Stream ImageStream = File1.PostedFile.InputStream;
byte[] ImageContent = new byte[intImageSize + 1]; int intStatus = 0; intStatus = ImageStream.Read(ImageContent, 0, intImageSize);
i have trouble while insert/update a field which contains double-byte characters (Chinese Traditional).
NO PROBLEM if i m using Enterprise Manager to view/edit the data. They are retrieved properly in the following: (1) Enterprise Manager (2) Query Analyzer (3) Visual Basic (4) Command prompt isql
EACH of the Chinese words are become a qustion mark '?' if the UPDATE SQL or stored procedure executed in the following: (2) Query Analyzer (3) Visual Basic
WHILE (4) Command prompt isql does not have the problem for the same UPDATE SQL and stored procedure.
I want to store some double byte characters in a table. Originally I planed to change all data type of fields which will store double bytes from VARCHAR() to NVARCHAR(). But i just found out we have no problem to pull and store foreign content(double byte characters) in fields with VARCHAR type( the content is showing correct after stored in database)
my question is : if VARCHAR can handle double byte character, what's the point of using NVARCHAR to store unicode character? any advice is appreciated.
Just wondering if anyone knows if you can create a certificate from a byte[]. For example, you can create an assembly using CREATE ASSEMBLY FROM 0x...; specifying the hex representation of it - can you do the same with a certificate? This means that you don't need to save the file to disk before loading it into the database.
I had just installed SQL 2005 dev on my laptop and got an error message when I tried to create a package using the BI IDE. I received the same error using VS2005 IDE. But the project was created regardless without any packages. When I tried to create a new package in the project, I received the same error again, but with an option to view the error details.
Following is the text of the error details:
TITLE: Microsoft Visual Studio ------------------------------
I have a situation where I have Package A, which is called from Package B. Both packages have been designed to use config files for their connections, etc.
The issues arises when executing Package B. I can specify a config for Package B to use, but how do I tell the embedded package to use its correct config file?
By default, Package A has its config file pointed to development, but I would rather not have to save a special version of Package A that points to production.
I dont see any option within the execute package task.
According to the help for SSIS, one method of deploying an SSIS package to a SQL Server, http://msdn2.microsoft.com/en-us/library/ms137565.aspx, is to use the File...Save a Copy of <package file> as... menu option.
I don't have that menu option at all. And yes, the package is in focus. My save menu options are simply; Save Selected, Save <package file> As... and Save All.
I am using Version 9.00.1399.00 of the SSIS Designer.
At one time I did have the Management Studio's CTP installed. However it was uninstalled before installing the tools from the Standard Edition. (it would seem like not completely however)
Your help would be greatly appreciated. Thanx much.
p.s. Almost forgot to mention... I am already aware of using the DTSInstall utility as a workaround. It should be noted, however, that despite enabling the "CreateDeploymentUtility" property, the DTSInstall.exe is not copied to the binDeployment directory.
i'm trying to read an image file from a database(ms-sql, .mdf) with type image. Anyone have any ideas on how to do this? I have a table adapter created but could not assign it to my byte[] variable. Thanks in advance.
Why can varchar datatype variable only 4000 byte? For example: in a storedprocedure declare @aa varchar(8000) ...... while select @aa=@aa+@otherinfo end when the length is more than 4000 ,the data in the behind will be lost
Hi,I have a .NET application that I want to save the Config.EXE contentsto my SQL database for remote review/testing. This config file is3700+ bytes long. I created a field in one of my tables with a VARCHAR4800 and then created a stored procedure that receives a parameter(also VARCHAR(4800).However it fails to write anything if the length of the value that Ipass is anything greater than 900. If I pass exactly 900 characters orless - the data is written to the field. If I pass 901 characters Iget nothing.I'm suspicious since it is exactly 900. I seriously doubt it's somelimitation of MS-SQL so I need a nudge in the right direction.Thanks