I've spent quite a few hours on this one... I've successfully gotten tables to datapump between source and destination datbases, however... I get the following message when I hit a table with a blob field:
Step 2 failed, error: x80040E21, x80040000 + 3617
General error -2147217887 (80040E21). (Microsoft Data Transformation Services (DTS) Data Pump (80040e21): Cannot instantiate Storage Object for source column 6 ('description') Blob data. Your provider may require that all Blob columns be rightmost in the source result set.) (Microsoft OLE DB Provider for ODBC Drivers (80040e21): Errors occurred)
I've written a VB com dll that does a simple datapump... but am stuck on this "Storage Object" instantiation thingy...
Any recommendations as to what I can put in my datapump object to allow just a simple xfer of blob types?
Ok, I am pretty new to BLOB's on SQL SERVER 200 so I need some answeres.
1. Can BLOB Fields hold .doc and/or .pdf documents? 2. If yes, can the contents of the documents be searchable. 3. If documents are updated do BLOB Fields lengtht change/grow dynamically?
I would apreciate any answers in this matter and if possible examples were I can get a better idea on how to configure the BLOB fieds on my DB and/or documentation regarding BLOB's.
I get this error when I execute a job calling a dts pkg. The package itself is fine and executes without any error.It's transferring around 400000 records. Shceduling the pkg gives the following error. Can any body led a hand please?
DTSStep_DTSDataPumpTask_1 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 1000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 1000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 2000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 2000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 3000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 3000 DTSRun OnProgre... Process Exit Code 1. The step failed.
On my MS SQL Server 2000, I am trying to create a generic way to load tables into my datawarehouse.
I have as input to the process a large number of table definition(s) stored individually as files on my server. And, ascii delimited data files in various locations but mostly accessible via NFS mounts.
I created two DTS package in MSSQL2K that in theory represents what I want to do:
package1 ... invoke package2 with global variables to load a system of related tables
package2 ... check for a trigger file ... set the "Execute SQL Task" statement to my first file ... run the "Execute SQL Task" which drop/add's a table ... set a "Connection" to a data source file that I want to use ... run the transformation and, with that my package starts to fall apart ... set the "Execute SQL Task" statement to the next file, and ...... goback and execute it
I can't figure out how to set the table in the transformation section to the table I want to use. And, I assume next to have the transformations links between the source and new table relinked.
The source files contain in the first row the column names as found in the tables I just created.
Wether I'm trying to access a datapump task in an imported package in SSIS, or in 2000 enterprise manager
they crash once created. I can not alter or redo mappings or anything else. After enterprise manager churns for 5-10 min it crashes.
SSIS partially crashes: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. After this I can no longer click on anything and have to end task on Visual studio.
I can do a disconnected edit on the package in enterprise manager and alter the statement, but as far as I know you can't really change much as far as mappings back in there. This also means that my 2000 packages have been limited to 1 teradata pump per package.
This happened on my laptop, 2 coworker's desktops, and now that i've migrated to a new high end developer box it's happening on it also.
I'm using an SSIS under SQL 2005; i transfer data with a DataFlow Component:
DataSource = OLEDB Source;
Destination = OLEDB Destination
The source is a View returning about 100000 records; if i execute it in query analyser it takes few seconds. But if i use it as DataPump Source it takes HOURS to execute it...
I've noticed that If i modify the SELECT query inserting a "TOP 10000000000", it takes few seconds..
I am scripting a DTS Package using VB. The problem I am having is that I get the following error when I execute the package from Visual Basic:
Step Error Source: Microsoft Data Transformation Services Flat File Rowset Provider Step Error Description:Incomplete file format information - file cannot be opened.
In the DTS Designer when I open the DataPump Transformation and click on the Destination tab I am prompted With the Define Columns dialog. I define the columns and click execute. The package works properly after doing this when executed from Enterprise Manager. Does anyone know how to script the file definition creation??? I have searched high and low through BOL and other resources and am not finding anything. I doesn't seem like it should be this hard to figure out so, maybe I am missing something.
I want to export to a Text File (destination) from a SQL query using a DTS DataPump Task. My query has 28 columns, some how when I try to define the destination columns for my text file the Microsoft Managment Console Crashes completely.
I tough may be my query has two many columns only to find out that it has one to many. If I ramove a column form my query, any column. I get no error at all.
¿Is there a limit to a Text File destination connection?
We are debating what is industry “best practice� for serving huge numbers of images in an industrial scale website. More directly, which approach produces the best performance and the best scalability? For example, how do sites like ebay, Amazon, and other large sites handle the millions or billions of images they must deal with?
Store as BLOB in sql server?
Store in /images folder and store url text into sql server?
We always assumed that the second approach is what most sites must do. But do they?
One developer on our team maintains that storing one million or more image files in a directory will most certainly result in poor performance, because the server must scan the directory, searching for the correct file, each time a web request is made. The directory is not indexed (?) so performance must eventually suffer.
Other developer counters that storing millions of images as BLOBs into sql server will result in poor performance and HUGE database. An additional layer of access (webserver to sql server, back to webserver, then to client) causes a delay and performance hit. Who is right? What do the gurus as the world class sites do?
I have just started using SQL Server reporting services and am stuck with creating subreports.
I have a added a sub report to the main report. When I right click on the sub report, go to properties -> Parameters, and click on the dropdown for Parameter Value, I see all Sum and Count fields but not the data fields.
For example, In the dropdownlist for the Parameter value, I see Sum(Fields!TASK_ID.Value, "AppTest"), Count(Fields!TASK_NAME.Value, "CammpTest") but not Fields!TASK_NAME.Value, Fields!TASK_ID.Value which are the fields retrieved from the dataset assigned to the subreport.
When I manually change the parameter value to Fields!TASK_ID.Value, and try to preview the report, I get Error: Subreport could not be shown. I have no idea what the underlying issue is but am guessing that it's because the field - Fields!TASK_ID.Value is not in the dropdown but am trying to link the main report and sub report with this field.
Am I missing something here? Any help is appreciated.
How can I create a Table whose one field will be 'tableid INT IDENTITY(1,1)' and other fields will be the fields from the table "ashu". can this be possible in SQL Server without explicitly writing the"ashu" table's fields name.
I have had an application running successfully in production for two years. In the last three months the app has become a document management system as well. During the addition of images, it was decided that no images would be stored in the database but on the file system. We now have over 500,000 images averaging in size of 92k each. I had to upgraid the raid once and I can only assume that I will again. I am not expecting the image count to exceed 700,000 in the next 6 months; By this time next year, I am expecting 1.2 million. Should I reconsider storing the images in the database? If they are stored in the database, what are the effects on performance? Thanks in Advance Wes
Does sql server has the data type similar to BLOB (Binary largerobject)which is available to DB2. BLOB in DB2 can support up to 2 G(variable-length data )if it does have, which one offers better functionalitiesany advice will be greatly appreciated!
Hi,One of our third-party software vendors is planning to implement BLOBin their database for storing certain documents. We are not toothrilled about it, since it can be a drain on our resources, but Iwould like to get the expert opinion out there on the pros and cons ofimplementing this.Also, the database is in Full recovery mode and we back up thetransaction log every 15 minutes. We also do a process similar to logshipping. We have two servers to which these transaction logs arerestored to periodically. What will the impact on the transaction logsdue to changes to the BLOB fields.If you could also point me to any resources that talks in detail aboutperformance, backup and recovery in relation to BLOB that would begreat.Thanks in advanceKR
can anyone help ,me out here with some design consideration reguarding importing of BLOB data to a SQL server 2000 using T-SQL statements?
I want to make an import of some documents which are stored in a Access database, to an Ms SQL server 2000. The documents are stored in the access database as a OLE Object, by now I thought of using the base64String function to convert the data from the access field and write it to the T-SQL statements which will written in a text batch file. And then I apply the SQL Convert function something like:
INSERT INTO testBin VALUES(convert(image,'base64sting_encoded')) go
Hello !I'm trying to update one table field with another table searched firstdate record.getting some problem.If anyone have experience similar thing or have any idea about it,please guide.Sample case is given below.Thanks in adv.T.S.Negi--Sample caseDROP TABLE TEST1DROP TABLE TEST2CREATE TABLE TEST1(CUST_CD VARCHAR(10),BOOKING_DATE DATETIME,BOOKPHONE_NO VARCHAR(10))CREATE TABLE TEST2(CUST_CD VARCHAR(10),ENTRY_DATE DATETIME,FIRSTPHONE_NO VARCHAR(10))DELETE FROM TEST1INSERT INTO TEST1 VALUES('C1',GETDATE()+5,'11111111')INSERT INTO TEST1 VALUES('C1',GETDATE()+10,'22222222')INSERT INTO TEST1 VALUES('C1',GETDATE()+15,'44444444')INSERT INTO TEST1 VALUES('C1',GETDATE()+16,'33333333')DELETE FROM TEST2INSERT INTO TEST2 VALUES('C1',GETDATE(),'')INSERT INTO TEST2 VALUES('C1',GETDATE()+2,'')INSERT INTO TEST2 VALUES('C1',GETDATE()+11,'')INSERT INTO TEST2 VALUES('C1',GETDATE()+12,'')--SELECT * FROM TEST1--SELECT * FROM TEST2/*Sample dataTEST1CUST_CD BOOKING_DATE BOOKPHONE_NOC12005-04-08 21:46:47.78011111111C12005-04-13 21:46:47.78022222222C12005-04-18 21:46:47.78044444444C12005-04-19 21:46:47.78033333333TEST2CUST_CD ENTRY_DATE FIRSTPHONE_NOC12005-04-03 21:46:47.800C12005-04-05 21:46:47.800C12005-04-14 21:46:47.800C12005-04-15 21:46:47.800DESIRED RESULTCUST_CD ENTRY_DATE FIRSTPHONE_NOC12005-04-03 21:46:47.80011111111C12005-04-05 21:46:47.80011111111C12005-04-14 21:46:47.80044444444C12005-04-15 21:46:47.80044444444*/
I am using MS SQL 2012. I have a table that contains all the data that I need, but I need to summarize the data and also add up decimal fields while at it. Then I need a total of those added decimal fields. My data is like this:
I have Providers, a unique ID that Providers will have multiples of, and then decimal fields. Here are my fields:
Can anyone get me pointed in the right direction or even better specific instructions on how to export BLOB's to .JPG's ? They are in SQL 2005 and I have about 1500 that I need to export. I ran across another site that said to use SSIS but I havent had much luck.
Can someone please show me an example on how to read & write blob data to a Database? For example if I have the query below (Northwind), how do I actually place the blob item in a picture box on a windows form?SELECT Picture FROM CategoriesWHERE CategoryID = 5
Can someone please give me an example in C# on how to retrieve an Image from a Table and store i into a Picture box on a windows form? In addition, how to insert a blob record into a table as well.
We are using DTS to transfer database from 6.5 to 7.0. The data of those tables with text datatype can not be transferred to the server with version 7.0 though the table structure is transferred. The error message we got is "Error at Destionation for Row Number 1. Error encountered so far in this task: 1. query based insertion or updating of BLOB values can not be supported. The source we use is Microsoft ODBC driver for SQL Server. Thanks in advance. Su Ge
I'm trying to transfer a table from a sql 7 server to an sql 6.5 server. When I try to select OLE db as the destination it gives me an error saying I can't do OLE DB unless I have sql 7.... So I chose ODBC
When it tries to transfer the table I get this error:
QUERY BASED INSERTION OR UPDATING OF BLOB VALUES CAN NOT BE SUPPORTED
what the hell? Is this due to the transfer from 7 to 6.5? How do I get around this?
----- What I'm trying to do is change the datatype of a table in this 6.5 database from smallint to integer... (I wasn't the idiot that designed this database with a smallint primary key) There is not enough room to copy the table into a new table, and when I ran through the above process it dropped the table and I can't restore it because I'm not the admin. Is there a way of changing the primary key datatype without using up a large amount of database space? or can I do a transfer from 7-6.5 some how?
This is a nightmare... I wish my client would just upgrade to 7 then this would have taken 5 seconds instead of all day long
I currently have a problem with blobs being cut off.
From powerbuilder, I am trying to pull in an image that is stored as a blob. This has always worked fine in our software, until a more recent version, and is now presenting me with this problem. Here are the details...
When the software is installed with a Sybase Database, everything is working great. It pulls in the full size, and there are no problems here.
However, when the software is installed with a SQL Server DB, problems arise.
The main problem: When using a ADO.NET DBMS interface to the SQL Server DB, the select statement is only pulling in 32000 bytes.
Secondary problem: This one may present a problem in the future, if and when i fix the first problem. To narrow it down to see if it was the ADO.NET interface giving me the problem, i connected to the same table on the same server, but using an ODBC interface as opposed to ADO.NET. This gave me the first 32768 bytes.
So a) ADO.NET when interfacing with SQL Server is only giving me the first 32000 bytes in my selectblob statement. I have narrowed it down to ADO.NET, as the code works fine with Sybase, and ODBC interfacing with SQL Server does not limit it at 32000 bytes.
and if I get a solution to that, something may then be limiting the blob read in size at 32768. Maybe, maybe not... but it is happening with ODBC|SQL Server.
Does anyone have any ideas. This is driving me wild. I have pounded google searching for ADO.NET known blob limitations but cannot find anything.
I'm dying here. Anyone who can help me out would be great. Thanks
Is it possible to import BLOB data using SQL statements stored in a file to an SQL server. Can the actual INSERT statement contain binary data? Or how should I convert the binary data in order to work with an INSERT statement?
I am using OLE DB source component to read a blob data column from a table. Then I want to Pass this column to a script component,w hich in it's script task should be able to convert it to a string and send it as an email. I have tried several ways but none seems to work. Please advise
I have a conversion application which convertts an access database to an sql server(different versions). I'm using stored procedures. The thins is that I export the OLe Object form access to SQL varbinary. what I do is to convert the binary data from the OLE Object to string using ToBase64String. The thing is that when I execute the SQL statement I get the following error:
Error:Operand type clash: text is incompatible with varbinary.
Can anyone tell me what I do wrong and how can I fix this? Thanks.
I have a table in a SQL Server database that contains a field(data type image) that contains a text file. I'm trying to retrieve this file and save it onto the hard drive. I'm using the code below, but get I get the error 'Arguments are of the wrong type, are out of acceptable range, or are in conflict with one another.' at line 11. This code worked fine when I tested it on the pubs database and exporting the logo field out of pub_info. I'm not quite sure why this doesn't work for my database. Can anyone see where I'm going wrong? 1 Dim cn As ADODB.Connection2 Dim rs As ADODB.Recordset3 Dim mstream As ADODB.Stream 4 cn = New ADODB.Connection5 cn.Open("Provider=SQLOLEDB;data Source=server;Initial Catalog=database;User Id='userid';Password='password'") 6 rs = New ADODB.Recordset7 rs.Open("Select * from filesubmissions where bundleId = 'F0000014.bun'", cn, ADODB.CursorTypeEnum.adOpenKeyset, ADODB.LockTypeEnum.adLockOptimistic) 8 mstream = New ADODB.Stream9 mstream.Type = ADODB.StreamTypeEnum.adTypeBinary10 mstream.Open()11 mstream.Write(rs.Fields("BLOB").Value)12 mstream.SaveToFile("c:export.txt", ADODB.SaveOptionsEnum.adSaveCreateOverWrite) 13 rs.Close()14 cn.Close() Thanks
Dear All, I am trying to run the some code which is designed to return an image blob. However it throws an exceptionException Details: System.IndexOutOfRangeException: ToolbarLogoSource Error:
Line 55: Response.ClearHeaders() ;Line 56: Response.ContentType = "image/gif";Line 57: byte[] arr = (byte[]) sdr["ToolbarLogo"];Line 58: Line 59: Response.BinaryWrite(arr); I can run the sql SP query fine and it returns binary data ok: Does anyone have any pointers I should look at as to why the error is thrown? Do you think my blobs are corrupt? Here is the code: Thank you for takin the time to look public class GetBanner : System.Web.UI.Page { protected System.Web.UI.WebControls.Image Image1; protected void Page_Load(object sender, EventArgs e) { SqlDataReader sdr = null; int brand_id = 0; try { brand_id = Int32.Parse(Request.QueryString["brand_id"].ToString()); } catch (Exception) { // commented below. Since it is a image handleer //Response.Write("<HTML>You must supply a brand_id</HTML>"); return; } { SqlDataAdapter daGetBanner = new SqlDataAdapter(); string connectionInfo = ConfigurationSettings.AppSettings["ConnectionInfo"]; using(SqlConnection dbConnection = new SqlConnection(connectionInfo)) { SqlCommand objCommand = new SqlCommand("usp_get_new_guid_r", dbConnection); objCommand.CommandType = CommandType.StoredProcedure; dbConnection.Open(); sdr = objCommand.ExecuteReader(); while (sdr.Read()) { Response.ClearHeaders() ; Response.ContentType = "image/gif"; byte[] arr = (byte[]) sdr["ToolbarLogo"]; Response.BinaryWrite(arr); Response.Flush(); } sdr.Close(); dbConnection.Close(); } } }