VERY Large Binary Import/export Headache
Oct 13, 2006
Hi,
I am currently importing (and exporting) binary flat files to and from Db fields using the TEXTPTR and UPDATETEXT (or READTEXT for export) functions. This allows me to fetch/send the data in manageable packet sizes without the need to load complete files into RAM first.
Given that some files can be up to 1Gb in size I am keen to find out a new way of doing this since the announcement that TEXTPTR, READTEXT and UPDATETEXT are going to be removed from T-SQL.
I had a quick foray into SSIS but couldn't find anything suitable which brings me back to T-SQL. If anyone knows a nice elegant way of doing this and is prepared to share, that would be grand.
Thanks for your time,
Paul
View 9 Replies
ADVERTISEMENT
Feb 27, 2007
Hi,I am having some trouble with my ASP page sending a file to a SQL Server 2005 database running on another machine. An exception is generated as follows:A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.) It seems to work ok with smaller files, but when I attempt to upload a 11MB file, it gives me the above exception. I upped the httpRuntime maxRequestLength in the web.config to 2048. Any Ideas? Here is my code:1 if (FileUploader.HasFile)
2 {
3
4 // call the stored proc
5 SqlConnection conn = new SqlConnection();
6 conn.ConnectionString = "Password=password;Persist Security Info=True;User ID=MyUser;Initial Catalog=MyDb;Data Source=MySqlServerMachine;Connect Timeout=300";
7 conn.Open();
8
9 SqlCommand cmd = new SqlCommand();
10 cmd.Connection = conn;
11 cmd.CommandType = CommandType.StoredProcedure;
12 cmd.CommandText = "spAddTestBinary";
13
14 SqlParameter param = new SqlParameter();
15 param.ParameterName = "@binaryParam";
16 param.SqlDbType = SqlDbType.VarBinary;
17 param.Direction = ParameterDirection.Input;
18 param.Value = FileUploader.FileBytes;
19 cmd.Parameters.Add(param);
20
21
22 cmd.ExecuteNonQuery();
23
24
25 conn.Close();
26 }
Here is my table:1 BinaryTable(
2 [id] [int] IDENTITY(1,1) NOT NULL,
3 [myBinary] [varbinary](max) NULL,
4 CONSTRAINT [PK_BinaryTable] PRIMARY KEY CLUSTERED
5 (
6 [id] ASC
7 )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]
8 ) ON [PRIMARY]
My stored proc:1 ALTER PROCEDURE [dbo].[spAddTestBinary]
2 -- Add the parameters for the stored procedure here
3 @binaryParam varbinary(MAX)
4 AS
5 BEGIN
6 -- SET NOCOUNT ON added to prevent extra result sets from
7 -- interfering with SELECT statements.
8 SET NOCOUNT ON;
9
10 -- Insert statements for procedure here
11 Insert into BinaryTable
12 (myBinary)
13 Values
14 (
15 @binaryParam
16 )
17 END
View 3 Replies
View Related
Jul 23, 2005
Hello, we are investigating the use of SQL Server as a backend to ourscientific imaging application. We have found that when we write alarge image (60 Megabytes) the performance is quite a bit slower thanwriting 60 single megabyte images. The tests were performed runningSQL Server 2000 on Windows 2003 Enterprise on a single machine toeliminate the network's contribution. Perhaps there is a configurationoption that will allow us to tune SQL Server to better handle largewrites?TIA
View 1 Replies
View Related
Sep 7, 2006
From what I can see, the 'varbinary(max)' data type is not supported, and the 'image' data type is supposed to go away. Is there some other way to store large chunks (10MB to 100MB) of data into an SSEv DB?
If I have to use the 'image' data type to so this, does anyone have a code sample that would let me push an array() of numbers into an 'image' field, and unload an 'image' field into an array()?
TIA
Pat
View 7 Replies
View Related
Jul 14, 2007
Hi I've followed a tutorial on how to write and read varbinary(max) data to and from a database. But when i try to read the data i get the error that the data would be truncated, but only when the varbinary(max) is greater then 8kB. I've used a system stored procedure (sp_tableoption) to set the table that holds the data to store data outside rows. To select the data i'm using a stored procedure: SELECT imageData , MIMEType FROM Pictures WHERE (imageTitle = @imageTitle) And then using an .aspx page to Response.Write the data:Using conn As New sql.SqlConnection conn.ConnectionString = ConfigurationManager.ConnectionStrings("myConnectionString").ToString Dim getLogoCommand As New sql.SqlCommand getLogoCommand.CommandType = Data.CommandType.StoredProcedure getLogoCommand.CommandText = "GetPicture" getLogoCommand.Connection = conn Dim imageTitleParameter As New sql.SqlParameter("@imageTitle", Data.SqlDbType.NVarChar, 200) imageTitleParameter.Value = Request("imageTitle") imageTitleParameter.Direction = Data.ParameterDirection.Input getLogoCommand.Parameters.Add(imageTitleParameter) conn.Open() Using logoReader As sql.SqlDataReader = getLogoCommand.ExecuteReader logoReader.Read() If logoReader.HasRows = True Then Response.Clear() Response.ContentType = logoReader("MIMEtype").ToString() Response.BinaryWrite(logoReader("imageData")) End If End Using conn.Close() End Using Can anyone please help me with this?!
View 2 Replies
View Related
Jun 2, 2015
I have a well-structured but also very large binary data-set that is generated by a C++ application every five minutes. The data needs to be accessed by SQL applications. Since data is generated every five minutes, performance is key, both for write and read. The data set is about 500MB.If data is written to the file system, the write performance doesn't involve SQL server. For reading it, I have a CLR to read the portions of the data that I need based on offset and length. That works and is very fast. The problem is that data is stored in the file system, so it is not self-contained within the database.
A second option that I haven't explored yet, is to write the data into a table as VARBINARY(MAX). I would read the data using SUBSTRING with appropriate offset and length. Performance of SQL write/read of binary data of this size, and whether there is a third option I haven't thought off. I'm using SQL Server 2014.
View 5 Replies
View Related
Feb 25, 2008
A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server.
I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard.
However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection
Operation stopped...
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success)
- Setting Source Connection (Error)
Messages
Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
(SQL Server Import and Export Wizard)
Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)
- Setting Destination Connection (Stopped)
- Validating (Stopped)
- Prepare for Execute (Stopped)
- Pre-execute (Stopped)
- Executing (Stopped)
- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)
- Post-execute (Stopped)
Does anyone encounter this problem before and know what is happening?
Thanks for kindly reply.
Best regards,
Calvin Lam
View 6 Replies
View Related
Oct 16, 2006
I am attempting to import data from Microsoft Access databases to SQL Server 2000 using the DTS Import/Export Wizard. I have a few errors.
Error at Destination for Row number 1. Errors encountered so far in this task: 1.
Insert error column 152 ('ViewMentalTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 150 ('VRptTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 147 ('ViewAppTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 144 ('VPreTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 15 ('Time', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Could you please look into this and guide me
Thanks in advance
venkatesh
imtesh@gmail.com
View 4 Replies
View Related
Oct 18, 2007
hi
i used to export my tables to excel file
but now i have tables which have binary (image) data
what happens to them?
is there any way to backup these data?
View 1 Replies
View Related
Nov 29, 2006
I am trying to simplify a query given to me by one of my collegues written using the query designer of Access. Looking at the query there seem to be some syntax differences, so to see if this was the case I thought I would import the database to my SQL Server Developer edition.
I tried to start the wizard from within SQL Server Management Studio Express as shown in one of the articles on MSDN which did not work, but the manual method also suggested did work.
Trouble is that it gets most of the way through the import until it spews forth the following error messages:
- Prepare for Execute (Error)
Messages
Error 0xc0202009: {332B4EB1-AF51-4FFF-A3C9-3AEE594FCB11}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not start session. Too many sessions already active.".
(SQL Server Import and Export Wizard)
Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Source 33 - ATable" (2065) failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard).
There does not seem to be any method of specifying a number of sessions, so I don't see how to get round the problem.
Does anyone know how I can get the import to work?
View 2 Replies
View Related
Feb 17, 2006
I am importing data from binary data files into SQL Server. This is the code I am using:
Do While Not EOF(1)
Get #1, , NonStdCurrRecord
adoRS.AddNew
adoRS!Field1 = CDate(NonStdCurrRecord.Field1)
adoRS!Field2 = CSng(NonStdCurrRecord.Field2)
adoRS!Field3 = CSng(NonStdCurrRecord.Field3)
adoRS!Field4 = CSng(NonStdCurrRecord.Field4)
adoRS!Field5 = CSng(NonStdCurrRecord.Field5)
adoRS!Field6 = CSng(NonStdCurrRecord.Field6)
adoRS!Field7 = CSng(NonStdCurrRecord.Field7)
adoRS.Update
Loop
Unfortunately, it takes about 8 mins to import a file with 180k records. Is there a faster way to do this?
View 4 Replies
View Related
Apr 3, 2006
I have a web application that I am rebuilding. I have many picture files that want to take off the file system and move into SQL as a blob. I will create an index of uids against the file names but need a good way to bulk add the files to the database... any hints on code or tools would be a great help.
Thanks
Bill
View 1 Replies
View Related
Mar 30, 2008
I'm trying to export data from an SQL table using classic ASP. In the past, I have created tab delimited text files using code similar to the following code:
Response.AddHeader "Content-Disposition", "attachment;filename=results.txt"
Response.ContentType = "application/octet-stream"
Response.Write "Field1"&vbTAB
Response.Write "Field2"&vbTAB
Response.Write "Field3"&vbTAB
Response.Write vbCRLF
RSb.Open "SELECT * FROM tblData1 WHERE ID=1 ORDER BY txtField1", con, 3, 3
Do Until RSb.EOF
Response.Write RSb("txtField1")&vbTAB
Response.Write RSb("txtField2")&vbTAB
Response.Write RSb("txtField3")&vbTAB
Response.Write vbCRLF
RSb.MoveNext
Loop
RSb.Close
This always worked fine in the past because the tables were small. Now, the tables are exporting 100,000 records and getting errors. I'm setting the page timeout and the sql connection timeout, but I'm still getting errors. I'm not exactly sure what's happening, but the export stops after exporting about 5Mb. It varies where it stops, so I'm thinking it's timing out somewhere.
Is there a better way to do this? Possibly use an export function in MS SQL that would export faster?
TIA
- PR
View 4 Replies
View Related
Oct 5, 2007
I'm trying to export data from a SQL Server 2005 database to a DB2 database through SSIS. However, I keep getting an error that says "Could not retrieve table list" with Invalid Conversion. SQLSTATE=07006. Does anybody have any ideas or what the problem could be?
-Kyle
View 7 Replies
View Related
May 27, 2008
Hi All,
i have a table in MS Access with CandidateId and Image column. Image column is in OLE object format. i need to move this to SQL server 2005 with CandidateId column with integer and candidate Image column to Image datatype.
its very udgent, i need any tool to move this to SQL server 2005 or i need a code to move this table from MS Access to SQL server 2005 in C#.
please do the needfull ASAP. waiting for your reply
with regards
View 1 Replies
View Related
Aug 31, 2006
Hi guys,
Hopefully this is the right place to ask.
Basically we have have two larges databases, one of which is updated from the other monthly.
For exaplination purposes:
DB1 = Source DB
DB2 = Destination DB
The problem that I require a soltion to is, how do I insert rows from a table in DB1 to DB2 and recover and store the identity of the new row against the ID of the existing row. This is so that I can then matain constraints when it comes to inserting rows into the next table and the next and so on.
This process of storing the ID's as lookups will need to be done for almost every table of which there are 20.
The best Idea we have at the minute is to create a table with two colums for each table (drop it and recreate it after each table has exported) that contains the two ID's, new and old.
This will require using a cursor for each row in the existing table, inserting it in the new table and the using @@Scope_Identity to get the new ID and then insert the two values into the temp table.
This too me feels like it will be very slow, particuarly when I bear in mind how much data we have.
Does anyone have any better ideas? (Sorry if the explaination isn't great, its difficult to get accross)
Thanks
Ed
View 1 Replies
View Related
Oct 27, 2006
I can't find the answer to these problems. I'm using SQL Server 2005 Developer Edition. I have a report with 15,036 rows and about 15 columns. It utilizes a table and a report header and footer. One column has data containing hyperlinks to another report.
1) When I export this report to Excel, the result is 14.2MB file. Why is this file so huge, and how can I make it smaller?
2) Is there anything I can do to prevent the inclusion of the hyperlinks in the exported Excel file?
Thanks,
Rosie
View 4 Replies
View Related
Oct 8, 2004
Importing data from an Access database, I cannot overcome the limit of 1,000 records.
In DTS, I "copy one or more tables", select tables, run, and cannot see my 1,052 entries.
Where can I set a max size of ~1,500 in my sql target base?
View 1 Replies
View Related
Mar 19, 2001
I was wondering if anyone can help me.
I am trying to import data into SQL Server 7. The table will be 700-800 columns, and the data will be about 150,000 records at a time.
The data source is flat file.
First I create the table using a database schema, and secondly I would like to populate the table.
The problem is that most of the data is numeric, and to be used for statistical analysis.
So far I have tried Bulk Insert, bcp, and dts.
DTS is the only method that has worked in any way, shape or form, but that requires importing each column as a Varchar. Importing to my pre-created table doesn't work, because it is interpreting some of the source columns as character data and refusing to insert them into an int field.
Bulk Insert and bcp both give error messages, and I am wondering if that is because of the size of the insert statement that is required to handle so many fields.
For the moment I am just trying to import the data in any way, but eventually, it will have to be run as an automated process, with the table structure probably needing to be altered as well.
Any help/suggestions would be very greatfully received.
View 2 Replies
View Related
Jun 15, 2001
When using DTS (in SQL 7) to export via OLE DB a large varchar to a text file, it clips it at 255 chars.
No other data access drivers seem to work, either. This is lame! I cannot use bcp as a work
around, because i want to use quoted comma-delimited, which it doesn't support, and I
am using query-based export, where the query calls a stored proc, which bcp also doesn't
support.
Are there any new versions of MDAC that fix this? Anyone know a workaround? My current hack fix
is to split my field into 2, but this is a grubby fix that hassles my reciptients.
This is a pretty fundamental limitation to a major product!
dn
View 1 Replies
View Related
Jun 21, 2007
Hello, All:
I have many, many Access databases that are roughly 1.5GB-3GBs each and they have millions of records. Each MS Access Database file corresponds to one Database in SQL server. I'm trying to simply transform the data as it is in Access to MS SQL 2005.
I'm using the 64 bit version of Windows Server 2003 and the 64 bit version of SQL 2005. The server is running four dual core AMD Operton processors and has 8GB of RAM with a 1TB RAID 5 configuration. I think the hardware should be sufficient but the SQL Server Import and Export Wizard can't seem to handle the large number of tables/records. If I do one table at a time, it works well; however, it produces the following error message whenever I try to import the entire database:
Pre-execute (Error)
Messages
Error 0xc0202009: {5A5BF7AD-E86B-4316-AD43-1912358C56F4}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Unspecified error".
(SQL Server Import and Export Wizard)
Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Source 64 - District Corporal Punishment Class" (5743) failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard)
Any ideas would be much appreciated!
Thank you,
Cody
View 1 Replies
View Related
Oct 7, 2005
This is a general question on the best way to import a large amount of datato a MS-SQL DB.I can have the data in just about any format I need to, I just don't knowhow to import the data. I some experience with SQL but not much.There is about 1500 to 2000 lines of data. I am looking for the best way toget this amount of data in on a monthly basis.Any help is greatly thanked!!Mike Charney
View 5 Replies
View Related
Apr 20, 2007
Hi,
I am making a SSIS package that imports data from a application using a custom ODBC driver. The field in the application is set to be a "longvarchar" type field and can be from 2 characters to 2MB of data.
I've created a ODBC data connection in the SSIS package and use a "DataReader Source" to read the data I need. The sql statement is very simple
Select log from tablename
When I try to run the SSIS package with that statement it just goes to yellow on the DataReader Source and stops. It stays like that until I stop it. If I select other fields except for that field it works fine. Also I've been able to get it to succeed getting the log field if I select a log record that's not too big. The largest one I've been able to get is 800 characters, but I got one with 2500 characters that just stops on yellow.
In the Progress log the last line says:
[DTS.Pipeline] Information: Execute phase is beginning.
Does anyone have any ideas on how to resolve this?
View 6 Replies
View Related
Jul 28, 2014
I need to create script that will import large XML files (500 - 7GB) on a daily basis and store the data in a relational db structure.
What is the best and fastest way of importing such files. I have played around with smaller files and found the following.
1. SSIS XML Data Source: It doesn't seem to like the complex elements types and throws out the file.
2. Using Bulk File Import, sorting the file in XML variable and using XQuery to parse the file: This works but it can't take a file more than 2GB in size, so I can't use this method.
3. C# + XML Serialization: This also works, but seems to be terribly slow. I open the DB connection once, so it doesn't open and close for each db call, but still seems like it takes a long time.
how to import large XML quickly in a relational table structure?
View 9 Replies
View Related
Apr 4, 2001
Hi guys,
I need to import a database from Sybase database. Tell me the best way in solving this issue.
I had already tired using DTS package, which is not working properly, since i could not get Primary keys, foriegn keys. But i am able to get rows back. It is real pain in bud to write a script to create Primary keys and foreign keys.
Is there is any way to solve this issue.. If so, let me know ASAP.. since i have a deadline by this week end..
Please solve this issue....
Thanks folks..
urs
VJ
View 1 Replies
View Related
Apr 11, 2000
Can somebody tell me, if I want import dat from db2 on ibm mainframe, do I need a separate Odbc driver
View 2 Replies
View Related
Dec 3, 2000
I am trying to design an import/export utility but
keep coming up against the problem of row and column delimiters.
Our data has every possible combination of delimiters in the data itself
so of course this causes the import to fail.
Basically I need a way to export and import a given table to a text file.
Is there any way to solve this without continually worrying about delimiters.
View 1 Replies
View Related
Sep 20, 2000
Q: Whenever I do import or export one database to another in SQL7, it does transfer all objects and it fails at users/permission transfer. I tried different options but the same. The odd thing when you look at the error msg every time it fails at different user.
Thanks
View 1 Replies
View Related
Feb 27, 2001
newbie...
I have a development environment and a test environment...
I would like to create a batch file of sorts in which I can backup (I would export in oracle) the development environment, objects and data, and restore (import in oracle) forced restore over the test environment. The reason for this is that we are a fast paced development team and it is important for us to update the test environment on demand. This may happen two-three times a day and the dba (me) is not always available to perform this task.
thank you,
joe
View 1 Replies
View Related
Jun 9, 2004
Can anyone tell me how can i have import/ export to/from CSV files programmatically?
Regards
View 5 Replies
View Related
Jul 13, 2004
When I run select @@servername
I get result NULL
Actually I am trying to transfer data and tables from production server to devlopment server. When I try it I get a error 'You cannot copy objects when the source and destination databases are same'
Database Name is same but I have source & destination server is different. Is it because I have @@servername is NUll? I don't know how it became NULL. How I can change it?
Thanks in advance
View 3 Replies
View Related
Jan 21, 2004
Hello All,
I am trying to export data from a database in MS-SQL server and import the same to a database in DB2 UDb v7.2 on AIX5. While trying to do this, I am getting some errors "table name is undefined" and "drivers are not capable."
I selected the IBM DB2 ODBC DRIVER for the destination server. Am I doing any mistake....
This is being posted in both DB2 and MS-SQL sites.
Thanking you all....
...Ram
View 6 Replies
View Related
Jan 10, 2007
How to export records from sql server 2005 database to a notepad or to excel..?
and how to import records from SQL server 2005 to SQL server 2000.
pls anyboy can help me pls
with regards
shaji
View 4 Replies
View Related