Large SQL Export

Mar 30, 2008

I'm trying to export data from an SQL table using classic ASP. In the past, I have created tab delimited text files using code similar to the following code:

Response.AddHeader "Content-Disposition", "attachment;filename=results.txt"
Response.ContentType = "application/octet-stream"

Response.Write "Field1"&vbTAB
Response.Write "Field2"&vbTAB
Response.Write "Field3"&vbTAB
Response.Write vbCRLF

RSb.Open "SELECT * FROM tblData1 WHERE ID=1 ORDER BY txtField1", con, 3, 3
Do Until RSb.EOF
Response.Write RSb("txtField1")&vbTAB
Response.Write RSb("txtField2")&vbTAB
Response.Write RSb("txtField3")&vbTAB
Response.Write vbCRLF

RSb.MoveNext
Loop
RSb.Close

This always worked fine in the past because the tables were small. Now, the tables are exporting 100,000 records and getting errors. I'm setting the page timeout and the sql connection timeout, but I'm still getting errors. I'm not exactly sure what's happening, but the export stops after exporting about 5Mb. It varies where it stops, so I'm thinking it's timing out somewhere.

Is there a better way to do this? Possibly use an export function in MS SQL that would export faster?

TIA

- PR

View 4 Replies


ADVERTISEMENT

Large Export Of Data From One DB Of One Structure To Another

Aug 31, 2006

Hi guys,

Hopefully this is the right place to ask.

Basically we have have two larges databases, one of which is updated from the other monthly.

For exaplination purposes:

DB1 = Source DB

DB2 = Destination DB

The problem that I require a soltion to is, how do I insert rows from a table in DB1 to DB2 and recover and store the identity of the new row against the ID of the existing row. This is so that I can then matain constraints when it comes to inserting rows into the next table and the next and so on.

This process of storing the ID's as lookups will need to be done for almost every table of which there are 20.

The best Idea we have at the minute is to create a table with two colums for each table (drop it and recreate it after each table has exported) that contains the two ID's, new and old.

This will require using a cursor for each row in the existing table, inserting it in the new table and the using @@Scope_Identity to get the new ID and then insert the two values into the temp table.

This too me feels like it will be very slow, particuarly when I bear in mind how much data we have.

Does anyone have any better ideas? (Sorry if the explaination isn't great, its difficult to get accross)

Thanks

Ed

View 1 Replies View Related

Export To Excel File Too Large

Oct 27, 2006

I can't find the answer to these problems. I'm using SQL Server 2005 Developer Edition. I have a report with 15,036 rows and about 15 columns. It utilizes a table and a report header and footer. One column has data containing hyperlinks to another report.

1) When I export this report to Excel, the result is 14.2MB file. Why is this file so huge, and how can I make it smaller?

2) Is there anything I can do to prevent the inclusion of the hyperlinks in the exported Excel file?

Thanks,

Rosie

View 4 Replies View Related

Cannot Export Large Varchar To Text File

Jun 15, 2001

When using DTS (in SQL 7) to export via OLE DB a large varchar to a text file, it clips it at 255 chars.
No other data access drivers seem to work, either. This is lame! I cannot use bcp as a work
around, because i want to use quoted comma-delimited, which it doesn't support, and I
am using query-based export, where the query calls a stored proc, which bcp also doesn't
support.

Are there any new versions of MDAC that fix this? Anyone know a workaround? My current hack fix
is to split my field into 2, but this is a grubby fix that hassles my reciptients.

This is a pretty fundamental limitation to a major product!

dn

View 1 Replies View Related

VERY Large Binary Import/export Headache

Oct 13, 2006

Hi,

I am currently importing (and exporting) binary flat files to and from Db fields using the TEXTPTR and UPDATETEXT (or READTEXT for export) functions. This allows me to fetch/send the data in manageable packet sizes without the need to load complete files into RAM first.

Given that some files can be up to 1Gb in size I am keen to find out a new way of doing this since the announcement that TEXTPTR, READTEXT and UPDATETEXT are going to be removed from T-SQL.

I had a quick foray into SSIS but couldn't find anything suitable which brings me back to T-SQL. If anyone knows a nice elegant way of doing this and is prepared to share, that would be grand.

Thanks for your time,
Paul

View 9 Replies View Related

How Can I Export Foreing Key And Primary Key With SQL2005 Management Studio/Database/Tasks/Export Data Wizard.

Jan 4, 2008

How can I Export Database with foreing Key and primary key.

Operation is that
SQL2005 Management Studio/Database/Tasks/Export Data


Before Version is SQL2000 we can Selected Copy Object and data between server and then Use Default Options click checked and Select Copy Index, Copy Foreing Primary key vs vs

But this options is not found in the SQL2005 Management Studio/Database/Tasks/Export Data wizard or I can't found it.

How can I export foreing Key and primary key with SQL2005 Management Studio/Database/Tasks/Export Data wizard.

Best Regards,

Athena.

View 1 Replies View Related

Import/Export Export Wizard

Oct 5, 2007

I'm trying to export data from a SQL Server 2005 database to a DB2 database through SSIS. However, I keep getting an error that says "Could not retrieve table list" with Invalid Conversion. SQLSTATE=07006. Does anybody have any ideas or what the problem could be?
-Kyle

View 7 Replies View Related

Export To Excel - Number Formatted Cells Export To Excel As 'General' ?

Feb 5, 2007

Anyone know why cells within a matrix that are formatted as numeric export to Excel with a cell format proprty of "General"? Cells within a table however export with an appropriate format.

Thanks

View 1 Replies View Related

BCP Export Has CHR(0) In Export File

Mar 28, 2008



Hello,

I am using BCP to export a Table to a TAB delimited file. This works great, but in some fields a NULL in the Table is being exported as a character zero in the Tab file. I confirmed this by looking at the Tab file with a Hex editor.

I am using the BCP options -T c -S.

Thanks for any ideas how to eliminate the chr(0).

Tom

View 6 Replies View Related

Large DataBase

Jul 23, 2007

What happens if you have a website and the hard drive on your server is say 250GB. Then the database exceeds that and the database is 300GB in size.How would you spread your database into two different hard drives?Thanks

View 2 Replies View Related

Large Tables In SQL 7.0

Jul 12, 2001

We currently have a data warehouse running on SQL 7.0, SP2. One of our primary fact tables is now well over 155 million rows in it. The table is not very wide, as it only contains 17 columns, most of which are defined as integers. The entire database is only 20 GB.

The issue is that the loads from the staging table to this fact table have significantly deteriorated over the last month or so, dropping from over 400 transactions per second to around 85. We drop all the indexes on the fact table before we load the data into it.

Are there issues with a manageable table size in SQL 7.0 that we need to be concerned about? And should we consider partitioning the table into several smaller tables and join them with a "union all" view?

I really need to get this performance issue resolved, as our IT support vendor is pushing us to port the data warehouse to UDB because they tell us that SQL server is not scalable enough to handle this volume of data.

Thanks for any help you can provide.

George M. Parker

View 6 Replies View Related

Maintenance A Very Large Db

Aug 31, 2001

Hi, anyone who administering the pretty big database not less than 30 Gb with the average number of rows in a table about 2M and more, please share you experience with maintenace of such a db. Esspecially i'm interesting in:

1) Indexes maintenance (When and how - just regular dbcc, maint. plan or some script to split the job twice and so on.)

2) Remove unused space from db. (not major)

The serever works 24*7, and it's transactional environment. SQL 7 sp3 on claster.

I run the sp. to rebuild all the indexes it takes about 2-3 hrs to determin the objects withfragmentation less than 80% and actually rebuilding, during this process the users experience the performance (specially for update/insert) problem. It looks like I need to change the plan or strategy to do this. Any thoughts appreciated!

Thanks in advance.
Dmitri

View 2 Replies View Related

Large Tables

Aug 10, 2000

Hi,

How can i partition the large tables so that the insert and updates which iam doing on the tables take less time.

I want to know how can i partition large tables and if i do that how is that the performance is going to be increased.

Thanks.

View 1 Replies View Related

Log Files Too Large.

Oct 20, 2000

Hi,

I have inherited some databases whith extremely large Log files.
I tried the truncate transaction log but did not work.
Can some body please tell me how to truncate these log files.

Thanks in advance.

Attaullah

View 2 Replies View Related

Large Database

Apr 17, 2003

Hello,

I would like to know whether MS SQL Server 2000 Can handle Large Database size 28 gig, currently we are using Sybase but we want migrate to SQL Server 2000. Are there any performance issues?

Any help is appreciated.

Thanks

Sejal

View 1 Replies View Related

Large Tables

Mar 13, 2001

How can I find largest 5 or 10 tables in a database?

Thanks in advance
Chan

View 2 Replies View Related

Hi Availability && DR For Large No. Of DBs

Feb 11, 2005

Hi,

I want some help for this particular scenario.

I have more than 350 Databases growing on an average of 5 Databases per month.

I am concernd about the DR Strategy to be adopted.

Can any one help me

View 1 Replies View Related

Large BCP Problem

Oct 6, 2005

OS - Window 2003 Standard
SQL - 2000 Standard SP4
2 CPU 4 GB Memory

We have a table this is approximate 155 GB with 450 million rows. 90 days worth of data and normalized pretty tight. Obviously this beast is too much to handle (reporting, maintenance) so we are partitioning the data off into weekly tables. I will never need to store more than 90 days worth of data so at the end of each week, we will truncate the oldest week's table and change the constraint on it to handle the next week. Anyway, We have done this and it is working well in 2 testing environment with a lot less data.

I need to migrate the data from the 1 table 450 million rows to the 13 weekly tables that will approximately make up the 90 days worth of data. I hoped to BCP out the data and Bulk Insert (fast mode). Even 1 weeks (Approximate 35 million rows) goes forever and and don't even see it start to output. I have changed the batch to be larger. Is there a better/faster way to do this or am I just going to have to wait. I'm open to any suggestions.

Note: Hardware is all Raid 1+0, reading from 1 array and writing to a different array. Performance just seems to slow but I'm not sure what to expect from something this big.

Ideas.

View 1 Replies View Related

Database Is Is Too Large Need Help

Aug 17, 2006

We have many installations of our shopping cart database. One specific database is huge, now about 25 GIG, compared to the others that range from 20 to 75 MEG. The server this one resides on has three other instances of the same database that are normal size.

In a particular table in the large database there are 9700 rows taking 380MB, the same table in a normal db has 162000 rows and takes 6.MB. The tables are identical and the indexes are the same.

Any ideas out there?

View 6 Replies View Related

Large DB Backup

Jun 19, 2008

I have this large DB around 250GB. In which 70% of the space is occupied by one single table. This is not a high transaction DB, it has only a few users.

Backup is taking 2:30 hours!

Is there a way I can reduce this time?

In desperation, I am thinking of doing,

1. detach the DB (off peak hours)
2. copy .mdf and .ldf files (backup)
3. attach the DB

Any suggestion ?



------------------------
I think, therefore I am - Rene Descartes

View 20 Replies View Related

Error.log Too Large

Jan 10, 2007

this file is getting upto 110GIG why?

View 4 Replies View Related

Transaction Log Too LARGE

Jan 30, 2007

I often in my job come across the following scenario:

Client rings up and says Run out of server space due to SQL 2000 Transaction log has consumed all the space or has consumed a very large portion of it.

what is the correct procedure in resolving this ASAP working with Full mode SQl 2000 Databases. as i have other
guys within my company that all do different things with good result and sometimes bad results.

The procedure i use is the following:

METHOD 1

1.backup both database and transaction log
2.Right click the database and select Detach, which from my understanding is a clean detach method which ensures that uncommited transactions are commited to the database.
3. Rename the old transaction log to .ldfOLD
4. REATTACH Database which creates a new transaction log.

METHOD 2

1. I dont use this method but im pretty sure its risky and if possible
can someone provide me with the reasons why:

1.Change database method from full to simple mode, shrink logs
and then change back to full mode.



basically what i am asking is what is the fastest way to sort out the above issue most effectively and with the abilty to Roleback succesfully.

PLEASE dont comment on why is the transaction log so big as i dont want to look into that now all i am sking is what is the most effective method to shrink the log down and save space.

View 13 Replies View Related

PK On A Large Table

Nov 16, 2007

I am developing an application that has a table with lots of records(network traffic) but the data is summarize every so often to create summary records (old records are deleted). The problem is that I have a PK based on an autoincrement ID (int) that will run out of numbers. However, this ID is not referenced anywhere, (not a foreign key from another table, not use for deletion and there is no update in this table whatsoever).

So my possibilites are:
1.- reseed the id when it is about to run out.
2.- make the id bigint
3.- remove the id and change the PK to 2 other fields
4.- remove the id and without PK

I am leaning toward option 4, because I do not see the need for a PK, but I understand that it is quite out of the normal.. So I would like to hear from other people ( I do not have much experience with DB).

I also like option 3. I already have a index on one of the other fields (time).

Any input will be appreciated.

Claudio Robles

View 7 Replies View Related

Large Datasets

Mar 27, 2008

Hi,

I have a table which has over 450,000 records in it, I have now split this into 4 so each table has around 100,000 records in it but I'm still having the problem of the data being returned really slowly.

What I need to do to this data is group it by a code and show the total for each code for every month of the year (this is currently based on one column and selecting the data accordingly). I have created views and put some indexes onto my table but the results are still being returned slowly. Does anyone have any suggestions of how I can speed this up?

Thanks

Gemma

View 5 Replies View Related

BCP Large Table.

Jul 23, 2005

If I use BCP to export a very large table will that table be blockedfor writes during the export process? I don't want to prevent usersfrom accessing that table during the bcp process?Thank You, TFD.

View 1 Replies View Related

Large Inserts

Jul 5, 2007

Dear Experts,What is the best way to do a large insert WITHOUT having direct accessto the machine SQL Server is running on? For example, imagine I wantto insert something like 20,000 records. If I were to have access tothe server, I could BULK INSERT into a temp table and then insert intothe destination table. But if I can't create a file on the server touse for BULK INSERT, what is the next best alternative to doing lotsof 1 record insert statements?Thanks,-Emin

View 4 Replies View Related

Best Practices For Large DB

Jul 20, 2005

Hi All,My question is what are the best practices for administering largeDBs. (My coworker is the DB administrator. I'm more of thedeveloper. But slowly being sucked in.) My main concern is that wehave some DBs that take approx 3 hrs a night just to rebuild theindexes. I know that with MSSQL 2000, I can use partitioned views tobreak out the table(s) into smaller databases and tables. But we alsohave an older server that runs MSSQL 7. Lastly how do you handledrive space issues? Do you spread out the DB across multiple MDFfiles on different drives? Thanks in advance.

View 1 Replies View Related

Large Text

Mar 18, 2008



Hi,
I have one column which contains some large data more than 30000 characters. When I exported the data to Excel file this, some data in this column showed as #######. This # are giving the problem in SSIS package. If I delete that data(####) then SSIS package is working fine.
How to solve this problem.

Thanks in advance.

View 1 Replies View Related

Log File Is Too Large

May 28, 2008

I administer an application that runs on SQL 2000, without being an SQL expert myself. I have observed that if I create a maintenance plan then the log file gets trimmed, but otherwise the log file keeps growing. All was going well at one site, but recently the log file has become huge. The server is also running out of disk space. Is that the problem? I'm not sure how to control this. Can I delete a log and then it will re-create? Two smaller databases share the same maintenance plan and these logs are small. Thank you.

View 6 Replies View Related

Database Has Gotten Relatively Large

Sep 12, 2006

I upgraded a database from SQL2000 to SQL2005 and it went pretty smooth. After the transition was made, I backed up the DB. The size of the database was as expected, about 5 GB. About 5 hours later, a maintenance plan executed a few optimization jobs in the following order: Reorganize Index, Rebuild Index, Shrink Database, Update Statistics. Soon after that, another job backed up the database and it was then 32 GB. During the time after the first backup, no one was really using the database.

I've been trying to track this down for several days. Does anyone have any ideas for me?

Thanks,
Bobby Crawford

View 1 Replies View Related

Why Is Database So Large?

Dec 4, 2006

Hello!
I have a database with one table inside. The table has six columns with the following datatypes:
col1 -->smallint (2byte)
col2 -->int (4byte)
col3 -->smallint (2byte)
col4 -->smallint (2byte)
col5 -->smallint (2byte)
col6 -->int (4byte)
I have insert 1.359.320 rows with data and the size of the sdf file is 40.116.224 byte.
According to my calculation:
1.359.320 * 2byte + 1.359.320 * 4byte + 1.359.320 * 2byte + 1.359.320 * 2byte +1.359.320 * 2byte + 1.359.320 * 4byte = 21.749.120 byte
I hope somebody can explain me, why the database is so large.

Thank you
Sascha

View 4 Replies View Related

How To... Backup Large DB...

Jul 23, 2007

hardware configuration:



2 node cluster, active-passive

+

for backup: direct attach tape library (dell powervault 132T with 2 tape drive's) on 1 node. Tape type is LTO (100Gb, and with compression 200Gb). Last time one filegroupe backup with errors:



Job 'BackupFG_index_ToTape1' : Step 1, 'step' : Began Executing 2007-07-02 Job 'EscortBackupFG_index_ToTape1' : Step 1, 'step' : Began Executing 2007-07-16 01:00:01

10 percent backed up. [SQLSTATE 01000]
20 percent backed up. [SQLSTATE 01000]
30 percent backed up. [SQLSTATE 01000]
40 percent backed up. [SQLSTATE 01000]
50 percent backed up. [SQLSTATE 01000]
60 percent backed up. [SQLSTATE 01000]
70 percent backed up. [SQLSTATE 01000]
Msg 4028, Sev 16: End of tape has been reached. Remove tape '\. ape1' and mount next tape for BACKUP DATABASE...FILE=<name> of database 'pakdb'. [SQLSTATE 42000]


i'am use native SQL Server backup, t-sql script:



BACKUP DATABASE pakdb
FILE = 'pakdb_index',
FILEGROUP = 'db_index'
TO TAPE = '\. ape1' WITH INIT, FORMAT, STATS, NOUNLOAD


i have 2 way's for solving this problem:

1) using any software (veritas as axample)

2) breack large filegroupe on any small filegroups - don't good think



Question:

How you solve this problems. Use veritas (example) - it problem too, if server full dead and veritas repository too - how i'm can restore information from my types (it's my ideas).

View 4 Replies View Related

Very Large Recordset

Aug 10, 2007



I have a stored proc which returns a very large recordset (150,000 rows +). Each row has upto 65 - 95 fields. I need to do row transformations on each row.

I am using an Execute SQL Task to execute the stored procedure to return the results into a Object variable which I then shred inside a Source Script Component (by casting it into a recordset datatable and looping it and assinging into an Output columns).

Obviously the recordset variable is enormous - stuffed with 150K large rows. But what choices do I have in this scenario? Can I persist the recordset variable to disk? I cannot stream the result of a stored proc execution. So what I do about the sluggish performance of this enormous variable?

thanks in advance

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved