Dealing With Large Amounts Of Data

Jul 20, 2005

We are looking to store a large amount of user data that will be
changed and accessed daily by a large number of people. We expect
around 6-8 million subscribers to our service with each record being
approximately 2000-2500 bytes. The system needs to be running 24/7
and therefore cannot be shut down. What is the best way to implement
this? We were thinking of setting up a cluster of servers to hold the
information and another cluster to backup the information. Is this
practical?
Also, what software is available out there that can distribute query
calls across different servers and to manage large amounts of query
requests?

Thank you in advance.

Ben

View 10 Replies


ADVERTISEMENT

Storing Large Amounts Of Data

Mar 31, 2008

Hi,

I was wondering if any one could help me, I need to store large amounts of data in my database, at present I have it set to nvchar (8000), I've looked around and noticed you can use text which stores up to 2 million, but is slow in displaying the information.

Any ideas or points in the right directions would be great.

Thanks

View 6 Replies View Related

Inserting Large Amounts Of Data

Sep 8, 2005

Does anyone have ideas on the best way to move large amounts of databetween tables? I am doing several simple insert/select statementsfrom a staging table to several holding tables, but because of thevolume it is taking an extraordinary amount of time. I consideredusing cursors but have read that may not be the best thing for thissituation. Any thoughts?--Posted using the http://www.dbforumz.com interface, at author's requestArticles individually checked for conformance to usenet standardsTopic URL: http://www.dbforumz.com/General-Dis...pict254055.htmlVisit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=877392

View 4 Replies View Related

Problem With Large Data Amounts

Jun 24, 2005

I have a dataset with 300,000 records and I'm getting the following error with MS Reporting Services.  "An error has occurred during report processing.  Exception of type System.OutOfMemoryException was thrown.  any help with this would be highly appreciated.

View 11 Replies View Related

Graphing Large Amounts Of Data

May 8, 2007

Greetings



I need to be able to graph roughly about 150 employees/ supervisor and their monthly cell phone usage in minutes. I understand that I will need to group this on say one graph for every ten employees so it doesn't look messy and cluttered. I have read some threads here but they dont seem to work for me.



So again each supervisor has 100+ subordinates and I need to graph theie phone usage by month



thoughts???

km

View 2 Replies View Related

Sqlserver 2000- Large Amounts Of Data 1.2 TB

Apr 20, 2001

p.s. my email was incorrect in the last mail.
Hi all,
is there a sql 2k thread. Am interseted in finding out what the largest database size of a sqlserver database people have worked with.
We have a 1.2 Terabyte db with about 150-200 million new rows being processed everyday. Would like to share some thoughts on this with other people who are working with this much data and what they are doing with it.

bhala
----------------------------------------
Please check us out at: http://www.bivision.org/bivision

View 2 Replies View Related

Migrating Large Amounts Of Data From SQL Server 7.0 To 2000

Jul 20, 2005

I'm in the process of migrating a lot of data (millions of rows, 4GB+of data) from an older SQL Server 7.0 database to a new SQL Server2000 machine.Time is not of the essence; my main concern during the migration isthat when I copy in the new data, the new database isn't paralyzed bythe amount of bulk copying being one. For this reason, I'm splittingthe data into one-month chunks (the data's all timestamped and goesback about 3 years), exporting as CSV, compressing the files, and thenimporting them on the target server. The reason I'm using CSV isbecause we may want to also copy this data to other non-SQL Serversystems later, and CSV is pretty universal. I'm also copying in thisformat because the target server is remotely hosted and is notaccessible by any method except FTP and Remote Desktop -- nodatabase-to-database copying allowed for security reasons.My questions:1) Given all of this, what would be the least intrusive way to copyover all this data? The target server has to remain running and berelatively uninterrupted. One of the issues that goes hand-in-handwith this is indexes: should I copy over all the data first and thencreate indexes, or allow SQL Server to rebuild indexes as I go?2) Another option is to make a SQL Server backup of the database fromthe old server, upload it, mount it, and then copy over the data. I'mworried that this would slow operations down to a crawl, though, whichis why I'm taking the piecemeal approach.Comments, suggestions, raw fish?

View 2 Replies View Related

Best Way To Insert Large Amounts Of Data From A Webform To SQL Server 2005

Oct 21, 2007

HiI have a VB.net web page which generates a datatable of values (3 columns and on average about 1000-3000 rows).What is the best way to get this data table into an SQL Server? I can create a table on SQL Server no problem but I've found simply looping through the datatable and doing 1000-3000 insert statements is slow (a few seconds). I'd like to make this as streamlined as possible so was wondering is there is a native way to insert all records in a batch via ADO.net or something.Any ideas?ThanksEd

View 1 Replies View Related

SQL Server Storing Large Amounts Of Data In Multiple Tables

Jul 20, 2005

Hello,Currently we have a database, and it is our desire for it to be ableto store millions of records. The data in the table can be divided upby client, and it stores nothing but about 7 integers.| table || id | clientId | int1 | int2 | int 3 | ... |Right now, our benchmarks indicate a drastic increase in performanceif we divide the data into different tables. For example,table_clientA, table_clientB, table_clientC, despite the fact thetables contain the exact same columns. This however does not seem veryclean or elegant to me, and rather illogical since a database existsas a single file on the harddrive.| table_clientA || id | clientId | int1 | int2 | int 3 | ...| table_clientB || id | clientId | int1 | int2 | int 3 | ...| table_clientC || id | clientId | int1 | int2 | int 3 | ...Is there anyway to duplicate this increase in database performancegained by splitting the table, perhaps by using a certain type ofindex?Thanks,Jeff BrubakerSoftware Developer

View 4 Replies View Related

Sql 2005 Updating Table Definition With 'large' Amounts Of Data - Timeout

Feb 4, 2006

I'm trying to move my current use of an sql 2000 db to sql 2005.



I need to update a table definition (to change a field to an Identity)



I'm getting a dialog box (in SQL server management studio) on save saying :



'xxxx' table

- Saving Definition Changes to tables with large amounts of data could
take a considerable amount of time. While changes are being
saved, table data will not be accessible.



I press 'Yes' to the dialog box.



After 35 seconds, I get another dialog box saying:



'xxxx' table

- Unable to modify table.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.



Well, the server is responding and I can query that talbe and other, I
can add/delete rows to other columns. I can modify other
(smaller) tables.



Any ideas where I can change this timeout?



Daniel

View 10 Replies View Related

Large Amounts Of Text

Nov 20, 2003

I was wondering what is the best solution to store large amounts of text in a SQL 2000 field. This text will be entered into a multiline textbox.

ex) what data type?......should I use BLOBs?

Thanks,
Trey

View 4 Replies View Related

Adding Large Amounts Of Text

Mar 20, 2007

this may seem like a simple question, but I have a report/lease agreement I need to put together and wanted to know the simpliest way to add large amounts of text. Basically its all the legal stuff most leases include in the amount of some 14 pages.

Should this be just one long string-- or does ssrs have another way to format this

thanks as always

KM

View 2 Replies View Related

Fastest Way To Do Large Amounts Of Updates

Mar 3, 2006

I was wondering what is the fastest way to UPDATE lots of recods. I heard the fastest way to perform lots of inserts in to use SqlCeResultSet. Would this also be the fastest way to update already existing records? If so, is this the fastest way to do that:

1. Create a SqlCeCommand object.
2. Set the CommandText to select the datat I want to update
3. Call the command object's ExecuteResultSet method to create a SqlCeResultSet object
4. Call the result set object's Read method to advance to the next record
5. Use the result set object to update the values using the SqlCeResultSet.SetValue method and the Update method.
6. repeat steps 4 and 5

Also I was wondering do call the SqlCeResultSet.Update method once per row, or just once? Also would it be possible and faster to wrap all that in a transaction?

Would parameterized updates be faster?
Any help will be appreciated.

View 3 Replies View Related

Mirroring Large Amounts Of Databases

Mar 6, 2008

I have been looking into mirroring a large amount of small databases approx 150 databases.



As I understand this won't be feasible because of the way mirroring threading works, http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=441900&SiteID=1



As I understand it for every database being mirrored sql will ping the mirror second, causing a network bottleneck?.

Also that the amount of threads generated for each mirrored database will cause also cause a bottleneck?



At the moment our database servers are under very little pressure and as an estimate use about 10% of the resources allocated to them such as CPU utilization, memory, disk IO and network. Our server hardware is Dual Quad core Xeons with 4 - 8 gig of memory and variety of 10k SCSCI raid configurations from raid 5 or 1,0 and sql 2005 32bit.



Ive done some calculations on the log file generation rate compared to network bandwidth there is more than enough network bandwidth.



Has anybody had any luck in mirroring many small databases?



My concerns is how much traffic is caused by the pinging of the mirror for each database?,

How many threads will the mirroring cause and what is the max amount of threads sql can handle?

How much memory will be consumed by each one of these mirroring threads?

View 1 Replies View Related

Problem Inserting Large Amounts Of Text

Aug 12, 2005

I am running into a problem inserting large amounts of text into my table. Everything works well when I test with a few simple words but when I try to do a test with larger amounts of text (ie 35,000 characters) the appropriate field is left blank. The Insert still performs (all the other fields recieve their data, but the "Description" field is blank. I have tried this with both "text" and "ntext" datatypes. I am using a stored procedure with input parameters. As I mentioned, the query goes off flawlessly with small amounts of data (eg "Hi there!") but not with the larger amount.I check and the ntext field claims to be able to accept 1073741823 bytes of data. Is there some other thing I should consider with large amounts of text?

View 6 Replies View Related

Dealing With Large Numbers Of Parameters

Sep 11, 2007

Hi there,

I am getting a headache trying to research what to do when you have a large number of parameters to include in a query. For example, if I have a large number of checkboxes for the user to pick criteria for a report and they select several, I'm assuming it would be bad practise to say:

WHERE Field = "a" OR Field = "b" OR Field = "c" OR Field = "d" OR Field = "e" OR.....etc etc etc

Is there a good solution for this, given that the number of parameters may vary dramatically depending on what the user selects to include in a report?!

I'm running SQL Server 2000 with an ASP front end.

Any help would be greatly appreciated!

Thanks in advance!

Matt

View 9 Replies View Related

Sizing A Pagefile On Servers With Large Amounts Of Memory

Sep 19, 2007

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory. However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary. With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile. Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed. If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile. 10% of this is 4.8GB, which I would hope I never see consumed.

Any thoughts?

Thanks, Dave

View 11 Replies View Related

MS HotFixes/patches Possibly Stunting DB Activity For Certain Amounts Of Data?

Jun 14, 2004

I’m wondering if anyone can shed light on a problem I’ve noticed that's really made for a major thorn in my side. I recently had a Microsoft patch installed on my server, and now for some reason, trying to run INSERT or UPDATE queries against the SQL 2000 database are severely limited. I constantly get the error:

“Error: A severe error occurred on the current command. The results, if any, should be discarded.�

My Event Logs also return the following:

"Invalid buffer received from client."

I think I’ve isolated the problem to be that I can’t add new or modify existing records that try using a field which is of type TEXT, but now can’t be longer than 4,000 characters, else the error fires. This is really weird, as I’ve used the same ASP.NET script to call a stored procedure to INSERT/UPDATE records thousands of times before with 100% success.

I have a feeling this might have something to do with the patch, but has anyone come across this problem specifically, or know for sure which patch(es) cause it? Why all of a sudden would a TEXT field be so limited in capacity?

View 2 Replies View Related

Dealing With Deleted Records In Source Data

Jan 29, 2008

Hi,
I have an SSIS package that runs each day from a live data source to create a data mart, which is then used for various things including SSAS and SSRS.

The problem is that certain records that will eventually go on to form fact tables are deleted from the live system (not a very robost database in the first place, hence the SSIS!) but these are not reflected in the SSIS transformation, creating plus figures when compared to the live system.

I currently use type 1 slowly changing dimension processes in each data flow (of which there are about 35) but I realise that this only updates records and does not delete.

The solution I have in place is to truncate the fact tables in the mart before the run starts using an Execute SQL task. This solves the problem though to me seems a little heavy-handed and renders the slowly changing dimension processes redundant (as it is currently only run once a day).

My question is, is there a better method of dealing with the above scenario? If there isn't, it would be a nice feature to add to future versions (*nudge nudge*).

Thanks in advance :-)

View 1 Replies View Related

DB Engine :: Migrate Table Data When Dealing With Constraints

Nov 19, 2015

I need to script out data in several tables (30+) and then reload those tables on a different database.  The "target" database table data will get overwritten each time, not appended to.  Several of these tables have numerous foreign keys and other constraints that must be dealt with.  None of them have more than 100 records or so.  Then I need to keep this script in source control and reference it as part of a post-deployment step with a Visual Studio DB project.

I have redgate, dbghost and visual studio.  Are there any tools will generate the script for me complete with dropping all the constraints, truncating the tables, then re-adding the constraints back?  Or, am I looking at simply scripting the table data out, then modifying the script myself to add in the steps to drop/readd the constraints?  Further, since some of these tables might have relationships, not just to other tables, but to those in question, there will likely be an order in which the tables need to get loaded.

The approach I am thinking will be needed, since ive done this sort of thing before except with SSIS, is to run something like dbghost or the "generate scripts" option in SSMS as a starting point.  From there, change the order around as needed, etc.  

View 3 Replies View Related

Transact SQL :: Key And Indexes On Two Column Data Table Or Parsed View (Large String Of Data And Filename)

Oct 4, 2015

I am studying indexes and keys. I have a table that has a fixed width of data to be loaded in the first column which is parsed in a view based on data types within the fixed width specifications.

Example column A:
(name phone house cost of house,zipcodecountystatecountry)
-a view will later split this large varchar string based 
column b: is the source filename of the data load (varchar 256)
....

a. would there be a benefit of adding a clustered or nonclustered index (if so which/point in direction on why)

b. is there benefit of making one of these two columns a primary key (millions of records) or for adding a 3rd new column as a pk?

c. view: this parses the data in column a so it ends up looking more like "name phone house cost of house zipcode county state country" each having their own column.

-any pros/cons of adding indexes (if so which) to the view instead of the tables or both for once the data is parsed?

View 4 Replies View Related

Compare Amounts Between 2 Tables

Sep 21, 2012

I have two tables. One is Invoice_tbl, with one account per customer.

This table has 3 fields; CustomerID, InvoiceAmount, InvoiceID.

Then second table is, Payment_tbl, with 2 fields; InvoiceID and PaymentAmount. The Payment table can have multiple payments from each customer.

With Access, i would run a QUERY(call it PaymentTotal) against Payment_tbl, then do a "GroupBy" on InvoiceID and SUM on the "Amount" field.

I then would create a NEW query against Invoice_tbl and INNER JOIN on Payment Total.

How would i do this with SQL?

View 6 Replies View Related

Joining Two Tables (Sum Of Amounts)

Dec 30, 2013

I have a view table A with 4 columns dateseq,SalesAmount, customerseq, CostofGoods and customerdescription.

Fact table B that has 4 columns dateseq,SalesAmount, Costofgoods, RebateAmount, and Customerseq.

My code is like this

select
ta.customerdesctiption,
sum(TA.salesamount) as salesamount,
sum(TB.RebateAmount) as salesamount

from
TableA TA
left join
tableB TB
on TA.dateseq = TB.dateseq

where
ta.dateseq like '%2013'

I would like to get
Customer A | 10,000
Customer B | 3,000
etc

What I get is:
Customer A | 20,000
Customer B | 6,000
etc

View 2 Replies View Related

Updating A Large Set Of Data

Oct 4, 2006

hello guyshere is my problem:i am developing a asp.net web app in .net 2.0. i have some sensitive data in my database. which is encrypted using DES ( with some key which is only known by the top level authorities ). now there is an option of changing the secret key. on changing the key the sensitive data has to decrypted using the old key and then again encrypted using the new key. Now if the no of records increases i am afraid that it might take a longer time and the application might look as it got hanged. guys i have no clue on how to do this. if you guys have any idea on how to implement this please let me know. any help would be appreciatedVignesh

View 7 Replies View Related

Large Data Sets

Mar 20, 2008

Hi,
 I'm currently trying to retrieve results from a large dataset, there are over 45000 records and I need to use them all to peform counts etc.  I have set up views, but my page is still being returned slowly, is there anything I can do to speed this up?
 Thanks
 Gemma

View 2 Replies View Related

V Large Data Import

Mar 19, 2001

I was wondering if anyone can help me.

I am trying to import data into SQL Server 7. The table will be 700-800 columns, and the data will be about 150,000 records at a time.
The data source is flat file.

First I create the table using a database schema, and secondly I would like to populate the table.
The problem is that most of the data is numeric, and to be used for statistical analysis.

So far I have tried Bulk Insert, bcp, and dts.
DTS is the only method that has worked in any way, shape or form, but that requires importing each column as a Varchar. Importing to my pre-created table doesn't work, because it is interpreting some of the source columns as character data and refusing to insert them into an int field.
Bulk Insert and bcp both give error messages, and I am wondering if that is because of the size of the insert statement that is required to handle so many fields.

For the moment I am just trying to import the data in any way, but eventually, it will have to be run as an automated process, with the table structure probably needing to be altered as well.

Any help/suggestions would be very greatfully received.

View 2 Replies View Related

Large String Data

Feb 11, 2004

I have a web site that allows user to enter large strings into a database (comments, etc). What is the best way to do that? Right now I have them limited to 25 characters and the data type is varchar. Is there a better way?

Thanks!

View 2 Replies View Related

Which Is Small And Which Is Large Data

Dec 22, 2014

when to use table variable and temp table. i told the interviewer that when rows is less like hundreds or thousand then use table variable else use temp table.After that he asked that what do u mean by less data or thousand rows may be there are multiple columns involved with that less rows and make a huge data set.

View 3 Replies View Related

Pagination For Large Data

Jul 20, 2005

I want to build a system that will have about 1 million rows in atable in sql server database.I am using this for a web application andaccessing it via JDBC type 4 driver.But display 20 records at a timeonly using pagination(as in google).What will be the best way to goabout this.

View 1 Replies View Related

Large Data Needs To Be Deleted??? HOW???

Jul 20, 2005

I have a database that is 70GB big. One of the tables has over 350million rows of data. I need to delete about 1/3 of the data in thatone table.I was going to use a simple delete command to delete the unnessacaydata.Something likeDelete from tablename where columname > '100'However, it takes SOOO LONG to run, over 8 hours, if not longer....What is the best way to delete this data???One idea was to create a Temp Table, then move the needed data to thetemp table, then Truncate the Table, and move that data back.I'm not very good with SQL Query Language, so can somone give me anexample on how to do this?? Or if you have a differant idea that wouldbe faster please let me know.thanks,Sam

View 2 Replies View Related

Large Data Delete

Jul 20, 2005

HiI have a SQL2000 server with 128m rows of data. I want to delete about65m of that. So far I have bcp'ed the relevent data out and put theminto another SQL database.We have a small amount of space for our transaction log so I cannotdelete all 65m rows in one go. So far I have been doing them is 0.5mchuncks, but it is extremly slow.Would a faster way be to bcp the data I wan to keep and truncate thetable and bulk import them in again ?What hapnes to log size in when builk import is happening and is thereanother way of doing this ?Thanks for any help

View 2 Replies View Related

Large Value Data Type(bcp)

Jun 5, 2006

I want to store some binary things(pic and so on), so I create a table which contain a a "varbinary" data-type column.

but 1. I used OPENROWSET to insert the large file in this table. 2. I used master..xp_cmdshell to retrieve data out as a file. One strange thing happened: the size of the input and output is really different(output is 1k bigger than the input file).

and it seems that the file is broken with different file format......

I really don't know why....

Any help would be appriciated.....

kavin

View 3 Replies View Related

Adding Column Amounts Per CustomerID

May 26, 2006

Hi guys


Code:


SELECT C.customerID, quantity, unitprice, (SELECT quantity * unitprice), OD.productID,
FROM customers C
INNER JOIN Orders O
ON C.customerID = O.customerID
INNER JOIN [order details] OD
ON O.orderID = OD.orderID
ORDER BY C.CustomerID



The Output looks a little like this:


Code:


customerID quantity unitprice productID
---------- -------- --------------------- --------------------- -----------
ALFKI 15 45.6000 684.0000 28
ALFKI 21 18.0000 378.0000 39
ANATR 1 28.8000 28.8000 69
ANATR 7 9.2000 64.4000 19
ANATR 10 34.8000 348.0000 72
ANTON 24 16.8000 403.2000 11
ANTON 15 46.0000 690.0000 43
AROUT 25 3.6000 90.0000 24
AROUT 16 19.0000 304.0000 36
BERGS 16 15.5000 248.0000 44
BERGS 15 44.0000 660.0000 59
BLAUS 3 10.0000 30.0000 21
BLAUS 21 34.0000 714.0000 60
BLONP 35 12.5000 437.5000 31
BLONP 15 19.5000 292.5000 57
BOLID 24 17.6000 422.4000 4
BOLID 16 15.6000 249.6000 57
BONAP 40 36.8000 1472.0000 43
BONAP 20 7.7500 155.0000 75
BONAP 8 30.0000 240.0000 7
BONAP 20 6.0000 120.0000 13
BONAP 20 25.0000 500.0000 6
BONAP 20 23.2500 465.0000 14
BONAP 10 9.2000 92.0000 19
BOTTM 16 24.8000 396.8000 10
BOTTM 9 46.0000 414.0000 43
BOTTM 30 4.5000 135.0000 24
BOTTM 21 49.3000 1035.3000 62
BOTTM 15 2.5000 37.5000 33



I would like to add the totals from each customerID and then show the customerID once with the final total amount.

The above SELECT statement was the closest I could come.

The help is always appreciated!

Justin

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved