Fastest Way To Do Large Amounts Of Updates

Mar 3, 2006

I was wondering what is the fastest way to UPDATE lots of recods. I heard the fastest way to perform lots of inserts in to use SqlCeResultSet. Would this also be the fastest way to update already existing records? If so, is this the fastest way to do that:

1. Create a SqlCeCommand object.
2. Set the CommandText to select the datat I want to update
3. Call the command object's ExecuteResultSet method to create a SqlCeResultSet object
4. Call the result set object's Read method to advance to the next record
5. Use the result set object to update the values using the SqlCeResultSet.SetValue method and the Update method.
6. repeat steps 4 and 5

Also I was wondering do call the SqlCeResultSet.Update method once per row, or just once? Also would it be possible and faster to wrap all that in a transaction?

Would parameterized updates be faster?
Any help will be appreciated.

View 3 Replies


ADVERTISEMENT

Large Amounts Of Text

Nov 20, 2003

I was wondering what is the best solution to store large amounts of text in a SQL 2000 field. This text will be entered into a multiline textbox.

ex) what data type?......should I use BLOBs?

Thanks,
Trey

View 4 Replies View Related

Storing Large Amounts Of Data

Mar 31, 2008

Hi,

I was wondering if any one could help me, I need to store large amounts of data in my database, at present I have it set to nvchar (8000), I've looked around and noticed you can use text which stores up to 2 million, but is slow in displaying the information.

Any ideas or points in the right directions would be great.

Thanks

View 6 Replies View Related

Inserting Large Amounts Of Data

Sep 8, 2005

Does anyone have ideas on the best way to move large amounts of databetween tables? I am doing several simple insert/select statementsfrom a staging table to several holding tables, but because of thevolume it is taking an extraordinary amount of time. I consideredusing cursors but have read that may not be the best thing for thissituation. Any thoughts?--Posted using the http://www.dbforumz.com interface, at author's requestArticles individually checked for conformance to usenet standardsTopic URL: http://www.dbforumz.com/General-Dis...pict254055.htmlVisit Topic URL to contact author (reg. req'd). Report abuse: http://www.dbforumz.com/eform.php?p=877392

View 4 Replies View Related

Dealing With Large Amounts Of Data

Jul 20, 2005

We are looking to store a large amount of user data that will bechanged and accessed daily by a large number of people. We expectaround 6-8 million subscribers to our service with each record beingapproximately 2000-2500 bytes. The system needs to be running 24/7and therefore cannot be shut down. What is the best way to implementthis? We were thinking of setting up a cluster of servers to hold theinformation and another cluster to backup the information. Is thispractical?Also, what software is available out there that can distribute querycalls across different servers and to manage large amounts of queryrequests?Thank you in advance.Ben

View 10 Replies View Related

Problem With Large Data Amounts

Jun 24, 2005

I have a dataset with 300,000 records and I'm getting the following error with MS Reporting Services.  "An error has occurred during report processing.  Exception of type System.OutOfMemoryException was thrown.  any help with this would be highly appreciated.

View 11 Replies View Related

Graphing Large Amounts Of Data

May 8, 2007

Greetings



I need to be able to graph roughly about 150 employees/ supervisor and their monthly cell phone usage in minutes. I understand that I will need to group this on say one graph for every ten employees so it doesn't look messy and cluttered. I have read some threads here but they dont seem to work for me.



So again each supervisor has 100+ subordinates and I need to graph theie phone usage by month



thoughts???

km

View 2 Replies View Related

Adding Large Amounts Of Text

Mar 20, 2007

this may seem like a simple question, but I have a report/lease agreement I need to put together and wanted to know the simpliest way to add large amounts of text. Basically its all the legal stuff most leases include in the amount of some 14 pages.

Should this be just one long string-- or does ssrs have another way to format this

thanks as always

KM

View 2 Replies View Related

Mirroring Large Amounts Of Databases

Mar 6, 2008

I have been looking into mirroring a large amount of small databases approx 150 databases.



As I understand this won't be feasible because of the way mirroring threading works, http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=441900&SiteID=1



As I understand it for every database being mirrored sql will ping the mirror second, causing a network bottleneck?.

Also that the amount of threads generated for each mirrored database will cause also cause a bottleneck?



At the moment our database servers are under very little pressure and as an estimate use about 10% of the resources allocated to them such as CPU utilization, memory, disk IO and network. Our server hardware is Dual Quad core Xeons with 4 - 8 gig of memory and variety of 10k SCSCI raid configurations from raid 5 or 1,0 and sql 2005 32bit.



Ive done some calculations on the log file generation rate compared to network bandwidth there is more than enough network bandwidth.



Has anybody had any luck in mirroring many small databases?



My concerns is how much traffic is caused by the pinging of the mirror for each database?,

How many threads will the mirroring cause and what is the max amount of threads sql can handle?

How much memory will be consumed by each one of these mirroring threads?

View 1 Replies View Related

Problem Inserting Large Amounts Of Text

Aug 12, 2005

I am running into a problem inserting large amounts of text into my table. Everything works well when I test with a few simple words but when I try to do a test with larger amounts of text (ie 35,000 characters) the appropriate field is left blank. The Insert still performs (all the other fields recieve their data, but the "Description" field is blank. I have tried this with both "text" and "ntext" datatypes. I am using a stored procedure with input parameters. As I mentioned, the query goes off flawlessly with small amounts of data (eg "Hi there!") but not with the larger amount.I check and the ntext field claims to be able to accept 1073741823 bytes of data. Is there some other thing I should consider with large amounts of text?

View 6 Replies View Related

Sqlserver 2000- Large Amounts Of Data 1.2 TB

Apr 20, 2001

p.s. my email was incorrect in the last mail.
Hi all,
is there a sql 2k thread. Am interseted in finding out what the largest database size of a sqlserver database people have worked with.
We have a 1.2 Terabyte db with about 150-200 million new rows being processed everyday. Would like to share some thoughts on this with other people who are working with this much data and what they are doing with it.

bhala
----------------------------------------
Please check us out at: http://www.bivision.org/bivision

View 2 Replies View Related

Fastest Way To Remove Large Table?

Nov 22, 1999

We have a table that we BCP into, the data is then processed and inserted into its appropriate table.
Then the table or its data needs to be removed. This seems to be a very slow operation to remove
the table or table data. I have tried drop table, and truncate table and it takes nearly as long as
the bcp operation. The table has 12 million rows. I didn't think either operation wrote to the
transaction log except for page extent management. Why is the drop and truncate so slow. Suggestions?

View 1 Replies View Related

Migrating Large Amounts Of Data From SQL Server 7.0 To 2000

Jul 20, 2005

I'm in the process of migrating a lot of data (millions of rows, 4GB+of data) from an older SQL Server 7.0 database to a new SQL Server2000 machine.Time is not of the essence; my main concern during the migration isthat when I copy in the new data, the new database isn't paralyzed bythe amount of bulk copying being one. For this reason, I'm splittingthe data into one-month chunks (the data's all timestamped and goesback about 3 years), exporting as CSV, compressing the files, and thenimporting them on the target server. The reason I'm using CSV isbecause we may want to also copy this data to other non-SQL Serversystems later, and CSV is pretty universal. I'm also copying in thisformat because the target server is remotely hosted and is notaccessible by any method except FTP and Remote Desktop -- nodatabase-to-database copying allowed for security reasons.My questions:1) Given all of this, what would be the least intrusive way to copyover all this data? The target server has to remain running and berelatively uninterrupted. One of the issues that goes hand-in-handwith this is indexes: should I copy over all the data first and thencreate indexes, or allow SQL Server to rebuild indexes as I go?2) Another option is to make a SQL Server backup of the database fromthe old server, upload it, mount it, and then copy over the data. I'mworried that this would slow operations down to a crawl, though, whichis why I'm taking the piecemeal approach.Comments, suggestions, raw fish?

View 2 Replies View Related

Sizing A Pagefile On Servers With Large Amounts Of Memory

Sep 19, 2007

I know the standard Microsoft recommendation is to make the pagefile at least 1.5 to 3 times larger then the amount of physical memory. However, if you're talking about a server with lots of memory such as 16GB or 32GB, would following this rule be unnecessary. With SQL 2000 running on Windows 2000 Server or Windows Server 2003 I typically see pagefile usage no more then 12% for a 2GB pagefile. Anything over 15% means I need to look at other indicators to see if a memory bottleneck has developed. If I have 32GB of physical memory and make the pagefile only 1.5 x 32GB I have a 48GB pagefile. 10% of this is 4.8GB, which I would hope I never see consumed.

Any thoughts?

Thanks, Dave

View 11 Replies View Related

Best Way To Insert Large Amounts Of Data From A Webform To SQL Server 2005

Oct 21, 2007

HiI have a VB.net web page which generates a datatable of values (3 columns and on average about 1000-3000 rows).What is the best way to get this data table into an SQL Server? I can create a table on SQL Server no problem but I've found simply looping through the datatable and doing 1000-3000 insert statements is slow (a few seconds). I'd like to make this as streamlined as possible so was wondering is there is a native way to insert all records in a batch via ADO.net or something.Any ideas?ThanksEd

View 1 Replies View Related

SQL Server Storing Large Amounts Of Data In Multiple Tables

Jul 20, 2005

Hello,Currently we have a database, and it is our desire for it to be ableto store millions of records. The data in the table can be divided upby client, and it stores nothing but about 7 integers.| table || id | clientId | int1 | int2 | int 3 | ... |Right now, our benchmarks indicate a drastic increase in performanceif we divide the data into different tables. For example,table_clientA, table_clientB, table_clientC, despite the fact thetables contain the exact same columns. This however does not seem veryclean or elegant to me, and rather illogical since a database existsas a single file on the harddrive.| table_clientA || id | clientId | int1 | int2 | int 3 | ...| table_clientB || id | clientId | int1 | int2 | int 3 | ...| table_clientC || id | clientId | int1 | int2 | int 3 | ...Is there anyway to duplicate this increase in database performancegained by splitting the table, perhaps by using a certain type ofindex?Thanks,Jeff BrubakerSoftware Developer

View 4 Replies View Related

Sql 2005 Updating Table Definition With 'large' Amounts Of Data - Timeout

Feb 4, 2006

I'm trying to move my current use of an sql 2000 db to sql 2005.



I need to update a table definition (to change a field to an Identity)



I'm getting a dialog box (in SQL server management studio) on save saying :



'xxxx' table

- Saving Definition Changes to tables with large amounts of data could
take a considerable amount of time. While changes are being
saved, table data will not be accessible.



I press 'Yes' to the dialog box.



After 35 seconds, I get another dialog box saying:



'xxxx' table

- Unable to modify table.

Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.



Well, the server is responding and I can query that talbe and other, I
can add/delete rows to other columns. I can modify other
(smaller) tables.



Any ideas where I can change this timeout?



Daniel

View 10 Replies View Related

Firing A Trigger When A User Updates Data But Not When A Stored Procedure Updates Data

Dec 19, 2007

I have a project that consists of a SQL db with an Access front end as the user interface. Here is the structure of the table on which this question is based:




Code Block

create table #IncomeAndExpenseData (
recordID nvarchar(5)NOT NULL,
itemID int NOT NULL,
itemvalue decimal(18, 2) NULL,
monthitemvalue decimal(18, 2) NULL
)
The itemvalue field is where the user enters his/her numbers via Access. There is an IncomeAndExpenseCodes table as well which holds item information, including the itemID and entry unit of measure. Some itemIDs have an entry unit of measure of $/mo, while others are entered in terms of $/yr, others in %/yr.

For itemvalues of itemIDs with entry units of measure that are not $/mo a stored procedure performs calculations which converts them into numbers that has a unit of measure of $/mo and updates IncomeAndExpenseData putting these numbers in the monthitemvalue field. This stored procedure is written to only calculate values for monthitemvalue fields which are null in order to avoid recalculating every single row in the table.

If the user edits the itemvalue field there is a trigger on IncomeAndExpenseData which sets the monthitemvalue to null so the stored procedure recalculates the monthitemvalue for the changed rows. However, it appears this trigger is also setting monthitemvalue to null after the stored procedure updates the IncomeAndExpenseData table with the recalculated monthitemvalues, thus wiping out the answers.

How do I write a trigger that sets the monthitemvalue to null only when the user edits the itemvalue field, not when the stored procedure puts the recalculated monthitemvalue into the IncomeAndExpenseData table?

View 4 Replies View Related

How Can I Do A Multiple Insert Or Multiple Updates Or Inserts And Updates To The Same Table..

Oct 30, 2007

Hi...
 I have data that i am getting through a dbf file. and i am dumping that data to a sql server... and then taking the data from the sql server after scrubing it i put it into the production database.. right my stored procedure handles a single plan only... but now there may be two or more plans together in the same sql server database which i need to scrub and then update that particular plan already exists or inserts if they dont...
 
this is my sproc...
 ALTER PROCEDURE [dbo].[usp_Import_Plan]
@ClientId int,
@UserId int = NULL,
@HistoryId int,
@ShowStatus bit = 0-- Indicates whether status messages should be returned during the import.

AS

SET NOCOUNT ON

DECLARE
@Count int,
@Sproc varchar(50),
@Status varchar(200),
@TotalCount int

SET @Sproc = OBJECT_NAME(@@ProcId)

SET @Status = 'Updating plan information in Plan table.'
UPDATE
Statements..Plan
SET
PlanName = PlanName1,
Description = PlanName2
FROM
Statements..Plan cp
JOIN (
SELECT DISTINCT
PlanId,
PlanName1,
PlanName2
FROM
Census
) c
ON cp.CPlanId = c.PlanId
WHERE
cp.ClientId = @ClientId
AND
(
IsNull(cp.PlanName,'') <> IsNull(c.PlanName1,'')
OR
IsNull(cp.Description,'') <> IsNull(c.PlanName2,'')
)

SET @Count = @@ROWCOUNT
IF @Count > 0
BEGIN
SET @Status = 'Updated ' + Cast(@Count AS varchar(10)) + ' record(s) in ClientPlan.'
END
ELSE
BEGIN
SET @Status = 'No records were updated in Plan.'
END

SET @Status = 'Adding plan information to Plan table.'
INSERT INTO Statements..Plan (
ClientId,
ClientPlanId,
UserId,
PlanName,
Description
)
SELECT DISTINCT
@ClientId,
CPlanId,
@UserId,
PlanName1,
PlanName2
FROM
Census
WHERE
PlanId NOT IN (
SELECT DISTINCT
CPlanId
FROM
Statements..Plan
WHERE
ClientId = @ClientId
AND
ClientPlanId IS NOT NULL
)

SET @Count = @@ROWCOUNT
IF @Count > 0
BEGIN
SET @Status = 'Added ' + Cast(@Count AS varchar(10)) + ' record(s) to Plan.'
END
ELSE
BEGIN
SET @Status = 'No information was added Plan.'
END

SET NOCOUNT OFF
 
So how do i do multiple inserts and updates using this stored procedure...
 
Regards
Karen

View 5 Replies View Related

Compare Amounts Between 2 Tables

Sep 21, 2012

I have two tables. One is Invoice_tbl, with one account per customer.

This table has 3 fields; CustomerID, InvoiceAmount, InvoiceID.

Then second table is, Payment_tbl, with 2 fields; InvoiceID and PaymentAmount. The Payment table can have multiple payments from each customer.

With Access, i would run a QUERY(call it PaymentTotal) against Payment_tbl, then do a "GroupBy" on InvoiceID and SUM on the "Amount" field.

I then would create a NEW query against Invoice_tbl and INNER JOIN on Payment Total.

How would i do this with SQL?

View 6 Replies View Related

Joining Two Tables (Sum Of Amounts)

Dec 30, 2013

I have a view table A with 4 columns dateseq,SalesAmount, customerseq, CostofGoods and customerdescription.

Fact table B that has 4 columns dateseq,SalesAmount, Costofgoods, RebateAmount, and Customerseq.

My code is like this

select
ta.customerdesctiption,
sum(TA.salesamount) as salesamount,
sum(TB.RebateAmount) as salesamount

from
TableA TA
left join
tableB TB
on TA.dateseq = TB.dateseq

where
ta.dateseq like '%2013'

I would like to get
Customer A | 10,000
Customer B | 3,000
etc

What I get is:
Customer A | 20,000
Customer B | 6,000
etc

View 2 Replies View Related

Adding Column Amounts Per CustomerID

May 26, 2006

Hi guys


Code:


SELECT C.customerID, quantity, unitprice, (SELECT quantity * unitprice), OD.productID,
FROM customers C
INNER JOIN Orders O
ON C.customerID = O.customerID
INNER JOIN [order details] OD
ON O.orderID = OD.orderID
ORDER BY C.CustomerID



The Output looks a little like this:


Code:


customerID quantity unitprice productID
---------- -------- --------------------- --------------------- -----------
ALFKI 15 45.6000 684.0000 28
ALFKI 21 18.0000 378.0000 39
ANATR 1 28.8000 28.8000 69
ANATR 7 9.2000 64.4000 19
ANATR 10 34.8000 348.0000 72
ANTON 24 16.8000 403.2000 11
ANTON 15 46.0000 690.0000 43
AROUT 25 3.6000 90.0000 24
AROUT 16 19.0000 304.0000 36
BERGS 16 15.5000 248.0000 44
BERGS 15 44.0000 660.0000 59
BLAUS 3 10.0000 30.0000 21
BLAUS 21 34.0000 714.0000 60
BLONP 35 12.5000 437.5000 31
BLONP 15 19.5000 292.5000 57
BOLID 24 17.6000 422.4000 4
BOLID 16 15.6000 249.6000 57
BONAP 40 36.8000 1472.0000 43
BONAP 20 7.7500 155.0000 75
BONAP 8 30.0000 240.0000 7
BONAP 20 6.0000 120.0000 13
BONAP 20 25.0000 500.0000 6
BONAP 20 23.2500 465.0000 14
BONAP 10 9.2000 92.0000 19
BOTTM 16 24.8000 396.8000 10
BOTTM 9 46.0000 414.0000 43
BOTTM 30 4.5000 135.0000 24
BOTTM 21 49.3000 1035.3000 62
BOTTM 15 2.5000 37.5000 33



I would like to add the totals from each customerID and then show the customerID once with the final total amount.

The above SELECT statement was the closest I could come.

The help is always appreciated!

Justin

View 5 Replies View Related

Doing Unions With Uneven Column Amounts

Mar 14, 2006

I'm creating a program that allows users to submit a report on equipment at regular intervals. If a piece of equipment has a problem, it is given a job entry that refers back to the report for various information.

However, there will be times when a problem is noticed, and someone wants to submit it immediately; these are made into extra jobs.

To this end, I have three tables:
Reports
Jobs
ExtraJobs

Because ExtraJobs cannot be associated with a report, they have their own table, which holds information that would otherwise be grabbed from both Reports and Jobs. While there are seperate submission forms for regular jobs and extra jobs, I would like them to appear on the same query result when a user looks at submitted jobs or reports.

What I'm currently trying to do is this:
Code:

SELECT*
FROMReports LEFT JOIN Jobs
ON Reports.reportID = Jobs.reportID
UNION ALL
SELECT ExtraJobs.*
FROM ExtraJobs


This won't work because the first half of the union has an extra column (reportID) that the second half does not. Is there any way to add in a value for that non-existant column (say, ExtraJobs.reportID = -1) to make sure that both sides have an equal number of columns?

If worst comes to worst, I could add a reportID column to ExtraJobs and have it set to -1 for everything, but I'd like to keep from adding fat, if at all possible.

View 2 Replies View Related

SSIS Using MASSIVE Amounts Of Memory

Feb 8, 2008

Hi,

I have a series of SSIS packages, all of which are ultimately executed by a parent package.

I'm consitently getting "OutOfMemory" errors when working with the packages which is temporarily solved by closing Visual Studio and re-opening the package(s)... This solution is short lived however as the OutOfMemory error occurs quite quickly after re-opening, often after doing nothing other than altering a variables default value and attempting to save the package.

The average size of the packages in question (.dtsx files) is around 7,000kb with the largest being 12,500kb. The total size of all the solution's packages is ~75,000kb.

The Processes tab in Task Manager shows a Mem Usage counter for devenv.exe *32 of around 20,000kb when Visual Studio is first opened however, when a single ~6,000kb dtsx file is opened this counter jumps to +300,000kb and when the entire solution is opened (When the parent package is executed), the Mem Usage counter for devenv.exe *32 is a massive +800,000kb!!!

Is this normal SSIS behaviour or do I have a major problem? Any tips or suggestions as to how to resolve this issue would be gratefully received.

FYI, "SELECT @@VERSION" gives me "Microsoft SQL Server 2005 - 9.00.3042.00 (X64) Feb 10 2007 00:59:02 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2) "

My Server is Windows Server 2003 R2 Enterprise x64 SP2 with 8GB of RAM.

Thanks in advance.

Leigh.

View 7 Replies View Related

Set Amounts To Zero In Addition If No Records Found

May 9, 2006

How can I check for Null for the amounts if no records are returned in either select.  Basically it errors out if one or both of the Amounts return no records.  I need to do some sort of IF statement to set one of the amounts or both amounts to zero in those cases so it doesn't error out on me

   SELECT (Coalesce(pd1_Amount, 0) + Coalesce(PD2_Amount, 0)) as Amount
   FROM    
     (

     SELECT pd.Amount as pd1_Amount
          FROM Master m (NOLOCK)
          LEFT JOIN dbo.pdc pd ON pd.number = m.number
          INNER JOIN dbo.Customer c ON c.Customer = m.Customer

          WHERE     pd.Active = 1
                    AND pd.Entered BETWEEN DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate) + 1, @FirstDayMonthDate) AND DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate), DATEADD(MONTH, 1, @FirstDayMonthDate))
     AND pd.Entered <> '1900-01-01 00:00:00.000'
                    AND pd.Deposit BETWEEN DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate) + 1, @FirstDayMonthDate) AND DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate), DATEADD(MONTH, 1, @FirstDayMonthDate))
                    --AND pd.Deposit IS NOT NULL    
                    --AND pd.OnHold IS NULL
                    AND c.customer <> '9999999'
          
          UNION

          SELECT   pdd.Amount as PD2_Amount
          FROM Master m (NOLOCK)
          LEFT JOIN dbo.pdcdeleted pdd ON pdd.number = m.number
          INNER JOIN dbo.Customer c ON c.Customer = m.Customer

          WHERE     pdd.Entered BETWEEN DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate) + 1, @FirstDayMonthDate) AND DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate), DATEADD(MONTH, 1, @FirstDayMonthDate))
              AND pdd.Entered <> '1900-01-01 00:00:00.000'
                    AND pdd.Deposit BETWEEN DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate) + 1, @FirstDayMonthDate) AND DATEADD(DAY, -DATEPART(DAY, @FirstDayMonthDate), DATEADD(MONTH, 1, @FirstDayMonthDate))
                    --AND pdd.Deposit IS NOT NULL    
                    --AND pdd.OnHold IS NULL
                    AND c.customer <> '9999999'
) as PDC_Main

View 3 Replies View Related

What Is The Fastest Way To Do This

May 14, 2008

right now I have a stored procedure that goes through each of the Line and Body fields using a cursor. The problem is that this method is very slow. How would you experts solve this problem? any Hints or suggestions?


BEFORE
EXAMPLEPartLineBodySeriesEngineYear
11234A,BWETC1998
25678991,93,94,95WET01997
3345656S,R5,6,12WENC1995


AFTER
EXAMPLEPartLineBodySeriesEngineYear

11234AWETC1998
11234BWETC1998

25678991WET01997
25678993WET01997
25678994WET01997
25678995WET01997

3345656S5WENC1995
3345656S6WENC1995
3345656S12WENC1995
3345656R5WENC1995
3345656R6WENC1995
3345656R12WENC1995

View 4 Replies View Related

Is There Fastest Way To Encryption All SPs In All DBs?

Sep 29, 2000

Hi,
In my SQL server 7.0, I have got 250 store procedures in each database.
Before using them for my application, I want to ecyption all.
I must add "WITH ENCRYPTION" string in each SP in all database and it'll take me a long time. Is there fastest way to encryption all SPs in all DBs? Have anyone got an utility SP ( or anyway else) to do this?
Thanks in advance.

View 1 Replies View Related

Fastest INSERT

Mar 10, 2008

What is the fast way a stored procedure can copy a table from a linked server?

I would like to tune this statement, possibly with hints or other logging options. Assume that table_A and table_B have the exact table structure and that I want to preserve table_A and all its indexes and contraints. The table will be truncated before this load, if that helps in any way.

insert into table_A select * from OpenQuery(Server,'select * from Table_B')

TIA, Mike

View 4 Replies View Related

Fastest Way Of Updating A Row

Jul 20, 2005

In relation to my last post, I have a question for the SQL-gurus.I need to update 70k records, and mark all those updated in a specialcolumn for further processing by another system.So, if the record wasKey1, foo, foo, ""it needs to becomeKey1, fap, fap, "U"iff and only iff the datavalues are actually different (as above, foobecomes fap),otherwise it must becomeKey1, foo,foo, ""Is it quicker to :1) get the row of the destination table, inspect all valuesprogramatically, and determine IF an update query is neededOR2) just do a update on all rows, but addingand (field1 <> value1 or field2<>value2) to the update querythat isupdate myTablesetfield1 = "foo"markField="u"where key="mykey" and (field1 <> foo)The first one will not generate new update queries if the record hasnot changed, on account of doing a select, whereas the second versionalways runs an update, but some of them will not affect any lines.Will I need a full index on the second version?Thanks in advance,Asger Henriksen

View 2 Replies View Related

Producing A Summary Table Of Amounts Per Status Per User

Jan 5, 2012

I want to produce a summary table of amounts per status per user.

I have 2 tables:

Invoices:

Code:
user_id, amount, status
1, £10, S
2, £20, P
3, £30, P
3, £40, E

Users:

Code:
user_id, name
1, user A
2, user B
3, user C

And I want to produce a summary table like this:

Code:
S P E Total
user A £10 £10
user B £20 £20
user C £30 £40 £70

What I have is:

Code:
SELECT Users.name,
(SELECT SUM(amount) FROM Invoices AS t1 WHERE t1.user_id = Invoices.user_id AND (t1.status = 'S')),
(SELECT SUM(amount) FROM Invoices AS t1 WHERE t1.user_id = Invoices.user_id AND (t1.status = 'P')),
(SELECT SUM(amount) FROM Invoices AS t1 WHERE t1.user_id = Invoices.user_id AND (t1.status = 'E')),
(SELECT SUM(amount) FROM Invoices AS t1 WHERE t1.user_id = Invoices .user_id AND (t1.status IN ('S','P','E')))
FROM Invoices
LEFT JOIN Users ON Users.user_id = Invoices .user_id
GROUP BY Invoices.user_id, Users.name
ORDER BY Users.name

This does give me what I want, however the real situation has lots of status codes, many more fields in the Invoices table, hundreds of users and hundred of thousands of records in the Invoice table and I have run out of system memory.

View 9 Replies View Related

Select CASE - Add Up Amounts Based On Transaction Type

Dec 18, 2014

Getting Incorrect Syntax near the keyword 'and'

This table returns multiple records for an Invoice.

Based on the transactiontype_desc the Amount_Paid_DC is a different value. Trying to add up the amounts based on the transactiontype.

select DebtorNumber, InvoiceNumber, Sum(Amount_Invoiced_DC) AS InvAmt,
case transactiontype_desc when 'Sales Invoice' then sum(Amount_Paid_DC) else 0 end as AmtPaid,
case transactiontype_desc when 'Discount/Surcharge' and Amount_Paid_DC < 0 then sum(Amount_Paid_DC) else 0 end as DiscountAmt
FROM BI50_BankTransactions_AR_InvcDt_H
group by debtornumber, Invoicenumber

View 6 Replies View Related

Fastest Data Transfer

Dec 26, 2000

Hi,
1)I need to transfer 500 gb of data from one server to other, which is faster, DTS/BCP/Restore.
2)Which are the best methods for checking blocking, dead locks & Indexes!

Thanks you all in advance

Richard..

View 5 Replies View Related

Fastest/best Way To Handle Update

Nov 5, 2004

I have a master table which has demographic data such as name, dob, location along with a primary key id. It will have about 10-12000 records. We get a refresh file every hour which may or may not have corrections for these records hourly with about 3,000 records. I put this data into a table. This data should be considered always to be correct. To handle the update to the master table I need to create an update process. I can take one of two approaches, just update all the records in the master table regardless if they are correct or not, or do some type of left join on those that do not match (in other words, only update the ones where thae names or dob don't match) There is an underlying update trigger on the patient master which will also fire if these values are changed. An opinions on a best approach?

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved