Help Understanding Batches

Dec 4, 2007

I am using SQL Server Express and Visual Studio 2005. I am new to batches and am trying to understand how they work. I am trying to write a query that creates an assembly and the functions that are contained in it. Here is my query:

USE ProductsDRM
GO

IF NOT EXISTS (SELECT 'True' FROM sys.assemblies WHERE name = 'ComputedColumnFunctions')
BEGIN
CREATE ASSEMBLY ComputedColumnFunctions
FROM 'C:WebsitesAssemblyTestStoredFunctionsStoredFunctionsinStoredFunctions.dll'
GO

CREATE FUNCTION fImageFileName
(
@ProductID int,
@ImageSizeCode nvarchar(4000)
)
RETURNS nvarchar(4000)
AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].ImageFileName
GO

CREATE FUNCTION fTestInt
(
@ProductID int
)
RETURNS int
AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].TestInt
GO

CREATE FUNCTION fTestInt2
(
@TestInt int
)
RETURNS int
AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].TestInt2
END
ELSE
BEGIN
PRINT 'The assembly named "ComputedColumnFunctions" already exists. No new assembly was created.'
END

GO

I read in a book about SQL Server 2005 about including a test for whether the object (such as assembly in this case) exists before trying to create it. If I only include the CREATE ASSEMBLY statement and the FROM line below it and delete the next GO down through the last CREATE FUNCTION (just before the END ELSE), it works fine. If I leave it as is, I get a runtime error on the GO line just after the CREATE ASSEMBLY statement. What am I doing wrong?

View 5 Replies


ADVERTISEMENT

Sql Batches In Vs.net

Jul 20, 2005

visual studio.net seems to default to single batch mode when runningsql scripts. does anyone know how to change this behavior? typicalbatches will include object existence, drop, and create batches priorto processing. i have attempted removing the graphical plan, usingbatch separator 'go', and the vba trick using ';' to no avail.tia

View 1 Replies View Related

How Do I Update A Table In Batches ?

Jul 25, 2006

I have an application that processes a large number of input files in a CSV format and then posts the data to a table on SQL Server Express.

The data table can end up very large and I have no requirements to store all the data locally.
I have included the table in my DataSet using visual studio express so I have access to the schema, but will not run Fill() on it.

Ideally I would like to process a CSV file at a time.
I can add records to my local (empty) data table and when I am happy, I can call tableAdapter.Update() or dataSet.DataTable.AcceptChanges() to generate lots of SQL 'INSERT' commands to update the physical database at the server end.

I would then like to empty my local data table (so it doesn't get too big) and repeat the same process over again for each CSV file.

How can I empty my local table without causing it to generate a load of SQL 'DELETE' commands? I want to empty the table and fool ADO.NET into thinking that everything is synchronised, as if it has just done an update but actually hasn't.


Regards

View 3 Replies View Related

T-SQL (SS2K8) :: Updating Rows In Batches?

Nov 6, 2014

I have a production table with 400 million rows.

I have a staging table which has 48 million rows. This data is the same as the production data, except one column has a different value.

Create Table Production
(
Id Int Identity(1,1),
Code Varchar(20),
ReferenceSequence int
)

-- Staging Table

Create Table Staging
(
Code Varchar(20),
NewSequence int
)

I need to update the production table with the newSequence value from staging to replace the ReferenceSequence. I.e:

Update Production

Set ReferenceSequence = Staging.NewSequence
From Staging

where Production.Code = Staging.CodeHowever, updating 48 million rows at once will generate a lot of logging!

How can I do 1 million rows at a time, commit the changes then do the next million?

I've tried some of the examples on the following page [URL], but they look to just update the tables with the same values.

View 4 Replies View Related

Exec Multiple Batches (GO) With Dynamic Sql

Jun 10, 2008

Hello SQL Team!
I'm stuck at this problem for days and need help.
The problem is with the GO keyword. I know the GO causes the batch to get executed and all local variables are lost. But I can't seem to find a work around. I would like each stored procedure to get executed in different batches.

create table sql_cmd(cmd nvarchar(255))
insert into sql_cmd(cmd) values('exec user_sp param1,''param2'')
insert into sql_cmd(cmd) values('exec user_sp2 param1,''param2'')

declare @sql nvarchar(255)
declare c_sql cursor --small table peformance not a problem. Also open to other suggestions
for select cmd from sqlcmd
open c_sql
fetch next from c_sql
into @sql
while @@fetch_status = 0
begin
print @sql
exec sp_executesql @sql
GO
fetch next from c_sql
into @sql
end

View 4 Replies View Related

What Sorts Of Statements Have To Be In Seperate Batches ?

Apr 7, 2008

Recently I was stumped on a problem where I was granting permissions to a user, from within a script that was creating a stored procedure. Then I would check the permissions on the procedure, but the permissions were empty. It turned out that I was missing a "go" statement to seperate the create procedure statement from the grant statement. So that got me to thinking about what other kids of statements must be seperated into seperate batches ? I would thik that anything that is being created must be seperated from any statements that grant or alter permissions, because the items would need to exist first.

comments ?

View 2 Replies View Related

DB Engine :: Deadlock While Updating In Batches

Oct 26, 2015

So i tweaked a stored procedure that did a 1 hour update for a specific countryId.

If that procedure was called at the same time with two different countryId than one update took place and afterwards the other.

Since all the rows are distinct i switched to an batch update only updating 10000 rows at a time ( and not all of the 2 million).

The general locking looked better afterwards but now i receive strange deadlocks.

My theory:

TX1 Updates a row on Page1 (P1) with rowlock. TX2 also does this on P1. Now TX1 deceides to escalate to PAGELOCK. TX1 waits for TX2 to be done with the row so it can lock the page. TX2 waits for TX1 to leave the page since TX2 may also want to pagelock.

Everybody waits for each other so we have a deadlock. Is that feasible ? OR is there another common problem when doing batch updates on the same table with distinct rows ( that can be on a same datapage ofc).

View 5 Replies View Related

Wrapping Query Batches In A Single Transaction.

Jul 23, 2005

I have a query batch "update" script that upgrades my users database from,say version 0 to version 1, or from version 1 to version 2. I would like toknow how I can wrap the entire script in a transaction, so that either thewhole thing succeeds or none of it does.For example:BEGIN TRANSACTION.......... Alter some tables.....GO.......... Alter a stored procedure.....GO.......... Create a new stored procedure.....GOCOMMIT TRANSACTIONorROLLBACK TRANSACTIONGO(how do I get to the "ROLLBACK TRANSACTION" if an error occurs in the updatescript?)

View 2 Replies View Related

Can Any One Tell Me How To Delete Data In Batches Of 10000 From A VLDB Table

Nov 14, 2001

Hi,
I would like to delete a data from a 750million row table in chunks of 10000,without blocking the users.As ours is a 24/7 shop I donot want to block the users for a long time.
Answer for this is highly appreciated.
Thanks
Samna

View 3 Replies View Related

SQL Server 2012 :: Updating 25 Million Records In Batches

Nov 10, 2014

I have 2 tables with this schema

CREATE TABLE tableValues(
[LASTENCRYPTIONDT] [datetime] NULL,
[ENCRYPTIONID] [int] NULL,
[NAME] [varchar](50) NULL

[Code] ....

I want to update tableToUpdate in batches of 5000 per batch and set the lastenecryptionDT to null based on the the join to the tableValues using the column ENCRYPTIONID, and also output updated rows into another table. Incase I would need to do a rollback.

View 3 Replies View Related

SQL Server 2014 :: Group Up Records Randomly Into N Number Of Batches

Jul 6, 2015

I need to group up the records randomly into ‘n’ number of batches. That can be done by NTILE, but I want group up similar records in single group.

Say for example, following is the list of records I have in my table which I want to group into 5 batches

A123
A124
A124
A123
A127

After Ntile I will get the below,

Desired output is, Need output like Ntile but all same id should reside in single batch

Even if I n=5, maximum possibility of batches are 3 only.

View 2 Replies View Related

Transact SQL :: GO Statements To Execute Large Stored Procedure In Batches

Jun 19, 2015

I want to include GO statements to execute a large stored procedure in batches, how can I do that?

View 9 Replies View Related

SQL 2012 :: Deleting Large Batches Of Rows - Optimum Batch Size?

Oct 16, 2015

In another forum post, a poster was deleting large numbers of rows from a table in batches of 50,000.

In the bad old days ('80s - '90s), I used to have to delete rows in batches of 500, then 1000, then 5000, due to the size of the transaction rollback segments (yes - Oracle).

I always found that increasing the number of deleted rows in a single statement/transaction improved overall process speed - up to some magic point, at which some overhead in the system began slowing the deletes down, so that deleting a single batch of 10,000 rows took more than twice as much time as deleting two batches of 5,000 rows each.

good rule-of-thumb numbers (or even better, some actual statistics and/or explanations) as to how many records should be deleted in a single transaction/statement for optimum speed? 50,000 - 100,000 - 1,000,000 or unlimited? Are there significant differences between 2008, 2012, 2014?

View 9 Replies View Related

Transfer SQL Server Objects Task (for A Table): Can It Be Split Into Smaller Batches

May 29, 2008

We are using the Transfer SQL Server Objects Task to transfer a large table. The trans log is filling up for this table. Is there a method to split the Data Transfer Task into smaller batches? (Smaller tables are transferring without issue.)

Thanks.

View 2 Replies View Related

Merge Replication Performance Issues, Enumerating Inserts And Updates For Articles...generations And Batches

Oct 8, 2007


We have a SQLServer 2005 Enterprise merge replication publication with SQL Mobile 3.0 subscribers (Windows Mobile 5.0 and 6.0). We do not use pre-computed partitions due to trigger performance issues with an SSIS/ETL application that supplies data to the merge database. We do use the "Optimize" (=true) option, though we have tried this both ways with no significant differences. We use filters and joins for each worker ID (as HOST_ID) from the subscriptions.

The sync times become increasingly worse after we run the snapshot and bring the publication online. I have tried rerunning the snapshots, this helps little, as it often behaves like the subscription was set to reinitialize and forces a big sync (reload of all data) to the subscriber. We have tried much of the obvious (e.g., flattening filters and joins, adding indexes, etc.).

When users are synchronizing, we watch replication monitor and notice that a lot of time is spent processing "enumerating inserts and updates for article [any article]", especially processing the many generations and batches. This is true for any follow-up syncs after the 1st big sync (initializing the subscription).

I read several posts regarding the batches and generations of changes, and decided to try increasing the €œDownloadGenerationsPerBatch€?. I tried adding this parameter to the snapshot agent job, and the job fails each time with a vague message, even with the default value of 100. How do you change this parameter for SQLServer 2005 Enterprise?

Any suggestions?

Thanks in advance,
Matt

View 5 Replies View Related

Help In Understanding This SQL...

Jan 19, 2005

Hi,

The following SQL is lifted from one of the Reporting Services / Adventureworks2000 sample reports. I'm a little slow / baffled on how the inner joins are working? Specifically the Inner Join Locale and Inner Join ProductModel. I'm used to seeing Inner Join SomTable On Something = Somthing but how these joins are working is lost on me. Can someone give a quick overview (or point me to a reference) so I can better understand.

Thanks!


SELECT ProductSubCategory.Name AS ProdSubCat, ProductModel.Name AS ProdModel, ProductCategory.Name AS ProdCat, ProductDescription.Description,
ProductPhoto.LargePhoto, Product.Name AS ProdName, Product.ProductNumber, Product.Color, Product.Size, Product.Weight, Product.DealerPrice,
Product.Style, Product.Class, Product.ListPrice
FROM ProductSubCategory
INNER JOIN Locale
INNER JOIN ProductDescriptionXLocale ON Locale.LocaleID = ProductDescriptionXLocale.LocaleID
INNER JOIN ProductDescription ON ProductDescriptionXLocale.ProductDescriptionID = ProductDescription.ProductDescriptionID
INNER JOIN ProductModel
INNER JOIN Product ON ProductModel.ProductModelID = Product.ProductModelID
INNER JOIN ProductModelXProductDescriptionXLocale ON ProductModel.ProductModelID = ProductModelXProductDescriptionXLocale.ProductModelID
ON ProductDescriptionXLocale.LocaleID = ProductModelXProductDescriptionXLocale.LocaleID AND
ProductDescriptionXLocale.ProductDescriptionID = ProductModelXProductDescriptionXLocale.ProductDescriptionID
ON ProductSubCategory.ProductSubCategoryID = Product.ProductSubCategoryID
INNER JOIN ProductCategory ON ProductSubCategory.ProductCategoryID = ProductCategory.ProductCategoryID
LEFT OUTER JOIN ProductPhoto ON Product.ProductPhotoID = ProductPhoto.ProductPhotoID
WHERE (Locale.LocaleID = 'EN')




Shawn

View 3 Replies View Related

Need Help Understanding A Function

May 17, 2007

I'm trying to get the following poll working:http://www.codeproject.com/useritems/Site_Poll_Control.aspIt looks like it's exactly what I was looking for, but it doesn't come with much in the way of instructions.  I have the following function: Public Function CastVote(ByVal PollId As Integer, ByVal Answer As Integer, ByVal MemberId As Integer) As Boolean        Dim cmd As New SqlCommand("InsertPollResult", New SqlConnection(Connection))        With cmd.Parameters            .AddWithValue("@PollId", PollId)            .AddWithValue("@PollChoice", Answer)            .AddWithValue("@MemberId", MemberId)        End With        Return (SqlExecuteInsertSp(cmd) > 0)    End Function This calls SqlExecuteInsertSp(cmd) which is:Public Function SqlExecuteInsertSp(ByVal cmd As SqlCommand) As Integer        Dim i As Integer        cmd.CommandType = CommandType.StoredProcedure        Try            cmd.Connection.Open()            i = cmd.ExecuteNonQuery()        Catch ex As Exception            ErrorMessage = "ProDBObject.SqlExecuteInsertSp(SqlCommand): " & ex.Message.ToString        Finally            cmd.Connection.Close()        End Try        Return i    End Function I can't figure out what this is doing.  The best I can figure is it determines if we have a good connection.  Is this right?  In my code CastVote keeps returning false, and I don't know why.   The answer seems to be in the i = cmd.ExecuteNonQuery() line, but I can't figure out what that line is supposed to be doing.Diane  

View 3 Replies View Related

Understanding Sch-M Locks...

Sep 4, 2006

Hi Guys,
I have written quite a big stored procedure which creates a temporary table (multi-session) and updates it. All the statements are encapsulated in a single transaction which is explicitly declared in the code. What happens is that a lock is being put by the server on that table (of type Sch-M) in order thus preventing any type of operations on it (including simple select)

Now, I want to be able read that table from within another transaction. Why is that I cannot use a table hint NOLOCK in the select statement?

Here is some code which reproduces my problem.

Query A:



SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;

BEGIN TRAN TR_DEMO;

CREATE TABLE ##TBL1(
Oidx int not null primary key identity(1,1),
Name nvarchar(30) not null,
Type char(1) not null
);


INSERT ##TBL1 (Name,Type) VALUES ('Car','M');

WAITFOR DELAY '00:00:10';

INSERT ##TBL1 (Name,Type) VALUES ('Plane','M');

WAITFOR DELAY '00:00:10';

INSERT ##TBL1 (Name,Type) VALUES('Submarine','M');

WAITFOR DELAY '00:00:10';

DELETE FROM ##TBL1;

DROP TABLE ##TBL1;

COMMIT TRAN TR_DEMO;



Query B:

SELECT TOP 1 * FROM ##TBL1 (NOLOCK) ORDER BY oidx DESC;

Launch query A and then execute query B.

Thanks a lot for your help.

View 2 Replies View Related

Understanding @@ERROR ....

Nov 3, 2004

My question is in what situations @@ERROR will be set...

I like to do some logic when some error is occured in a particular statement....

the doc. says the @@ERROR value will be set if an error occurs in a statement, and the control will move to the next statement without exiting(???) the procedure and @@ERROR value can be used in that statement.

but when i execute the below procedure, the execution is terminated ( when the error occurs) without moving to the next statement. please help me to understand the SQL Server's @@ERROR and the situations when it will be set....

-----------------------------------------------------------------------
CREATE PROCEDURE VALUE_ERROR_TEST
AS
BEGIN
DECLARE @adv_error INT
DECLARE @errno INT
DECLARE @var int
SELECT @var = '101 a'
SELECT @errno = @@ERROR
print @errno
END
go
-----------------------------------------------------------------------
procedure get successfully compiled. when executed it says,

Server: Msg 245, Level 16, State 1, Procedure VALUE_ERROR_TEST, Line 10
Syntax error converting the varchar value '101 a' to a column of data type int.


Jake

View 1 Replies View Related

Understanding SSIS

Jun 1, 2007

I am a new learner of SQL, Please can someone give me a very brief overview of what SSIS and what it can do?

thanks

View 2 Replies View Related

Understanding Joins

Jan 10, 2008

What's the best resource (book or web article) that explains the joins between database tables?
Thank you!

View 3 Replies View Related

Understanding Triggers

Aug 7, 2006

Please consider the following example.CREATE TABLE test (an_ndx int NOT NULL primary key identity(1,1),a_var varchar(48) NOT NULL,last_edit_timestamp datetime NOT NULL default CURRENT_TIMESTAMP);CREATE TABLE test_history (an_ndx int NOT NULL,a_var varchar(48) NOT NULL,last_edit_timestamp datetime NOT NULL,current_edit_timestamp datetime NOT NULL default CURRENT_TIMESTAMP);GOCREATE TRIGGER update_history ON test FOR UPDATEASBEGININSERT INTO test_history (an_ndx, a_var, last_edit_timestamp)SELECT * FROM deleted;UPDATE inserted SET last_edit_timestamp = CURRENT_TIMESTAMP;END;The question is, does this do what I think it should do? What Iintended: An insert into test results in default values for an_ndx andlast_edit_timestamp. An update to test results in the original row(s)being copied to test_history, with a default value forcurrent_edit_timestamp, and the value of last_edit_timestamp beingupdated to the current timestamp. Each record in test_history shouldhave the valid time interval (last_edit_timestamp tocurrent_edit_timestamp) for each value a_var has had for the "object"or "record" identified by an_ndx.If not, what change(s) are needed to make it do what I want it to do?Will the trigger I defined above behave properly (i.e. as I intended)if more than one record needs to be updated?ThanksTed

View 3 Replies View Related

Is Replication For Me? Help In Understanding...

Aug 29, 2006

I have a situation where I need to distribute a db to 'subscribers' for use during network, and preferred application downtime. Currently, we do this by employing MS-Access. We update our db on the 'subscriber' by sending a text file with the new data using FTP.

When the 'subscriber' opens their local copy of the Access db application, a macro fires off to check for any new file in the ftproot folder, and if one is detect that is newer than the last update, it truncates the existing table, and imports the text file using a predefined import specification format. The process works well enough as is. However, we were hoping to move beyond our dependency on MS-Access for a variety of reasons, so we are looking at developing windows forms apps using the new Asp.Net v2.0 technology and Sql2005 and SqlExpress.

I need some clarification on how replication works. Does it allow a 'snapshot' db to be created on a subscriber that can be used when the network is down? If not do we have alternatives? For example, I guess we could export/import to the subscriber in some manner.

Hope I've made the case clear enough for some responses. Thanks in advance for your thoughts and help. :)

View 5 Replies View Related

Is My Understanding Correct??

Mar 16, 2007

Please forgive the basic-ness of my question, but I have only been using DTS and SSIS for 2 weeks now.

What I'd like to know is, is my understanding of SSIS package development correct.

In SQL Server 2005, I use BIDS to develop my SSIS packages. During development, I simply store my project on the local C drive.

However, once I am finished writing and testing my projects, I move them to SQL Server, using SAVE COPY AS... (for now, I only using one server for both development and production)

Then, anytime I need to change or modify the package, I change the local file system copy of the package, then do another COPY AS... to SQL Server.

That is, once the package is on SQL Server, if you need to change it, you can only do so through BIDS, correct?

I just want to know if I'm on the right track here.

What do other people do?

Thanks much

View 16 Replies View Related

Understanding FTP Task

Apr 25, 2008

I just learned some basic FTP commands for the first time and was able to transfer over some files from one machine to another. Now I'm trying to to this using the FTP Task.


First of all, I understand that the destination machine has to be set up with something called an FTP Site. And I noticed that when I do a "cd" command in my FTP session, I have visibility only to those directories that are set up as an FTP Site.

Ok, onto the FTP Task. First I created and test connected my FTP Connection Manager. Next, I went into the FTP Task Editor, and I'm in the File Transfer page. Looking at the properties, I guess, this is where I tell what file to move where.

So, in IsRemotePathVariable, I selected "False". When I did this, I was expecting to be able to navigate the various FTP Sites I have set up on my destination computer. However, when I click on RemotePath, I only see "/" (Root). Was I wrong to expect to see the FPT Sites here? What am I doing wrong?

View 3 Replies View Related

Understanding ADO's HRESULTs

Sep 10, 2007

Hi, I'm fairly new to coding with ADO in C++. My previous experience with it was in VB 6.0 and VB.NET. I'm having difficultly deciphering HRESULTs that come back from it when an error occurs. Up until this point I was using,

DXGetErrorString9(), and DXGetErrorDescription9() to interpret any HRESULT that got sent back to my program, but now that I'm using ADO they always return "Unknown".

Is there a Microsoft provided group of functions that help programmers to better understand ADO's HRESULTs?



Thanks,

Kyle

View 3 Replies View Related

Need Help Understanding A Call To A Sql Function

Feb 25, 2008

Can someone help me to understand a stored procedure I am learning about? At line 12 below, the code is calling a function named"ttg_sfGroupsByPartyId" I ran the function manually and it returns several rows/records from the query. So I am wondering? does a call to the function return a temporary table? And if so, is the temporary table named PartyId? If so, the logic seems strange to me because earlier they are using the name PartyId as a variable name that is passed in.
 
1  ALTER              PROCEDURE [dbo].[GetPortalSettings]2  (3     @PartyId    uniqueidentifier,45  AS6  SET NOCOUNT ON7  CREATE TABLE #Groups8  (PartyId uniqueidentifier)910 /* Cache list of groups user belongs in */11 INSERT INTO #Groups (PartyId)12 SELECT PartyId FROM ttg_sfGroupsByPartyId(@PartyId)

View 4 Replies View Related

SQL 2000 Upgrade Understanding

Mar 15, 2002

Hello List,

I've couple of questions about Sql Server 2000 and would greatly appreciate, If somebody out there, please answer to it.

Questions:

I've a SQL 7 server dump file and I was wondering, If I can directly load this file into SQL Server 2000? If yes, Would this call a Upgrade of sql 7.0 database to SQl Server 2000? Or this is just a backward compatability support in 2000?

Well the main objective of this is, We are planning to upgrade one of our production SQL 7.0 server and I was wondering, If we can just take the SQL 7.0 dump file and load it into the SQL 2000 Server? If yes, What are the downgrades of this and if no, Should we upgrade the sql 7.0 server itself and along with it all the databases sitting on it? Please shed some light here.

Many thanks.

View 1 Replies View Related

Help Understanding Stored Proc

Jun 14, 2007

hi guys! Can anybody please explain what does lines 4,6,8,13 and 26 does? Thanks in advance!


1 USE [PawnShoppeDB]
2 GO
3 /****** Object: StoredProcedure [dbo].[sp_Customer_AddCustomer] Script Date: 06/14/2007 16:50:07 ******/
4 SET ANSI_NULLS ON
5 GO
6 SET QUOTED_IDENTIFIER ON
7 GO
8 ALTER Procedure [dbo].[sp_Customer_AddCustomer]
9 @Customer_FirstName varchar(50),
10 @Customer_LastName varchar(50),
11 @Customer_MiddleInitial varchar(5),
12 @Customer_Address varchar(100),
13 @Identity int output
14 AS
15 Begin
16 Declare @DateCreated DateTime
17 Set @DateCreated = getDate()
18
19 Insert into RCPS_Customer
20 Values (@Customer_FirstName,
21 @Customer_MiddleInitial,
22 @Customer_LastName,
23 @Customer_Address,
24 '0',@DateCreated)
25
26 Set @identity = Scope_identity()
27 End

View 1 Replies View Related

Jobs Understanding And Analysis

Sep 24, 2007

Hi

How to understand a job which already exists as a part of database.

Muralidaran r

View 4 Replies View Related

Understanding Licensing For 2000

Jan 17, 2008

There seem to be a slew of options.

Finally figuring out that I can't use Express, I need to temporarily have my own SQL Server 2000 db server running. Hopefully just a couple of months, but who knows how long it'll take to convert over to MySQL.

Anyway, I need this db server to be on one machine. One web server will be accessing it (for one db).

What type of license do I need? Is it really the massively-priced per processor type? Or can I use another?

View 11 Replies View Related

Understanding Variable Update

May 16, 2008

Yesterday I was working to try to resolve a issue with not Using a cursor for a unique type of update.

TG was able to resolve my issue using this query


Declare @Stage Table(StartDate datetime,BenefitInterestID INT PRIMARY KEY Clustered, Amount MONEY, InterestAmount MONEY, Interest DECIMAL(10, 4), ai DECIMAL(10, 4))
Insert Into @Stage(StartDate ,BenefitInterestID, Amount, InterestAmount, Interest , ai )
Select
convert(datetime,'2006-12-01 00:00:00.000',101) as StartDate,1 as BenefitInterestID,1701.00 as amount,79.605 as InterestAmount ,0.1000 as Interest,0.0000 as ai
Union all
select '2007-12-01 00:00:00.000',2,172.80,7.92,0.0500,0
Union all
select '2008-12-01 00:00:00.000',4,0.00,0.00,0.0700,0
Union all
select '2009-12-01 00:00:00.000',5,0.00,0.00,0.0900,0
Union all
select '2010-12-01 00:00:00.000',6,0.00,0.00,0.0200,0

declare @ai decimal(10,4)
,@aiTot decimal(10,4)
,@a money
,@ia money

update s set
@ai = ai = s.interest * (isNull(@aiTot,0) + @a + @ia)
,@a = isNull(@a, 0) + s.amount
,@ia = isNull(@ia, 0) + s.interestamount
,@aiTot = isNull(@aiTot, 0) + @ai
from @stage s



TG pointed out that although this query should always work, there is a underlying issue in this type of update method, due to it is dependent on the order in which sql updates (Currently the clustered index is the only way I know of to secure the order is in tact).

Although it is unlikely, TG pointed out that do to this inherent flaw in the methodology, if future releases of SQL were to be modified to improve performance in updates, this method COULD possibly not work, and there is no 100% guarentee that the index would be used.

My questions are

1. Is there any way to Ensure that the style of update I am using uses the clustered index (i.e. "from @stage s WITH (INDEX = ...)") but I do not know the ID of the index since it is a TMP table?

2. Force the order of the update to be the order of the index (Just because there is a index, that doesn't ensure that a future release of sql will reference the index from top to bottom (I know this is unlikely, but I still would like to know if there was a way to ensure the update did always occur from top to bottom of the index).

View 11 Replies View Related

Understanding Sql, This Insert Query

May 22, 2007

hi all, i need someone to break this down for me, what it means, what each part does. thank you in advance, i would b lost without forums. :-)


$query = "INSERT INTO address_dtl
(srn_id,
box_id,
box_privacy,
box_start,
box_who,
box_what,
box_timestamp,
box_attribute)
VALUES
('$srn_id',
'$wwc_id',
'NNNN',
'" . date('Ymd') . "',
'$who',
'$what','" .
date("Ymd") . "',
'$attribute')";

$success = $conn->Execute($query);


site: http://www.deweydesigns.com
blog: http://www.deweydesigns.com/blog
myspace:
http://www.myspace.com/esotericstigma

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved