Where Is SQL2000 Discussion List?

Mar 15, 2002

List-

I have completely lost entry into SQL2000 Discussion List with the following URL:

http://ls.swynk.com/scripts/lyris.pl

This URL taking me to internet.com testlist. I have sent a request to the webmaster but there is no responce.

Anybody know how to get into this forum? I am not able to look at my group messages since yesterday.

Thanks for Redirecting me by any other URL.

Jaganmohan Rao

View 2 Replies


ADVERTISEMENT

NEW SQL DISCUSSION GROUP

Jul 5, 2001

Dear all,

There is now a NEW SQL discussion Group currently being tested;
http://www.baysignia.com/discussions/discussions.xml
Try it out and send comments to faq@baysignia.com

Many Thanks for your comments

SQL DBA
baySignia Systems

View 1 Replies View Related

Index Discussion

Mar 23, 2004

Hi Folks,

Got a topic open for debate.

We currently have an archive table - DDL

CREATE TABLE [dbo].[Audit] (
[id] [int] identity (1,1) NOT NULL ,
[col1] [char] (10) NOT NULL ,
[col2] [char] (15) NOT NULL ,
[col3] [int] NOT NULL ,
[col4] [varchar] (50) NOT NULL ,
[col5] [datetime] NOT NULL ,
[col6] [varchar] (4000) NULL ,
[col7] [char] (3) NULL
)
GO


This table grows to about 40 million rows during the course of the month. The table has a clustered index on the id field and a non clustered index on the col2 and col3. The id column is not used in queries. At the moment we run weekly dbcc reindexes on all the indexes. We are running into a space issue on the reindex of the clustered index (copying the whole table out , ordering etc) and are considering dropping the index or changing to a non clustered index. (The DBCC utility that we have built will only rebuilt all the indexes or none at all.)

I feel this is not a good idea and know my reasons. I would like some input as to why this might prove a bad idea.

Will it increase page splitting? Will the table performance be impacted even if the queries are not specifically using the clustered index?

What are the reasons for and against?

Thanks Folks

View 14 Replies View Related

Discussion Forum - Datastructure

Jul 27, 2005

I'm creating a discussion forum for my website, using Sql Server 2000. I need to display 'Number of threads', 'Number of posts', 'Last post by' (username/id and date) for each forum, and 'Number of replies' and 'Last post by' (username/id and date) for each thread.Here are a couple of ideas I have come up with to solve this problem:1) Poll the database (using a stored procedure that returns the number I'm looking for) for each forum when I loop all the fourms. - I suspect this approach isn't optimal, since it creates more traffic to the database.2) Have fields in my Forum and ForumPost tables for 'Number of threads' and so on. Now, create triggers that updates these fields every time a post is made. - I guess this would be much more effective than the first apporach, since everything is done on the database server directly.Are there any other ways that are better? Please advice! Thanks a bunch for any help!

View 7 Replies View Related

Fill Factor -discussion

Apr 4, 2001

What are the criteria to decide the optimal(or close to optimal) fill factor?
Any input is appreciated.

View 2 Replies View Related

MS Access Vs. SQL Server 7 Discussion

Mar 29, 2000

I'm sure this has been a topic in the past. I would like to build a fact sheet about each one and do a compairison. Are there any articles/links/resources out there that speak to this issue?

Thank you,
Nathan

View 2 Replies View Related

Discussion On The TEXT Datatype And ASP

Sep 22, 1998

OK, I`ve been researching the use of the TEXT datatype all day and would
like opinions on what I`ve found.

First, a little background. I have been tasked with writing an ASP
application to handle the display of FAQs for a company`s products. I would
like to store all info in a table much like

faqID int
question TEXT
answer TEXT

Simple enough, right? I then tried to create a stored procedure to add a
new FAQ and all hell broke loose. ASP would not pass anything larger than
255 chars to the stored procedure.

I read in the "ADO and SQL Server Developer`s Guide" from Microsoft about
using varchar datatypes of 255 chars (instead of TEXT) and chunking large
text up to fit in these smaller datatypes. This seems like a lot of work.

I also read in "Inside SQL Server 6.5" that "The text datatype is sometimes
awkward to work with. Many functions don`t operate against text, stored
procedures are limited in what they can do with text, and some tools don`t
deal with it well." (page 632). This statement concerns me greatly. How
are stored procedures limited in dealing with TEXT? Do the standard SQL
UPDATE and INSERT commands work or must READTEXT and UPDATETEXT be used
instead?

I guess my question is, what is the best way to accomplish this? I have a
feeling that others have had to do this before. Is SQL Server not meant to
handle large textual objects? Is chunking the best way to go? Will version
7.0 handle this scenario better?

Any help greatly appreciated!

--Matt Richmond
MenoX Technologies, Inc.

View 1 Replies View Related

Table Linking Discussion

Jan 15, 2004

I would like to hear your thoughts on a philosophy I adhere to.

As a rule of thumb I've always preached that Unique Indexes are for linking tables and Primary Keys are used to ensure that records aren't duplicated. I’ve embraced this philosophy for a couple reasons, the main one being that I don’t have to create numerous foreign key fields in the foreign key table.

However I’ve done most of my programming in Access and am now in need of something more robust (SQL Server v7) and I’m wondering if I need to reconsider.

I do also have a how to question; that being is it possible to create a table join on a unique index in SQL Server v7 and if so how? I would like to have an Auto Number / Auto Incremented / Unique Identifier field in the Primary Key table that links to a numeric field in the Foreign Key table.

Thanks in advance
Dog

View 14 Replies View Related

Is Discussion Board Reply Working

Jun 18, 1999

I cannot reply to any messages of the board? Is anyone else having this problem?

I see from the board that there haven't been any replies for a few days...

Jarlath O'Grady
mailto:jogrady@swynk.com
http://www.swynk.com/friends/ogrady

View 2 Replies View Related

Database Discussion To Change Perpective

Aug 2, 2006

Hello

I have been doing relational database forever(or a long time) and have been intruduces to a team that uses a highly normalized database(propietary) to manage workflow.

We are capturing data in an AUDIT Trail EAV format.(500 million rows)

It is my task to build this into a data warehouse for reporting and I need to have with my team a relational database discussion. The relational database knowledge on this team is DB2 based, IDMS, and other past evolutions.

The common processes used are recieve a flat file and process this file sequentially using C# or VB doing lookups of other databse tables and writing out another flat file to be converted in XML for load to the propritary system.

My goal is to attempt to introduce new design concepts to my team and these are some talking points that I have come up with for a lunch and learn session.

can anyone else add to this list I don't want to get into a deep discussion about 3rd NF, Star Schemas vs Snowflake, etc.. I want to keep is informational and light to eliceit discussion and relat it back to older technologies.

some of the topics we can discuss are:
Why the data warehouse
Real-time tables what needs to stay in prod
What is going to happen to reporting database
Interaction between database on the same cluster/server
Interaction between databases on different servers (linked servers not allowed)
Set processing as opposed to cursor processing.
Table types
EAV
Type1
Type2
Fact
Dimensions
Code

View 1 Replies View Related

To DB/SQL Teachers: A Wrong Query On Northwind DB Discussion

Jul 26, 2006

Good day to all.

I'm a DB teacher in a University. Planning and developing SQL exercises for my students I found a "tricky" or extrange sql exercise.

The sql exercise using Northwind DB reads as follows:
"Wich is the total amount of the freight that corresponds to all orders containing products of seafood category ?"

The first sql code that comes to my mind was:

select sum(freight)
from orders o, "order details" d, products p, categories c
where o.orderid=d.orderid and d.productid=p.productid and p.categoryid=c.categoryid
and categoryname='seafood'

This is equivalent code constructed with Query constructor in Enterprise Manager:
SELECT SUM(dbo.Orders.Freight) AS Expr1
FROM dbo.Orders INNER JOIN
dbo.[Order Details] ON dbo.Orders.OrderID = dbo.[Order Details].OrderID INNER JOIN
dbo.Products ON dbo.[Order Details].ProductID = dbo.Products.ProductID INNER JOIN
dbo.Categories ON dbo.Products.CategoryID = dbo.Categories.CategoryID
WHERE (dbo.Categories.CategoryName = N'seafood')

These 2 equivalent queries output 27722.9600 as result, but the correct result (total freight) is 23791.1400.

Did you also think (as I did) that the exposed code is correct?

Now I know the correct sql code that gives the corect answer and also the explanation why the code exposed fails, but before share it under your posible request, want to know your tech comments about this exercise in order to know if this particular exercise is "tricky" (make people to fail) or is my total fault and need to review my strategies of applying/constructing sql (and teaching).


Thanks a lot.

Carlos

View 12 Replies View Related

Real-Time Data Mining Discussion

Sep 27, 2006

I am about to prepare a paper concerning the field of real-time data mining. Real-time here means the process of incremental training of an existing model as soon as the data arrives.

There is a number of papers introducing algorithms for incremental association analysis, incremental clustering etc. Stream mining ís a field which is closely related to that. The main reason for the implementation of incremental algorithms is a) the large amount of data to be mined and b) the high rate of new data that is evolving every day.

Using classical batch mining algorithms, models that are outdated for some reason, would have to be re-trained, which could be very time consuming for billions of records. And once the training is completed, the training would have to be restarted once again because a bulk of new data has been arrived.

The question that I would like to discuss now is: For what real world applications would it be a meaningful or even essential to use real-time training of models?

Two main reasons could determine the answer to that question:

You just want to incorporate new data into existing models in order to increase the prediction accuracy of your model or
Your underlying data is subject to more or less massive changes (also refered to as concept drift) and you want to adapt your mining model continuously to that reality.

I'm looking for some examples or ideas where one of these cases apply and it would be a good idea to have incremental mining algorithms involved.

I'm looking forward to inspiring some discussion on that issue.

View 3 Replies View Related

Sql2000 && Sql2005, Want Localhost To Use Sql2000

Sep 17, 2006

 i have sql2000 & sql2005 on the same machine. I am unable to register my localhost in sql2000, get an access denied error. How can I make my localhost use sql2000 database?

View 1 Replies View Related

[Performance Discussion] To Schedule A Time For Mssql Command, Which Way Would Be Faster And Get A Better Performance?

Sep 12, 2004

1. Use mssql server agent service to take the schedule
2. Use a .NET windows service with timers to call SqlClientConnection

above, which way would be faster and get a better performance?

View 2 Replies View Related

Migrate SQL2000 To SQL2000

May 13, 2008



i am in the process of Migrating SQL 2000 to my new SQL2000 server i want to know the what would the best way for me to migrate one SQL server to another SQL server on the same network and rename the new server to the old server and bring it up for use in our ecommerce website.

View 10 Replies View Related

SQL 2012 :: List All Different Values That Go With Single CAS ID To Appear As Comma Separate List

Jun 15, 2015

So at the moment, I don't have a function by the name CONCATENATE. What I like to do is to list all those different values that go with a single CASE_ID to appear as a a comma separate list. You might have a better way of doing without even writing a function

So the output would look like :

CASE_ID VARIABLE
=====================
1 [ABC],[HDR],[GHHHHH]
2 [ABCSS],[CCHDR],[XXGHHVVVHHH],[KKKJU],[KLK]

SELECT
preop.Case_ID,
dbo.Concatenate( '[' + CAST(preop.value_text AS VARCHAR) + ']' ) as variable
FROM
dbo.TBL_Preop preop
WHERE
preop.Deleted_CD = 0

GROUP BY
preop.Case_ID

View 8 Replies View Related

Report Designer: Need To List Fields From Multiple Result Rows As Comma Seperated List (like A JOIN On Parameters)

Apr 9, 2008



I know I can do a JOIN(parameter, "some seperator") and it will build me a list/string of all the values in the multiselect parameter.

However, I want to do the same thing with all the occurances of a field in my result set (each row being an occurance).

For example say I have a form that is being printed which will pull in all the medications a patient is currently listed as having perscriptions for. I want to return all those values (say 8) and display them on a single line (or wrap onto additional lines as needed).

Something like:
List of current perscriptions: Allegra, Allegra-D, Clariton, Nasalcort, Sudafed, Zantac


How can I accomplish this?

I was playing with the list box, but that only lets me repeat on a new line, I couldn't find any way to get it to repeate side by side (repeat left to right instead of top to bottom). I played with the orientation options, but that really just lets me adjust how multiple columns are displayed as best I can tell.

Could a custom function of some sort be written to take all the values and spit them out one by one into a comma seperated string?

View 21 Replies View Related

Insert Value List Doest Not Match Column List

Apr 18, 2007

HI...



I need to do a simple task but it's difficult to a newbie on ssis..



i have two tables...



first one has an identity column and the second has fk to the first...



to each dataset row i need to do an insert on the first table, get the @@Identity and insert it on the second table !!



i'm trying to use ole db command but it's not working...it's showing the error "Insert Value list doest not match column list"



here is the script



INSERT INTO Address(
CepID,
Street,
Number,
Location,
Complement,
Reference)Values
(
?,
?,
?,
?,
?,
?
)
INSERT INTO CustomerAddress(
AddressID,
CustomerID,
AddressTypeID,
TypeDescription) VALUES(
@@Identity,
?,
?,
?
)



what's the problem ??

View 7 Replies View Related

Items In List A That Don't Appear In List B (was Simple Query...I Think)

Jan 20, 2005

Ok, I want to write a stored procedure / query that says the following:
Code:
If any of the items in list 'A' also appear in list 'B' --return false
If none of the items in list 'A' appear in list 'B' --return true


In pseudo-SQL, I want to write a clause like this

Code:

IF
(SELECT values FROM tableA) IN(SELECT values FROM tableB)
Return False
ELSE
Return True


Unfortunately, it seems I can't do that unless my subquery before the 'IN' statement returns only one value. Needless to say, it returns a number of values.

I may have to achieve this with some kind of logical loop but I don't know how to do that.

Can anyone help?

View 3 Replies View Related

Select List Contains More Items Than Insert List

Mar 21, 2008

I have a select list of fields that I need to select to get the results I need, however, I would like to insert only a chosen few of these fields into a table. I am getting the error, "The select list for the INSERT statement contains more items than the insert list. The number of SELECT values must match the number of INSERT columns."
How can I do this?

Insert Query:
insert into tsi_payments (PPID, PTICKETNUM, PLINENUM, PAMOUNT, PPATPAY, PDEPOSITDATE, PENTRYDATE, PHCPCCODE)
SELECT DISTINCT
tri_IDENT.IDA AS PPID,
tri_Ldg_Tran.CLM_ID AS PTicketNum,
tri_ClaimChg.Line_No AS PLineNum,
tri_Ldg_Tran.Tran_Amount AS PAmount,

CASE WHEN tln_PaymentTypeMappings.PTMMarsPaymentTypeCode = 'PATPMT'
THEN tri_ldg_tran.tran_amount * tln_PaymentTypeMappings.PTMMultiplier
ELSE 0 END AS PPatPay,

tri_Ldg_Tran.Create_Date AS PDepositDate,
tri_Ldg_Tran.Tran_Date AS PEntryDate,
tri_ClaimChg.Hsp_Code AS PHCPCCode,
tri_Ldg_Tran.Adj_Type,
tri_Ldg_Tran.PRS_ID,
tri_Ldg_Tran.Create_Time,
tri_Ldg_Tran.Adj_Group,
tri_Ldg_Tran.Payer_ID,
tri_Ldg_Tran.TRN_ID,
tri_ClaimChg.Primary_Claim,
tri_IDENT.Version
FROM [AO2AO2].MARS_SYS.DBO.tln_PaymentTypeMappings tln_PaymentTypeMappings RIGHT OUTER JOIN
qs_new_pmt_type ON tln_PaymentTypeMappings.PTMClientPaymentDesc =
qs_new_pmt_type.New_Pmt_Type RIGHT OUTER JOIN
tri_Ldg_Tran RIGHT OUTER JOIN
tri_IDENT LEFT OUTER JOIN
tri_ClaimChg ON tri_IDENT.Pat_Id1 =
tri_ClaimChg.Pat_ID1 ON tri_Ldg_Tran.PRS_ID =
tri_ClaimChg.PRS_ID AND
tri_Ldg_Tran.Chg_TRN_ID =
tri_ClaimChg.Chg_TRN_ID
AND tri_Ldg_Tran.Pat_ID1 = tri_IDENT.Pat_Id1 LEFT OUTER JOIN
tri_Payer ON tri_Ldg_Tran.Payer_ID
= tri_Payer.Payer_ID ON qs_new_pmt_type.Pay_Type
= tri_Ldg_Tran.Pay_Type AND
qs_new_pmt_type.Tran_Type = tri_Ldg_Tran.Tran_Type
WHERE (tln_PaymentTypeMappings.PTMMarsPaymentTypeCode <> N'Chg')
AND (tln_PaymentTypeMappings.PTMClientCode = 'SR')
AND (tri_ClaimChg.Primary_Claim = 1)
AND (tri_IDENT.Version = 0)

View 2 Replies View Related

Insert Only New Items From A List But Get ID's For New And Existing In The List.

Feb 12, 2008

Hi,

I have a a table that holds a list of words, I am trying to add to the list, however I only want to add new words. But I wish to return from my proc the list of words with ID, whether it is new or old.

Here's a script. the creates the table,indexes, function and the storeproc. call the proc like procStoreAndUpdateTokenList 'word1,word2,word3'

My table is now 500000 rows and growing and I am inserting on average 300 words, some new some old.

performance is a not that great so I'm thinking that my code can be improved.

SQL Express 2005 SP2
Windows Server 2003
1GB Ram....(I know, I know)

TIA





Code Snippet
GO
CREATE TABLE [dbo].[Tokens](
[TokenID] [int] IDENTITY(1,1) NOT NULL,
[Token] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL,
CONSTRAINT [PK_Tokens] PRIMARY KEY CLUSTERED
(
[TokenID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

CREATE UNIQUE NONCLUSTERED INDEX [IX_Tokens] ON [dbo].[Tokens]
(
[Token] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = ON, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
CREATE FUNCTION [dbo].[SplitTokenList]
(
@TokenList varchar(max)
)
RETURNS
@ParsedList table
(
Token varchar(255)
)
AS
BEGIN
DECLARE @Token varchar(50), @Pos int
SET @TokenList = LTRIM(RTRIM(@TokenList ))+ ','
SET @Pos = CHARINDEX(',', @TokenList , 1)
IF REPLACE(@TokenList , ',', '') <> ''
BEGIN
WHILE @Pos > 0
BEGIN
SET @Token = LTRIM(RTRIM(LEFT(@TokenList, @Pos - 1)))
IF @Token <> ''
BEGIN
INSERT INTO @ParsedList (Token)
VALUES (@Token) --Use Appropriate conversion
END
SET @TokenList = RIGHT(@TokenList, LEN(@TokenList) - @Pos)
SET @Pos = CHARINDEX(',', @TokenList, 1)
END
END
RETURN
END
GO

CREATE PROCEDURE [dbo].[procStoreAndUpdateTokenList]
@TokenList varchar(max)
AS
BEGIN
SET NOCOUNT ON;
create table #Tokens (TokenID int default 0, Token varchar(50))
create clustered index Tind on #T (Token)
DECLARE @NewTokens table
(
TokenID int default 0,
Token varchar(50)
)

--Split ID's into a table
INSERT INTO #Tokens(Token)
SELECT Token FROM SplitTokenList(@TokenList)
BEGIN TRY
BEGIN TRANSACTION
--get ID's for any existing tokens
UPDATE #Tokens SET TokenID = ISNULL( t.TokenID ,0)
FROM #Tokens tl INNER JOIN Tokens t ON tl.Token = t.Token

INSERT INTO Tokens(Token)
OUTPUT INSERTED.TokenID, INSERTED.Token INTO @NewTokens
SELECT DISTINCT Token FROM #Tokens WHERE TokenID = 0

return the list with id for new and old
SELECT TokenID, Token FROM #Tokens
WHERE TokenID <> 0
UNION
SELECT TokenID, Token FROM @Tokens
COMMIT TRANSACTION
END TRY
BEGIN CATCH
DECLARE @er nvarchar(max)
SET @er = ERROR_MESSAGE();
RAISERROR(@er, 14,1);
ROLLBACK TRAN
END CATCH;
END
GO




View 5 Replies View Related

ERROR: A Variable May Only Be Added Once To Either The Read Lock List Or The Write Lock List.

May 22, 2006

Hi,
I have set of 2 DTS packages, one of which calls the other by forming a command-line (dtexec) using a Execute Process task.

From the parent package-> Execute Process Task->
dtsexec /F etc... /<pkg variable> = "servername"

Each of the parent and the called package have a variable: "User::DWServerSQLInstance" which is mapped to the SQL server connection manager server name property using an expression. The outer package has the above variable and so does the inner called package (which gets assigned through the command line from the outerpackage call to inner)

I "sometimes" get the following error:

OnError,I4,TESTDOMAdministrator,ACDWAggregation,{A1F8E43F-15F1-4685-8C18-6866AB31E62B},{77B2F3C7-6756-46EB-8C01-D880598FB4B3},5/22/2006 5:10:28 PM,5/22/2006 5:10:28 PM,-1073659822,0x,The variable "User::DWServerSQLInstance" is already on the read list. A variable may only be added once to either the read lock list or the write lock list.

Help would be appreciated!

I have seen other posts on this but, not able to relate the solution to my scenario.

View 9 Replies View Related

A Variable May Only Be Added Once To Either The Read Lock List Or The Write Lock List

May 10, 2006

Hi All,



I have seen a few other people have this error.

Package works fine when run from BIDS, DTExec, dtexecui. When I schedule it, It get these random errors. (See below)

The main culprit is a variable called "RecordsetFileDIR" which is set using an expression. (@[User::_ROOT] + "RecordSets\")

A number of other variables use this as part of their expression and as they all fail, pretty much everything dies.

I have installed SP1 (Not Beta) on server. Package uses config files to set the value of _ROOT.



The error does not always seem to be with this particular variable though. Always a variable that uses an expression but errors are random. Also, It will run 3 out of 10 times without a problem. I am the only person on the server at the time.

Any ideas?



Cheers,

Crispin



Error log:

OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073659822,0x,The variable "User::RecordsetFileDIR" is already on the read list. A variable may only be added once to either the read lock list or the write lock list.

OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073639420,0x,The expression for variable "rsHeaderFile" failed evaluation. There was an error in the expression.

OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.

OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.

OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.

OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636247,0x,Accessing variable "User::rsHeaderFile" failed with error code 0xC00470EA.

OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.

OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.

OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.

OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1071636390,0x,The file name is not properly specified. Supply the path and name to the raw file either directly in the FileName property or by specifying a variable in the FileNameVariable property.

OnError,,,DF_Header_Header,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".

OnError,,,Move All Data,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".

OnError,,,Load Open Batches and Process Files,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".

OnError,,,POSBasketImport,,,10/05/2006 12:03:34,10/05/2006 12:03:34,-1073450901,0x,"component "rsHeader" (365)" failed validation and returned validation status "VS_ISBROKEN".

View 1 Replies View Related

ASP.NET 2.0 And SQL2000

Sep 5, 2006

Hi,I have made a web application using SQL Server 2005
Express with a few admin pages that require login. I used the SQL
Server Managment Studio Express to setup users and groups and it works
fine. But the web server has SQL Server 2000 and all I get is a
user id and password to access the database. It seems I cannot use the
Login control I put on my "login.aspx" page becuase it uses the
integrated authentication which chokes on 2000. I am afraid I wonlt be
able to use some of the 2.0 features such as profiles, ... Is
this the way 2.0 works with SQL Server 2000? Should I ditch the ISP if
they only give me a single access to database with no provisions to
create users. (I was told I had to create a "Users" table and manually
assign ids and passwords and that I can only use the user id and
password they supplied to setup the connection string in the script.)I appreciate your help. 

View 2 Replies View Related

SP1 On SQL2000

Mar 8, 2001

Does anyone know when SP1 will be available for SQL2000??

Thanks,
Steve Bajada

View 1 Replies View Related

SQL2000 On NT4

Oct 23, 2000

Are there any advantages/disadvantages of running SQL2K on NT4 as opposed to running on WIN2K?

View 1 Replies View Related

Sql2000

Jan 7, 2008

pls tell me abt log shipping in sql2000

View 1 Replies View Related

MSDE && SQL2000

Jul 13, 2006

I am developing a Crystal Reports App in VS2005 (VB).  I was originally working on an XP machine with MSDE and stored procedures where it worked fine.
I then transferred the development to a Win2003 machine with SQL2000 and am now getting the following error;
 Failed to open a rowset. Description: 'Get_Calls_By_MLO_Date' expects parameter '@DT1' which was not supplied.  Plus some other error detail.
Here is the SP.
CREATE PROCEDURE Get_Calls_By_MLO_By_Date  @DT1 datetime, @DT2 datetimeASSELECT MLO, CallNo, DT, Type FROM ActionsWHERE dt >= @DT1 AND dt <= @DT2 AND ActionID=1 ORDER BY MLO, DTGO
If I transfer everything back to the XP machine it works fine.
Any ideas?
Thanks
Terry.

View 3 Replies View Related

Locks In SQL2000

Jun 1, 2007

How to lock a Row in SQL2000 so that nobody can select that row.
I applied ROWLOCK, but i am not finding the way.
My query is "SELECT * FROM tablename WITH (ROWLOCK)"
Is this the correct way to write locks.
I would be thankful if u help me

View 2 Replies View Related

SQL2000 DTS To Excel

Apr 18, 2006

I am exporting data to an Excel file via a scheduled DTS package. I need to be able to either overwrite the existing Excel file or delete it. I haven't found a way to do this yet. Any help, comments, or direction will be appreciated. TIA
Paul

View 1 Replies View Related

I Need Your Help,About SQL2000 DataInsert

Apr 25, 2006

I want to insert some data into my SQLServer,For example:insert into xcjl(zch,xcsj) values('027741',getdate())
 
but ,the "zch" maybe have a few data,I want to insert these step by step,how can I do?

View 1 Replies View Related

Using Sp_ For User Sp&#39;s In SQL2000

Feb 20, 2001

Hi all,

We're beginning to migrate our 6.5 DB's to SQL2000. A question came up regarding naming conventions for stored procedures. In BOL it says that for SQL2000:

"Stored procedures with the prefix sp_ are first looked up in master. If a user-defined stored procedure has the same name as a system-supplied stored procedure residing in master, SQL Server always finds the system-supplied stored procedure. "

Behavior in 6.5 was the opposite: SQL Server searched the current database followed by a search in master.

We started several months ago renaming user stored procedures to "usp_XXX", but we still have many non-system sp's that still use "sp_". I'm looking for opinions of whether we should bite the bullet now and rename all our non-system sp's at this time; prior to moving our production environment to SQL2000. Will we see a performance gain since SQL will not have to hit master first before going to the current DB?

Any and all opinions welcome!

Thanks,

Tom Rosebrook
EMJ

View 1 Replies View Related

BCP Sql65/sql2000 Help!

Jan 9, 2002

When bcp'ing in data into an sql 6.5 table that is NOT nullable, records in the text file that contain nulls are ommited, but valid records without nulls are inserted.
However, using exactly the same table structure and files on a newly installed sql 2000 server (sp2), as soon as the first invalid record in the text file (i.e. with a null value)is encountered, bcp'ing is terminated and NO records are inserted.
I can't find any option in bcp/sql 2000 to allow me to carry out a load as before.
The help files state that the default no. of invalid records allowed is 10 (both versions sql), and explicitly setting the -m option of bcp to 10 still doesn't work.
Most of the data i receive has at least one duff record
Is there an option in sql 2000 that allows me to have x number of duff records inserted before a full rollsback occurs?

I can't find anything in technet, msdn or books online.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved