Access To SQL Server 2000 Pauses With Lots Of Inserts, Updates, ...
Nov 21, 2007
Hi,
I have the following problem: Within a VBScript, I use a component (written in C++ I think with use of ADO) for sending "Insert", "Update" Statements to an SQL Server 2000 for inserting, updating data. If I insert 100 - 120 Records in a Loop, all works fine. If I insert 1000 records, approximately 150 records will be inserted very quick, then the program pause fo approx. 8 - 15 minutes and then it proceed for the next 150 recs, pause for 8 - 15 minutes and so on.
If I use a SQL Server 2005 for the database, all works fine. The same happens with another customer and another program written in Visual Basic 6.0 with ADO. The access to SQL 2000 pause and with SQL 2005 all works fine. It seems to me that this is a problem with some buffers, timeout, or so. Has anyone an idea on what screw I can turn?
Thanks
Hans
View 1 Replies
ADVERTISEMENT
Mar 19, 2008
Hi,
I am working on an application to analyse down time on a production line system. The system has about 40 rows inserted per minute. The inserts are coming from about 10 different stations.
I need to a analyse the downtime between each insert from each station. The plan is to copy the data to another database on a different server so as not to affect the live system that is being updated by the production line.
However the initial requirement was to do this at night while the production line was down but now they want the data to be updated every 3 hours which means performing this huge query while the production line is bombarding the DB with inserts.
I am wondering what is the best way of doing this. Is there any way I can limit the abount of processor this proceedure will take.
Any advice appreciated,
Thanks,
Sean
View 2 Replies
View Related
Oct 30, 2007
Hi All,
I'm a relative novice on SQL Server and am a complete beginner at SQL, so am looking for a little help.
I currently use a DTS package to perform inserts / updates to a "production" table.
The DTS package transforms a comma separated file into a "temporary" table that is truncated / cleared before the load starts.
The temporary table has a column denoting Insert or Update. The production table is almost identical, however, doesn't contain the Insert / Update column. The DTS package then, depending upon the Insert / Update flag, either inserts data into the production table or updates data in the production table.
When the DTS package has completed, I'd like to be able to run an SQL Query that validates everything in the "temporary" table is identical to that in the "production" table, which it should be.
I have managed to do some queries to verify that everything has loaded / updated i.e. select primary_key from temporary table where primary_key not in (select * from production table), however, what I haven't been able to do is verify that all the columns on the temporary table match the values in the production table (excluding the Insert / Update flag).
I tried concatenating the columns in each table and comparing the concatenated values, however, this failed due to the different data-types, i.e. decimal, text etc.
Any help will be greatly appreciated.
Many thanks.
Cheers,
David
View 8 Replies
View Related
Aug 30, 2007
I have a database setup and the fields setup as nchar(200).
Now, when I update a row using the code below it inserts the text but then seems to fill the rest of the field with spaces. i.e. if the text is only 10 characters, mssql seems to put 190 characters on the end of it to make 200.
Is there any reason how I can stop this/1 // Create new command
2 comm = new SqlCommand(
3 "UPDATE Pages SET PageBody=@PageBody, " +
4 "PageMetaTitle=@PageMetaTitle, PageMetaDesc=@PageMetaDesc, PageMetaKeywords=@PageMetaKeywords " +
5 "WHERE PageID=@PageID", conn);
6
7 // Add command parameters
8 comm.Parameters.Add("@PageID", System.Data.SqlDbType.Int);
9 comm.Parameters["@PageID"].Value = idTextBox.Text;
10 comm.Parameters.Add("@PageBody", System.Data.SqlDbType.NVarChar);
11 comm.Parameters["@PageBody"].Value = contentTextBox.Text;
12 comm.Parameters.Add("@PageMetaTitle", System.Data.SqlDbType.NVarChar);
13 comm.Parameters["@PageMetaTitle"].Value = titleTextBox.Text;
14 comm.Parameters.Add("@PageMetaDesc", System.Data.SqlDbType.NVarChar);
15 comm.Parameters["@PageMetaDesc"].Value = descriptionTextBox.Text;
16 comm.Parameters.Add("@PageMetaKeywords", System.Data.SqlDbType.NVarChar);
17 comm.Parameters["@PageMetaKeywords"].Value = keywordsTextBox.Text;
View 3 Replies
View Related
Jul 20, 2005
Hello,Can someone point me to getting the total number of inserts and updates on a tableover a period of time?I just want to measure the insert and update activity on the tables.Thanks.- Vish
View 3 Replies
View Related
May 27, 2014
I need a script that inserts the data of an excel sheet into a table. If something already exists it should leave it, unless it's edited in the excel sheet and so on and so on. This proces has to go through a stored procedure... ...But how?
View 6 Replies
View Related
Oct 30, 2007
Hi...
I have data that i am getting through a dbf file. and i am dumping that data to a sql server... and then taking the data from the sql server after scrubing it i put it into the production database.. right my stored procedure handles a single plan only... but now there may be two or more plans together in the same sql server database which i need to scrub and then update that particular plan already exists or inserts if they dont...
this is my sproc...
ALTER PROCEDURE [dbo].[usp_Import_Plan]
@ClientId int,
@UserId int = NULL,
@HistoryId int,
@ShowStatus bit = 0-- Indicates whether status messages should be returned during the import.
AS
SET NOCOUNT ON
DECLARE
@Count int,
@Sproc varchar(50),
@Status varchar(200),
@TotalCount int
SET @Sproc = OBJECT_NAME(@@ProcId)
SET @Status = 'Updating plan information in Plan table.'
UPDATE
Statements..Plan
SET
PlanName = PlanName1,
Description = PlanName2
FROM
Statements..Plan cp
JOIN (
SELECT DISTINCT
PlanId,
PlanName1,
PlanName2
FROM
Census
) c
ON cp.CPlanId = c.PlanId
WHERE
cp.ClientId = @ClientId
AND
(
IsNull(cp.PlanName,'') <> IsNull(c.PlanName1,'')
OR
IsNull(cp.Description,'') <> IsNull(c.PlanName2,'')
)
SET @Count = @@ROWCOUNT
IF @Count > 0
BEGIN
SET @Status = 'Updated ' + Cast(@Count AS varchar(10)) + ' record(s) in ClientPlan.'
END
ELSE
BEGIN
SET @Status = 'No records were updated in Plan.'
END
SET @Status = 'Adding plan information to Plan table.'
INSERT INTO Statements..Plan (
ClientId,
ClientPlanId,
UserId,
PlanName,
Description
)
SELECT DISTINCT
@ClientId,
CPlanId,
@UserId,
PlanName1,
PlanName2
FROM
Census
WHERE
PlanId NOT IN (
SELECT DISTINCT
CPlanId
FROM
Statements..Plan
WHERE
ClientId = @ClientId
AND
ClientPlanId IS NOT NULL
)
SET @Count = @@ROWCOUNT
IF @Count > 0
BEGIN
SET @Status = 'Added ' + Cast(@Count AS varchar(10)) + ' record(s) to Plan.'
END
ELSE
BEGIN
SET @Status = 'No information was added Plan.'
END
SET NOCOUNT OFF
So how do i do multiple inserts and updates using this stored procedure...
Regards
Karen
View 5 Replies
View Related
Jan 22, 2001
I have a number of columns with predefined character length but user can input more from gui. i want to trucncate automatically to the desired length and insert or update the database right now it does not allow me to update , or insert the values can i do it and how this is urgent
View 2 Replies
View Related
Nov 28, 2007
Hi
I have 3 table (tbl1, tbl2, tbl3).
tbl1 colums ---Pid(autonNumber),Pcode(unique values),PGuid
tbl2 columns---Pid, CatId,Fid
tbl3 Colums----Pid,BundleId
Using Trigger(After Insert) in tbl1 to insert the data of tbl1.Pid to tbl2.Pid and tbl3.Pid . tbl1 will be uploaded with new rows and existing rows will be updated from the csv file.
Trigger in tbl1 works fine by inserting data to tbl2.Pid and tbl3.Pid.
My questions are :
1. Is it possible to use trigger to update tbl2 and tbl3 if records exist if not insert new record to colum Pid in tbl2 and tbl3 from Pid of tbl1
2.Or is it possible to use sql procedure to do cascaded updates or inserts by merging datasets from tables and csv file using asp.net
3.Or How can I use single Trigger to insert if not exist and update if data exist.
4.Or Do I have to use two trigger one for insert and one for update. If this is the case how do check for data existance.
After insert Triger I am using at the moment is below (replied as an answer in this forum for my previous thread)
create trigger t1 on tbl1
after insert
as
begin
declare @id int
select @id=pid from inserted
insert into tbl2(pid) values(@id)
insert into tbl3(pid) values(@id)
end
View 5 Replies
View Related
Sep 15, 2006
Hi, i have the following sp. im using vs.net2005, sqlserver2005.
now how do i implement functions like - > okay using datasets, exception is thrown that the row has been modified and i get to inform the user, reload the modified row, .
how do i get same functionality using sp? is that possible ? how/
set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
go
ALTER Procedure [dbo].[sp_Insert_Or_Update_Bill]
@BillID_All uniqueidentifier,
@BillID uniqueidentifier,
@Pono nvarchar(25),
@Date_ smalldatetime,
@SupplierCode nvarchar(25),
@Reference nvarchar(25),
@AmountType nchar(10),
@BillType nvarchar(25),
@TypeCode nchar(10),
@AmountFC decimal(18,2),
@ROE decimal(9,2),
@Currency nchar(5),
@LinkBill uniqueidentifier,
@multiplicityID uniqueidentifier,
@payment_or_bill smallint
as
If Exists(Select * from bill where billid_all= @BillID_All and amounttype = @amounttype)
begin
update bill
SET [BillID_All] = @BillID_All
,[BillID] = @BillID
,[Pono] = @Pono
,[Date_] = @Date_
,[SupplierCode] = @SupplierCode
,[Reference] = @Reference
,[AmountType] = @AmountType
,[BillType] = @BillType
,[TypeCode] = @TypeCode
,[AmountFC] = @AmountFC
,[ROE] = @ROE
,[Currency] = @Currency
,[LinkBill] = @LinkBill
,[multiplicityID] = @multiplicityID
,[payment_or_bill] = @payment_or_bill
where billid_all= @BillID_All and amounttype = @amounttype
end
else
if not @AmountFC = 0
begin
begin
INSERT INTO [Costing].[dbo].[Bill]
([BillID_All]
,[BillID]
,[Pono]
,[Date_]
,[SupplierCode]
,[Reference]
,[AmountType]
,[BillType]
,[TypeCode]
,[AmountFC]
,[ROE]
,[Currency]
,[LinkBill]
,[multiplicityID]
,[payment_or_bill])
VALUES
(@BillID_All
,@BillID
,@Pono
,@Date_
,@SupplierCode
,@Reference
,@AmountType
,@BillType
,@TypeCode
,@AmountFC
,@ROE
,@Currency
,@LinkBill
,@multiplicityID
,@payment_or_bill)
end
end
View 1 Replies
View Related
Nov 22, 1999
Is there a way to switch off transaction logging for insert and update statements and not only for select into bulk copy?
View 1 Replies
View Related
May 26, 1999
Can someone point me to some code that properly prepares a string for an INSERT or UPDATE into a char or varchar column? I need to be able to handle single & double quotes and any other possible issues. thanks
View 1 Replies
View Related
Nov 5, 2007
I am looking for pros and cons for the following scenarios:
When a table contains a unique key constraint is it viable to always do an insert and immediately check the @@ERROR value and if @@ERROR states a duplicate key exception then perform an update statement?
Another possible solution would be to always check if the key exists and then do the insert / update based upon that result. This method will always require two steps.
View 4 Replies
View Related
Apr 2, 2007
Hi:
We have a weird problem in our test SQL 2005 SP1 - Windows 2003 Enterprise SP1 - Server (after upgrading from SQL 2000).
While changing data via JBOSS - JDBC connection, we get all our data modifications rolled back.
Selects are fine. Same SQL 2005 Inserts/Updates through JBOSS are fine on other machines with Windows XP (JBOSS is on the same machine there, while in troubled configuration it's on its own server).
But in production imitation environment we can't modify data.
Please advise where to look for the solution.
Thanks.
View 2 Replies
View Related
Dec 14, 2001
We recently upgraded from a HP LH3(Dual PIII 550, 512MB, NT 4 SP6, MDAC 2.5, SQL 7 SP3) to a new Dell 2550: (Dual PIII 1.26, 512 MB, NT 4 SP6, MDAC 2.6, SQL 7 SP3).
Our system is comprised of a service that listens on a port and then sends the requests to a Transaction Server,
inserts stuff into the database and then goes back the other direction to send the response.
We have a stress tester that we can start multiple threads and send different type of transactions. One of these transactions is a simple ping, just a header and footer that gets inserted into the database and then a header and footer is returned to the client.
Our problem is that these are working correctly on the live database server and our live (service) workstation, as it will send thousands of pings without stopping. On the new server it will stop on 35 pings, pause for several seconds, then send another 35 pings, pause, this cycle continues. (No, we don't have any code inserted to do this :P)
We have watched the Performance Monitor and see nothing out of the ordinary. We have ran a trace on both the test server and this new server. Near the end on the new server's trace it shows the disconnect twice with a duration of 40094 and then it starts up with a call to the sp_sqlagent_get_perf_counters. This trace only had the SQL command and stored procedure events. We had ran a trace that contained all events and stress tester actually worked without pauses. But this slowed down the transactions too much, and isn't really an acceptable resolution.
Has anyone ever encountered this problem? Any options that need to be set?
Thanks,
G.
View 1 Replies
View Related
Aug 15, 2006
Hi,Before stepping into ado.net code to perform an insert or update, the insert / update has already taken place, just on starting the debugger. I use VS 2005 on SQL Server 2000. This did not happen with VS 2003 and SQL Server 2000.Anyone else encountered this?
View 2 Replies
View Related
Sep 6, 2005
Hello people!Here is the situation:We have an old propriatary database that is used for daily tasks. Not much we can do with it, but we have to use it.I can create basically a read only connection to it through ODBC.I would like to, on a timed interval copy certain data for reporting from this slow thing to my SQL server so that I can learn to program, and create some cool reports etc without having to wait on this server all day.So here is what I don't quite understand.I had originally planned on just deleting the contents of the tbl on my SQL server just before I populated it each time, but found out that my AutoNumber field will continue to increase, and I'm assuming that eventually I'm going to run into a problem as a result.Should I be doing some kind of update instead? if so do I need to first CHECK if the record exisit, if not then do an insert, if so do an update type thing?Or is there a way to basically do it in one command?I hope this makes sense. I would show you some code but there really isn't much to show you other than my insert statement :->Thanks for any advice!Josh
View 5 Replies
View Related
Mar 11, 2002
I am kinda new with SQL and am trying to get a count of on the number of updates and or inserts to any given or group of tables and cannot get the syntax correct...can anyone help with this?
Thank you in advance.
Colin P.
View 1 Replies
View Related
Jul 20, 2005
Hopefully someone can at least point me in the right direction for moreresearch (e.g.: correct terminology). My only previous experience was justdumping data into a database using ODBC, and that was some years ago so nowmostly forgotten.I need to write an NT Service/Application (in C/C++) that will be gettingdata sent to it via SQL Server 2000. The data will arrive in my SQL Server(read-only access), via replication of tables from another remote SQLServer.My application needs know when new row are inserted, or updated so it can toread this data (needs to be quick/timely so hopefully no polling) to theninterface with other remote proprietary systems.T.I.A.PS: If you can recommend appropriate books on SQL Server 2000 that wouldalso be useful.
View 2 Replies
View Related
Mar 1, 2006
I have transactional replication set up between two SQL Server 2000 databases. In some cases when I perform an UPDATE on a published table on the the publisher, SQL Server attempts to perform a DELETE followed by an INSERT on the subscriber using the stored procedures created during the initial snapshot.
Why does it do this?
How can I stop it doing this and force an UPDATE on the publisher to call the UPDATE procedure on the subscriber?
Thanks
View 3 Replies
View Related
Oct 23, 2007
Hi all,
This managed application was written to run on a Symbol 3090 Win CE 5.0 scanning device. We are using the symbol provided classes to access the scanning interface, and SQL Compact database on the device to collect the scanned data, and then using merge replication to synchronize scanned data when the device is docked. The problem we have experienced seems to be releated to the performance when inserting and updating records in the database.
We have tested some randomly generated 1000 records and inserting/updatating into a database. At first the time to commit a record increases when the database is flushing into the memory (The flush interval in the connection string property is 10 seconds by default). and then as the database size grows increasing the time to commit every single record which is causing the application to perform slowly as they scan items into the database. However, the device program memory remains consistant as they are scan items. From our tests, I found the time to execute either a update/insert command on 2MB sqlMobile database (upto 10000 records, depending on the size of the columns) is taking nearly 2 to 2 and half seconds to complete. Below is the only code I am executing,
If Not sqlObj.UpdateItem(1061022, itemNo, 1) Then
sqlObj.InsertResultSet(1061022, itemNo, itemObj.Style, itemObj.Color, itemObj.Size, itemObj.Description, 0, 1)
End If
For the notes, I am using prepared updated command and resultset.insert methods to perform update and insert commands into the database.
Any help on this issue is highly appreciated.
Thanks
Ravi.
View 1 Replies
View Related
Apr 16, 2007
I want to measure updates/Insertion rate of my databse in order to measure that how percent the databses is booked for insert and update.
can some one can suggest me the mechanism or other resource for doing this work...
thanxs in advance
View 4 Replies
View Related
Jun 8, 2007
Ok, I think this may have a simple answer. Basically I have no problems in setting up QueryString/Control/etc parameters when I use SELECT in the Configure Data Source Wizard as it prompts me for the necessary parameters. But when I try to use the Configure Data Source Wizard with an UPDATE, INSERT or DELETE it does NOT prompt me for the required parameters.Is this a bug or am I just missing something? Do I have to put them in manually or something?Thanks!
View 5 Replies
View Related
Jan 3, 2008
I have a 14GB database whose data content is legacy and is described as static. The log file is significantly large and continues to change size mostly increasing by 2-5GB a day (~60GB now) I have observed over the past two days; it shrank once unexpectly by a few GB. The instance is hosting other databases such as: EnterpriseVaultDirectory, EnterpriseVaultMonitoring, EnterpriseVaultStore, and NetPerfMon - might these seemingly unrelated data sources be involved?
I am trying to a trace to find traffic against the tables, no such luck.
Web applications are playing against it for queries but there should be no UPDATEs beign applied. I can only suspect that other unknown applications are performing operations but have yet to find unexplained connections.
Are there any other reasons why this type of log file activity would happen merely due to queries or stored procedure calls?
Lets also state, "mirroring, indexing, replication" are not at play. I know logging "Full" is not necessary as "Simple" should suffice but I am still hunting down why UPDATEs might be getting through. I realize I might adjust the migrated SQL 2000 security model to deny updates to find what breaks but would rather not take that iniative yet.
The installation is a fresh SQL 2005 Standard setup with SP2 applied; the databases were upgraded.
View 6 Replies
View Related
Oct 8, 2007
We have a SQLServer 2005 Enterprise merge replication publication with SQL Mobile 3.0 subscribers (Windows Mobile 5.0 and 6.0). We do not use pre-computed partitions due to trigger performance issues with an SSIS/ETL application that supplies data to the merge database. We do use the "Optimize" (=true) option, though we have tried this both ways with no significant differences. We use filters and joins for each worker ID (as HOST_ID) from the subscriptions.
The sync times become increasingly worse after we run the snapshot and bring the publication online. I have tried rerunning the snapshots, this helps little, as it often behaves like the subscription was set to reinitialize and forces a big sync (reload of all data) to the subscriber. We have tried much of the obvious (e.g., flattening filters and joins, adding indexes, etc.).
When users are synchronizing, we watch replication monitor and notice that a lot of time is spent processing "enumerating inserts and updates for article [any article]", especially processing the many generations and batches. This is true for any follow-up syncs after the 1st big sync (initializing the subscription).
I read several posts regarding the batches and generations of changes, and decided to try increasing the €œDownloadGenerationsPerBatch€?. I tried adding this parameter to the snapshot agent job, and the job fails each time with a vague message, even with the default value of 100. How do you change this parameter for SQLServer 2005 Enterprise?
Any suggestions?
Thanks in advance,
Matt
View 5 Replies
View Related
Feb 27, 2008
Hi!We have SQL Server 2000 on virtual machine (slowly). We run a pair ofstored procedures from ADO.NET 2.0 (WebService):- first procedure: insert into table A a data row with correlationID- next procedure: search for inserted record in tab A bycorrelationID and insert value into tab B with constraint to tab A.The problem is when we try to do this in short time (few minutes)thousands times (5000/6000 pairs in 5/6 minutes) in 100 parallelthreads:query in second procedure cannot find inserted record in firstprocedure.Everything is dirty read. no transactions.Have you any ideas?edi
View 1 Replies
View Related
Jan 6, 2008
when I hit F12 to preview my page and forms etc it goes to this error
This error (HTTP 500 Internal Server Error) means that the website you are visiting had a server problem which prevented the webpage from displaying.
For more information about HTTP errors, see Help.
AND when I try to hold up to an SQL database it wont let me and says the server doesnt exist still. Whats wrong with my SQL server and what can I do to resolve this problem?
View 1 Replies
View Related
Aug 3, 2006
Hello guys och dolls.
I have a DTS program in sql-server 2000 which worked to week 27 but not from week 28. During that time I was on vacation och no one else was working with the server.
The part which is the problem looks like this.
if exists (select * from dbo.sysobjects where id = object_id(N'[lenko7].[aviskydd]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [lenko7].[aviskydd]
GO
CREATE TABLE [lenko7].[aviskydd] (
[Col002] [varchar] (255) NOT NULL,
[linecount] [int] NOT NULL
) ON [PRIMARY]
GO
I have not changed owner.
I don't understand why this DTS package suddenly cant work because it has been working for a several years with no problems. The only thing a know is that we have updated windows 2000 cause of a critical updates Juli 13.
Is there anyone who has the same problem.
Regards Stig Holmlund
View 3 Replies
View Related
Apr 26, 2007
Hi,
I have few tables. I want to identify the RECORDS for a table which has been created/modified for a particular date and time. I don't want to write a trigger to capture the event for add/update.
Is there any system table which track for date and time using stored procedure each individual records which has been last updated or newly created records??
Note : The application already created without lastModified date and each table... so, we don't want to modify the application or db.
Database : SQL Server 2000 sp4
thanks in advance.
View 3 Replies
View Related
Oct 18, 2005
Hi Guru's,
I am kind of baffeled. I have a table with a column of 8 varchar in 2000
and the same in 6.5. When I insert into 2000 with a data length of more than 8 chars via Cold Fusion into the table, it fails. The same Cold Fusion
program inserts into the 6.5 table, but truncates the data but does not fail.
Does anyone know why this happens. Thanks, Newbie.
View 1 Replies
View Related
Jul 23, 2005
Hi,Simple question: A customer has an application using Access 2000frontend and SQL Server 2000 backend. Data connection is over ODBC.There are almost 250 concurrent users and is growing. Have theysqueezed everything out of Access? Should the move to a VB.Net frontendtaken place ages ago?CheersMike
View 4 Replies
View Related
May 27, 2008
Parameter
Access 2000/XP
SQL Server 7.0
SQL Server 2000
MSDE 2000
Number of instances per server
n/a
n/a
16
16
Number of databases per instance / server
n/a
32,767
32,767
32,767
Number of objects per database
32,768
2,147,483,647
2,147,483,647
2,147,483,647
Number of users per database
n/a
16,379
16,379
16,379
Number of roles per database
n/a
16,367
16,367
16,367
Overall size of database (excluding logs)
2 GB
1,048,516 TB
1,048,516 TB
2 GB
Number of columns per table
255
1024
1024
1024
Number of rows per table
limited by storage
limited by storage
limited by storage
limited by storage
Number of bytes per row
(Excluding TEXT/MEMO/IMAGE/OLE)
2 KB
8 KB
8 KB
8 KB
Number of columns per query
255
4,096
4,096
4,096
Number of tables per query
32
256
256
256
Size of procedure / query
64 KB
250 MB
250 MB
250 MB
Number of input params per procedure / query
199
1,024
2,100
2,100
Size of SQL statement / batch
64 KB
64 KB
64 KB
64 KB
Depth of subquery nesting
50
32
32
32
Number of indexes per table
32
250 (1 clustered)
250 (1 clustered)
250 (1 clustered)
Number of columns per index
10
16
16
16
Number of characters per object name
64
128
128
128
Number of concurrent user connections
255
32,767
32,767
5
View 1 Replies
View Related
Jul 23, 2005
I've created a small company database where the tables reside in a SQLServer database. I'm using Access 2000 forms for a front end.I've got a System DSN set-up to SQL Server and am using links withinAccess 2000 to get to the SQL Server tables.My forms worked fine until I made a few minor changes to the databaseschema on SQL Server (e.g. added a foreign key, or added a column).After that, all the links break - I click on a table link and get anerror msg like "invalid object name."Deleting the links after a schema change and re-adding the links seemedto fix the problem. The forms I'd already created seemed to work fineafter re-creating the links.But then I got more advanced with my forms. I have it set up so thatfor certain entry fields, the combobox gets populated with values froma table (the description appears in the drop-down and the correspondingprimary key value gets populated in the table). I created a number offorms using this technique, entered data, and everything worked fine.Made a small schema change and it broke everything -- not the actualtable links, but the functionality for the drop-downs. My values nolonger appeared, and this was true for forms that accessed tables whoseschemas did not change.This is driving me nuts. Is there any way to keep my forms frombreaking each time I make a small schema change?Thanks.- Dana
View 5 Replies
View Related