Lets assume database A is production, B is copy. SQL Server 2005 sp2, SQL CE 3.5
Database A has a variety of transactions against it 24x7
Database B (the copy) is for reporting and as a source of merge replication for SQL CE instances
Merge replication and reporting is used 24x7 as well
I have the following requirements:
Maintain an up to date copy of the production database (need not be up to the minute, could be hourly, even daily update)
Database B is read-only. The merge replication is NOT bi-directional.
Here is the caveat (which I think prohibits using some solutions to this problem):
The production application accomplishes much of it's functionality with in-memory copies of records. I have no control over the production application. When it works against the database, it sort of does a 'withdrawal-deposit' scenario. (to the best of my knowledge it's not using SQL Server transactions) So, for every record it works with, a copy is made out of the database, changes are made in memory, a delete of the database record is done, then the record is re-inserted.
With this kind of behavior in db A, I'm not sure what it would do to log-shipping or transactional replication. I do know that I want to minimize the changes required at the SQL CE instances to keep the sync operation to a minimal cost.
Can you describe the best (or your preferred) method for updating data held in a related table using Visual Studio 2005 and SQL Server. For example; if you had a stock control system with the product names and current stock levels in one table and all stock movements in and out held in another table, what is the best, fastest, safest and most reliable method of inserting a stock movement and then updating the current stock level? I have tried a couple of different methods but would really appreciate a wider range of opinions. Thanks
I have a DataSet which I am holding in .NET and I would like to know the quickest way to get the DataSet in to a Table on SQL Server? Any sample code would be great.
Right now I am leaning towards joining a temp table that pulls my aggregates in and then joins them on metric id and year month. But I noticed my bosses boss who knows this stuff a lot more than I, seemed to do direct inserts dropping that whole range.
Same criteria but different approach. I like updating the new results and not dropping or deleting the contents. I like using temp tables too it makes it easy to just select into them and join off of them on the destination table that will be updated.
I got a problem. I am working on DTS package. The last step is updating a table field. I wrote a stored procedure as below:
CREATE PROCEDURE [Update_product_manufacturer] AS
Declare @product_id int Declare @supplier_name VarChar (255)
Declare ValueCursor Cursor For
select product.product_id, [P21_SUPPLIER_id_name_ke].[supplier_name] from [VARIANT],[P21_INV_MAST_uid_itenID_weight_ke],[product], [P21_INVENTORY_SUPPLIER_uid_supplierID_price_cost_k e],[P21_SUPPLIER_id_name_ke] where [product].product_id = [VARIANT].[product_id] and [P21_INV_MAST_uid_itenID_weight_ke].[item_id]=[VARIANT].[SKU] AND [P21_INV_MAST_uid_itenID_weight_ke].[inv_mast_uid]=[P21_INVENTORY_SUPPLIER_uid_supplierID_price_cost_k e].[inv_mast_uid] AND [P21_SUPPLIER_id_name_ke].[supplier_id]=[P21_INVENTORY_SUPPLIER_uid_supplierID_price_cost_k e].[supplier_id] order by [product].[product_id] for read only
Open ValueCursor while (0 = 0) begin fetch next from ValueCursor Into @product_id, @supplier_name
update product set manufacturer = @supplier_name where product_id = @product_id
end
close ValueCursor Deallocate ValueCursor
Notes: Table: Product has 28,000 rows, other tables with 28,000 - 56,000 rows
it's been 2 hours, the job is still working.
Who has this kind of experience? How can I make updating quickly?
I am purchasing a new/first server and could use some help with the details.
I am purchasing the server with the intent of managing a large database that will be quite extensive and requires a good amount of processing power. I have decided to go with windows server 2003 and SQL Server 2000 as a database. Within next year I hope to have this database directly flowing to a website that I could possibly be hosting as well as 2-3 offsite employess logging into the system remotely.
I would say my biggest question is whether or not to choose the raid 1 configuration or the raid 5. I want to be able to have the Hard drives mirror eachother. I was thinking of going with three hard drives but im not really sure if I would even need that setup. With that, I will just show my current system:
Dell poweredge 1800
3.0 ghz xeon 2 gb memory sata 1 raid cerc 6-Channel sata raid controller 160 gb hd x 2 onboard NIC network adapter
Im going price savvy on this one so no ups redundant, power supplies, or tape backup. Although I am open to any suggestions.
Definately appreciate any help with this as I have been hard pressed to find some quality reseller help. They just want to throw the biggest and baddest thing at me.
I need to choose a database based on the following criteria (using .NET app): 1) a light but fully functional database, preferably with the support of store proc and constraints, less than 8000 transaction a day. 2) portable or the database can be export/import very easily 3) reliable and stable 4) least maintenance
I have two db in my mind, Access and MSDE? Does anyone have some hand-ons experience on the above two? Or any other better suggestions?
choosing a primary key for the database which i am designing.
I have few tables which contains 5 -15 fields out of it 3 - 9 columns combined to form the uniqueness of the row.
All are un-related tables. Three parent tables connect with 20 child non-related child tables.
I believe it would not be a wise choice to choose 3 to 9 fields for primary key. But if i use an auto increment as a key will there be of any use as it might never be used to fetch rows. Then why do i still have to go with that?
Or Is it ok to create a primary key of upto 5 attributes?
Hi-I have a sql server database, and am wring web apps to access it.I've created databases different ways, and ended up with different owners (eg dbo, nt authorityetwork services...)I also have connection strings using windows authentication, and some using a user name and password.I have read that using windows authentication is the best way to go, as far as security goes, but I have noticed some connectivity issues when I upload the site to the server, and test it remotely. What is the safest 'owner' of the database, and what's the safest way to connect?Thanks Dan
Hi,I am trying to write a method which needs to call a stored procedure and then needs to get the response of the stored procedure back to the variable i declared in the method. private string GetFromCode(string strWebVersionFromCode, string strWebVersionString) { //call stored procedure } strWebVersionFromCode = GetFromCode(strFromCode, "web_version"); // is the var which will store the response.how should I do this?Please assist.
Hi, I just have a Dataset with my tables and thats it I have a grid view with several datas on it no problem to get the data or insert but as soon as I try to delete or update some records the local machine through the same error Unable to find nongeneric method... I've try to create an Update query into my table adapters but still not working with this one Also, try to remove the original_{0} and got the same error... Please help if anyone has a solution
Hi,I have a table which I need to store times in. What is the best way of doing this? The only way I can think of so far is to save the times as a datetime such as '1/1/1900 03:00 PM' then format the date as "hh:mm tt" so that the day isn't displayed. Is this the best method?Thanks,Curt.
I have a table that is corrupted and want to remove and add a backup version of it. How can i remove this table and add it again preserving all the foregin key restraints, permissions, dependencies, etc? Simply exporting and importing does not work. I could painfully remove the table and then painfully reconnect it again, recreating all the foreign key restraints, etc, by hand; but there has to be an easier way! What is the How-to?
I have a process that restores a backup from a primary server to a backup server daily. When doing the restore, sometime it fails (for various reasons).
I have coded a job to Set offline, set online, an then do the restore:
RESTORE DATABASE [xxx] FROM DISK = N'D:Backup Stagingxxx.bak' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10
Sometimes it fails to bring back online, other errors as well. Is there a reliable method of doing this?
I wanted to set up a mechanism that would transfer blocks of records (a few dozen to in rare cases a few thousand), with slight modification, from one database to another. It's a sort of custom partial archiving process that would be triggered from a web-based admin application in plain old ASP (not .net alas). Records in the target db would be identical except:
-- the primary key in the source table, an identity field, would be just an integer in the target table -- the target table has an extra field, an integer batch ID supplied by the web application that triggers the process
It's a simple, if not efficient matter to do it within the web application: query the source table, suck the records into memory, and insert them one by one into the target db. This will be an infrequent process which can be done at off-hours, so a bit of inefficiency is not the end of the world. But I wondered if there is a more sensible, orthodox approach:
-- Could this process be done, and done efficiently, as a stored procedure with the batch ID passed as a parameter? -- Is there any way to do a bulk insert from a recordset or array in memory using plain ASP, ADO and SQL? And if so, is that better than inserting records one by one?
I realize that the ASP.NET tableadapter and dataset features might provide a good solution, but in the short run I can't rewrite the whole application. Advice on the best general approach from an ASP-ADO platform would be appreciated, and I will try to figure out the details.
I developed a console application that will continually check a messagequeue to watch for any incoming data that needs to be inserted into MSSQL database.What would be a low-cost method I could use inside this consoleapplication to make sure the MS SQL database is operational before Iperform the insert?
I have made a small asp.net project which uses a local database file as a part of the project. The project is running fine om my local machine, but when I upload it to the remote server, the login fails for the server.
I suspect this is can be solved by using sqlserver authentication. But I have now spent a lot of time trying to configure the database file to use this authentication mode. As I see it there are three possible solutions to the problem.
use management studio express to configure the local mdf file (Ecxept that I cant find out how to connect to the mdf-file) and from here change the authentication method to sqlserver authentication. use Visual Web developer to change the authentication method (but how???) make the windows authentication work on the server (this would probably require that mannamgement studio express connects to the remote database. (Same problem as no 1)
Hi, it is few days I posted here my question, but received no answer. Maybe the problem is just my problem, maybe I put my question some strange way. OK, I try to put it again, more simply. I have few textboxes, their values I need to transport to database. I set SqlDataSource, parameters... and used SqlDataSource.Insert() method. I got NULL values in the database's record. So I tried find problem by using Microsoft's sample code from address http://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.sqldatasource.insert.aspx. After some changes I tried that code and everything went well, data were put into database. Next step was to separate code beside and structure of page to two separate files followed by new test. Good again, data were delivered to database. Next step: to use MasterPage, very simple, just with one ContentPlaceHolder. After this step the program stoped to deliver data to database and delivers only NULLs to new record. It is exactly the same problem which I have found in my application. The functionless code is here:http://forums.asp.net/thread/1437716.aspx I cannot find any answer this problem on forums worldwide. I cannot believe it is only my problem. I compared html code of two generated pages - with maserPage and without. There are differentions in code in ids' of input fields generated by NET.Framework:Without masterpage:<input name="NazevBox" type="text" id="NazevBox" /><span id="RequiredFieldValidator1" style='color:Red;visibility:hidden;'>Please enter a company name.</span><p><input name="CodeBox" type="text" id="CodeBox" /><span id="RequiredFieldValidator2" style='color:Red;visibility:hidden;'>Please enter a phone number.</span><p><input type="submit" name="Button1" value="Insert New Shipper" onclick="javascript:WebForm_DoPostBackWithOptions(new WebForm_PostBackOptions("Button1", "", true, "", "", false, false))" id="Button1" /> With masterpage:<input name="ctl00$Obsah$NazevBox" type="text" id="ctl00_Obsah_NazevBox" /><span id="ctl00_Obsah_RequiredFieldValidator1" style='color:Red;visibility:hidden;'>Please enter a company name.</span><p><input name="ctl00$Obsah$CodeBox" type="text" id="ctl00_Obsah_CodeBox" /><span id="ctl00_Obsah_RequiredFieldValidator2" style='color:Red;visibility:hidden;'>Please enter a phone number.</span><p><input type="submit" name="ctl00$Obsah$Button1" value="Insert New Shipper" onclick="javascript:WebForm_DoPostBackWithOptions(new WebForm_PostBackOptions("ctl00$Obsah$Button1", "", true, "", "", false, false))" id="ctl00_Obsah_Button1" />In second case ids' of input fields have different names, but I hope it is inner business of NET.Framework.There must be something I haven't noticed, maybe NET's bug, maybe my own. Thanks for any suggestion.
I am wondering if we could back up the databases to any place outside of the local server system? As I found, we can only back up the database to the local server system, so we have needs to share databases on network places. Is there any method to back up the database on network place rather than first of all I have to back up the database on a local server system, then copy it to the network place, that just sounds really inconvenient.
Thanks a lot in advance for your help and I am looking forward to hearing from you shortly.
I have my first small SQl Server 2005 database developed on my localserver and I have also its equivalent as an online database.I wish to update the local database (using and asp.net interface) andthen to upload the data (at least the amended data, but given thesmall size all data should be no trouble) to the online database.I think replication is the straight answer but I have no experience ofthis and I am wondering what else I might use which might be lesscomplicated. One solution is DTS (using SQL 2000 terms) but i am notsure if I can set this up (1) to overwrite existing tables and (2) notto seemingly remove identity attributes from fields set as identities.I know there are other possibilities but I would be glad of advice asto the likely best method for a small database updated perhaps onceweekly or at less frequent intervals,Best wishes, John Morgan
How do I insert data that I have collected in a local database onto a table on my online ie hosted database which is on a different server?
At the moment I am just uploading all the data to the hosted DB but this is wasting bandwith as only a small percentage of data is actually selected and used.
I thought that if i used a local DB and then update the table on my hosted DB this would be much more efficient, but I am not sure how to write the SQL code to do this!
tblJobDates serves two purposes: to give us the most recently entered due date for a job, and to serve as a "repository" to track changes to the due date.
Report C: The report I want to generate does NOT provide historical information... it only serves to show the CURRENT due date for each job in the tblJobs table: --------------------------------------------
COLUMNS: LocationName Due Date (alias of DateData)
OUTPUT (csv): Jonesport ME, 6/8/2002 Garden City NY, 6/13/2002
Note that for Jonesport, an initial due date of 6/17/2002 was entered (based on the CRD). Then someone changed it so that the job was due EARLIER.
Note that for Garden City, an initial due date of 6/12/2002 was entered (based again on the CRD). Then someone changed it so that the job was due LATER.
The "most recently entered due date" is what should be reflected in my report -- just as it does above ("C")
Other Notes:
-- There are other columns of information from both tables that i would like to return, but above is the most basic form of my request. Most notably, we would need to return the JobPK in report (C).
-- A job should only appear ONCE in report (c), with it's "current" due date, regardless of the other due dates that may have been entered for that job.
-- If a job has no due date, it should not appear on the report.
-- Although not shown here, each row in (B) DOES have a unique identifier (DatePK) as well... if that helps in your solution.
-- Note that the job that is "due first" appears at the top of report (C). This allows a person looking at the report to quickly determine which job "gets priority" -- the one on top!
Okay gurus -- how should the query look that would generate the desired output in Report C?
THANKS IN ADVANCE if you even can point me in the right direction!!
I need to decided between Standard and Enterprise Edition (Cost is acriteria - but its secondary to performance - <!--and I am not paying forit myself-->)The server spec under consideration: Dual Xeon, 1GB RAM, 36GB - RAID 1(Dell PowerEdge 1850).Application: Windows 2003 Std Server, ASP.NET, MS SQL Server 2000 baseddata driven web application.Approximately 25 simultaneous clients. Peak activity would probably be 50transactions/activities per second (2 per second per client). I expectthe database size to grow up to 4GB in 1 year.The application would use only basic OLAP features (if at all)...sofeature set wise I believe that standard edition is good enough.What I am concerned about is when MS documentation says that StandardEdition is for "organization that do not require the advanced scalability,availability, performance, or analysis features of the SQL Server 2000Enterprise Edition"Is there a difference in performance between Std and Ent editions? Interms of number of transactions per second that can be serviced?What other criteria should I be aware of before deciding to go one way orthe other?Any ideas?
There must be a way to do this simply. We're running SQL Server 2000. I'm looking for some generic SQL statement that I can apply.
If I have a table with a person column and a location column and multiple records for the same person / locatioin combination, how do I select the person with the location they most frequently visited? Say George visits Mexico 5 times, and the Bahamas twice and costa rica once. I would have 8 records in my table for George. The data looks something like this:
Hi all, I am using C# for ASP.NEt 2003. I would like to know if there is any easy method to update a database with about 100 fields in it. At present, I pass all the values of the controls on the web form to the stored procedure as parameters like :- myCommand.Parameters.Add("@CustomerID", SqlDbType.Int).Value = txtCustomerID.text Like this, I add all 100 parameters. Is there any easy method to do it using a class or any other methods? Thanking you in advance, Tomy
Let's say you had a User table and one of the fields was called Deceased. It's a simple closed-ended question, so a bit value could be used to satisfy the field, if the person is dead or alive. Let's say another field is called EyeColor. A person can have only one eye color and thus one answer should be stored in this value, so this is easy as well. Now, let's say I want to store all the languages that a specific user can speak. This isn't as easy as the previous examples since it's not a yes or no or a single-value answer. I haven't had much experience with working with databases so I've come up with two possible ways with my crude knowledge hehe. In terms of inputting the multi-answer values, I suppose I could use a multiple-selection listbox, cascading dropdowns, etc. Now, here are the 2 solutions that came to mind..... 1) Make a field called LanguagesSpoken in the User table. When I process the selections the user makes on the languages he knows, I can then insert into the LanguagesSpoken field a string "English, Spanish, Czech" or IDs corresponding to the languages like "1, 5, 12" (these IDs would be referenced from a separate table I guess). I would use commas so that later on, when I need to display a user's profile and show the user's languages, I can retrieve that long string from the LanguagesSpoken field, and parse the languages with the commas I've used. Using commas would just be a convention I use so I would know how to parse (I could have used "." or "|" or anything else I guess) the data. 2) Forget about the LanguagesSpoken field in the User table altogether, and just make a LanguagesSpoken table. A simple implementation would have 3 fields (primary key, userId, languageId). A row would associate a user with a language. So I would issue a query like "SELECT * FROM LanguagesSpoken WHERE userId=5" (where userId=5 is some user). Using this method would free me from having to store a string with delimited values into the User table and then to parse data when I need them. However, I'm not sure how efficient this method would be if the LanguagesSpoken table grows really large since the userIds would NOT be contiguous, the search might take a long time. I guess I would index the userId field in the LanguagesSpoken table for quicker access? OR, I may be going about this the wrong way and I'm way out on left field with these 2 solutions. Is there a better way other than those 2 methods? I haven't work extensively with databases and I'm just familiar with the basics. I'm just trying to find out the best-practice implementation for this type of situation. I'm sure in the real world, situations like this is very common and I wonder how the professionals code this. Thanks in advance.
Please help me out: I have some records in a sqldatasource and want to show it column wise. Now I do it with a datalist because it's easy. But other options are open. Every item/record should have a radiobutton (in a group, so that you can only choose one from all). People advised me to do this with a html radiobutton inside the template. After the user has selected an item and chooses the next-button I need to know what item the user has choosen. Furthermore, when the user likes to step back, the same radiobutton should allready be selected. Please help, this is bothering me for a while, best regards from The Netherlands, Gert
My company has a website that connects to a sql server (on a different box). I am trying to convince them to get sql server 2005. However, I do not know if SQL Server 2005 Workgroup edition is okay for our needs. Can someone please tell me if it is. Basically, our setup is the following:
The SQL Server will only have one/two clients - the web server
i have to store some data on a remote sever(MS SQL SERVER2000). The scenario is like 1. The web application runs on a local machine. User (who inputs) uses through LAN.2. The Input should be stored in the remote server. if the remote connection is ok. otherwise it should be saved in local server's database(MS SQL 2000).3. In the application's web.config there is a connection string pointing to the remote server and another one (alternating one) points the local server's database. in scenario like this i first to tested the remote connection. if it is not ok then i initialize the local server's connection like thisprivate MyConnection() { try { connectionSql = new SqlConnection(ConfigurationManager.ConnectionStrings["ConnForRemote"].ToString()); connectionSql.Open(); } catch (Exception ex) { connectionSql = new SqlConnection(ConfigurationManager.ConnectionStrings["ConnForLocal"].ToString()); } finally { connectionSql.Close(); } connectionSql2 = new SqlConnection(ConfigurationManager.ConnectionStrings["Temp"].ToString()); }My problem is when the remote connection is lost it takes almost 1 minute to store in local database. how can i make it more time efficient. Thanks....
Hello, I have a table with some data in it. What I want to do is to create a query that returns me randomly one of the records of the table. Can this be done?
If this is not possible from SQL server I have thought an alternative way. This is:
I want to return all rows of the table with SELECT *, but I want the select to return in the first column an autoincreament number for each row without the need to add an autoincrement field in the table. e.g
Table ------ Banana Tomatoe Aple ... ... Orange
Result from select ------------------ 1 Banana 2 Tomatoe 3 Aple . .... . .... 23 Orange
Can this be done? At least this way 1) I can travel to the end of the results (from ASP), 2) read the ID of the last row 3) Create a random integer number from 1 to last ID, 4) and finaly select the appropriate random row from that integer.