Advice On Middleware Products For TRUE Scaling Out Of SQL Server
Apr 14, 2006
We have a 3 month old quad processor/dual core server running SQL
Server 2005 and already it is getting close to hitting the CPU wall.
An 8 way CPU box is prohibitively expensive and out of the question. I
am looking desperately for a way to TRULY scale out SQL server...in the
same way that IIS can be scaled out via App Center.
The "in the box" solution for SQL Server 2005 scaling out is the DMV.
Unfortunately this solution makes the system less available rather than
more (one server outage takes them all out for that table) and requires
serious rearchitecting of the software to use. Contrast this to IIS
and AppCenter where each added server makes the system more available,
and requires no rearchitecting to work.
Before someone says "what you want can't be done in a
database"...Oracle has an application server middleware product that
lets you do both of the above. Just plug a new server with Oracle on
it, and you've doubled your capacity. But SQL Server 2005 doesn't yet
have a similar capability.
So I read with great interest the following article that talks about
why this is the case with SQL Server. There are two issues that make
it very difficult to do:
http://www.sql-server-performance.c...ability_availab...
You can create a crude pool using replication, but the performance
times look horrendous.
However, the article also talks about the latest developments in this
field...specifically MIDDLEWARE that can create a scale out solution
that is more available and that requires simply adding new servers to
scale up.
I found two companies which seem to offer this new capability:
http://www.metaverse.cc/newsevents.asp?cid=17999
and
http://www.pcticorp.com/product.aspx
Both companies appear to have patents or a patent pending on the
process. I tried to contact metaverse but got no reply, despite their
recent press release. I just emailed Pcticorp today to see if I could
learn more about their product.
My question for this group is:
Does anyone have experience with either of the two products (or any
others that provide this capability)?
I would like to get some advice on Visual Studio and Business Intelligence products. I currently work with SQL Server 2005 and I created an export package which I turned into an SSIS package. I would like to be able to change the SSIS package when I need to and I read that I needed Business Intelligence to modify the package. I installed the Business Intelligence software through our SQL Server install disks and still didn't see the Business Intelligence app in my program files. We have another server running SQL 2005 and I can modify SSIS packages there with BI but it has Visual Studio loaded on it. Do I need Visual Studio to use BI?
Hi, assuming a data driven web application was written, and it is used by a huge number of customers, we can set up some load balancing infrastructure, where multiple web servers run to process all requests. In this case, the database might run on its own machine. But what can be done if one machine is not enough for the database anymore? Is it possible to build some kind of database cluster? And if yes, how does the application work with such a cluster? Is there still ONE connection string or is it needed to query multiple SQL servers? Long story short; Can a SQL Server database cluster be handled as ONE SQL Server instance from outside, even it is hosted on multiple machines?And if yes, what edition of SQL Server 2005 is required? RegardsMarco Buerckel
I have a cube that has a Dimension set up with several values some of which are bools. While Browsing in Excel or SSMS, two new values, when used as a filter shows (All) (Blank) and (True) for selections instead of (All) (True) and (False).Β
In our current project, we are attempting to use Broker for the middleware for a "queuing" solution that helps in throttling & load-balancing incoming messages. These messages could potentially be either long-running jobs or simply an incoming flood of messages received from our integration web services that are called upon via BizTalk.
For the posting of messages onto the "to do" queue, we are implementing the "fire & forget" pattern that has been discussed here previously and on Remus' blog. This aspect of the design seems to work fine for us. Currently we have this setup where activation is occuring on the target queue, but we don't want to hold open a thread there on the server while waiting for our synchronous calls to various web-services on the processing "farm" to complete.
The next evolution of our design is to try and move activation from off of the primary SQL cluster itself (i.e. activation is currently happening on the clustered SQL boxes) onto the individual processing nodes. For this model, we are looking at using SQL Express on each of the nodes as has been suggested here on the forums for other similar scenarios.
For resiliance to any node failures, we do not want to "route" the messages to the individual nodes hosting their own queues but rather have those nodes do a "read" from the primary queue and do the per-message processing and performing either a committed EndDialog of EndDialogWithError based on the success of processing each task/message.
To invoke the processing on each of the nodes, we need some form of mechanism to send a "wake up" and do the "reads" since no message is being sent to the node itself for any form of activation based on a queue that receives the actual "job". On the same hand, we are considering having a "wake up" queue on the nodes where a simple "wake up" message could be sent to all nodes/queues and then activation on those queues would then invoke the processing "loop" for each node.
My question is how to best establish this "wake up" call to each node. I think I've read about a queue that is internal to Broker itself that receives a message when new items are received in any queue. My initial thought is to put activation on that queue and have a procedure that sends the "wake up" to each of the nodes in our processing farm.
I am looking for any input where others have attempted to solve this type of problem with Broker.
Hi, I tried to install sharepoint server 2007 as a standalone in my computer. When i run the sharepoint products and technologies configuration wizard, i get the following error,
Failed to create sample data.
An exception of type System.Data.Sqlclient.SqlException was thrown. Additional Exception information: Cannot open database "SharedServices2_DB_28aa01d8************************" requested by the login. The login failed. Login failed for user "domainAdministrator"
I am trying to pivot my data results so that instead of showing multiple rows for each product a client has it will show one line for each client and concatenate all of their products together.
For example, if I run the following query:
SELECT [Year], [Client_Name], [Product_Name] FROM My.dbo.Table GROUP BY [Year], [Client_Name], [Product_Name]
I have web application for which I need to come up with a scalability plan. The application right now is running on 2 computers, the front end is running Tomcat with JSP (Java Server Pages). The back end is MS SQL 2000
The database is running fine now, but the file size and CPU utilization is growing and I need to figure out a way to distribute this database over several machines.
I looked at MSCE and setting up a cluster, but I think that the solution will NOT work in an intensive enterprise environment. (Am I wrong?). For one thing, there is a hard limit of 16 nodes in a cluster. Also, it requires a shared RAID or FibreChannel Array. These solutions are very expensive. Also, it's not easy to add additional capacity if you reach the maximum of the RAID array. For example, if you have a FibreChannel Array with 8 Terrabytes of capacity, if your DB uses up all 8 terrabytes, you have to buy a whole new enclosure.
One of the major objectives I have is to minimize cost and support unrestricted database growth.
Here is what I would like to achieve. If any of you know of a solution or something that YOU have USED which might give the functionality I require .. I would greatly appreciate your feedback.
Wish List:
1) Run single SQL Instance on regular PCs, which host 1 or More Tables (Table level partitioning) of a very large database.
2) Distributed JOINs on Tables residing on different physical PCs
3) Multiple Machines hosting same table (Fault Tolerance)
4) Single Virtual IP for database to Application Front End.
-----------
I looked at C-JDBC which is an open source middleware that can do some of the things above. I liked it, but it doesn't seem to support Distributed JOINs which means that our application - which requires quite a few tables JOINs - won't be able to achieve Table level partitioning.
Can anyone recommend any clustering middleware which you can recommend?
I'm currently developing an ASP.NET site with SQL Server 2005 Standard and I'd like to ask a question about the future of the database. It needs to have continuity and performance. I'm thinking about doing replication or mirroring for continuity and table partitioning for performance. I admit I've never done any of those before and I'll learn about them but they're not needed at this time. The question is, I'm currently designing the database and do I have to anything for consideration for those things I'm thinking of implementing later? For example, I'm using Identity in my tables but I've heard about identity crisis using replication with identity columns, therefore I'm thinking of using Guid's but now I fear the Guid column index itself will be the slowdown factor in the first place.
Any suggestions to consider? I'd appreciate any opinions.
Hi, My application is about to scale up significantly, and it seems that SSB could be very useful to help me scale it right, especially I like the multiple readers feature.
So, here's the deal my application is about to get lots of records (peaks could gets up to 300k records per hour). Each record must be processed separately, but could be processed in parallel no dependency between rows (gut feeling tells me not to overuse parallelism, but Ill do my tests). It takes about 10ms-15ms to process a single row; I don't mind sliding to non peak hours if needed. Each record is constructed of about 10 columns.
My questions:
1. Is SSB the right solution as a buffer queue? any famous use case is similar to my problem ?
2. What is the fastest easiest way to serialize / deserialize each record?
I would prefer not using CLR integration due to performance issues and stick with tsql for now. I dont necessarily prefer XML serialization. If binary serialization works faster, its fine with me.
I'm upgrading an application that uses a VBScript/ASP front-end and MS Access Backend. The application has many points expecting fields to be true or false..
E.g. SELECT * FROM MyTable WHERE Active=True
However, since SQL Server requires 0 or 1 for bit fields, this query keeps failing. SQL Server (2005 Express) thinks True is a column name and is not automatically converting "True" to 1.
It would be so difficult to re-write the application to do this at the VB level, is there a way to make SQL server do this conversion automatically ?
I have a pivot chart that displays 3 data series.Series 1 values range from 17 to 106.Series 2 values range from 1 to 18.Series 3 values range from 0 to 1.When I display all three series on the chart, the value axis labels run from1 to 120 with major unit lines at intervals of 20.When I remove Series 1, the chart automatically adjusts itself and thevalue axis labels run from 1 to 20 with lines at intervals of 2, thusnicely accommodating the value ranges of Series 1 and Series 2.All this is fine. But when I remove Series 2, leaving only Series 3,the labels run from 1 to 1.2 with interval spacing of .2. Since thesevalues are counts, it makes little sense to show decimal values.Can anyone provide advice on how to keep these axis labels fromshowing decimal values, if a user removes Series 1 and Series 2?Thanks.- Bob
Read only transacted replication to about 60 clients. 100 or so transactions a day (8 hours), transaction (record) is probably about 8K/transaction. Reliable connection over 100base-T. Latency - 15 minutes would be great.
This doable? -- I couldn't find a timing/ sizing model any place.
We use timed subscriptions to do almost all of our reporting. Reports are delivered (primarily via e-mail and printer) once they are completed and users don't have to "watch the pot boil" so to speak.
Apparently SSRS has some load balancing capability whereby it lets only a limited number of threads/reports run concurrently. We often reach this max and lock ourselves up on some very long-running reports, causing other important reports to wait a long time.
We've added some operational reports (ie. document prints) to the mix. These reports run off of OLTP data. They are very fast and very high priority. Waiting on them is not an option. Is there some way we can get SSRS to work on these operational reports in preference to other types of reports (eg. "just for kicks" reports)? I think we'd almost like to add another SSRS server and dedicate it to the operational reports. Ideally the new SSRS server would use the same Report Server database but would only work on subscriptions for certain documents.
Has anybody else tried to solve this problem? This MS document does really address subscriptions or load balancing by report: http://www.microsoft.com/technet/prodtechnol/sql/2005/pspsqlrs.mspx
Hi all,I use the following connectionstring connect to sqlservserver=(local); database=mydata;Integrated Security=trueIf I tried browse the website in VS Web Developer 2005 environment, it's ok. After I compiled the site and access through IIS server, can not connect to sqlserver , I need to use login and password to connect. How can I connect wihtout using LogID and password.Thanks
hello, i have a Products table, and i want to make an image data type to one of his rows (Picture1 for example), i want to know, what is best, to store the picture in the database or store only the direction to the picture? if i store only the direction, i should take it from some output parameters of the Upload function of ASP and the add it to the Database? can i add it to a special folder for example MySiteUserName + file name? and another thing: let's say i have Promotion - tinyint, to store the promotion value of this product..If i show products using DataList, i could order my products first after Promotion and then after date added? could i use a special CssClass(font weight or other background) for the products witch has the Promotion more than 10 for example? how can i know the exact date time (yy/mm/dd/hh/mm) ? - it is taked from the Server date? thank you
For the last 2 years we have been using ArcServe for Windows NT (version 6.0 and 6.5) with the Backup Agent for SQL Server. We are now re-evaluating our backup software. Our primary issues with ArcServe are that it has been unstable, unreliable and it does not support the full SQL Server recovery functionality. ArcServe does not include the ability to do point-in-time database recovery. Nor does it support single table restores (when a full database backup was performed). These last 2 items are critical and necessary features at my installation.
What products do people recommend for performs Windows NT files and MS-SQL Server backups?
Say if I want to get the top 3 products that fall under category "J2" and have them ordered by ranking then I would use the
below query
ELECT DISTINCT TOP 3 mid, pid, pname, prank, mname FROM Products, Manufacturers, Categories WHERE Products.mid = Manufacturers.mid AND (cat1=2 OR cat2=2 OR cat3=2) ORDER BY ranking
Now what if I need the top 3 ranked products that fall under category "J2" and have them ordered by ranking selecting only
one product from each manufacturer. So in order words I want the top ranked product from each manufacturer.
What would that query be?
I am having a really tough time and have already spent quite sometime on it unsuccessfully. I would appreciate any help.
I want to have visitors to my website get price calculations from my database based on input variables which the visitor enters. I think I need the combination of Access, ASP.NET, and Sql Sever. Is this correct?
I am currently creating an application that requires the use of Pocket PC/PDA. The same application that runs on the desktop is to be run on the mobile device metioned above. I am confused as to which of the following combination servers i should use.
Option1) Desktop - SQL Server Express Edition (Multi-user) Mobile Device - SQL Sever Compact Edition
OR
Option 2) Desktop - SQL Sever Compact Edition Mobile Device - SQL Sever Compact Edition
To my understanding, if I use SQL Server Compact Edition on both my desktop and mobile device, when i need to synchronise, the entire database from the mobile device is copied over to the desktop and vice- versa. However, I prefer only modified data to be synchornized. Can I achieve it with the second option? Is it also possible with the first option? Please advice. Thank You
I am trying to write a bit of code that I can pass a brand name to. If the brand name exists I want to return the brandid to the calling middle tier. If the brand id does not exist I want to insert and then return the new brand id. The code below works unless the brand does not exist. Then it inserts, and I get an application exception. Next time I run the code it continues on until the next time it has to do an insert. So the inserts are working, but getting the value back is resulting in an application excetio.
Middle Tier Function ( private static int GetBrandForProduct(clsProduct o) { int brandid = -1; // If the brand name comes in blank use the first word of the overstock product o.BrandName = o.BrandName.Trim(); // if we do not have a brand for this product if (o.BrandName.Length == 0) return -1; Database db = CommonManager.GetDatabase(); ; try { // Get the brand id for this brand name // If it does not exist we will add it and STILL return a brand id object obj = db.ExecuteScalar("BrandIDGetOrInsert", o.BrandName); string catid = obj.ToString(); *** FAILING LINE *** return Convert.ToInt32(obj.ToString()); } catch (Exception ex) { throw ex; return -1; } return brandid; } Stored Procedure: ----------------------------------------------------------------------------------------------------------------------------------------------------------------------- ALTER PROCEDURE [dbo].[BrandIDGetOrInsert] -- Add the parameters for the stored procedure here @brandnameparm varchar(50) AS BEGIN -- SET NOCOUNT ON SELECT brandid from brands where Lower(brandname) = Lower(@brandnameparm) -- If we found a record, exit
if @@rowcount > 0 return -- We did not find a record, so add a new one.
begin
insert into brands (Brandname) values(@brandnameparm)
end SELECT brandid from brands where Lower(brandname) = Lower(@brandnameparm) END
I have a shopping cart table and two products tables (legacy reasons...).
I need to create a relationship between the ShoppingCart table and the two products tables such that no products are entered in the Cart which don't exist in either product table 1 or product table 2.
I'm trying to select the TOP 20 Products per Catagory out of my Star Topology Database if someone would PLEASE be able to help me out.
Here is my Query: -
SELECT Distinct (SELECT TOP 20 DP.[Description]), DP.Mims_Sub_Cat, SUM(FD.Cost) AS 'Cost' FROM DIM_Product DP, FACT_Dispensary FD, DIM_Time DT, DIM_Client DC WHERE DP.Product_KEY = FD.Product_Key AND FD.Time_KEY = DT.Time_KEY AND FD.Client_Key = DC.Client_KEY AND DT.[Year] = 2007 AND DT.[Month] IN (2) AND Client_name LIKE '%Medicare%' AND DP.Manufacture_Name LIKE '%Cipla%' --AND DP.[Description] IN (SELECT TOP 20 [Description]) AND DP.Mims_Sub_Cat IN (SELECT Mims_Sub_Cat) GROUP BY DP.Mims_Sub_Cat, DP.[Description], FD.Cost ORDER by DP.Mims_Sub_Cat, SUM(FD.Cost) DESC
My other problem is that it keeps on selecting the same products although i have a distinct in my query The query will select a product with the amount and then select the same product again with a different amount.
hi, I have total 45 GB of data in the following servers ( 40 GB in sql server 6.5, 5GB in ftp server). both servers are in one machine(server). I want to make sure that I am doing the right thing for backup. The machine is not connected to a Lan. What is the best way to insure that my data are backed up safely? what I am thinking of doing is to back up the database to using sql tools in the menu bar, that will backup the data to the same machine in this directory: c:mssqlackupDBname.DAT
I will do this for all database in the sql server. Am I save to do this... I am wondering what IF the machine in which both severs are in crashed. My backup effort will be lost too. right. so what is the best way to protect my data. Do I need to buy a tape backup so I can do what I am doing Plus back up the c:mssqlackupDBname.DAT to the tape backup... I would really appreciate any suggestion in choosing any media (software/ hardware)to back up my data.
I'm looking to purchase new server hardware that will host my corporate intranet and a sql server instance. In most cases I can't see this server being hit by more than 20 or 30 users but I want it to be quick. I'm planning on running Server 2003 with sql server 2k and the intranet on IIS.
What I'm not sure about is the single vs dual processor and the RAM. How valuable is the dual proc? And the RAM, I plan on 1.5 GB but I wonder what the benefits are beyond that.
Perhaps with this many users it doesn't make that much difference but I can see adding more load.
I've got a situation where I need to regularly (maybe each month), detach a DB, copy its files from their highspeed SAN location to a slower NAS, then re-attach it and make it available on-line. We're doing this for our DB's as they age to > 3 years.
Just wondering if any of you have scripts you can point me to so I don't have to re-invent the wheel.
Also, after you re-attach, how do you verify the NAS DB is 100% ok before deleting the original from its SAN location?
We currently have a Server: Dual Intel Xeon 3.0 GHz 4 GB ram (C) 2x18 GB SCSI disks RAID 1 for OS (D) 4x72 GB SCSI disks RAID 5 for Data
The server is running: Windows 2003 + IIS (Single web application) Tomcat (Single web application) (Is about to be outfased) MS SQL 2005 (In simple recovery mode)
The C-driver is currently only used for OS and applications The D-driver is holding all data + SQL Data files and SQL Transaction files... E.g. - SQL Data files = 7 GB - SQL Transaction files = 20 GB - Image liberary = 100 GB - Other = 15 GB
The IIS uses around 768 MB ram for cashing and execution of the web application The Tomcat uses around 350 MB ram The SQL is the to use a max of 2.2 GB ram (and 1024 MB pr. query) (Leaving around 512 MB ram left)
There is only one database installed on the SQL, but it is very large (+15 mill records) We are currently doing Free-Text searches in one table (~3.5 mill records), and it is to slow.
So the question is what is the best way to gain performance... - Increase to 8 GB ram? - Purchase a new server as webserver, so the curr. server can be dedicated as SQL? - Attach a NAS to split the SQL files into several groups, and seperate transaction logs from data? (6 disks are max in curr. server)
I'm hoping somebody can help me here as i'm struggling to find any information elsewhere on the net. We have recently purchased a new server, the rough specs are:
2 X Quad-Core Xeon E7320 2.13GHz 4Mb Cache 32Gb PC2-5300 DDR II RAM
We are planning to install the 64 Bit version of SQL Server 2005. We want to use the server for a number of purposes.
Building and weekly processing of 2 complex data marts (approx size is 1Tb each)
Processing and querying of 2 Analysis Services databases that will be built from these data marts. These will be queried by no more than about 15 users (no more than about 5 simultaneously).
Relational querying of the data marts themselves (same users as above) My problem is that I am not sure of the best way to configure SQL Server. Should I use 2 separate instances? How should the processors/memory be shared between SQL Server/Analysis Services? My main priority is the performance of the OLAP querying. However, I also want the weekly processing and any ad-hoc SQL querys run against the marts to be efficient.
How can I store more than one category in a products table? For exampe: I have a dvd website where the admin can add new dvd's. On the website all the categories are listed with a checkbox. If the dvd is a action comedy the admin have to check these two checkboxes. But how do I store that in the sql database an retrieve it? Thanks David