I'm looking to serialize some NVP data into an XML blob. I plan to put a primary xml index on the column, but my question is, would putting an XSD on the column speed up any queries, or would it just ensure format? I know that with selective xml indices, you can do things like specify the datatype associated with the xpath which can further optimize retrieval.
I dragged two tables into the dataset designer. And I have a query with a resultset over both of the tables (joined).Is there ANY way I can have a strongly typed datatable out of the resultset?The automatically created dataset adapters won't allow me to create a tableadapter or datatable for the joined resultset but for one single table at a time only.Any solution to that? Theoretically it should be possible since I don't "lose" any type information with a join.. why can't I have a strongly typed datatable over the resultset?The join is not very sophisticated. In fact it couldn't be any more simple.(I'm refering to my previously posted question at http://forums.asp.net/t/1220481.aspx)
When I insert the fields using parameters, the params look like this: paramValues(10) = New SqlParameter("@RegHrs", SqlDbType.Decimal) paramValues(11) = New SqlParameter("@OTHrs", SqlDbType.Decimal) paramValues(12) = New SqlParameter("@TotHrs", SqlDbType.Decimal)
The fields in the SQL table are defined as: RegHrs - decimal (9,2) OTHrs - decimal (9,2) TotHrs - decimal (9,2)
Before I insert the row (from w/in the VB code), I stop the code and verify that the values being placed into the params are: 5.85 0 5.85
After the SQL insertion, the values w/in the SQL Table contain: 6 0 6
Like the subject says, I'm using strongly typed datasets. I'm using to designer to create the datasets and the methods. I can select and insert, but I can't update or delete. I right click on the adapter bar and select Add Query. I sleect 'Use SQL Statements'I select 'Update' (or 'Delete')I get a sql statement pane containing the word 'Update' ('Delete') and asking 'What data should the table load?'I can click on next, but anything else gives me errors. I'd list them, but I'm clearly doing something wrong and it's probably obvious. Diane
I have a pretty large DB and a fairly complex query. If I drop buffers and clear cache the query runs in 20 seconds returning 25K rows. Subsequent runs are 2 seconds. Is this the result of the results being cached, execution being cached, other? Are there good ways to close the gap between the initial and later runs? Does the cache stay present until the service restarts or does SQL recycle the memory and if so, based on what criteria?
I have several data bases on a server (SQL Server 2000 only, no web server installed) and lately, as the company keeps gowing, my users complain saying the server gets slow, (this dbs are well designed and recieve optimizations and integrity checks, etc) because of this, Im thinking about getting a new server to repleace my old ProLiant ML 330 which was bought 4 years ago but Im concerned about what server arquitecture or characteristic can help me best to improve response performance, is it HD speed? Processor speed? or more Ram? I want to make a good decision, so I´d really appreciate your help...
I'm using strongly typed datasets in my first ASP.NET 2.0 web application. I come from ASP Classic, not an earlier version of .NET, and feel like I'm in another world. I'm slowly getting my head around datasets, but one thing I can't find any information on is how to read the data in a single record?I'm not talking about for next loops, or accessing row information as a repeater or other control is filled. I'm talking about reading the data returned by a method that returns a table containing a single record. This is what I have: Dim BookAdapter As New BooksTableAdapters.BooksBBTableAdapter Dim organizations As Books.BooksBBDataTable Dim organization As Books.BooksBBRow organizations = BookAdapter.GetDataByOneBook(sBookID) Dim sDropOff As String Dim sOrganization As String Dim sContact For Each organization In organizations If organization.DropOff = 1 Then sDropOff = "True" Else sDropOff = "False" End If sOrganization = organization.Organization sContact = organization.ContactName Next sBody = "Books Listing" & Chr(10) & Chr(10) sBody = sBody & "Organization: " & sOrganization & Chr(10) sBody = sBody & "Contact Name: " & sContact & Chr(10) sBody = sBody & "Email: " & organization.Email & Chr(10) sBody = sBody & "Phone Number: " & organization.Phone & Chr(10) I'm using the FOR NEXT, but this is silly since I only have one record. GetDataByOneBook(sBookID) does exactly what is says, it returns a single book with a specific bookID.Not only is this silly, it doesn't work. Using sContact as an example, it's Dim'd as a string. In the FOR NEXT loop, organization.ContactName has the right value, and it appears to be assigned to sContact correctly, but when I try to use sContact in sBody, I get an error saying that sContact has been used before it is assigned a value. Maybe the variables lose their scope outside the loop? Maybe I could get around this by building sBody inside the loop, but there has to be a better way! Diane
Is it possible to return typed data in an Endpoint for an ASP.Net Web Reference to Proxy? If so, is there any specific terminology I should be aware of to target my search?
I realize there is a choice between Object (returns a dataset or error) or dataset, but the automatically generated WebReference Proxy in ASP.Net (2.0) is untyped and we can't change the typing there as you have to remove the entire WebReference to pick up new WebMethods (or changes to signitures I'd assume).
I'm able to create my own typed proxy as a psuedo-DAL assembly which takes the WebReference and cast rows/objects into typed rows/objects one at a time, but this seems like a lot of work and probobly not the best practice.
I have a DAL that uses Typed DataSets (not directly, the DAL references and calls the dataset methods) and I am receiving a "There is already an open DataReader associated with this Command which must be closed first." error when I render an ASP.NET page where two imgs need to retrieve two different versions of an image (a thumbnail and a full sized image).I am using SQL Server 2005, ASP.NET 2.0, and my connection string includes the MARS attribute set to true. This is the method in the SqlServer DAL, _images is a typed table adapter:public override Image SelectImage(Guid? id) { // TODO: Find out why MARS feature isn't working! DataSet.ImageDataTable dt = _images.SelectImageById(id); <-- Exception occurs here if ((dt != null) && dt.Count == 1) { DataSet.ImageRow row = dt[0]; Guid? keyId = null; if (!row.IsKeyIdNull()) { keyId = row.KeyId; } // Success return new Image(row.Id, keyId, row.Path, row.Caption); } return null; } // This is the method that is called for both images, which calls the DALpublic class EntitySearch{ public static Image ImageById(Guid id) { return _dal.SelectImage(id); }}// These are the code-behind methods that the image response strings will come from...imgThumbnail.ImageUrl = String.Format("../Services/ImageBroker.aspx?img={0}&mode=Thumbnail", image.Id);imgFullImage.ImageUrl = String.Format("../Services/ImageBroker.aspx?img={0}&mode=Image", image.Id); // And finally here is the code in the broker class that saves the images to the response streamImage image = EntitySearch.ImageById(imageId);using (Picture picture = (mode == ImageMode.Image) ? image.Picture : image.Thumbnail){ Picture.Save(response.OutputStream, ImageFormat.Jpeg);}So the final snippet of code is being called twice, and it crashes. I have tried closing the connection and reopening it as a workaround but this only yields strange results, though it intermittently does load the images.Can anyone spot the issue?
Hi, Is there any work around to passing a set of strings to a parameter in a Typed Dataset for example I am pasing '4226222172004','4212012182004' which I build on my code the number of items will vary passed on the user selection but since the Typed Dataset uses sp_executesql and the parameters are change to '''4226222172004'',''4212012182004''' Any ideas how I can format the Parameter I am passing.so that it will end like where in ( '4226222172004','4212012182004') instead of where in ('''4226222172004'',''4212012182004''') I again the number of parameters will very. Thanks Julio D
I am using typed data sets. When a certain line of C# code is reached, I get the following error message: Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints. Here is my source code. I altered the table to remove all contraints and primary keys, but I still get the error message. The erroroccurs when the last line of code is reached. (_DeltaTable = myAdaptor.GetData();) The code is listed below. Does anyone know whatis going on? calculateValuesAndAveragesTableAdapters.FundamentalsTableAdapter myAdaptor = new calculateValuesAndAveragesTableAdapters.FundamentalsTableAdapter(); calculateValuesAndAverages.FundamentalsDataTable _DeltaTable; _DeltaTable = myAdaptor.GetData(); // Error occurs on this line
Hi, I am trying to use a typed dataset created using the Query builder which returns the data correctly when I use 'Execute Query' in query builder but as soon as I attempt to return a dataset using the GetData method created I get the following error: Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints. Which is not very helpful, any ideas where the problem may be? I have tried switching off enforce constraints and setting the NullValue property of all strings to 'Emtpy' bu that has no effect. Thanks in advance
Hello,I'm using a typed dataset to access my database (SQL Express); I have a table that returns data, which two fields are null; I'm getting this error:{"Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints."}There are no foreign-key or unique key constraints, so it has to be a non-null issue; I changed the two fields in the table to have a null value of Empty instead of Throw Exception, but I don't understand what the problem is coming from... I'm at a loss.Thanks.
I'm need to use typed Dataset with CLR UDT, but when I'm trying to create TableAdapter in Dataset designer, I'm gettng error message: User-defined data types are not supported in DataSet designer
I would like to extact data from a source system even if it has errors. Then I can transform it and handle the errors in the appropriate manner. Are there any loosely-typed Data Flow Destinations?
Hi all,I am using a Strongly Typed DataSet (ASP.NET 2.0) to insert new data into a SQL Server 2000 database, types of some fields in db are nvarchar. All thing work fine except I can not insert unicode data(Vietnamese language) into db.I can't find where to put prefix N. Please help me!!!
Hi, I have a little question. I searched google, and could not find good answer for this one. I have a stored procedure that returns two tables. Usually I generate a dataset out of a stored procedure by dragging it to the dataset. When I drag this one it creates a DS with only one table, the first one. How can I make it use both tables?
<DetailRows> <DetailRow> <MonthNumber></MonthNumber> <Amount></Amount> </DetailRow> </DetailRows>If my variable contains following xml document as un-typed xml
[Code] ....
However, if I use a typed xml variable that is based on above schema, I cannot use OPENXML. What is the correct way of achieving same result with a typed xml doc? I am using SS2K5.
Using VS 2008 Beta 2, SQL CE 3.5, on desktop, and Typed Datasets: The INSERT command of dataset table adapter does not return the updated identity of inserted row. Why?
also every time I want to modify the insert command to return the updated identity of inserted row, i get the error: "Unable to parse query text."
I'm at loss how I'm supposed to work with typed datasets and Sql Server Compact 3.5, when inserting records and I need to update my datatables with the primary key of newly inserted rows.
I've tried adding a RowUpdated handler to all tableadapters that look like this:
I've previously used this type of approach when working with an OleDbDatabase, which works just fine. But it doesn't work with Sql Server CE 3.5, and since it doesn't support stored procedures I can't fix it that way either. And it doesn't support commands in a batch (i.e appending the Insert command of the adapter with ";SELECT @@IDENTITY") so that doesn't work either...
So how are we supposed to use Sql Server CE 3.5? It's impossible together with datasets? Or am I missing something obvious?
I am trying to upload a file in ASP.net 2.0 to a SQL database using the FileUpload control. I am doing this by way of a typed dataset.Here is my code.Dim rta As New ResponseTableAdapterDim rdt As ResponseDataTable = rta.GetResponseByID(1)Dim rr As ResponseRow = rdt(0)Dim fs As New FileStream(fu.PostedFile.FileName, FileMode.Open, FileAccess.Read)Dim br As New BinaryReader(fs)Dim image() As Byte = br.ReadBytes(fs.Length)br.Close()fs.Close()rr.UploadFile = imagerr.NameFile = fs.Namerta.Update(rdt)The code runs without an error but afterwards I find that nothing has been stored in the UploadFile cell in the database but the file name has been stored in the NameFile cell in the database. Any ideas?Your help is much appreciated. James
Using VS 2008 I got a Windows Smart Device project targeting WM6 Standard and using .NET CF 3.5. I add a new database file = creating an empty Compact 3.5 database (creates an sdf-file in my project). Then create an empty dataset in the wizard (creates an xsd-file in my project). Then I add a couple of tables in the database. After that I want to use my earlier created dataset. I mark the xsd-file and looks in Poperties. I get the error when I try to change Custom Tool value from 'MSDataSetGenerator' to 'MSResultSetGenerator'. Why????
hello, i need some opinion on how to sum up or group by more than 2k records faster.. eg, how do i optimize this?
SELECT DISTINCT r.ClientID,c.ClientName, r.ItemID, r.StockID,r.StockName, r.ExpectedQty,r.QCQty,r.AVAQty,r.PNDQty as pnd, r.VMIQCQty,r.VMIAVAQty,r.VMIPNDQty as vmipnd,
(Select isnull( SUM(d.HoldQty) ,0) FROM tblItemdetail d WHERE d.itemid=r.itemid AND d.ConsignorID=@ClientID AND d.Ownerstatus='VMI') AS VMIPNDQty,
(Select isnull( SUM(d.HoldQty) ,0) FROM tblItemdetail d WHERE d.itemid=r.itemid AND d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='VMI') AS PNDQty,
(Select isnull(SUM(d.OriginQty - d.PickQty -d.HoldQty -d.qcqty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustAVAQty,
(Select isnull(SUM(d.HoldQty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustPNDQty,
(Select isnull(SUM(d.qcqty),0) FROM tblItemDetail d WHERE d.ConsignorID=@ClientID AND d.Ownership= i.Supplier AND d.Ownerstatus='OWN') AS StockAtCustQCQty
FROM tblItemCrossRef r INNER JOIN tblClient c ON c.ClientID=r.ClientID INNER JOIN tblItemClients i on i.Supplier=r.ClientID WHERE r.ClientID=@ClientID AND r.StockID LIKE @StockID+'%'
Hello all-Given the following UDF, in sql 2000 can it be sped up, complied oranything of the like. A query returning 300,000 + rows times out whenran through the udf, inline case statements returns the rows in 5seconds.Thanks!JeffCREATE FUNCTION dbo.TimeFormat(@input datetime,@groupformatvarchar(20) --DAY, WEEK, MONTH)RETURNS datetimeASBEGINdeclare @dtvar as datetimeif @groupformat = 'DAY'set @dtvar = CAST(CONVERT(char(10), @input, 101) AS datetime)else if @groupformat = 'WEEK'set @dtvar = CAST(DATEADD([DAY], 1 - DATEPART(dw, CONVERT(char(10),@input, 101)), CONVERT(char(10), @input, 101)) AS datetime)else if @groupformat = 'MONTH'set @dtvar = CAST(CONVERT(CHAR(6), @input, 112) + '01' AS datetime)return @dtvarEND
This is x-posted in:alt.php.sqlcomp.databases.ms-sqlservermicrosoft.public.sqlserver.programmingI have events that occur during the day. I want to be able to search thoseby a form with checkboxes (multiple select).Let's say for instance an event is happening from 3-10pm. When someonesearches for 4-6 (checkbox option) it needs to show up.I don't need code so much as I just need theory. My theory that I coded outand worked, just a missight in theory is as follows. I did a BETWEEN callthat pulled any event that began BETWEEN 4 AND 6 or ended BETWEEN 4 AND 6.As you can see. The event spans that time, but does not start or stopbetween 4 and 6, thus was not pulled. Ooops.So if someone call tell me of another function or perhaps just a better wayto use BETWEEN that would be great. I don't think that code is necessary atthis juncture, so save the 'Please post code' post :) Thanks.
I am using sp_executesql this to pass parameter to sql string and I am seeing deadlock between sp_prepexec which does UPDATE with another UPDATE done by another process. When it comes to speed and deadlock, would you recomand not using sp_executesql?
Hi, Can anyone tell me a way to speed up these querys? //This is selecting a number of records (sent by user) from a table and randomizing those tempSQL.Text = "select top " + amount.Text + " number from [" + src.Text + "] Where pull='N' order by newID()";
SqlConnection conn2 = new SqlConnection(ConfigurationManager.ConnectionStrings["MyDB"].ConnectionString); conn2.Open(); SqlCommand cmd3 = new SqlCommand(tempSQL.Text, conn2); cmd3.CommandTimeout = 1000; SqlDataReader dr = cmd3.ExecuteReader(); //Then I open a data reader that uses the records SqlConnection conn2a = new SqlConnection(ConfigurationManager.ConnectionStrings["MyDB"].ConnectionString); conn2a.Open(); while (dr.Read()) { //the records are then placed 1 by one into a temp table string fillresultID = "Insert into [" + src.Text + "_Additional_Temp] (number) Values('" + dr["number"] + "')"; SqlCommand cmd4 = new SqlCommand(fillresultID, conn2a); cmd4.CommandTimeout = 0; cmd4.ExecuteNonQuery(); //then the original table that held the numbers is marked as used(again one by one) string update = "Update [" + src.Text + "] set pull='Y' where number='" + dr["number"] + "'"; SqlCommand cmd5 = new SqlCommand(update, conn2a); cmd5.CommandTimeout = 0; cmd5.ExecuteNonQuery(); } dr.Close(); conn2.Close(); conn2a.Close(); Thanks, Doug
Hi, how can i speed up this query, it seems to be taking a very long time to bring back the reults; --This stored procedure retrieves access rights for usersCREATE PROCEDURE wc_User_Access_Right_List ASSELECT dbo.tblRep.Rep_ID, RTRIM(dbo.tblRep.Rep_Forename) + ' ' + RTRIM(dbo.tblRep.Rep_Surname) AS User_Full_Name, dbo.tblAccessRight.Access_Right, dbo.tblAccessRight.Access_Right_IDFROM dbo.tblRep LEFT OUTER JOIN dbo.tblAccessRight ON dbo.tblRep.Access_Right_ID = dbo.tblAccessRight.Access_Right_ID ORDER BY User_Full_Name
--Make sure this has saved, if not return 10 as this is unexpected error IF @@rowcount = 0 return 10 DECLARE @RETURN_VALUE tinyintIF @@error <>0 RETURN @@errorGO
I have a select statement that has many 'ands' where one side uses the same thing for ex.
where t1.column1=r3.other and t1.column1=e5.new and t1.column1=k9.old etc...is there any speed gained by putting the value for column one into a variable and using that variable each time instead of t1.columns1?
I have indexed my SQL Server tables to gain some speed on calling up tables and queries ( using VB and ADO ). It is still very slow...Is there a move I have to make once my tables are indexed or is there any tricks to improve the speed cause I am getting kinda desparate right now :(
I am in the planning stages of a website and want to design my database to allow for optimum performance in case the site becomes popular. In my database I will have around 5 main categories with each category having around 25 or 30 subcategories and each subcategory having around 100 to 300 items. I plan to use MS-SQL Server7. Would it be best to have one large table or have 5 tables for the main categories or have around 100 tables one for each subcategory? Database usage will be simple (no complex queries) but most pages will hit the database and I need to allow for a potential of 4 or 5 million page views per month. The SQL Server will be a shared one.