I am not sure which of the forums this question should be posted to (Engine or TSQL) so I have posted to both. Please forgive the cross-post.
The problem we are having is with the performance of a stored procedure. Normally, this specific proc will only take seconds to run, but if we at all modify the stored proc in any way, the proc will take at least 20 minutes to run (we don't really know when it will finish because we cut it off at 20 minutes).
When this happens, we can run the profiler and see that a lock and unlock is happening on each record that the query is returning. We are at a loss as to why this is happening.
The only way to fix it (get the proc run time back down to seconds) is to run the engine tuning advisor. We don't even need to apply any of the recommended changes, we just need to run it and the proc performance goes back down to mere seconds.
If we run the profiler again when the proc performs normally, we dont' see any locking going on.
We can consistenly repeat these steps.
Just so we aren't jumping the gun, we have re-written and disected the proc 8 ways to Sunday and it still does the same thing.
Anyone else see this and can offer any suggestions?
I am struggling with creating a simple stored procedure, I want to take max(timekey) out in one variable @Val in a dynamic SQL statement as below in SP. When I create this SP, it creates successfully but when I run it I get error
Select @Val=MAX(TimeKey) FROM ABC
Msg 137, Level 15, State 1, Line 1
Must declare the scalar variable "@Val".
Can someone help me in understanding this wierd behavior? And how to get rid of this. Table and SP scripts are as below
CREATE TABLE ABC (TimeKey int, Data_Val int)
CREATE Procedure MAXVal(@Table_Name Varchar(30))
AS
BEGIN
DECLARE @Val INT,
@SQL Varchar(100)
SET @SQL='Select @Val=MAX(TimeKey) FROM '+''+@Table_Name+''
I have a stored procedure that does a lookup on a particular field. Sometimes it runs almost instantly and other times it drags. Running the proc through ISQL it will come back with less than 10 logical reads and other times it comes back with over 800,000 logical reads. This only happens on our production box(of course). Anybody seen anything like this? Thanks -Bob-
We have a stored procedure that we've tried with two slightlydifferent designs. It needs to take a 30 day date range and return aresult set.Design 1 takes one date as a parameter. The other date is calculatedin a local variable to be 30 days before the one that was passed in.Both data types are datetime and are in the where clause.Design 2 takes two dates as parameters with the 30 days beingcalculated outside the stored procedure, both in the where clause.There's some joins, but the main table has maybe 20 million rows.This is sql server 2000.Design 1 takes maybe 30 mintues to run. Design 2 runs 15 timesfaster.The plan says that Design 1 is doing a table scan on the 20 millionrow table. For Design 2, the plan says it's doing a bookmark lookupon the date in question.Why?brian
I have SQL Server 2005 9.0.3050 running on a machine with 3.4GHz and 3.00GB of memory. I have a C# stored procedure that is marked as SAFE. In the SP I am using two DataTables. Table 1 contains just the schema of the table that will be updated. Table 2 contains the data to be manipulated. The result set can contain 10,000 to 150,000+ rows of data. I can run the SP and get 50,000 to 80,000 records returned and stored in Table 2 and it works fine no errors. The problem I keep running into and can€™t seem to figure out is when I have 150,000 records returned and stored in Table2 and start to loop through the DataRows and manipulate the data. I will get the following error message. Executed as user: NT AUTHORITYSYSTEM. .NET Framework execution was aborted by escalation policy because of out of memory. System.Threading.ThreadAbortException: Thread was being aborted. System.Threading.ThreadAbortException: at System.Data.Common.DecimalStorage.SetCapacity(Int32 capacity) at System.Data.RecordManager.set_RecordCapacity(Int32 value) at System.Data.RecordManager.GrowRecordCapacity() at System.Data.RecordManager.NewRecordBase() at System.Data.DataTable.NewRecord(Int32 sourceRecord) at System.Data.Data. The step failed. I€™ll stop SQL Server then restart it and sometimes that will do the trick and other times it won€™t. I have exception handling everywhere and when I debug it I can€™t recreated that error. Any help would be greatly appreciated.
Rather than the real code, here's a sample we came up with.
Here's the C# Code: public class sptest : System.Web.UI.Page { protected System.Web.UI.WebControls.Label Label1; private DataSet dtsData;
private void Page_Load(object sender, System.EventArgs e) { // Put user code to initialize the page here string strSP = "sp_testOutput"; SqlParameter[] Params = new SqlParameter[2]; Params[0] = new SqlParameter("@Input", "Pudding"); Params[1] = new SqlParameter("@Error_Text", ""); Params[1].Direction = ParameterDirection.Output; try { this.dtsData = SqlHelper.ExecuteDataset(ConfigurationSettings.AppSettings["SIM_DSN"], CommandType.StoredProcedure, strSP, Params); Label1.Text = Params[0].Value.ToString() + "--Returned Val is" + Params[1].Value.ToString(); } //catch (System.Data.SqlClient.SqlException ex) catch (Exception ex) { Label1.Text = ex.ToString();
} }
Here is the stored procedure:
CREATE PROCEDURE [user1122500].[sp_testOutput](@Input nvarchar(76),@Error_Text nvarchar(10) OUTPUT)AS SET @Error_Text = 'Test'GO When I run this, it prints up the input variable, but not the output variable.
strSQL = "select '<input type=text name='& id &'size=4 maxlength=4 value=>',product,description,moreinfo,media,License,unitc ost from newdeal where category='" & session("category") & "'"
Idea is the value of id to be fetched from the DB and substituted in the input box.It works well with MS Access but in SqlServer it is showing an error saying that data of varchar type cannot be converted to *******.Id is ******* data type in the DB .can anyone help me please
I'm writing a report for a survey application. There is a subscriber table, a question table, and an answer table. The subscriber table holds info like the name, address, etc of the publication's subscribers. The Question table holds the questions on the survey (ie, How many widgets do you own? What is your age? How many do you think you will purchase in the coming year?). The Answer table holds answers to each question (18-21, 25-50%, 20+, etc). I am forced to work within a very non-standard database application running on MS SQL Server 2000 that also uses a separate Relationships table, so to determine which answers go with which questions the relationships table will have a record linking a question to an answer (and vice versa). To determine how each subscriber answered the question the subscibers have a record in the relationships table linking them with specific answers (and vice versa). Both types of entries are in the same table. There are several questions that will include a note to "check all that apply" (you may get several widgets from the company). The report runs through each question, shows how many subscribers answered the question, and how many selected each answer as a percentage of the total for that question. It is nearly finished, I can run it on the test data for any date, but there is one special case I'm having trouble with. For one question, they want to know additionally how many people chose one possible answer, but no others. I can do it using simple SQL to get the list of the subscribers that chose the answer in question, and then loop through each one in vb and run another query that will tell me how many answers related to that question they selected. This works okay for the test data, but in production there are more than 100,000 subscribers, which means running 100.000+ individual querys. I know it's possible to do this in one (all I want is the final count), but the SQL for this is horrid. Any help appreciated.
We have an asp.net2 website pointing at sql2005 (not service packed yet - it may be the cause, not sure - would like opinions first!).The website will run without problems, but randomly we get the following scenario:We have a function that takes a SP name and a sqlcommand with parameter information. It opens a connection and uses a dataadapter to fill a datatable, then closes the connection and returns. When the problem occurs, we get a sqltimeout on the line where it tries to fill the datatable. However, when the function is called again, the results from the first query seem to get populated into the datatable. Then when it gets called again, the previous results are again returned. Its like ADO sees the timeout (but sql doesn't), but then picks up the next available resultset from sql (which is the wrong one!).Hope the above makes sense....Cheers,James.
hey ppl, if there is any1 that can help me here i would apreciateerror line(probably)Comando = New SqlCommand("INSERT INTO Table1(name) VALUES('a')", Conexao)on Table1 i have those fieldsid(int, identity) e name(text)but when i execute this code i get the following error:Error on the XML processing: no element found(ps. the error wasnt on english so i made a poor translation but i think u can have a general ideia)(the error message untranslated)Erro no processamento de XML: nenhum elemento encontradoPosição: http://localhost/teste.aspxNúmero da linha 1, Coluna 1:anyway, the wierd part is that on the sqlserver the data is insertted normally.. so.. what is that error??any1 pls help me! im going nuts (ps. im still noob on asp.net/sqlserver so please be gentle ^.^)(ps2. sorry for the crap english but it isnt my native language :) )
The amount of columns needs to be dynamic. If i were to add pineapple there needs to be 5 columns etc. Sorry if this has been covered before but I wasnt sure on search criteria. Thanks in advance.
Hi all, I'm having a real wierd problem with sql 7.0 (sp3)update statement. here is my table structure sku varchar(18) minimumquantity varchar(7) maximumquantity varchar(7) usa varchar(20)
when I issue an update like this...
update sapproductprices set usa= "8.25" from sapproductprices s where (s.sku= "VSE-ASAP-BB-100-S1")
it works!, but, when I change my update to this it does'nt work...I get a message which says "the command completed successfully" and no update happens
update sapproductprices set usa= "8.25" from sapproductprices s where (s.sku= "VSE-ASAP-BB-100-S1" and s.maximumquantity ="10")
Can someone shed some light on this one please... thanks uday
Hi all, I'm having a real wierd problem with sql 7.0 (sp3)update statement. here is my table structure sku varchar(18) minimumquantity varchar(7) maximumquantity varchar(7) usa varchar(20)
when I issue an update like this...
update sapproductprices set usa= "8.25" from sapproductprices s where (s.sku= "VSE-ASAP-BB-100-S1")
it works!, but, when I change my update to this it does'nt work...I get a message which says "the command completed successfully" and no update happens
update sapproductprices set usa= "8.25" from sapproductprices s where (s.sku= "VSE-ASAP-BB-100-S1" and s.maximumquantity ="10")
Can someone shed some light on this one please... thanks uday
Hi,I can not find any reference or support for the following issue fromMicrosoft or on Google:When I open a table either to return data or try to use the design tools orif I try to create a new database diagram, I get a new result pane, diagrampane etc... but no result and no functionality within that pane. Thedatabase is working and I can run querys using MS Query. The problem seemsto be with MMC or Enterprise Manager.I am using SQL Server 7.623 Service Patch Level 4 or 6. I am not sure onthat point. This has been a stable well functioning system for years withno recent changes. So I am sure it is not a configuration issue.Thanks in advance for your help.PS; I don't want to seem rude, but I am not interested in buying a new"upgrade" to fix this so any MS adbots out there can just not bother withresponding to this post :)
and somtimes it failes on this command with this error:
<Message>Error in generating package C:PROJECTSPOSTALBuildSSIS packagesMappingsMappingsSBSCR_52_OUT.dtsx</Message>
<InnerException>Type 'Microsoft.SqlServer.Dts.Tasks.ExecuteSQLTask.ExecuteSQLTask' in Assembly 'Microsoft.SqlServer.SQLTask, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' is not marked as serializable.</InnerException>
<Data />
<Source>mscorlib</Source>
<HelpLink />
<StackTrace> at System.Runtime.Serialization.FormatterServices.InternalGetSerializableMembers(RuntimeType type)
at System.Runtime.Serialization.FormatterServices.GetSerializableMembers(Type type, StreamingContext context)
at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.InitMemberInfo()
at System.Runtime.Serialization.Formatters.Binary.ObjectWriter.Serialize(Object graph, Header[] inHeaders, __BinaryWriter serWriter, Boolean fCheck)
at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Serialize(Stream serializationStream, Object graph, Header[] headers, Boolean fCheck)
at System.Runtime.Remoting.RemotingServices.MarshalToBuffer(Object o)
at Microsoft.SqlServer.Dts.Runtime.ObjectHostImpl.get_InnerObject()
at Microsoft.SqlServer.Dts.Runtime.TaskHost.get_InnerObject()
at InterfaceEngine.OutputInterface.CreateExcel() in C:PROJECTSPOSTALBuildInterfaceInterfaceEngineInterfaceEngineOutputInterface.cs:line 842
at InterfaceEngine.OutputInterface.GenerateOutputPackage() in C:PROJECTSPOSTALBuildInterfaceInterfaceEngineInterfaceEngineOutputInterface.cs:line 322</StackTrace>
<TargetSite />
what is this about? why does it happen , and why not always? the package generation is generated by a web service that is called from an asp.net 2.0 page .
I setup the query as a stored procedure and changed the SSIS accordingly. Running the sp from within SSMS was fine but the task still returned the wierd results.
This is completely unexpected and giving us headaches. It doesn't even look like well-formed XML to my eyes and won't display in IE without showing problems! Where are the eroneous /'s and ROOT nodes coming from?
I can't imagine we are the only people to have run into this and I'm sure we aren't doing something quite right - just stuck as to what we're doing wrong.
Hopefully I've provided enough info. If not just ask.
Hi I'm modifying a report from MS CRM. Sales Pipeline. Not the easiest one but still. I just got to the drilldown report and need to change some things in it. I tested the SQL code as I could in SQL Server Management Studio so that I don't have any misspellings (the report I'm modifying is on a remote location). Anyway, now that I copy my code to the SQL Server Business Intelligence Development Studio and to the dataset I'm modyfing I see that a chunk of code from the bottom is missing ... ??? ... is there a limit on how much I can type?!
I counted in Word that I have 33.176 characters and 878 lines of code. Is that to much?! The original report has a dataset of 27.529 characters and 724 lines of SQL code.
I used two symbols (”¬€¬ and ”´) in my reporting services page and they appear fine in Internet Explorer. However, when I try to export the page to a PDF file, they appear as question marks.
Can someone tell me why or how to work around this?
When I try to run the asp.net application (C#) it is throwing the following exception.
"An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)"
The problem is I don't have SQL Server 2005 installed in my system. And I am trying to connect a remote SQL 2000 server.
Adding to my confusion, the same code is working fine in another system.
Yes, I am pulling my hair out on this one...................
PROBLEM:When importing from my .sql file, I loose many lines of info in my SQL table. I am using the SQL Query Analyzer to execute the file. The file contains 655,000 rows and 20 columns of data on it.
When I run the full file, the system tells me it does not have enough system resources to execute the request. I broke it down just to see if it could handle a smaller file.
When I create a table and have it insert 10,000 rows of info, it drops 88 lines of data.
When I do the same with 120,000 rows I loose 300 rows of data.
When I run it with 56,000 lines of data on the file, I loose 260 rows of data.
When I do just 1,000 rows or below it is perfect.
I have taken the chunks into Excel and verified the number of lines I was trying to put into the table, and everytime, it showed I had the amount of lines I thought I was trying to import.
When I did a query once the table was created to see how many lines it created, I was always short.
This is what my query consisted of (Let me know if this may be my issue)
SELECT COUNT(*) Col_1 FROM nfo_tablename GO
I even tried zero column names
Any suggestions??? Am I doing something wrong here? Should I be executing the process in a different way?
Here is the basic idea of what the top part of my "CREATE TABLE" file looks like (Color coded as I see it in my Query Analyzer):
Thank you all for your insight. Any ideas, thoughts or suggestions, would be appreciated. I really need to get this table done so I can get a couple web pages created this weekend that are due Monday.
SELECT DSNew, DTTM, RQDT FROM dbo.Feb INNER JOIN DMSEFL ON ACTR = DSNew where cast(DSNew as varchar(20)) = cast(ACTR As varchar(20)) If I run the above query I get zero recs back.
If I substitute a Value then I get the desired results (ie. where DSNew = '93235500') or if I enter (ACTR = '93235500') or if I put (where DSNew = '93235500' AND ACTR = '93235500')
Can anyone suggest a reason why this is happening. I know the records exist on both tables I ran the query in Acess and got the desired resutls.
Vb stores an image (bitmap/jpg) in a SQL2000 image field. For somereason it doubles the size by adding 00 for each byte. But sometimes itdoesn't add 00 (0000.0000) but 01 or 20 or ?? and also the byte thatshould be transferd is changed...So SQLImageData = Imagebyte + 00and sometimesSQLImageData = CHANGEDImagebyte + xx.Some example data: (hex notation):A1 => A1 0003 => 03 00-----------91 => 18 2083 => 92 018C => 52 01Could anybody give me an explanation, because I need to know what ishappening, so I can remove te extra bytes added......I have already a image when I remove the extra bytes, but with somewrong data (on the place where things like 8C => 52 01 happen)....Thx,Geronimo
Right this has to be a Micro$oft mess-up surely...?I'm running SQL 2k standard with SP3. I have a table which I'm tryingto query using a LIKE operator on a varchar field as follows....WHERE dbo.tbl_pm_projects.SeniorManagerID LIKE '%'....In actual fact the % is passed in by the application when the userselects "All managers" from the drop down list used to select theManager to filter by. If they select a manager's name from the listit becomes LIKE 'ajames' or whatever.BUT - the table currently contains 2972 records. If I take out theWHERE clause the SELECT returns all records - fine - but if I put thewhere clause in it returns only 1682!! I thought the % was meant tomatch, and I quote the SQL server Books Online files here; "Any stringof zero or more characters." Anyone explain to me whats going onhere?TIANiall
I can access the report server remotely. However, when I try to access via browser locally, no luck. I get access denied.
Some kind of browser diff maybe?
I noticed that on the browsers/machines that work, I have the option of putting the domain in seperately. The browser that doesnt work (on the same machine as the db server and web site) puts the domain in front of the username for me.
I am having a problem with a procedure. I can run it from QA and ittakes 50 minutes. When I have it in a scheduled job, it takes 3hours!! What could be the cause of this? Why the big timedifference?Thanks
i have created a procedure that is about 500 line long.
now this is actually a controller procedure which calles other procedures and functions to generate data for a report. But this procedure table about 3 min to generate result set. I am not using any temp table. I am using table variables.
My procedure do not recompile. My rocedure have some insert into ... Exec statements also..
My question is Will performance increase if i split the stored procedure into 2 or 3 or 4 parts?
Given that a stored procedure and T-SQL code in query analyzer are exactly the same, why would the stored procedure run much slower?
When I mean much slower I mean 3 sec for the code in query analyzer as opposed to 2:33 sec for the stored procedure.
Exact same code!
Profiler also gives more reads and writes for stored procedure, and a lot of BatchStarted and BatchCompleted directives between the 'start' and 'end' of the stored procedure.
I am writing an ASP based application that creates a dynamic querry and then executes it and displays results. I was thinking about writing a stored procedure to increase performance. How much can the SP help me boost querry responce time ???
i've a store procedure has a strange behavior, As soon as created has a good performance , but after some times (indeterminated) it takes more time to be execute.... (up to 70s!!!)
The thing that i've not understood was if i take the query inside to the store i execute it separtely I get result immediately... :eek:
Dropping and re-creating procedure,it become newly fast... I've just scheduled a maintenance plan with index optimization and integrity check, but this seems doesn't work ...
hi how should i monitor performjance of stored procedure and sql statements. i want to know that how much cpu time a query or stored procedure is taking??
r there any system table which give these information