ProcessInput Method Kept A Reference To The Buffer
Sep 7, 2006
Hi
I have written a custom data flow component that counts the rows coming through the pipeline and adds the counts to an object stored in a package variable.
The component sometimes generates warnings with the following message - "A call to the ProcessInput method for input 358 on component "<Component Name>" (354) unexpectedly kept a reference to the buffer it was passed. The refcount on that buffer was 4 before the call, and 5 after the call returned.".
The component is used several times throughout ~60 packages. It does not always occur on the same instance, or even in the same package, from one occasion to another. When all the packages are executed (from a master package - some in parallel and some in series) the warning is generated several times (around 10), but the warning is not generated at all when the packages are executed individually from BIDS. The values of the refcounts vary from message to message, but the difference is always 1 more after the call than it was before. The server has SP1 and the hotfix package installed, but the warning was being generated before SP1.
I do not understand what could be keeping a reference to the buffer. The ProcessInput method is very basic:
public override void ProcessInput(int inputID, PipelineBuffer buffer)
{
try
{
if (!buffer.EndOfRowset)
{
totalRowCount += buffer.RowCount;
}
}
catch (Exception ex)
{
bool cancel = false;
ComponentMetaData.FireError(0, ComponentMetaData.Name, ex.Message, string.Empty, 0, out cancel);
throw new Exception("Could not process input.", ex);
}
}
Does anyone have any idea what could be causing this, or ideas for things I should be investigating?
TIA . . . Ed
View 5 Replies
ADVERTISEMENT
Jan 3, 2007
Hello,
I have a Data Flow Task within an SSIS package where data is pulled from a Fixed Width flat file, goes through a conditional split, and sends data either to a SQL Server 2000 table, or to another fixed width flat file.
The first condition in the conditional split checks for the following; [Column 44] + [Column 45] != "D1" . If this is true, then those rows go to a SQL Server 2000 table destination. If false, then the rows go to another flat file.
The SQL Server 2000 table has a unique nonclustered index with ignore_dup_key. From what I can tell, it looks like the step fails when a duplicate record is encountered. However, it appears that the unique records still go into the table.
When I check the 'Progress' tab of the package, the following steps seem to be suspect;
[Populate Staging [786]] Information: The final commit for the data insertion has started.
[Populate Staging [786]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for SQL Server" Hresult: 0x80004005 Description: "Duplicate key was ignored.".
[Populate Staging [786]] Information: The final commit for the data insertion has ended.
[DTS.Pipeline] Error: The ProcessInput method on component "Populate Staging" (786) failed with error code 0xC0202009. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0202009.
[DTS.Pipeline] Information: Post Execute phase is beginning.
..and..
Task 2 Isolate Suspect Records and Insert Known Good Records failed
In one case, the source flat file had 74,101 rows. Each row tested true in the conditional split, and all but one row made it into the SQL Server table. The step failed, and the 'Progess' section containted the entries mentioned above.
The Connection Manager for the SQL Server table destination uses the Native OLE DBMicrosoft OLE DB Provider for SQL Server.
Is the 'error' that occurs when a duplicate row is encountered causing the Data Flow task to fail, or could it be something else?
Thank you for your help. Please let me know if you need more information.
CSDunn
View 2 Replies
View Related
Mar 6, 2008
Good day everyone,
I'm experiencing a completely random warning from any given row count component within any given data flow task. It occurs sporadically. Whilst distracting, I don't see any adverse effects to the data after the packages complete. Can someone weigh in on this warning and let me know if it is indeed benign or what I maybe able to do to fix it?
Here's the warning:
"A call to the ProcessInput method for input 75997 on component "CNT Rows sent for STG table" (75995) unexpectedly kept a reference to the buffer it was passed. The refcount on that buffer was 4 before the call, and 5 after the call returned."
Thanks,
Langston
View 4 Replies
View Related
Mar 22, 2007
I'm trying to create a simple Data transfermation. I have a flat file that came of a unix server.. it's 177 bytes wide.. thought it was 175, but when I created the flat file connector, I could see some extra characters on the end.
My output is going to be an excel spreadsheet, I only want two columns from the input. I created an oledb jet 4.0 connection. and followed instructions from here :
http://aspalliance.com/889_Extracting_Data_from_a_Flat_File_with_SQL_Server_2005_Integration_Services
to create my datafow.
On my first attempt to dataflow, I ran into unicode errors and had to do this:
ran into a problem with unicode errors. went to the source for the flat file. for the output column in question changed to Unicode string [DT_WSTR].
When I run , here are the errors I get:
[OLE DB Destination [513]] Error: An OLE DB error has occurred. Error code: 0x80040E09. [DTS.Pipeline]
Error: The ProcessInput method on component "OLE DB Destination" (513) failed with error code 0xC0202009. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0202009.
[GanchoFileSource [1]] Information: The total number of data rows processed for file "\ammia01dev04D$JCPcpmgancho_venta_20070321.sal" is 19036.
[GanchoFileSource [1]] Error: Setting the end of rowset for the buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "GanchoFileSource" (1) returned error code 0xC0209017. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.
View 4 Replies
View Related
May 29, 2008
I've written an asynchronous script component and I have created the output buffer. 500,000 rows go into the input but only 1500 to 2000 rows come out berfore I get an SSIS Object reference not set to an instance of an object Error. The error occurs at the AddRow method of the outputbuffer (that's how I know it's gone). Why does this happen? Is there a way to sync up the output buffer with the input buffer?
View 25 Replies
View Related
Oct 10, 2006
Hi
I am trying to make a custom task. The custom task has one input, which i map to externalmetadata column in the task and one output.
When i run the task it fails with this error ( I am putting the whole SSIS message)
SSIS package "Package.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Data Flow Task, Flat File Destination [1855]: The processing of file "C:ole db eft data.txt" has started.
Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.
Error: 0xC0047062 at Data Flow Task, Lib [2387]: System.ArgumentException: Value does not fall within the expected range.
at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBuffer90.DirectRow(Int32 hRow, Int32 lOutputID)
at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.DirectRow(Int32 outputID)
at Lib1.LibPM.ProcessInput(Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper90 wrapper, Int32 inputID, IDTSBuffer90 pDTSBuffer, IntPtr bufferWirePacket)
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: The ProcessInput method on component "Lib" (2387) failed with error code 0x80070057. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x80070057.
Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0047039.
Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Data Flow Task, Flat File Destination [1855]: The processing of file "C:ole db eft data.txt" has ended.
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "Flat File Destination" (1855)" wrote 0 rows.
Task failed: Data Flow Task
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (5) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
----------------------------
This is my piece of code which is trying to put the data in the output buffer (ProcessInput function).
int GoodOutputId = -1;
IDTSInput90 inp = ComponentMetaData.InputCollection.GetObjectByID(inputID);
//GetErrorOutputInfo(ref errorOutputID, ref errorOutputIndex);
GoodOutputId = ComponentMetaData.OutputCollection[0].ID;
System.Console.Write("Here i am");
System.Console.Write(GoodOutputId);
if (!buffer.EndOfRowset)
{
while (buffer.NextRow())
{
if (_inputColumnInfos.Length == 0)
{
buffer.DirectRow(GoodOutputId);
}
else
{
buffer.DirectRow(GoodOutputId);
}
}
}
I have put any code in the else part as i am just trying to run the task as of now and later put the functionality in it.
Please let me know if i have missed something. Thanks in advance.
Vipul
View 1 Replies
View Related
Apr 18, 2012
I encountered the following error while attempting to preview an RDL report I was developing in VS2010 using SSDT:"The size necessary to buffer the XML content exceeded the buffer quota"
View 3 Replies
View Related
Oct 7, 2015
We have a set of reports with same header section in all the reports. So while developing a new report i used to copy that header section to the new report with same dataset names (without any change) , but while rendering the report it is throwing error " The size necessary to buffer the XML content exceeded the buffer quota".
View 2 Replies
View Related
Apr 28, 2006
Hi
I have a master package that executes a series of sub packages run from a SQL Agent job. One of those sub packages has been stable for a week, running at least once per day, but it just failed despite having been run once already today with the same set of input data.
There were a series of errors showing in the event log for the Execute Package Task starting with "Buffer Type 15 had a size of 0 bytes.", then "The buffer manager failed to create a new buffer type.", then "The Data Flow task cannot register a buffer type. The type had 32 columns and was for execution tree 3.", then "The layout failed validation." and finally "Error 0xC0012050 while loading package file "C:[Package].dtsx". Package failed validation from the ExecutePackage task. The package cannot run.".
SQLIS.com reports the constant for the error code as DTS_E_REMOTEPACKAGEVALIDATION ( http://wiki.sqlis.com/default.aspx/SQLISWiki/0xC0012050.html ).
I then ran the package on my dev machine in BIDS and it worked fine, so I re-ran the job on the server and this time that package executed ok, but another one fell over but did not put anything in the event log.
Does any one have any idea what happened?
TIA . . . Ed
View 2 Replies
View Related
May 10, 2007
Hi,I am trying to write a method which needs to call a stored procedure and then needs to get the response of the stored procedure back to the variable i declared in the method. private string GetFromCode(string strWebVersionFromCode, string strWebVersionString) { //call stored procedure } strWebVersionFromCode = GetFromCode(strFromCode, "web_version"); // is the var which will store the response.how should I do this?Please assist.
View 3 Replies
View Related
Sep 10, 2014
I am using vs 2010 to write my dtsx import scripts.I use a script component as a source to create a flat file destination file.Everything have been working fine,but then my development machine crashed and we have to install everything again.Now when i use the execute package utility to test my scripts i get the following error:
Error system.NullReferenceException: Object refrence not set to an instance reference.
In PreExecute section
TextReader = new system.io.streamreader(" file name")
In the CreateNewOutputRows:
dim nextLine as string
nextLine = textReader.ReadLine
[code]...
is there something which i did not install or what can be the error?
View 0 Replies
View Related
Jan 29, 2008
Hi,
I just have a Dataset with my tables and thats it
I have a grid view with several datas on it
no problem to get the data or insert but as soon as I try to delete or update some records the local machine through the same error
Unable to find nongeneric method...
I've try to create an Update query into my table adapters but still not working with this one
Also, try to remove the original_{0} and got the same error...
Please help if anyone has a solution
Thanks
View 7 Replies
View Related
Apr 6, 2007
The scenario is as follows: I have a source with many rows. Each row has a column called max_qty_value. I need to perform a calculation using another column called qty. This calculation is something similar to dividing qty/(ceiling) max_qty_value. Once I have that number I need to write an additional duplicate row for each value from the prior calculation performed. For example, 15/4 = 4. I need to write 4 rows to the same target table as in line information for a purchase order.
The multicast transform appears to only support fixed and/or predetermined outputs. How do I design this logic in SSIS to write out dynamic number of rows to a target table.
Any ideas would be greatly appreciated.
thanks
John
View 18 Replies
View Related
Apr 3, 2001
I'm having a problem importing a text file into a SQL db using DTS. I have to transform some of the data that is being imported so I think Bulk import is out of the question.
Everything works fine until a hit a row that contains more than 255 characters in one cell. Once it encounters that row, it fires this error:
"Error at source for row number 9.Errors encountered so far in this task :1
General Error: -2147217887(80040E21)
Data for Source Column 3('Col3') is too large for the specified buffer size."
I found a entry in the MS KnowledgeBase that addresses the symptom but the workaround doesn't fix it:
http://support.microsoft.com/support/kb/articles/Q281/5/17.ASP?LN=EN-US&SD=tech&FR=0&qry=DTS%20buffer&rnk=3&src=DHCS_MSPSS_tech_SRCH&SPR=SQL
Anyone have any ideas.....
View 2 Replies
View Related
Sep 30, 1999
Help, have recently upgraded from 6.5 to 7.0 and have come across a problem with performance. The problem appears to relate to the buffer cache being flushed, the buffer cache hit ratio drops from 98% to 0% in a matter of a second. It then very slowly grows, then is flushed again, then increase slowly upto 30%.
Does any one have any ideas as to what would flush the buffer cache?
Any comments would be much appreciated - cheers
View 1 Replies
View Related
Jan 27, 2003
Hi,
We upgraded our applications from 7.0 server to win 2000, sql server 2000 sp2 machine.
While running the same batch job that we used to run on the old NT server without any problem, job failed with the following message
'Msg 845, Sev 17: Time-out occurred while waiting for buffer latch type 3 for page (1:8200), database ID 2. [SQLSTATE 42000]'.
Can someone tell me what's going on? This new server is supposed to be much more powerful than the old server.
What value should I monitor for the new server to prevent this timeout?
-Shaili
View 1 Replies
View Related
Aug 22, 2007
Hi all,
I have a strange problem that I need to solve as soon as possible.
I have created two CLR UDTs called point and point_list. Each record of a point_list consists of a list of points. I created a CLR stored procedure which reads some raw data and updates the point_list records.
When I execute the stored procedure the following error appears :
System.Data.SqlTypes.SqlTypeException: The buffer is insufficient. Read or write operation failed.
System.Data.SqlTypes.SqlTypeException:
at System.Data.SqlTypes.SqlBytes.Write(Int64 offset, Byte[] buffer, Int32 offsetInBuffer, Int32 count)
at System.Data.SqlTypes.StreamOnSqlBytes.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.BinaryWriter.Write(Char ch) etc ...
Does anybody know what should I do ?
Thanks!
View 4 Replies
View Related
Jul 20, 2005
Hi thereAnybody know how to increase the MS SQL server buffer size?I get an error when trying so insert some pictures as OLE objects. Whentransfering to the server i get an error, that the buffer sizes needs tobe increased.RegardsRudi W.
View 1 Replies
View Related
Oct 17, 2006
I am trying to set a decimal value to the pipelinecolumn buffer, but it doesnt get set, and the value is NULL.
Here is the portion of the code of what I am trying to do:
if (columnInfos[x].colName.EndsWith("_CRC"))
{
int a;
a_cmp tst = new a_cmp();
a= tst.a_crc32(inputbufferstream); this function returns a integer value
buffer.SetDecimal(colInfo.bufferColumnIndex, Convert.ToDecimal(a));
}
Please let me know how to set a decimal value in the buffer.
View 2 Replies
View Related
Aug 8, 2007
Hi
A script component receives some input. But I just can't get at the first row??
Basically, if i use the NextRow method in the in the Do statement, then it advances the row collection to the second row before it gets into the code inside the loop?? BUT, if I use the EndOfRowset property to define my loop then I get an error:
[PipelineBuffer has encountered an invalid row index value]
I'm guessing this means...I have to call NextRow before i access the data in the collection? But thats retarted because then I miss the first row?? what? What am I missing??
This is the code which works but I miss the first row:
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Dim strConcept As String
Do While Row.NextRow()
strConcept = Row.concept
updateDb(strConcept)
Loop
End Sub
This is the code which throws the invalid row index error:
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Dim strConcept As String
Do While Not Row.EndOfRowSet()
strConcept = Row.concept
updateDb(strConcept)
Row.NextRow()
Loop
End Sub
I've put some try catches in there an the error happens on the line which calls Row.concept....?
Can anyone help, it must be something I'm messing up
thanks!!
andy
View 17 Replies
View Related
Dec 28, 2006
All,
My weekly loading is failed and here is the error message I got. Could someone kindly point me what is the problem and how to detail with it?
Thanks
Error: 0xC0047012 at Fact_ResidentService, DTS.Pipeline: A buffer failed while allocating 63936 bytes.
Error: 0xC0047011 at Fact_ResidentService, DTS.Pipeline: The system reports 43 percent memory load. There are 4227104768 bytes of physical memory with 2378113024 bytes free. There are 8796092891136 bytes of virtual memory with 8787211939840 bytes free. The paging file has 10300792832 bytes with 14786560 bytes free.
Error: 0xC0047022 at Fact_ResidentService, DTS.Pipeline: The ProcessInput method on component "Union All 1" (3629) failed with error code 0x8007000E. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC02020C4 at Fact_ResidentService, From_Basis [16]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Fact_ResidentService, DTS.Pipeline: The PrimeOutput method on component "From_Basis" (16) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Error: 0xC0047021 at Fact_ResidentService, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0x8007000E.
Error: 0xC0047021 at Fact_ResidentService, DTS.Pipeline: Thread "SourceThread1" has exited with error code 0xC0047038.
Error: 0xC0047039 at Fact_ResidentService, DTS.Pipeline: Thread "WorkThread2" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
View 7 Replies
View Related
May 29, 2006
When running a package created on my local machine i get no errors at all but when i try to run the same package on the server i get an error specifying Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
I have tried changing the defaultbuffersize of the data flow task but this makes no difference. I think that a buffer size for a particular column is being exceed but i cannot find anywhere to set this property.
Has anyone else struck this error?
View 1 Replies
View Related
Oct 22, 2007
We had a package fail when trying to get 3 buffers. I also saw another message saying "18430 buffers were considered and 18430 were locked. "
1. Why are buffers locked in SSIS
2. How can i reduce the number of locked buffers.
This one in question was a very large lookup.
Thanks
Glenn
View 1 Replies
View Related
Nov 20, 2000
Upon running DTS manually to transfer data from Excel into SQL Server, I
get the error:
-----------------------------ERROR OUPTUT ------------------------------------
Error at Source for Row number 264. Errors encountered so far in this task: 1. General error -2147217887 (80040E21).
Data for source column 3 ('Value') is too large for the specified buffer size.
---------------------------END ERROR OUTPUT----------------------------------
*** 'Value' is varchar(4000); largest having length of 1000.
*** The network packet size is 4096.
?? AM I SUPPOSED TO CHANGE THE BUFFER SIZE??
Your kind help is greatly appreciated
Thanks
Ziggy
View 2 Replies
View Related
Nov 1, 1999
I have been looking at Books Online and I'm trying to figure out how I can resolve this error.
MSG 845, Level 17, State 1
Time out occured while waiting for buffer latch type 2 for page.....
Thanks..
View 1 Replies
View Related
Jun 15, 2004
Time out occurred while waiting for buffer latch type 2,bp 0x18b7d40, page 1:11558916), stat 0xb, object ID 9:1842105603:2, EC 0x5862D9C8 : 0, waittime 300. Not continuing to wait.
What does this mean any reason and fix for it..?
Thanks
View 1 Replies
View Related
May 24, 2006
Hi
I've been searching this site and the Web for info on an error message I get when importing from Access 2003 into SQL Server 2000.
'Data for Source Column 3('Col3') is too large for the specified buffer size'
A memo field in Access is larger than 255.
I have followed advice about putting the field to the first column. This doesn't work - the error just returns the new column number. In fact, I've tried just importing the first column - no good.
I am wary about making Registry changes as comments on the Web say this doesn't work either.
Does anybody have the solution for this.
Paul
View 6 Replies
View Related
May 7, 2004
I want to trace the user logins by using a stored procedure. This script (sp_login_trace) is created by the SQL Profiler tool. (Once this procedure works well, I will use sp_procoption to run it automatically everytime the SQL Server startup.)
After I successfully created sp_login_trace, I run it (exec sp_login_trace). The trace process is started and TraceID is 1. (I use select * from ::fn_trace_getinfo(default) to verify it). However the file size of login_trace.trc is always 0 even after I use Query Ananlysis or Eneterprise manager to let some users to login into the SQL Server instance. (when I use SQL Profiler to start a trace, the trace file size will increase along with users continaully login in). At that time if I use SQL Profiler to open the login_trace.trc file, the system will give me an error message: No data since Empty File.
After I stop and delete the trace process, I find that the file size of login_trace.trc becomes 128K and I can see the login records caught by sp_login_trace if I use SQL Profiler to open this file again.
How can I flush the buffer to trc file frequently without need of stopping trace process?
Thanks for helps in advance.
Leon
View 1 Replies
View Related
Sep 1, 2015
Last Sunday on our Primary server I saw some blocking and the First spid that blocked everything was waiting on LogBuffer wait. At that time the server was not running hot and our DB is on SSD's. There was no memory pressure(1 TB of memory on the server) It is strange why we should get LogBuffer wait when we are running on the fastest disks possible and there was not much action going on the DB.
View 4 Replies
View Related
Jun 12, 2008
hello again,
This is another pending issue.
It is another package accessing the same database of the same system ("Sage" for commercial and accounting operations).
When run, it gives an error message, as following (also translated from french):
Simba ODBC Driver[CBase]: Very Small/Insufficient Buffer Zone. Data is truncated.
it seems there is a bug in the database, as when we run the same package on another database, it runs successfully.
It may be possible that the database needs a maintenance. Is it possible to advise how to do it, if it is so?
May you help in helping resolve this issue?
thank you in advance.
Leïla
P.S: How is it possible to attach a file?
View 1 Replies
View Related
Feb 15, 2007
Ramesh writes "Hi,
Is there any possibility to execute a query in sql server 2000 without keep / using buffer.
Thanks"
View 1 Replies
View Related
May 16, 2007
Hi,
My problem is that I cannot completely clean buffer cache on SQL Server 2005 version 9.00.2047.00 (probably SP1).
Right after I run DBCC DROPCLEANBUFFERS in the context of my database (this is development server, and so far I am only the one who is working with a particular database), I run a script that quetries sys.dm_os_buffer_descriptors view also from the context of my database to make sure that the buffer cache is really clean. However it shows large number of entries totalling 42 MB.
I ran both DBCC an the script in the past too, and it always showed nothing in the results, that means that buffers were really clean. The reason why I am running this is for benchmarking of existing and new application.
Does anybody have any idea, suggestions, how to troubleshoot this issue ? I already closed all connections to this database, but rebooting the server is not an option since other people are also working on it.
Thanks
View 2 Replies
View Related
Jun 1, 2006
Getting below sort of error message when running a simple select to a table from Query analyser 2000 to a SQLServer 2000 running with SP4 on different sort of times.
1)
[Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionRead (InvalidParam()).
Server: Msg 11, Level 16, State 1, Line 0
General network error. Check your network documentation.
Connection Broken
2)
[Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream
[Microsoft][ODBC SQL Server Driver]TDS buffer length too large
[Microsoft][ODBC SQL Server Driver]Protocol error in TDS stream
3)
[Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server
[Microsoft][ODBC SQL Server Driver]Invalid cursor state
[Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server
Any one faced this error? Any advise please,
View 5 Replies
View Related