I have a SQL Server 2000/E with 8 GB RAM in the machine.
It is configured to use fixed memory size, at 7GB, and all the appropriate OS switches are set to use that.
Everything had been working fine, but we've been having some production problems and while looking into it, I looked at task manager and while the server performance tab shows that the server is using 7+ GB, the processes tab shows sqlservr.exe at 107,724 K.
Do you think this is a real issue, and that the server might only be using 100 mb of ram? Does anyone have any larger servers like this that they could compare?
I am a newbie on SQL server and my problem is this:
My SQL server runs with many client queries and after a while I can observe that the meory usage of the SQL server shown in the Windows Task Manager is growing up (e.g. 260 MB !!!).
I checked the online books and found the settings "min/max server memory" which I set to
min = 4 MB max = 20 MB
by Enterprise manager.
Then I restarted my SQL server, checked the memory settings again by Enterprise manager and started many client queries. The memory usage in Task Manager nevertheless exeeded the 20 MB.
What is my failure? How can I limit the memory usage of SQL server?
I've SQL Server 2005 Dev Edition and Windows Server 2003 ENT SP2. in the task manager i see that the server use 5GB out of 6GB,but when i arrange the process to see whom takes all the memory i see that the sqlservr.exe takes 150MB(he is the biggest). when i open Perfmon and look on the sqlservr.exe memory use, i see that he takes the 4.5G. i've a problem that the server use alot of cpu time to run users queries,i see that pages/sec counter is very big average between 600-800.
is there any problem with my memory? why the memory reading from the task manager is wrong? if my pages/sec counter is so big do i have a memory leaks + pressure?
I have an XML Task to validate an xml document on disk (size: 78,464K). When I try to validate it I get error: [XML Task] Error: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".
Task Manager shows 904,492K Physical Memory Available which drops to about 586,000K when the task fails. Any ideas?
[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 12 buffers were considered and 12 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
I am trying to use a DTS package to get data from db2 in a s390 environment. I am able to use the Import task and then run a query on db2, save the package and execute the package.But when i try edit the transform task i get a mmc.ese application error...it says that the instruction at addres "" tries referencing memory at address "". The memory could not be read...
I installed a ibm odbc driver on my client...obviously the connection seem to work since the package executes...But then the edit issue...
If any one faced this problem or know what i am doing wrong....appreciate ur time and effort... Thanks
I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.
The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.
I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".
I also run this package on a 8GB Ram server, and same applies.
Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.
I'm trying to map the SPID of a certain process to the PID under the processes tab in task manager. I can map the PID to the KPID through the sysprocesses table but this KPID does not appear in task manager under the PID's under the processes tab. How can this be done?
When I try to create an SSIS package with an FTP task, it always fails to compile with the messages:
Error at Package: The connection "" is not found. This error is thrown by Connections collection when the specific connection element is not found. Error at FTP Task [FTP Task]: Connection manager "" can not be found. Error at FTP Task: There were errors during task validation. (Microsoft.DataTransformationServices.VsIntegration)
The item is marked with a circled-X. The tool-tip also agrees: The connection "" is not found.
Using my recreation steps below, I am setting up a connection manager, but it is never found at compile time.
1. Open BIDS, select a new Integration Services project. 2. Drag an FTP task from the toolbox to the Control Flow window. 3. Double-click the FTP task. 4. On the FTP Task Editor window, General page, pull down the FTPConnection property. 5. Select <New Connection...> 6. On the FTP Connection Manager Editor window, enter the servername, port, username and password. 7. Click "Test Connection" to verify connectivity. (It succeeds.) 8. Click OK on the FTP Connection Manager Editor window. 9. Click OK on the FTP Task Editor window.
I'm not using a configuration file or logging provider for this example. I have also installed SP2 + cumulative update 2.
This is running on sql server 2012, we have seup the databases on Availability group. The strange behavior i have been seeing the past few weeks is task manager has been blocked from index Re-org but have not found that what that task manager doing.. it won't shows the statement that running but shows as command type= 'Task manager'. Index is so big and i have been stopping this because of triple thread blocking.
We had this issue yesterday, for like an hour. Suddenly, our website started to get slow and get timeout errors. After executing sp_who2 and sp_whoisactive, we found a system process blocking many other processes. Of course, it was impossible for us to kill it.
Under "Status" it said TASK MANAGER, "Login" sa, from sp_who2. Under "sql_text" it was NULL, "program_name" empty, "open_tran_count" changed like this: 2,3,4,3,4,2,1 until it finished, and "database_name" changed from our database to master at the end, from sp_whoisactive.
Our CPU usage and memory at that moment were almost the same as any other day.
We don't do backups or execute jobs at that particular time. We don't even use "sa" account.
What could have happened? In the event viewer we've found nothing, nor in SQL server logs.
Once it finished everything was back to normal, we want to prevent this from happening again.
Hi, I'm using Integration Service between Firebird and MS SQL 2005. I can able to move data from Firebird db tables to MS SQL db tables using DataReader source (Firebird Connection Manager) and SQL Server Destination (SQL Server Connection Manager) component.
Now I want to execute parametrized SQL statement to fetch data from Firebird DB using Execute SQL Task Component. In this process while trying to set Connection property to this component I can able to view only MS SQL Connection Manager and it is not listing Firebird Connection Manager. Now I need a help to get connect this Execute SQL Task component with Firebird connection Manager. Help me to find out where I'm making mistake.
I am running into issues with custom objects interaction with visual studio 2005.
1. Custom connection manager. I am setting the name of the connection manager ( to standardize naming convention ) that user created, however when the CM is created, the name displayed in visual studio is still the default name even if the real name is the one i set. I have to do things like, edit it or save package - close - reopen, create another connection, ... etc in order to get it refreshed.
2. Custom Task I am managing some variables in this custom task so that I will be adding and deleting variables in the package. The challenge i am running into is, when I added 2 variables for example, even though the variables are successfully added to package, the Variable Window in visual studio designer will not reflect the new variables. I have to save package, close, and re-open in order for the variables to show up.
So this brings to my question - is there any way to tell Visual Studio programmatically to refresh the contents of these 2 sections, 1 is the Variable Window and the other is the panel containing the list of connection managers.
I have been searching around and found some clue about visual studio SDK but I still cannot find an exact way of doing it. Visual Studio SDK example tells you how you can access the Variable window as
We have an SSIS job that runs once a day and sends emails. For security reasons, IT switched the send-mail porton the smtp server from 25 to a different number (let's say 1234).
So, I changed the SSIS SMTP Connection Manager's smtpServer string to smptserver:1234
However, this does not seem to be working. I keep getting 'Send Mail Failure' error.
Any ideas how to set the port number for sending emails using SSIS?
I am trying to figure out how to override the column delimiter from a script task for a flat file connection.
Before outputing the data into a file, I have a variable which contains the delimiter to be used... but as it seems that this property cannot be changed through an expression, I assume it has to be done throught code...
However, the connectionmanager object doesnt seem to have the delimiter as an available property.
Dts.Connections.Item("myADO.NET connection").AcquireConnection(Nothing)Dim conn As New SqlClient.SqlConnection(Dts.Connections.Item("myADO.NET connection").ConnectionString) conn.Open()
This seems silly, in that I'm not really using the same connection, but using the connection string of a connection that already exists. And, for my purposes, it's not working currently, because I've switched from Windows Authentication to SQL Auth... and the password isn't coming over in the ConnectionString property.How do I re-use the exact same ADO.NET connection I have in my connection manager in a script task? That's the recommended way to go, right?
Hello. I have received the follwoing error upon an attempt to Browse the Cube. All other tabs are functional, including the Calculations tab. We are running Windows Server 2003 SP2 and SQL Server 2005 SP2. Any suggestions would be greatly appreciated!
**EDIT** - Have confirmed SP1 for VS2005 is installed both locally and on server, also.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (Microsoft Visual Studio)
------------------------------ Program Location:
at Microsoft.Office.Interop.Owc11.PivotView.get_FieldSets() at Microsoft.AnalysisServices.Controls.PivotTableFontAdjustor.TransformFonts(Font font) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdatePivotTable(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.UpdateAll(Boolean translate) at Microsoft.AnalysisServices.Browse.CubeBrowser.InitialUpdate() at Microsoft.AnalysisServices.Browse.CubeBrowser.SupportFunctionWhichCanFail(FunctionWhichCanFail function)
I've been researching AWE to determine if we should enable this for our environment.
Currently we have a quad core box with 4 gb of RAM (VMware). OS: Windows 2003 std, SQL Server 2005 std. 3GB is not set but will be as soon as we can perform maintenance on the server.
I have read mixed feedback on AWE, either it works great or grinds you to a hault. I would assume that the grinding to a hault is due to not setting the min/max values correctly or not enabling the lock page in memory setting.
We only have one instance of SQL on the server and this box won't be used for anything else aside from hosting SQL services. We do plan on running SSRS off of this server as well.
1. Will running SSRS and enabling AWE cause me problems? Will I have to reduce the max setting by the SSRS memory usage or will it share and play nice?
2. How do I go about setting the Max value? Should it be less than the physical RAM in the box? Right now its set to the default of 214748364, even if I don't enable AWE should this default value be changed?
3. It seems that even at idle the SQL server holds a lot of memory and the page file grows. If I restart the process in the morning, memory usage in taskmon is at 600mb or so. By the end of the day, its up around 2gb. How can I track down whats causing this, should this even concern me?
4. The lock Page in memory setting worries me. Everything I've read on this seems to give a warning about serious OS and other program support degradation. In some cases to the point where they have to restore the settings on the server before they can bring it back up. What are your thoughts on this.
I have a Windows sever 2012 with sql server 2012 enterprise. Ram size is 22GB. Sometimes SQL sever takes 95% memory.My question, How to reduce memory size without killing any process because it's production server.So there are many background process is running. And,Is there any guides to learn why Memory is raise d so high and how to reduce it.
OK. I give up and need help. Hopefully it's something minor ...
I have a dataflow which returns email addresses to a recordset.
I pass this recordset into a ForEachLoop configuring the enumerator as (Foreach ADO Enumerator). I also map the email address as a variable with index 0.
I then have a Execute SQL task which receives this email address as a varchar variable (parameter 0) which I then use in my SQL command to limit the rows returned. I have commented out the where clause and returned all rows regardless of email address to try to troubleshoot this problem. In either event, I then use a resultset to store the query result of type object and result name 0.
I then pass this resultset into a script variable to start parsing the sql rows returned as type object. ( I assume this is the correct way to do this from other prior posts ...).
The script appears to throw an exception at the following line. I assume it's because I'm either not passing in the values properly or the query doesn't return anything. However, I am certain the query works as it executes just fine at the command prompt.
My intent is to email the query results to each email address with the following type of data by passing the parsed data from the script to a send mail task. Email works fine and sends out messages but the content is empty. I pass the parsed data as string values to the messagesource and define the messagesourcetype as a variable in the mail task.
part number leadtime
x 5
y 9
....
Does anyone have any idea what I might be doing wrong?
Hello, I understand that we should use SSMS -> Server Properties -> Memory to put a cap on the SQL server memory usage, therefore it gives some space memory for OS, this is based on the fact if the max memory is not specified, SQL will use whatever available memory and eventually crash the system.
My question is that when a server has SSIS and SSAS services installed along with the SQL service. Would the max memory setting covers the SSIS and SSAS memory usage, or the SSIS and SSAS has to shared the memory with OS?
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
sql server 2000 is running on windows server 2003 ... 4gb of memory on server .... 2003 was allocated 2.3gb nd sql server was allocated (and using all of it) 1.6gb for total of approx 4gb based on idera monitor software ... all memory allocated betweeen the OS and sql server .... then 4 more gb of memory added for total now of 8g ... now idera monitor shows 1.7gb for OS and 1.0 gb for sql server ..... 'system' info shows 8gb memory with PAE ... so I assume that the full 8gb can now be addressed .... why are less resources being used now with more total memory .... especially sql server ..... i thought about specifying a minimum memmry for sql server but i amnot convinced that would even work since it seems that this 1gb limit is artificial .... it it used 1.6 gb before why would it not use at least that much now ??
I've a database with a memory optimized filegroup on it. How can I remove it?I have removed the memory optimized table I had on it, but when I try to remove the filegroup I receive an error.
A user was created with a limited privilege under the USERS group. Once this user loged in the Report Manager he is acting like an Admin and Content Manager, though he is not given even a browser role.
What do u think that this guy is acting like a Super User evenif he is restricted to a browser role on the Report Manager ????????????
On one of our machines, all of the SQL Server 2000components except for the main Server component (SQL Servercore) itself were installed (Management tools, etc) a while agoand everything was running fine. Now I go and add/install theServer component and then Service Pack 3a.It seems that Service Manager won't start up (I get an hourglass cursor)and now I find that Enterprise Manager won't run as well. No errormessages appeared and I don't think I saw anything unusual inthe log file.However, I can use Enterprise Manager on a differentmachine and connect to the database (so the databaseitself seems to be running).Any suggestions as to what the problem might be and how tofix it? I like to see if I can repair this without havingto do a reinstall.Thanks.PF
IF someone can assist me. Everytime I load up enterprise manager the service manager turns off. And the enterprise manager can't connect to the local database. But everytime i turn it back on and try to connect again it shuts it off and around and around we go. Help would be appreciated. Thanks.