SQL Server Admin 2014 :: Analysis Services Not Taking Allocated Memory?
Mar 14, 2014
We have run into an issue on a dedicated SSAS 2012 SP1 server where the allocated memory is not being utilized, causing some slowness in use, connections, and queries.
Total Memory on the server is 512, and after startup, the utilized memory gets up to a max of 60GB and stops there. Checking the Resource Monitor, msmdsrv.exe is only taking around 39GB overall. With the current properties, that should be at 330GB. Am I missing something in the settings or in configuration that should be changed?
Version: SQL Server 2012 SP1 Enterprise (11.0.3000)
OS: Windows Server 2012 Datacenter - Fully patched and up to date
Databases: 2 Tabular models
Server: 512GB RAM
Current memory configuration:
Hard Memory Limit - 0 (Default)
LowMemoryLimit - 65% (Default)
TotalMemoryLimit - 95% (Default is 80)
VertiPaqMemoryLimit - 60% (Default)
VertiPaqPaginingPolicy - 1 (Default)
MemoryHeapType - 2 (Default)
View 2 Replies
ADVERTISEMENT
Jul 24, 2014
Is it possible to have Analysis Services in both modes or are they mutually exclusive?I have a machine setup with Multidimensional AS and would like to know if it's possible to add a Service in Tabular mode.
View 2 Replies
View Related
Mar 5, 2014
My database server memory utilisation is growing faster from past 1 week. it remained same for 1 week around 55% and now it is going to 70% and increasing.
Total OS memory is 32GB and I kept cap for sql server memory upto 29GB. Dont know what to do..
View 9 Replies
View Related
Nov 11, 2014
Is there a method of forcing existing tables into the in-memory filegroup so the table data can benefit from in-memory processing.
View 7 Replies
View Related
Oct 4, 2015
I want to create a lot of index for my database for performance.
But I need find memory usage by indexes.
How to find memory usage by index in sql server?
View 1 Replies
View Related
Nov 26, 2014
We are planning to 2014 migration in few days.
ServerA----- ServerA1
ServerB---- ServerB1
In serverA we have 5databases. And making 5databases as availability group. The replica is ServerA1
In server B we have 3 databases. And the making those 3 databases as an availibility group. The secondary replica is ServerB1.
What is the best option to configure the quoram drive in this situation.
Also Server A1 & Server B1 also we use for reporting purposes.
We have some sensitive data. Is it possible to delete the data while reading the data?
How the memory optimization feature work with always on?
View 8 Replies
View Related
Jun 11, 2015
How do i find Total allocated space and used space of a memory optimized filegroup?
use memory_optimized_db
Go
select (SUM(size)*8.0)/1024.0 as Space,
FILEGROUP_NAME ( data_space_id ) , type_desc from sys.database_files
group by data_space_id,type_desc;
above query gives "current used size of the container " of memory optimized file group but doesn't give Total space detail.
View 0 Replies
View Related
Sep 21, 2015
I'm working on a large scale project that is currently in production. We have a big process that recently changed to use In-Memory Tables with SQL 2014 for performance efficiency.
The Process uses:
51 In-Memory SQL Tables.
50 Stored Procedures (not native) that loads data(Insert) from about 150 regular Tables and IM tables.
300 Validations (short stored procedure not native) Selecting from those 50 In-Memory Tables (And insert to In-Memory table that save the validation errors if exists on In-Memory table).
At the end of this process we clean the table from the data that relavnt to etch prosses(DELETE FROM WHERE)
B.T.W
No UPDATE STAT on In-Memory are used-when we test the prosses it slow as down and cause some locks.
We are calling this process from ADO.Net, loads stored procedure first and then validations, each SP use different SQL Connection. In normal use, everything works fine and takes about 1.5 second.
Under stress test (6 Clients X 100 Tasks) for 30 minutes. After several minutes we are starting to get this SQL Exception (1 SQL Exception for every 20 tasks):
41301. A previous transaction that the current transaction took a dependency on has aborted, and the current transaction can no longer commit.
Transactions in Memory-Optimized Tables
The Exception is not clear. We are not using BEGIN TRANSACTION in the process. The SQL Exception occurs in different stored procedures each time.
View 2 Replies
View Related
Oct 2, 2014
I have the following setup:
- An MSSQL 2014 Standard server that houses multiple small databases (in excess of a hundred).
- These databases are frequently dropped and restored by an application that uses this SQL Server.
- There is a business need for this setup at this time, so I can't get away from it. Therefore answers like "don't have so many small databases that are frequently dropped and restored" would be somewhat unuseful
This is the problem I have:
- When I connect SSMS 2014 to the server and expand the "Databases" node, it takes forever to display. In comparison, SSMS 2008 connected to SQL 2008R2 server with the same number of databases displays the Databases tree very quickly.
I ran a trace to see what exactly SSMS 2014 is doing. When the "Databases" node is expanded, it runs a query that checks each database for Memory-Optimized Tables (new and wonderful feature of SQL 2014 for sure, but I'm not using it, at least yet). Naturally, when you have to loop through over a hundred DBs, it takes time. Worse yet, if one of these DBs is in process of being restored, the query sits and waits to time out before proceeding to the next DB. Sometimes this causes outright timeouts. Here is the query:
use [MyDatabase]
SELECT
ISNULL((select top 1 1 from sys.filegroups FG where FG.[type] = 'FX'), 0) AS [HasMemoryOptimizedObjects]
To be sure, this is NOT a SQL Server performance issue. This server processes a rather heavy workload and has been doing so for over a month, and the workload completes within expected time limits or better. Even so I've done some basic performance measuring, and the server itself is quite all right.
Moreover, if I connect SSMS 2008 to it, I get an error message (Index out of bounds or somesuch), but SSMS 2008 does connect, and displays the Databases tree much faster than SSMS 2014.
I'd like to turn off the option to check for Memory Optimized Objects altogether, as I'm not using the feature.
View 3 Replies
View Related
Aug 26, 2013
I try to load data into a memOpt table (INSERT INTO ... SELECT ... FROM ...). The source table has a size about 1 Gb and 13 Mio Rows. During this load the LDF File grows to size of 350 GB (until the space if the disk is run out of space). The Server has about 110 GB Memory for the SQL Server reserved. The tempdB doesn't grow. The Bucket Size in the create statement has a size of 262144. The Hash key as 4 fields`(2 fields have the datatype int,1 has smallint, 1 has varchar(200). ) The disk for the datafiles has still space for the datafiles (incl. the hekaton files).
How can I reduce the size of the ldf files during the load of the data ?
View 9 Replies
View Related
Apr 24, 2014
SQL Server In Memory OLTP 2014?
View 6 Replies
View Related
Oct 9, 2015
I have the following SQL FCI configuration:
NODE1 -256GB
INST1 - 64GB min/64GB max
INST2 - 64GB min/64GB max
NODE2 - 256GB
INST3- 64GB min/64GB max
INST4- 64GB min/64GB max
With this configuration and if all instances are running on the same node there will be enough memory for them to run. Knowing that normally i ll have only 2 instances in each node wouldnt it be better the following config?
NODE1 -256GB
INST1 - 64GB min/128GB max
INST2 - 64GB min/128GB max
NODE2 - 256GB
INST3- 64GB min/128GB max
INST4- 64GB min/128GB max
With this configuration and in case all the instances (due to a failure) start running on only 1 node, SQL will adjust all instances to just use Min memory specified?
View 6 Replies
View Related
Aug 1, 2007
One of our servers has 20GB of memory and SQL server has been allocated 18.5 GB out of it, however SQL Server only uses about 9GB or so.
OS: Windows 2003 Server
MS SQL Server 2000 SP4
awe enabled
min & max server memory (run values) are: 18432
/PAE switch in boot.ini
any help why SQL server is not using 18.5 GB allocated to it, would be greatly appreciated?
View 8 Replies
View Related
Dec 8, 2014
I have a SQL server box running 2014 reporting services. I have another server running IIS v8.
I would like to be able to connect to the IIS site and be given the SSRS report browser.
So externally if I browse to [URL], I am presented with the report server interface, the same as if I browse to http://xxx.xxx.xxx.xxx/reports internally.
What are my options?
View 4 Replies
View Related
Sep 8, 2014
I've got reporting services on a different box from the database and I can see all the reports, but when I try to setup a subscription, I get this weird error:
The SQL Agent service is not running. This operation requires the SQL Agent service. (rsSchedulerNotResponding)
The same error happens when I connect to the database server via management studio and try to run a job.
I can confirm that SQL Agent service is running.
View 1 Replies
View Related
Oct 6, 2014
I've migrated a server with SQL Server 2008R2 and Reporting services into a new box with SQL Server 2014, but forgot to change the timezone to the correct one. I've changed it later, but it seems like the reports are running by the old default timezone. The schedule says that the report should run at 6:30am, but the Last Run column shows 8:30 PM.
I need to fix it without manually updating each subscription with some date/time conversion.
View 1 Replies
View Related
Apr 30, 2015
I am searching for query to find total memory allocated to sqlserver, Â and how much being used utilized as well cpu utilization percentage .Â
View 10 Replies
View Related
Jul 19, 2015
When I try to install sql server 2014 on my machine,all the features were installed but except Reporting services-Native and error occurred as shown below.
Error details:
§ Error installing SQL Server Reporting Services
Updating permission setting for folder 'C:WindowsTemp' failed. The folder permission setting were supposed to be set to 'DA;OICI;0x1200af;;;S-1-5-80-4063824523-3130906261-2263067808-2545249320-213050741)'.
Error code: 0x84CF0003
View 8 Replies
View Related
Apr 14, 2015
I need to migrate Reports from a SQL Server 2008 R2 Server to SQL Server 2014 Reporting Services.
Reporting Services is installed on one Server and the Reporting Services is installed on another Server.
On the existing Server the Encryption was never backed up.
What do I do with the key?
The following article provides some good information but it does not identify what to do with the Destination URL.
It mentions to download the files but it does not specify how to identify the path and it does not state where to upload them.
Do you restore the existing Reporting Services Databases to the new Server?
[URL] .....
View 2 Replies
View Related
Feb 11, 2015
I am running into a weird issue with a new SQL Reporting Services 2014 server I built. I installed SQL Reporting 2014 on Windows Server 2012 R2 and configured Kerberos, but the site is extremely slow. After some reconfiguration and log captures I have determined the issue has to do with the Kerberos setup, however I am running a similar configuration with SQL Reporting Services 2008 on Windows Server 2008 R2 and do not run into the same errors.
The error I see while using Wireshark is KRB Error: KRB5KDC_ERR_BADOPTION NT Status: STATUS_NO_MATCH. When I drill down the into the error I can see the kerberos string is testprjmnmtreports14.company.com, which is the URL we are using to access the site. I made sure to add that name as an SPN for the service account that is running SQL Reporting Services, however I still receive the error.
Then I tried configuring the site to run without a hostheader, so I accessed the site with the server name, ECTSTSQLRS5, and the site works perfectly fine, no errors are reported either. So it seems I have isolated the issue down to Kerberos but I am not sure how to resolve it. Here is some more information about my environment:
DNS/URL used: testprjmnmtreports14.company.com
Server Name (FQDN): ECTSTSQLRS5.company.int
AD Domain Name: company.int
Server Version: Windows Server 2012 R2
AD Functional Level: 2008 R2
As you can see I am trying to use a .com address but my AD domain is .int which I think is the issue, but I do not have the same problem on my other server that is running Windows Server 2008 R2. What do I need to do to allow my new site on 2012 R2 to work with this DNS Alias?
View 0 Replies
View Related
Aug 27, 2015
I want to set up a database role so that users can use sp_readerrorlog through SSMS. It does a check on membership in the securityadmin role.
I have tested it and can see you can grant execute on xp_readerrorlog but the SSMS GUI uses sp_readerrorlog.
I thought I could create a user/certificate and add the signature to sp_readerrorlog but it's not permitted (likely because it's not a normal database object).
So the other solution is to add the users to the securityadmin role but then explicitly deny alter any login (best done with a custom server role in 2012+ but otherwise just manually in 2008). I tested this out and it works, I'm not able to alter any logins or increase my own permissions, I also did a check of what's reported from fn_my_permissions(null, null) and it shows minimal permissions like I'd expect.
View 0 Replies
View Related
May 17, 2007
I recently changed the max. memory option in SQL from 24 GB to 30GB but the perfmon counters still only show 24 GB. Any ideas on why it is not recognizing the change? The server has Win 2003 EE and 32 GB of RAM.
View 5 Replies
View Related
Feb 3, 2006
I am running SQL Server 2000 on a desktop PC. Just recently upgraded my PC to 2gig of ram from 1 gig, in part, to try and fix the problem below. Didn't work. Have SP3 installed but not SP4 at this point.
When I open up query analyzer and edit some code, regardless of whether I actually execute the code, SQL server eventually sucks up available and cached memory to the point that my system comes to an effective halt --takes forever to do anything either in SQL server or other applications. In the task manager PF Usage climbs to just over the 2gig memory limit.
I conceptually understand the dynamic memory operation of SQL server ... but why is it sucking up most available memory when nothing is executing?
Is there a way I can release/clear the memory? Ideally, code that coould be run under a stored procedure would best meet my needs. Right now, I am "fixing" by shutting down and then restarting SQL server.
Thanks,
View 8 Replies
View Related
Jul 5, 2006
On a given Analysis Server the machine level OLAP Administrators group controls which users have admin access to AS Databases and Cubes on that machine. From everything I have read, if you are in the OLAP Administrators group you have full access to administer ALL the databases and cubes.
We have a need to create a OLAP database and grant a few users (a role) full admin access to create and maintain datasouces and cubes within that database but NOT allow them admin access to the other existing databases and cubes on the server. This seems like such a common requirement. Has anyones else encountered and resolved this issue.
Thanks,
Tony
View 2 Replies
View Related
Oct 12, 2007
Hi,
I have a problem with Microsoft Analysis services, we already configure the boot.ini and the registry to use 3GB in RAM, but 2 or 3 times in a day we should restart the services to clean the memory, as you can see that€™s not acceptable. The application that use the OLAP is OutlookSoft . Here the scenery
SO. Windows Server 2003 Enterprise Edition with SP 1
RAM 4 GB
Analysis services 2000 with SP 4
I hope you can help me with any idea.
Thanks
View 1 Replies
View Related
Oct 9, 2013
I'm trying to move specific data from three linked tables on a source database to three tables on a destination database by generating dynamic SQL INSERT scripts but am getting stuck on the last set of INSERT statements. I don't think I can use SSIS because the source autonumber fields may already exist on the destination side but I could be wrong.
For simplicity, Table 1 (T1) has one autonumber key of PK1 as well as other fields (A1, B1, etc.).
Table 2 (T2) has one autonumber key of PK2 and a foreign key (FK1) that links back to PK1 as well as other fields (A2, B2, etc.).
Table 3 (T3) has one autonumber key of PK3 and a foreign key (FK2) that links back to PK2 as well as other fields (A3, B3, etc.).
Like this:
T1: PK1, A1, B1, etc.
T2: PK2, FK1, A2, B2, etc.
T3: PK3, FK2, A3, B3, etc.
So, I'm able to query the source database T1 to generate my dynamic SQL INSERT statements that will be run on the destination side. I'm also able to generate my dynamic SQL statements to insert into T2 but when I get to T3 I currently am stuck figuring out how to insert because the destination side is unable to match it's FK2 to the just inserted PK2. It throws the below error.
Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression.
View 3 Replies
View Related
Apr 10, 2006
I have a custom .net application that uses SQL 2000 server. All users are compaining performance issues and white-outs while they are using the application. I am almost certain that it's the SQL server that is the curprit. All the other components involved in the application hardly has any CPU or memory usage when I check the performance in the task manager.
On SQL server, I see that the process sqlserver.exe is taking like 2.8GB of memory. Is there a way to find out which exact SQL query or process is taking so much of memory? I belive there may be a bad SQL process that is stuck and taking all the memory? Is there a way to find out?
Thanks
View 3 Replies
View Related
Mar 28, 2006
I have created a SSIS package that reads 500 text files splits them into 4 raw files then reads them again and writes then to 4 database tables different Tables.
The reason form this is that my raw files have multiple types of records in them and it is only 1 Coolum. I split this out into the different types of records and load whole rows into the database.
ie input 1 txt file
<T6>
1:1000178
3:18148821-00
5:40204043
6:1
17:EX201036259NZ
25:0000304862
</T6>
<T1>
1:18148821-00
</T1>
<T5>
1:1511313
4:18126485-00
8:2006032510230300
17:EX201033399NZ
</T5>
<T6>
1:1511158
3:18084863-00
5:40617044
6:1
17:EX201033969NZ
25:0000302981
</T6>
End up begin rows in the T6 Table
1000178 18148821-00 40204043 1 EX201036259NZ 0000304862
1511158 18084863-00 40617044 1 EX201033969NZ 0000302981
T5 Table gets a new record
1511313 18126485-00 2006032510230300 EX201033399NZ
and T1 Table get a record
18148821-00
Anyway all this works find but I find that the DTExec process work fine until it has used up all the memory in the laptop in general it take 400megs to run this SSIS. I'm wondering am I missing something like don't run in a transaction. I know in the old DTS you could commit on each package and how do I turn all logging off eg what you see in the DOS box (can I do this?) would love some help on this and if anyone want a copy of this ssis package ie your trying to do the same then I'm more than happy to email it.
View 5 Replies
View Related
Mar 5, 2015
I have, a SSAS 2012 tabular instance with SP2, there is a database on the instance with a read role with everyone assigned permissions. When configuring the Power BI analysis services connector, at the point where you enter Friendly Name, Description and Friendly error message, when you click next I receive the error "The remote server returned an error (403)." I've tested connecting to the database from Excel on a desktop and connect fine.I don't use a "onmicrosoft" account so don't have that problem to deal with.
We use Power BI Pro with our Office 365. As far as I can tell that part is working ok as I pass that stage of the configuration with a message saying connected to Power BI.The connector is installed on the same server as tabular services, its a Win2012 Standard server. The tabular instance is running a domain account that is the admin account for the instance (this is a dev environment) that account is what I've used in the connector configuration. It's also a local admin account. There is no gateway installed on the server.
View 10 Replies
View Related
Apr 14, 2015
I inherited a lot of Servers to upgrade to 2014 to include an SSRS Server.
The encryption Key was never backed up and it seems that no one knows what the password is?
Do I have to manually load the reports? There are a lot of Reports.
[URL]
View 4 Replies
View Related
May 2, 2007
Hi,
Situation: I have a web application that is running on a server. IT has told me that the application I developed is taking up enormous amount of ram. It seems that the amount of ram that is taken up, is slowly increasing. About 10 megs per hour (memory leak). Furthermore, he has informed me that the ram is taken up by the sql. My senior developer told me that I probably have sqlconnections open that are not closed. I have checked through the application and it seems they are all closed properly. But I may have missed some, as the application is pretty big.
Questions:
1. Is there an application or piece of code that can monitor how many sql connections I have open during runtime?
2. Could there be any other common causes to such a kind of memory leak.
Any help with this matter would be very much appreciated.
Thank you for your time.
Sincerely,
Jeff
View 7 Replies
View Related
Sep 26, 2013
I want to use BCP to load data from a text file.
By default, constraints are turned off in bcp, so I use the CHECK_CONSTRAINTS hint.
bcp aborts if ANY of the rows contains a FK violation. No data get loaded.
So if I add the -b 1 batch size option, it loads all data UNTIL the first FK violation, but nothing after that.
I want to load EVERYTHING ... except for the violations. But bcp won't let me. Is there a way?
View 2 Replies
View Related
Jan 13, 2014
If I install an instance with Windows Only authentication, and then change it to Mixed Mode, if I enable the sa login, the password has already been set. What is the default? If it's generated, how secure is it? Is the password generated? What algorithm is used for that?
View 9 Replies
View Related