SSAS: Processing Cube Hangs

Aug 4, 2006

Hi,

I have a problem with processing my cube. My fact table (with telephone data) contains about 400,000 records... which is increasing rapidly (400,000 records is about 8 months of data)...
I have a few dimensions:
Dimension User: about 200 records
Dimension Line: about 200 records
Dimension Direction: 4 records
Dimension Date: 365 records for each year
Dimension TimeInterval: with 24 records

So far so good... when I process this dimension I have no problem....
However, when I add a dimension (CalledNumber, with exactly 101 records) the processing hangs as soon as it starts...

The SQL performed when processing the cube looks like this:

SELECT field1, field2,... fieldn
FROM table1, table2,.... tablem
WHERE
(table1.id=table2.table1id)
AND
(table2.id=table3.table2id)
...


When I execute above SQL in the Query Analyser from SQL Server Enterprise Manager, it ALSO hangs...

I am not really suprised by that, because this SQL first create a huge table of 400,000 x 200 x 200 x 4 x 365 x 24 x 101 records and after that works through the WHERE statements to filter out the appropriate records.

for me it would be more logical to use the following code to process the cube, but that cannot be changed in Analysis Manager:

SELECT field1, field2,... fieldn
FROM table1
LEFT JOIN table2 ON (table1.id=table2.table1id)
....
LEFT JOIN tablem ON (tablem.id = tablem-1.tablemid)


When I execute above SQL in the Query Analyser from SQL Servel Enterprise Manager, it does NOT hang, but performs the query in about 35 seconds....
But Analysis Manager does not allow me to change the SQL used for processing the cube...

What can I do to add more dimensions to my cube... (It will be more anyway after adding the CalledNumber dimension)??
any suggestions?

PS. forgot to mention: I use Sql Server 2000

View 1 Replies


ADVERTISEMENT

SSAS: Processing Cube Hangs

Aug 4, 2006

Hi,

I have a problem with processing my cube. My fact table (with telephone data) contains about 400,000 records... which is increasing rapidly (400,000 records is about 8 months of data)...
I have a few dimensions:
Dimension User: about 200 records
Dimension Line: about 200 records
Dimension Direction: 4 records
Dimension Date: 365 records for each year
Dimension TimeInterval: with 24 records

So far so good... when I process this dimension I have no problem....
However, when I add a dimension (CalledNumber, with exactly 101 records) the processing hangs as soon as it starts...

The SQL performed when processing the cube looks like this:

SELECT field1, field2,... fieldn
FROM table1, table2,.... tablem
WHERE
(table1.id=table2.table1id)
AND
(table2.id=table3.table2id)
...


When I execute above SQL in the Query Analyser from SQL Server Enterprise Manager, it ALSO hangs...

I am not really suprised by that, because this SQL first create a huge table of 400,000 x 200 x 200 x 4 x 365 x 24 x 101 records and after that works through the WHERE statements to filter out the appropriate records.

for me it would be more logical to use the following code to process the cube, but that cannot be changed in Analysis Manager:

SELECT field1, field2,... fieldn
FROM table1
LEFT JOIN table2 ON (table1.id=table2.table1id)
....
LEFT JOIN tablem ON (tablem.id = tablem-1.tablemid)


When I execute above SQL in the Query Analyser from SQL Servel Enterprise Manager, it does NOT hang, but performs the query in about 35 seconds....
But Analysis Manager does not allow me to change the SQL used for processing the cube...

What can I do to add more dimensions to my cube... (It will be more anyway after adding the CalledNumber dimension)??
any suggestions?

PS. forgot to mention: I am using Sql Server 2000

View 2 Replies View Related

Cube Processing Hangs When Rows &&> 35

May 21, 2008



Hi! I have a problem with cube processing. When i am processing a cube with 5 million rows it will stop responding and the process on the SQL server has been suspended with the waittype ASYNC_NETWORK_IO. The waittime changes both up and down.

However if i change the the view(the cube is getting the data from a view that gets the data from ONE table) to only return up to 35 rows it works. However, 40 and it will go into suspended again. And i can let it run for several hours without it finishing. Other cubes in the same database with more rows works fine.

If i just run the query that is used when processing in Management Studio it works fine.

I have SP2 both on the AS and DB server.

The DB server looks like this:

4GB ram
2 x 2.80 dualcore intel xeon

The AS server:

16 GB ram
2 x 3.00 intel xeon X5450 quadcore


Any ideas?

Edit:
And the AS service runs at 13%, in task manager it has one processor at 100 %.
SQL Server Profiler shows no activity on the AS server.

View 1 Replies View Related

Error While Processing Cube In SSAS 2008

Nov 4, 2009

i am getting the following error when i am processing the cube in SSAS 2008...Errors in the back-end database access module. The size specified for a binding was too small, resulting in one or more column values being truncated. Errors in the OLAP storage engine: An error occurred while the 'Policy Type' attribute of the 'Policy Type' dimension from the MyDemo' database.i verified the datatype column length for policytype column in the dimension as well as all fact views.

View 9 Replies View Related

Analysis :: SSAS 2012 - Notification While Cube Is Processing

Apr 16, 2015

I am working on SQL 2012.We have a SSAS Cube build. On top of it client use Excel to connect to SSAS Cube and See the reports.My Cube process every hour and take almost a 1-2 min to Process.When ever End user/Client See or refresh the Excel report (Which use Cube as Source),WHEN CUBE IS PROCESSING ,they get an error that Source is not available

We Have tried best but cannot bring down the Processing time of the cube to 3-5 Second , so that End user don't face report refresh issue at the moment of cube processing 

Requirement :In Case End User see the report while Cube is processing from back end , Instead of Error they should see some customize Msg which we can provide some thing Like "Data is Refreshing , please wait ".

View 5 Replies View Related

Cube Processing Hangs, MSSQLServerOLAPService Crashes On 2005 SP2 Build 3054

May 21, 2008

Hi,

*every processing* of a cube with 7 dimensions 5 measure groups (no table has more than 100.000 rows) hangs while trying to process the measure groups. I have to restart the Analysis Service all the time. But even worse every 2nd processing crashes the whole Analysis Services Process.

The only time I could sucessfully deploy a cube was with 1 measure group only.

How can anyone do any serious work with Analysis Services?

The error messages from evenlog are not helpful either.

Thanks,
Tobias



EventType sql90exception, P1 msmdsrv.exe, P2 9.0.3054.0, P3 46048069, P4 msmdsrv.exe, P5 9.0.3054.0, P6 46048069, P7 0, P8 005e9430, P9 00000000, P10 NIL.

The description for Event ID ( 22 ) in Source ( MSSQLServerOLAPService ) cannot be found. The local computer may not have the necessary registry information or message DLL files to display messages from a remote computer. You may be able to use the /AUXSOURCE= flag to retrieve this description; see Help and Support for details. The following information is part of the event: Internal error: An unexpected exception occured..

View 1 Replies View Related

SQL 2012 :: SSIS Processing Of SSAS Multidimensional Cube Fails?

Oct 31, 2012

All I get back is an error message of "Analysis Services Processing Task Error: A Connection cannot be made. Ensure the Server is running" The server is running, I can process the cube by connecting to the AS instance and right-click processing it.

I can process the cube by running the SSIS task inside of SSDT Just when I deploy the SSIS package (in Project mode) and then execute it do I get the error message.

SQL Server, SSAS, and SSIS processes are all running under the same account. SSAS is on a separate server from SSIS and SQL if that matters.

View 2 Replies View Related

SQL Server Admin 2014 :: SSAS Cube Processing Fail With Communication Link Failure Error

Feb 4, 2015

we have our cubes in Server A and SQL DB resides on Server B (we are on SQL 2014), from last couple of days are cube started failing due to below error:

OLE DB or ODBC error: Protocol error in TDS stream; HY000; Protocol error in TDS stream; HY000; Protocol error in TDS stream; HY000; Communication link failure; 08S01; TCP Provider: An existing connection was forcibly closed by the remote host.
; 08S01

I have been going through some blogs to understand the error but don't find any specific yet.

View 0 Replies View Related

Analysis :: SSAS Partition Processing

Oct 21, 2015

I am facing issue with partition processing. I am having a SSAS cube which is having 5 partitions. These partitions are processed through a sql server job using SSIS packages. In packages I used SSAS process task to do this.Now problem is, job is running successfully and showing that the step which is having partition process also fine.But data is not updating in the partition. While checking the partition properties, it is not updated with recent date and time.

When I try to manually process the partition, it is getting succeeded and recent data is getting reflected with recent date and time.Package configuration is done in job itself.

View 4 Replies View Related

Cube Processing

May 22, 2008



Hi All,

I have 3 cubes in a single SSAS database and these cubes should be processed using the following schedule

Cube 1 - Every Day
Cube 2 - Every Week
Cube 3 - Every Month
Cube 4 - Every Day

The issue that I face is that these cubes share the dimensions and so I cant do a FULL process of these SHARED Dimensions as it will affec other cubes.

I can expect additions and deletions to my dimension data , but the structure remains the same. It would be great if someone can suggest how to go about processing the dimensions. I am confused with the number of options(Process Incremental, Process Update etc.,) available for processing the dimensions.

I will creating a SSIS package to automate the processing. One more question is say, if Cube 2 fails during a day and Cube 1 has succesfully processed on the same day earlier, how do I revert back to the old state of Cube 2? Does this mean that I need to do a back up of the SSAS database before processing each cube?

Apologies if I had missed anything.

Thanks!

View 1 Replies View Related

Client Able To Access SSAS Cube Only Once

Sep 6, 2007

I have an excel workbook, that has a pivot table in it where the data source is an OLAP cube. My problem occurs on the client machine, logged in as the client. I remote into the PC, and create a pivot table using the OLAP cube connection. I create the pivot table and everything works fine - I am able to browse the data with no issues. Once I exit the Excel Workbook, and come back in - I am no longer able to connect to the datasource. I have tried both saving the password in the connection and not saving it. It has made no difference.

View 1 Replies View Related

Analysis :: Using Calculations In SSAS Cube?

May 21, 2015

how can use this mdx script in the calculation part of a cube, will i simply dump it in the script form by starting with the 'create member current cube.

[measures].[test]'
select 
[measures].[abc] on 0,
[xyz].[xyz].(&0):[xyz].[xyz].(&60) on 1
from
(
select
(tail([month].[month].[month].members,6))on 0
from
[cube])

View 3 Replies View Related

Dynamically Add Partitions To A SSAS Cube

May 9, 2007

The environment here is SSIS ETL feeding a Fact Table. The Fact Table is pulled into SSAS as a cube and reporting services are handled there. I am on the ETL side and don't pretend to know all the processing that happens with cubes, etc.



What we are trying to accomplish is to add partitions to a Cube via the ETL processing. The partitions should be incremented by Day, i.e. 20070501, 20070502, etc.



This is currently processed manually by the Reporting developer and we are looking for an automated process to reduce errors and hand work.



I have explored the following objects: Partition Processing DataFlow Destination and did not find much documentation or examples on it's use. If you have any information on this stage, please reply with such.



The other option is the Analysis Services Processing Control Flow. I understand that we can process Analysis Server objects as part of our package. Is there a way to incorporate a Partitioning Script in this object? If so, how. Again, I did not find detailed examples on the use of this object.



If you have experience with either of these, please feel free to reply. I appreciate any and all comments.

View 5 Replies View Related

Analysis :: More Than One Cube In One SSAS Database

Dec 13, 2010

Is it possible to have more than one cube under one SSAS database? For example I have a database "Test" and in this the cube exist is "TestCube", is iit possible to deploy another cube "TestCube2" under the Test databse?

If yes then what is the process to do that, the reason I am asing is there are some common dimensions used n both the cubes and I am not sure what is the best way so that I can use the shared dimension?

View 6 Replies View Related

Processing Cube Error 211

Aug 14, 2000

When I process a new cube I recieve an error "Error (211): Unknown dimension member ' 8'; Time: 8/11/00 1:09:45 PM". Any ideas about what this error is about ?

View 2 Replies View Related

Cube Processing Error

Apr 14, 2004

Hi,
I am trying to Process a analysis server Cube and I am getting an error message saying

syntax error converting the varchar value A.H to column of type int.; 22018.

Can somebody tell me how to remove this error.

Thanks,
Praveen

View 2 Replies View Related

Cube Processing Hanged!

Jan 18, 2008



I have a cube which can be processed in our development server. However, when it is deployed to the UAT server, the cube processing was hanged. What I have done then:

Checking sys.sysprocesses found that there are 8 threads suspended with waittype=CXPACKET. The threads are all reading the transactional database.


We then searched from web that CXPACKET is related to parallel query processing. So we have done the following:


Select 'Sequential' option in processing -> result is the same with CXPACKET.

Select 'Parallel' with parallel thread set to 1 -> result in one thread with ASYNC IO wait type

Select 'Parallel' with parallel thread set to 2 -> result in CXPACKET.

Change the number of CPUs SQL Server use from 8 to 1 -> result in one thread with ASYNC IO wait type

Change the number of CPUs SQL Server use from 8 to 2 -> result in CXPACKET.

Change 'Maximum parallel query' in SQL server = 4 -> result in CXPACKET.
All the trials result in hanged status

Difference between development server and UAT server:
1. CPU : dev = 2, UAT = 8
2. SQL Server Version : dev = 2005 SP1, UAT = 2005 SP2
3. database size : dev > UAT


Anyone has the problem and solution before? Please share. Thanks.

View 1 Replies View Related

Error Processing A Cube

May 16, 2008



I just started getting a new error message to me:

Errors in the high-level relational engine. The data source view does not contain a definition for the 'WeightRecieved' column in the 'dbo_factPurchases' table or view.





The problem is WeightReceived IS defined in my DSV so I don't know what to do about this error.

Any suggestions?

View 4 Replies View Related

Analysis :: Processing SSAS Tabular Model Automatically

Oct 18, 2012

how can i automatically update my tabular model in the future when there's an update in my database.

View 4 Replies View Related

Analysis :: SSAS Tabular Model Processing Error

Nov 4, 2015

I get the following error while processing a SSAS tabular model (2014) on a new server.The SSAS service on this server is running under a login which has access to the SQL server data sources. I tried changing the provider to OLEDB from SQLCLNI11 in the connection string but that doesn't work too. The error message isn't useful to debug further.

The cube processing succeeds on a different server. I scripted out the cube DB and ran it on the new server and am trying to process full but it fails with the following error.

Error Message:
The operation failed because the source database does not exist, the source table does not exist, or because you do not have access to the data source.

More Details:
OLE DB or ODBC error: A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections.

For more information see SQL Server Books Online.; 08001; SSL Provider: No credentials are available in the security package

; 08001; Client unable to establish connection; 08001; Encryption not supported on the client.; 08001.
A connection could not be made to the data source with the DataSourceID of 'd7a37dae-be87-44e0-a8b2-498069af82c9', Name of 'connection name'.
An error occurred while processing the partition 'XXXX_460f3467-1a99-4dc9-aaf2-bcf3d54a5c4c' in table 'XXXX_460f3467-1a99-4dc9-aaf2-bcf3d54a5c4c'.

The current operation was cancelled because another operation in the transaction failed. The operation failed because the source database does not exist, the source table does not exist, or because you do not have access to the data source.

View 3 Replies View Related

SQL Server 2005 SSAS - Refresh Cube

Nov 12, 2007

Hi guys,
This is my first post. Could anyone help me out? I am trying to refresh a cube that i created by using a new script command but i get the following error "the script contains the statement, which is not allowed." On the microsoft page i've read that for refreshing you need to create a new script. Is there another way for refreshing the cube?

Cheers

View 5 Replies View Related

Analysis :: Add Population Data In SSAS Cube

Aug 7, 2015

I want to implement population data in sales cube.

Fact table has customer code which is foreign key of Customer master dimension which in turn is linked to census data dimension. Census data dimension have city wise population data having foreign keys of zone and state.

We want to add population data in fact table.

View 3 Replies View Related

Analysis :: ABF File - Restoring SSAS Cube

Oct 25, 2010

I have a .abf file, which I am attempting to restore. I go to Management Studio and attempt to restore the cube.

However, whenever I attempt to restore the following error message occurs:

"File 'C:/.......' specified in restore command is damaged or is not an AS backup file.

The following error occured:

Access is denied (Microsoft SQL Server 2008  R2 Analysis Services)

View 4 Replies View Related

Analysis :: Deploying SSAS Cube With Different Name On Same Server

Jun 21, 2013

How can i deploy an existing cube with a new name on the same server?

View 4 Replies View Related

Analysis :: SSAS Model Where Never Need Cube To Be Offline

May 21, 2015

I am thinking of a possible design where the cube will never go offline.

Usually when I do some code changes on my cube the cube goes offline and I need to Full Process it again to get it back .

However , in cases where the cube is extremely critical for the business users , it would be great if I can deliver a solution where the cube never goes down.

View 4 Replies View Related

Analysis :: How To Connect SSAS Cube To Server

Jun 12, 2015

How can I connect ssas cube to server. I have no SSAS SERVER INSTANCE...

View 3 Replies View Related

Integration Services :: Cube Creation In SSAS

Aug 24, 2015

I need to create 3 fact table in a cube with dimensions ?? is it possible . What are the steps to do.

View 3 Replies View Related

Analysis :: Count Distinct In SSAS Cube?

Sep 8, 2015

Background: I have a Huge fact dimension table(table has both measures and dimensions) that i am using to build a SSAS Cube. 

The table didn't have a unique identifier, so the database team added ROW NUM as a column to the table which i am using as a PrimaryKey in my Cube build. I was able to create a cube successfully with it without any issues.

Problem: Now customers are asking for a 'Claim Count calculation' which shows the Distinct Claim Count. 

Its defined as below :

Count(Distinct Claim_Number || Claim Year || Claim Month)

. All the 3 columns are available in table, but when i am trying to create this Count Distinct Object in the DSV the Cube processing time increased by 5 times, as now i have to use a GroupBy function in my SQL.(There are around 30 columns to group by). 

Is there a better way to achieve this Count(Distinct Claim_Number || Claim Year || Claim Month) without using groupBy in DSV SQL logic? I cant seem to find any Count(Distinct) function in the Cube Calculation functions?

Environment: SSAS 2012 Multidimensional Model

View 4 Replies View Related

Processing Ms Olap-cube Via Jobs

May 29, 1999

Hello!

I'm looking for a possiblity to schedule the processing of a ms olap services cube in a SQL Server Agent job. Has anyone expericiences with that? Are there any alternatives for scheduling the processing?

Thanx, Wiebke

View 1 Replies View Related

Cube Processing Takes Time

Jan 31, 2007

Hi,
cube processing is taking more time in a new server while same cubes takes less time in another server.
the cubes are processed through DTS package
can anybody help finding out the possible reasons for this.
Regards
Naseem

View 5 Replies View Related

OLAP Cube Partition Processing ...

Apr 25, 2008

We have a MS-OLAP cube that has about 11 partitions and I have created a prototype package which processes these partitions conditionally based on expressions that are fed values from a SQL Server control table. It appears that one or more of the partitions seem to fail due to the fact that all of the data for the various partitions come from the same huge fact table. Is there a way to control the level of concurrency within the package itself? If not, I am thinking I should move some of the partitions to process based on other partitions completing their process successfully. Appreciate any help.

View 2 Replies View Related

OLAP Cube Processing Project ...

Apr 28, 2008



I am trying to log the processing time details so that we can identify bottlenecks. My SSIS package has a bunch of OLAP processing tasks. In the Event Handler (onPreExecute and onPostExecute events), I am trying to capture the start and end time for each OLAP processing task by using an "Execute SQL task". In the event handler, I have a conditional expression that checks the following:

@SourceName != @[User::Expression1]

where Expression1 is a variable that contains the value of "Execute SQL Task". This expression I thought would be true only for OLAP processing tasks which btw never fire the OnPreExecute or OnPostExecute events. What am I doing wrong?

View 1 Replies View Related

Analysis :: SSAS - Dimension Processing Failing Due To BitLocker Encryption

May 8, 2015

I'm facing an issue while processing OLAP. I have enabled BitLocker for dirve encryption and OLAP services uses this drive for db storage. OLAP is executing though SSIS package and I'm getting below error in Package. When debugging the script, it says Drive is encrypted using BitLocker. 

My client requires TDE for all databases, for OLAP we decided to use BitLocker: [URL] ....

SQL server is installed on C Drive & D Drive is the storage location for OLAP DB. When locking D Drive, OLAP processing failed. When I tried to restart SQL Server Analysis service in Services.msc it is not starting. Service restarted only when D Drive is unlocked. Is there any way we can process OLAP even when the drive is locked?

Error message is given below:

"The following system error occurred:  This drive is locked by BitLocker Drive Encryption. You must unlock this drive from Control Panel. "

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved