I'm trying to return the EOY (12/31/14) value for measure Average Balance when I pass in a Date current member. This current member may also be at a Month, Quarter or Year level of the Post Date dimension. I've tried:
I am new one in MDX. Our PeriodsToDate function does not return any value. We have set type property of our Date Dimension as time.Actually CURRENT MEMBER does not return a valid value. So our PeriodsToDate function fail.
With MEMBER [Measures].[YTD Actual] AS Aggregate ( PeriodsToDate ( [DimDate].[CalendarHierarchyDateLevel].[Calendar Year] ,[DimDate].[CalendarHierarchyDateLevel].CURRENTMEMBER )
I have managed to use the BI Wizard for time intelligence and added YTD and MTD successfully. I notice the values returned are empty, and I think this is due to the fact that all the test data I use is many years old. What's the simplest way to resolve this issue so that I can see that these MDX functions return correct values? Changing the system date on this company laptop is not an option.
I have, a SSAS 2012 tabular instance with SP2, there is a database on the instance with a read role with everyone assigned permissions. When configuring the Power BI analysis services connector, at the point where you enter Friendly Name, Description and Friendly error message, when you click next I receive the error "The remote server returned an error (403)." I've tested connecting to the database from Excel on a desktop and connect fine.I don't use a "onmicrosoft" account so don't have that problem to deal with.
We use Power BI Pro with our Office 365. As far as I can tell that part is working ok as I pass that stage of the configuration with a message saying connected to Power BI.The connector is installed on the same server as tabular services, its a Win2012 Standard server. The tabular instance is running a domain account that is the admin account for the instance (this is a dev environment) that account is what I've used in the connector configuration. It's also a local admin account. There is no gateway installed on the server.
I have a cube that we are processing nightly via an Analysis Service Processing Task in SSIS. In order to increase the performance of the processing time, we elected to use a lot of rigid dimension attributes, and do a full process of everything in the SSIS task. The issue that I am having is that after that task completes, I need to go into Visual Studio to deploy the cube becuase we are unable to browse or use the cube. This issue seemed to start once we changed the SSIS Analysis Service Processing Task to do a full process on the dimensions, rather than an incremental.
I would expect that once development is done, and it is processed and deployed, that is it. My thinking is that the SSIS task should just update the already deployed cube,
How to right choose key column in"Mining Structure" for Microsoft Analysis Services?
I have table:
"Incoming goods"
Create table Income ( ID int not null identity(1, 1) [Date] datetime not null, GoodID int not null, PriceDeliver decimal(18, 2) not null, PriceSalse decimal(18, 2) not null, CONSTRAINT PK_ Income PRIMARY KEY CLUSTERED (ID), CONSTRAINT FK_IncomeGood foreign key (GoodID) references dbo.Goods ( ID ) )
I'm trying to build a relationship(regression) between “Price Sale” from Good and “Price Deliver”.But I do not know what column better choose as “key column”: ID or GoodID ?
I am connecting to SSAS cube from Excel and I have date dimension with 4 fields (I have others but I don't use it for this case). I created 4 fields in order to test all possible scenarios that I could think of:
DateKey: - Type: System.Integer - Value: yyyyMMdd Date: - Type: System.DateTime DateStr0: - Type: System.String - Value: dd/MM/yyyy (note: I am not using US culture) - Example: 01/11/2015 DateStr1: - Type: System.String - Value: %d/%M/yyyy (note: I am not using US culture) - Example: 1/11/2015
Filtering on date is working fine:
Initially, in excel, filtering on date was not working. But after changing dimensional type to time, and setting DataType to Date, as mentioned in [URL] filter is working fine as you can see in the picture.Grouping on date is not working:
I have hierarchy in my Date dimension and I can group based on hierarchy, no problem. But user is used to pre-build grouping function of excel, and he wants to use that. Pre-build functions of Excel, Group and ungroup seems to be available as you can see in following picture:
But when user clicks 'Group', excel groups it as if it is a string, and that is the problem. User wants to group using pre-build grouping function available in Pivot table. I also find out that Power Pivot Table does not support this excel grouping functionality. And if I understood well, this pre-build grouping functionality of excel, needs to do calculation at run time, and that is not viable solution if you have millions of rows. So Power pivot table does not support pre-build grouping functionality of excel and hence we need to use dimension hierarchy to do the grouping. But I am not using Power Pivot table, I am using simple Pivot Table. So I expect grouping functionality to be working fine. Then I tried to do simple test. I created a simple data source in excel itself. And use it as source of my Pivot table. Then grouping is working fine. The only difference that I can see is (When double click the Measure value in Excel),For date values of my simple test, excel consider them as 'Date'.
For date values of my data coming from cube, excel consider them as 'General'
2.1. But value here is same as it was in simple test.
2.2. 'Date Filter' works just fine.
2.3. If I just select this cell and unselect it, then excel change type to 'Date' though for that cell.
2.4. I have created 4 different types of fields in my date dimension thinking that values of attribute of my dimension might be the problem, but excel consider 'General' for all of them.
2.5 This value (that can be seen when double clicking on measure) comes from 'Name Column' of the attribute. And the DataType defined is WChar. And I thought that might be the reason of issue. And I changed it to 'Date'. But SSAS does not allow it to change to 'Date' giving error : The 'Date' data type is not allowed for the 'NameColumn' property; 'WChar' should be used.
So, I don't know, what is the puzzle piece that I am missing.
1. Date filter works, group does not work
2. Excel consider it as 'General' string.
3. SSAS does not allow to change 'NameColumn' to Date.
I would like to know the best practice for running analysis service in terms of port usage. Is it better to run on a specific port or have dynamic ports ? We have clustered servers that run default on 2383 but not sure with non clustered what's the best way to get performance.
I have a package that I have been attempting to return a error code after the stored procedure executes, otherwise the package works great.
I call the stored procedure from a Execute SQL Task (execute Marketing_extract_history_load_test ?, ? OUTPUT) The sql task rowset is set to NONE. It is a OLEB connection.
I have two parameters mapped:
tablename input varchar 0 (this variable is set earlier in a foreach loop) ADO. returnvalue output long 1
I set the breakpoint and see the values change, but I have a OnFailure conditon set if it returns a failure. The failure is ignored and the package completes. No quite what I wanted.
The first part of the sp is below and I set the value @i and return.
Why is it not capturing and setting the error and execute my OnFailure code? I have tried setting one of my parameter mappings to returnvalue with no success.
This is my function, it returns SQLDataReader to DATALIST control. How to return page number with the SQLDataReader set ? sql server 2005, asp.net 2.0
Function get_all_events() As SqlDataReader Dim myConnection As New SqlConnection(ConfigurationManager.AppSettings("...........")) Dim myCommand As New SqlCommand("EVENTS_LIST_BY_REGION_ALL", myConnection) myCommand.CommandType = CommandType.StoredProcedure
Dim parameterState As New SqlParameter("@State", SqlDbType.VarChar, 2) parameterState.Value = Request.Params("State") myCommand.Parameters.Add(parameterState)
Dim parameterPagesize As New SqlParameter("@pagesize", SqlDbType.Int, 4) parameterPagesize.Value = 20 myCommand.Parameters.Add(parameterPagesize)
Dim parameterPagenum As New SqlParameter("@pageNum", SqlDbType.Int, 4) parameterPagenum.Value = pn1.SelectedPage myCommand.Parameters.Add(parameterPagenum)
Dim parameterPageCount As New SqlParameter("@pagecount", SqlDbType.Int, 4) parameterPageCount.Direction = ParameterDirection.ReturnValue myCommand.Parameters.Add(parameterPageCount)
myConnection.Open() 'myCommand.ExecuteReader(CommandBehavior.CloseConnection) 'pages = CType(myCommand.Parameters("@pagecount").Value, Integer) Return myCommand.ExecuteReader(CommandBehavior.CloseConnection) End Function
Variable Pages is global integer.
This is what i am calling DataList1.DataSource = get_all_events() DataList1.DataBind()
How to return records and also the return value of pagecount ? i tried many options, nothing work. Please help !!. I am struck
I'm trying to create a report that's based on a SQL-2005 Stored Procedure.
I added the Report Designer, a Report dataset ( based on a shared datasource).
When I try to build the project in BIDS, I get an error. The error occurs three times, once for each parameter on the stored procedure.
I'll only reproduce one instance of the error for the sake of brevity.
[rsCompilerErrorInExpression] The Value expression for the query parameter 'UserID' contains an error : [BC30654] 'Return' statement in a Function, Get, or Operator must return a value.
I've searched on this error and it looks like it's a Visual Basic error :
I am trying to bring my stored proc's into the 21st century with try catch and the output clause. In the past I have returned info like the new timestamp, the new identity (if an insert sproc), username with output params and a return value as well. I have checked if error is a concurrency violation(I check if @@rowcount is 0 and if so my return value is a special number.)
I have used the old goto method for trapping errors committing or rolling back the transaction.
Now I want to use the try,catch with transactions. This is easy enough but how do I do what I had done before?
I get an error returning the new timestamp in the Output clause (tstamp is my timestamp field -- so I am using inserted.tstamp).
Plus how do I check for concerrency error. Is it the same as before and if so would the check of @@rowcount be in the catch section?
So how to return timestamp and a return value and how to check for concurrency all in the try/catch.
by the way I read that you could not return an identity in the output clause but I had no problem.
i have using BCP to output SP return data into txt file, however, when it return nothing , it give SQLException like "no rows affected" , i have try to find out the solution , which include put "SET NOCOUNT ON" command before select statement, but it doesn't help :(
anyone know how to handle the problem when SP return no data ?
I have some questions about SQL Servers 2000 and 2005 compatibility. In my configuration I have to use both servers. The cubes are stocked in 2005 server. May I transfer from 2005 to 2000 Analysis Services the cubes?
If yes, what is the procedure? The result of migration is the same in the two different versions?
I have a stored procedure that selects the unique Name of an item from one table.
SELECT DISTINCT ChainName from Chains
For each ChainName, there exists 0 or more StoreNames in the Stores. I want to return the result of this select as the second field in each row of the result set.
SELECT DISTINCT StoreName FROM Stores WHERE Stores.ChainName = ChainName
Each row of the result set returned by the stored procedure would contain:
ChainName, Array of StoreNames (or comma separated strings or whatever)
The question is, when can it be considered excessive? Or can you just order the results by descending order and see if you can make improvements on the worst ones?
The same goes for this query for pageiolatch wait counters and times for indexes: select database_id, object_id, index_id, partition_number, page_io_latch_wait_count, page_io_latch_wait_in_ms from sys.dm_db_index_operational_stats (null,null,null,null)
I have SQL query, and I am trying to write MDX with similar logic. Logic has where clause:
Where (((CD_TYP_CAT_PARENT = 'LOAD' OR CD_TYP_CAT_PARENT = 'SPEND') AND (CD_TYP_CAT NOT IN('REVERSAL','REFUND','FEE'))) OR (CD_TYP_CAT_PARENT = 'OTHER' AND CD_TYP_CAT = 'SPEND'))
I just ran a testbed of 4 types of SQL queries:1. inline SQL with a StringBuilder2. managed sql3. SQL text processing (@SQL as varchar(5000); SET @SQL = 'SELECT ' + @var...)4. a regular sproc that has the columns and table name hard coded1,2, and 4 always end up at about the same time given the averages.3 is always at last 1.5 times slower, and usually closed to 2 times.1 and 2 both use StringBuilders, the code is a direct copy, and 3 is a copy as well.My managed SQL is: [Microsoft.SqlServer.Server.SqlProcedure] public static void usp_Items_Select_Managed(SqlString table, SqlString name, SqlString value) { // sql StringBuilder stringBuilder = new StringBuilder(); stringBuilder.AppendFormat( "SELECT {0}.* FROM {0} WHERE {0}.{1} = {2}", table, col, value ); SqlConnection sqlConnection = new SqlConnection("context connection=true"); SqlCommand sqlCommand = new SqlCommand(stringBuilder.ToString(), sqlConnection); sqlConnection.Open(); SqlContext.Pipe.Send(sqlCommand.ExecuteReader()); sqlConnection.Close(); }Is there anything wrong with my Managed SQL, or is this just the way that it is?Thanks
I have a SQL 2000 with Analysis server installed on it . Its in a different domain . When i am trying to register the server remotely i.e from another domain ,from another server it gives me an error that :
'Cannot connect to the repository . Analysis Server : <Server_name>
Error : Cannot open database requested in Login '<Database_name > Login fails . Do you wish to register this server ? '
Both the domains are trusted . Except that the OS where the Analysis server is running is Windows 2000 . I am trying to connect through SQL 2000 Server with Windows NT 4.0 Server .
We have another Analysis server which is running but on Windows NT 4.0 OS where i can easily register that server . I tried providing access with all the rights to concerned domains , but in vain . I am exhausted trying to figure out the problem .
Any assistance in this regard would be of great help to me .
I am tuning some queries and am using the Display Execution Plan option from Query Analyzer. What I am looking to for is somewhere which will explain the differences between the costs displayed when you put the mouse pointer over the object in the Execution plan output. BOL gives some general directions, but I need more specifics.
Is there any method by which I can use Stored procedures in Analysis services. I have some procedures which uses Temprorary tables. I wanna use those procedures's columns. Is it possible?
Hi, Im pretty much new to AS and would like to ask some silly questions why I cant browse the data in my cube.
Version - SQL2K Developers Edition on W2K OS
Problem: Im doing the tutorial which comes with Analysis Services. I have successfully followed their instructions and created and processed the 'Sales' cube.
However when i browse the cube, i get the error Unable to browse the cube 'Sales'. Unspecified error
will anyone help me as iam new to AS and i created the cube using the sample tutorial.but when i try to browse the cube it is showing an error as "unable to browse the cube,unspecified error'.pls help me thanx in adv
hello everybody I am new to bi project.my client want user activities to be captured into chart and cube.Into this context i just started reading analysis services.i read some document and implemented one simple analysis service project using adventureworksdw database. My question is this like i have to work on analysis services using my own db.so directly can i use clint db(ie user table) or i need to write some script to change this db into a db which can produce cube and dimention.
I installed MS Analysis Services in my Windows XP (with SP1) desktop. When I try to expand the "+" sign beside the server, I got the following error message: "Unable to connect to the registry on the server, or you are not a member of the OLAP Administrators group on this server."
It is very weird that occasionally I can connect to the server without the above problem. But most of time, I cannot connect to the server and will get the above the error message.
A lot of job postings these days for SQL DBA's list the requirement of Analysis Services. Not having any experience with it I'm wondering what exactly is meant? Of course its going to vary from company to company, but I really have no clue what they mean in general? Is it cube design? MDX query writing? Simply backing up? To be clear, Im asking in regards specificically for a DBA, not a Developer.
I have been looking at Microsoft Business Intelligence tools. SQL Server 2005. I see great potential with Analysis server, and creating data cubes for reporting. This will offload much processing from my transactional servers.
I just haven't been able to identify the process for automatically updating my cubes over night. The documentation leaves me baffled.
I am running XP in the office and already have SQL's Enterprise Manager and Analysis services installed - which I can use to access the SQL databases on our office server.
Question: I have installed the desktop version of SQL on my machine and am trying to register the server in the Analysis Service Manager but no luck.
I was doing steps on page 15 / 16 of attached sheet
I was doing the microsoft example about cube to setup DSN SOURCE CONNECTION to ACCESS database (food mart) when i do a test it works
but after i do design storage and go to Process the cube it gives me error
'test connection failed because of an error in initializing provide (microsoft) ODBC DRIVER manager data source name not found and no default driver specified
one more error found Microsoft ODBC DRIVER MANAGER driver sql connect attr failed: IMOO6
Hi. I have a problem with the caculation of the dimensions in the Analysis Services and I need help. In a certain dimension I make some calculations with the "personalized member formulas". Some cells are modified after this calculations. After this, in another dimension, other calculations are made using again the "personalized member formulas". With this operations some cells that were modified in the previous dimension should be modified again, but they aren't. Is there a way to fix the sequence of the calculation of the dimensions? Do they get calculated at the same time or some get done before others? Once one of this cells are modified using those formulas is it possible to modify them again? In the Hyperion's Essbase software ther is an option to calculate some dimensions before others. Does Analysis Services has that option or something similar?. I hope you can help me. Thank you very much.