Problems With Connections And Analysis Service Processing Task In SSIS
May 30, 2008
Hi everybody,
I'm fairly new to the SSAS/SSIS world (though not new on databases, etc.) and I'm having some problems with the SSIS packages in our Cube environment.
Currently in our SSAS/SSIS project, we have two major connection managers, one to the database we use for loading the Cube, and the other connector for the cube itself. To load the data from the database to the cube, we wrote some SSIS packages and used the Analysis Service Processing tasks to process all the dimensions and measures. This works pretty good, so no problems here.
The real problem starts, when I try to change the connection parameters, e.g. because the server changed, or the database has been renamed.
As soon as the connection managers points to another (existing) cube, regardless if the structure is exactly the same as the one of the old cube, the tasks lose all the assigned objects from their lists. It is really annoying to add all these exactly same objects to the task again. I tried experimenting with the DelayValidation attribute so the Development Studio doesn't destroy my work every time, but when I deploy the package the Cube breaks. Obviously some kind of deeper connection is destroyed when I change the connection string.
Is there a way to prevent the package from breaking/losing objects, without me having to sacrifice 15 minutes every time I change the connection parameters?
I have a cube that we are processing nightly via an Analysis Service Processing Task in SSIS. In order to increase the performance of the processing time, we elected to use a lot of rigid dimension attributes, and do a full process of everything in the SSIS task. The issue that I am having is that after that task completes, I need to go into Visual Studio to deploy the cube becuase we are unable to browse or use the cube. This issue seemed to start once we changed the SSIS Analysis Service Processing Task to do a full process on the dimensions, rather than an incremental.
I would expect that once development is done, and it is processed and deployed, that is it. My thinking is that the SSIS task should just update the already deployed cube,
I have an olap database "A" and SSIS package "P" which process all the dimensions and cubes in "A" olap database.
I created "A1" olap database copy of "A" and made copy of "P" SSIS package as "P1" I opened "P1" SSIS package and updated olap connection properties "Initial Catalog = A1". A1 is my new olap database.
When I run package "P1" guess what? it processed "A" olap database's cubes and dimensions. Try it, not in production because I did it in production.
I am trying to execute an SSIS package from a client that contains an Analysis Services Processing Task in the package. The client that does not have SSIS and SSAS installed. We are getting an error
The task "Analysis Services Processing Task" cannot run on this edition of Integration Services. It requires a higher level edition. The same package runs from a server that has both SSIS and SSAS installed. Let me know if someone has come across the same problem.
Hi. I have an Analaysis Services (2005) cube with four dimensions and one fact table (with three partitions - 2006,2007,2008) for which I need to create an SSIS package to process. I only want to process one of the three partitions (2008) - the previous two years should remain unchanged.
This is what I have currently in the Analysis Services Processing Task under Processing configuration: - An object for each of the dimensions with "Process Full." - An object for the 2008 partition with "Process Full."
(Note - Under Process Options, I see only Process Default, Process Full, Unprocess, and Process Data for dimensions and partitions).
Batch settings are: - Processing order: Sequential - Transaction mode: All in one transaction - Dimension errors: Ignore errors - Process affected objects: Do not process
When I execute the package, the cube loses the 2006 and 2007 data.
I am assuming that I have an issue with the Process Option or the Batch Settings, and I would appreciate any guidance!
The last node of my workflow in SSIS is an analysis Services Processing Task, which is supposed to fully reprocess a cube, defined in a different project.
In the configuration, I found the correct cube and setups for it, I thought I wasn't gonna have any problems with it, but it started to complain about user and password information. I thought since the databases configured itself when I added them, the same thing would happen with this Task.
I do have my own user and pass which has permissions to reprocess the cube, although I thought windows authentication would be better then setting up a user and password for the application/task.
I looked in the entire configuration pane and found no information regarding username and password. Where should I set it up, my SSIS solution or the Cube's solution?
This might be a newbie question, I'm not quite sure...
EDIT: Here is the error message: [Analysis Services Execute DDL Task] Error: The following system error occurred: Logon failure: unknown user name or bad password. .
I am trying to run Olap Cube 2000 inside SSIS project. I am using "Analysis Services Processing Task" Object. The Visual Studio Project is sitting on the machine where the analysis 2000 is running but yet i get an error while establish a connection to the Analysis server.
On that machine also install MICROSOFT SQL SERVER 2005 .
the error is: A Connection Cannot be made . Ensure that the server is running.
Does Anybody have an idea to why i get this Error.
i have got a SSIS Package, that contains a sequence container with transactionoption "required". Within this sequence I placed different AS processing tasks and different SQL tasks. The transactionoptions of these tasks are set to supported.
My problem: in the case a SQL task fails on execution, all executed tasks are rolled back except the AS processing tasks. The expected and necessary behavior should be, that also the AS processing tasks get rolled back.
Has anyone got a solution or a workaround for this problem?
I 'm using 'Analysis Services Processing Task' as part of a SSIS package to refresh the cube. in the property page,
the 'loggingMode' is set 'enabled', but there is no records in the sysdtslog90 table while all other tasks are logged in the table. How to logging into the sysdtslog90 table?
We have an Integration services package that executes a few TSQL tasks, then processes an Analsys Services database. This has been in production for about three weeks now and twice the package has failed with this error from the event log:
Event Type: Error
Event Source: MSSQLServerOLAPService
Event Category: (289)
Event ID: 3
Date: 7/11/2007
Time: 1:48:59 AM
User: N/A
Computer:
Description:
OLE DB error: OLE DB or ODBC error: An error has occurred while establishing a connection to the server.
When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server
does not allow remote connections.; 08001;
Communication link failure; 08S01;
TCP Provider: An existing connection was forcibly closed by the remote host.
; 08S01.
For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.
I don't think that this error is accurate because the package and Analysis Services are on the same server.
Also, this does not happen in our development environment. Any help is appreciated.
We have set up an IS package to process an AS 2005 database (comprising cube & dimensions, etc) daily, via a SQL Server Agent job on both development and production systems. This has been working fine for months.
A new dimension was added to the cube on the development system - automatic processing via the IS package continued without issue. However, when the new dimension was added to the production system the IS package no longer processes the cube correctly. Although all appears ok (and all is present and correct in the logs), no data updates to the cube are made. Only when the cube is manually processed does the cube get updated.
Anyone got any ideas about how to get around this issue? We have created a new IS package, with a single Analysis Services Processing Task, and tried this but get the same outcome.
I have an Analysis Services Processing Task in my SSIS package. I run the SSIS package using SQL Server job, the running of the package is a job step.
When I process manually the analysis services objects (in practise cubes) using dtexec utility I get a lot of log. In case the processing fails I get error messages that quite well describe the error. But when I run the job the only information I get in the job log is that the job step failed. I know the failure happens in the Analysis Services Processing Task.
Is there any way in SSIS to get a) the log of the Analysis Services processing or b) the error messages of the Analysis Services processing? Or should the processing be done some other way than I've been doing?
We find that if we deploy the OLAP database with a different name on the test server, then regardless of how we change the connection string provided to the SSIS package that processes the cube, then the package fails to connect to the database. To clarify:
In development the OLAP database is called MyOlapDB and the source database is called MySqlDB. Both are on the same machine. When the the application is built and released for test, the test team install the databases on a replica of the production environment (i.e. web app on one machine, OLAP DB on another and SQL database on yet another). They also, quite rightly, implement the new test databases so they incorporate the build version number. So, MyOlapDB123 and MySqlDB123 are both from build 123.
This is when the problems start. Regardless of how the connection string is specified in the job that processes the cube, the SSIS integration package fails with the error:
[Analysis Services Execute DDL Task] Error: Errors in the metadata manager. Either the database with the ID of 'MyOlapDB' does not exist in the server with the ID of 'OurTestServer', or the user does not have permissions to access the object.
We have tried config files and job properties, but neither work. Also, simply attempting to run the package using the DTEXECUI does not work either.
Looking inside the XML of the package, we clearly see the ConnectionManager object which has the original connection string, which is
Data Source=localhost;Initial Catalog=MyOlapDB;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;
However, editing the initial catalog here still does not solve the problem. Searching the XML for the string MyOlapDB reveals the OLAP database name in two other places - both within the object data of the two Analysis Services Execute DDL tasks.
Anyone know how to solve this problem without having to hack the XML of the package?
I have an SSAS 2005 database "A" and SSIS package "P" which process full "A" olap database. SSAS SERVER connection string is based on a variable read from XML configuration file.
It works well in BIDS, but when i deployed, the package failed at the step connecting SSAS, the message is "a connection cannot be made, please ensure the server is running"
In the connnecting string, i am using server name like servera.xx.com, if I change it to IP address, it works. if I change it to Localhost(happens to be on the same server), it works.
But I need the server name solution as IP may be changed.
I have been struggling with a problem with the Web Service Task. I have a package that uses this and has to authenticate to the web service. This works fine on my development server but if I export the package to file and run from another computer / user, the package fails as it doesn't log in to the web service. I have found the problem to be the package was set to default of EncryptSensitiveWithUserKey and I understand why this is. If I change to use EncryptSensitiveWithPassword, it prompts for creds and works fine. My question is how do I use a package configuration file to use different creds (there is not password option for the http connection) so I can standardize on use of config files for this operation. Thanks.
I m trying to use the web services task, but when i try to run it i got the following error
[Web Service Task] Error: An error occurred with the following error message: "Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: Could not execute the Web method. The error is: Object reference not set to an instance of an object.. at Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebMethodInvokerProxy.InvokeMethod(DTSWebMethodInfo methodInfo, String serviceName, Object connection) at Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTaskUtil.Invoke(DTSWebMethodInfo methodInfo, String serviceName, Object connection, VariableDispenser taskVariableDispenser) at Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTask.executeThread()".
This is probably because i never worked with this task, even so, does anyone knows what might be wrong?
I got a connection a wsdl, i can configure the service, method and variables(i use fixed values), and i configure the output to a variable with the type object(i have tryed string and int)
I even can download the wsdl file, but the error seems to be in the connection
There is not a way to pass parameters to input of Web Service tasks. I heard this problem is fixed with SQL2K5 SP1 and even the online doc says that one can choose either "value" or "variable" when specifying input for web service tasks, but after I installed what-I-think-is SP1, there is still no way to do this.
If one can only specify values (hard-coded) as input to web service tasks, then this would be a very severe limitation. I hope I'm wrong, so could someone please give a pointer. Thanks
I'm pretty stuck on a security issue in SSIS. The web service works by itself, but I can't call it from SSIS, it gives me a 401 unauthorized error. The web service also uses impersonation of my domain admin account.
I have tried the following things: Setting integrated windows authentication in IIS Setting the NTFS permissions of those web site directories to EVERYONE Using a credential / proxy in SSIS and running it from SQL Agent Changing the log on services of MSSQLSERVER, SQLAGENT, and SQL Integration Services to my domain admin account
I can't get anything to work. What is wrong with this thing? Microsoft's security model has gotten completley out of of hand imo
Also in the security event log it shows all authentication as successful.
In SSIS web service task - when you specify the Service and Method in the input tab for a WSDL file being used, it seems to prompt for the parameters to be supplied only in the body of the WSDL file, and not the header.
I need to be able to provide security information (present in the SOAP header of the WSDL) like username, password etc..which is necessary to post any response to the web server, and I cannot see where I can give this in the input tab of the web service task.
Is it possible to do this under SSIS 2005? How? I see I can add a reference to system.web.services.dll.. but then what?
The web service was developed in vb.net/vs.net 2005 and I have no problem adding and consuming it from a web page developed using vs.net 2005 - asp.net/vb.net.
Does any body knows how to convert this VBScript Code to VBDotNet (SQL SERVER 2005 INTEGERATION). Below Codes Returns a list of all the services installed on a computer, and indicates their current status (typically, running or not running).Its in VBScript Format.
I am connecting to SSAS cube from Excel and I have date dimension with 4 fields (I have others but I don't use it for this case). I created 4 fields in order to test all possible scenarios that I could think of:
DateKey: - Type: System.Integer - Value: yyyyMMdd Date: - Type: System.DateTime DateStr0: - Type: System.String - Value: dd/MM/yyyy (note: I am not using US culture) - Example: 01/11/2015 DateStr1: - Type: System.String - Value: %d/%M/yyyy (note: I am not using US culture) - Example: 1/11/2015
Filtering on date is working fine:
Initially, in excel, filtering on date was not working. But after changing dimensional type to time, and setting DataType to Date, as mentioned in [URL] filter is working fine as you can see in the picture.Grouping on date is not working:
I have hierarchy in my Date dimension and I can group based on hierarchy, no problem. But user is used to pre-build grouping function of excel, and he wants to use that. Pre-build functions of Excel, Group and ungroup seems to be available as you can see in following picture:
But when user clicks 'Group', excel groups it as if it is a string, and that is the problem. User wants to group using pre-build grouping function available in Pivot table. I also find out that Power Pivot Table does not support this excel grouping functionality. And if I understood well, this pre-build grouping functionality of excel, needs to do calculation at run time, and that is not viable solution if you have millions of rows. So Power pivot table does not support pre-build grouping functionality of excel and hence we need to use dimension hierarchy to do the grouping. But I am not using Power Pivot table, I am using simple Pivot Table. So I expect grouping functionality to be working fine. Then I tried to do simple test. I created a simple data source in excel itself. And use it as source of my Pivot table. Then grouping is working fine. The only difference that I can see is (When double click the Measure value in Excel),For date values of my simple test, excel consider them as 'Date'.
For date values of my data coming from cube, excel consider them as 'General'
2.1. But value here is same as it was in simple test.
2.2. 'Date Filter' works just fine.
2.3. If I just select this cell and unselect it, then excel change type to 'Date' though for that cell.
2.4. I have created 4 different types of fields in my date dimension thinking that values of attribute of my dimension might be the problem, but excel consider 'General' for all of them.
2.5 This value (that can be seen when double clicking on measure) comes from 'Name Column' of the attribute. And the DataType defined is WChar. And I thought that might be the reason of issue. And I changed it to 'Date'. But SSAS does not allow it to change to 'Date' giving error : The 'Date' data type is not allowed for the 'NameColumn' property; 'WChar' should be used.
So, I don't know, what is the puzzle piece that I am missing.
1. Date filter works, group does not work
2. Excel consider it as 'General' string.
3. SSAS does not allow to change 'NameColumn' to Date.
I would like to know the best practice for running analysis service in terms of port usage. Is it better to run on a specific port or have dynamic ports ? We have clustered servers that run default on 2383 but not sure with non clustered what's the best way to get performance.
I am facing issue with partition processing. I am having a SSAS cube which is having 5 partitions. These partitions are processed through a sql server job using SSIS packages. In packages I used SSAS process task to do this.Now problem is, job is running successfully and showing that the step which is having partition process also fine.But data is not updating in the partition. While checking the partition properties, it is not updated with recent date and time.
When I try to manually process the partition, it is getting succeeded and recent data is getting reflected with recent date and time.Package configuration is done in job itself.
I'm getting this error during processing one dimension.OLE DB error: OLE DB or ODBC error: SQL Server blocked access to STATEMENT 'OpenRowset/ OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', see "Surface Area Configuration" in SQL Server Books Online.; 42000.my dimension contains member from two datasource table.
and in another dimension i get error;Errors in the high-level relational engine. The 'dbo_vicidial_Users' table that is required for a join cannot be reached based on the relationships in the data source view.Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Staging User', Name of 'DimUsers' was being processed.
I have designed a cube. It has two fact tables and some dimensions. Fact table to fact table is many to many relationship.
For example
FactMain DataKey(PK), StartDateKey, PostCodeKey, TotalCost FactBridge DataKey(FK), ProductKey(FK), Position - PrimaryKey on DataKey + ProductKey + Position DimProduct ProductKey(PK), ProductCode
Cube is built successfully, processed successfully.When I try to process the cube from agent job, I am getting error "Attribute key not found: tablename, value..." I have added a job step to run AnalysisServices Command. I have taken the command from cube process script(taken from manually process the cube and take script generated). I used ProcessAffectedObjects = "true" in the script. When I checked the tables, the key does exist. Why am I getting this error?
We plan to process our SSAS Cube nightly after our data warehouse is loaded (SSIS package) using an SQL Agent Job.
1. What is the best option to automate the processing of our cube? 2. Can this be added to our SQL Agent Job? 3. As we will only be adding new dimensions and fact records, will be use Process Add? 4. Does the initial load require Process Full? 5. How can we configure a processing option before the automated execution?
I am getting following errors when i processed Cube in Production.An error occurred while the dimension, with the ID of 'DIM_PARTICIPANT', Name of 'DIM_PARTICIPANT' was being processed. End Error
An error occurred while the 'PARTICIPANT NAME' attribute of the 'DIM_PARTICIPANT' dimension from the 'XL_GCS_SelfServices' database was being processed. End Error
An error occurred while the dimension, with the ID of 'v d Transaction', Name of 'DIM_TRANSACTION' was being processed. End Error
An error occurred while the 'INVOICE NUMBER' attribute of the 'DIM_TRANSACTION' dimension from the 'XL_GCS_SelfServices' database was being processed. End Error.
But i implmented same in Dev and QA server, there is no issues found.
In our project, we would like to use the same data source for our analysis service database cubes and for our reporting service reports.
I created the analysis service project first, deployed successfully. When trying to setting up the data source in the report model project, I selected the "create a data source based on another object", and then selected the "create a data source based on an analysis service project". However, there is no analysis service project to select, and no browse button to see where the reporting service is looking for analysis service project either.
I have tried creating a new analysis service project with data source views, cubes, dimensions and all the stuff, but still cannot see the analysis service project in the drop down box to be selected for my reporting model project data source.
As I am fairly new to the reporting service, I'm sure I'm missing something, but couldn't find much information in the help or on the web. Any suggestion would be much appreciated.