How To Load SSAS Cube Partitions With Dynamic Name Directly From Flat Files
Feb 1, 2008
HI,
As the source data of the cube is from MySQL, and the source data volume is more than 80M row per month, I have to build multiple partition in the cube, each partition contain only one month data, even though, the time to load data directly from MySQL is still too long, and because the mysql .Net provider and is not mature enough, the connection often break while loading, so I have to try to load from flat file which was exported from MySQL.
The Partition Processing Destination seems support this way, but, even it shows the partition name in component editor, it actually process the partition with partition ID, and there's not any way to change destination partition name for this component.
Unless I have to change the SSIS package every month, looks like it is impossible to make a smart ETL program that can dynamic create new partition with the ID and name as YYYYMM every month and load data directly from the flat csv file exported from big MySQL table.
Does anyone know how to load data from flat text, and also support dynamic destination partition name?
And I also find a bug in Partition Processing Destination, even 2005 SP2, if there's more than one cube in a SSAS database, and if two partition name in different cube is same, in component Editor, you can not set mapping to the second one with the same name, even you point to the second one, and click ok, the next time you open the editor, you will find it high light at the first one. And even it shows name in editor, it actually process with partition ID instead of partition name, this make it is not possible by change the partition name which need to process to a constant name, say currentMonth to force the component process it.
Thanks.
Jun
View 4 Replies
ADVERTISEMENT
May 9, 2007
The environment here is SSIS ETL feeding a Fact Table. The Fact Table is pulled into SSAS as a cube and reporting services are handled there. I am on the ETL side and don't pretend to know all the processing that happens with cubes, etc.
What we are trying to accomplish is to add partitions to a Cube via the ETL processing. The partitions should be incremented by Day, i.e. 20070501, 20070502, etc.
This is currently processed manually by the Reporting developer and we are looking for an automated process to reduce errors and hand work.
I have explored the following objects: Partition Processing DataFlow Destination and did not find much documentation or examples on it's use. If you have any information on this stage, please reply with such.
The other option is the Analysis Services Processing Control Flow. I understand that we can process Analysis Server objects as part of our package. Is there a way to incorporate a Partitioning Script in this object? If so, how. Again, I did not find detailed examples on the use of this object.
If you have experience with either of these, please feel free to reply. I appreciate any and all comments.
View 5 Replies
View Related
Apr 23, 2008
Hi Evry one,
I Have Multiple Flat Files in Source Folder(They have Naming Conventions With Todays Date ex: Flatfile_20082204_1,Flatfile_20082204_2,Flatfile_20082204_3 ),
I need to Extract Each and Evry file by Dynamically, and Transform the Flat File then load that Flat file into the Destination Folder with Standard Prefix and Todays Date with a Sequence No ex:Flatfile_20082304_A,Flatfile_20082304_B, Flatfile_20082304_C
Please HELP Me
Thanks In Advance.
View 20 Replies
View Related
Jul 13, 2007
Hi All,
I am totally new to SSIS and im in the learing phase. I have a requirement as below,
I have two flat files (mainframe files), the structure i have given below,
File1:
070113
12345johnk
23456james
1st row is header record which has got date in YYMMDD format and remaining rows have emp no and emp name
File2:
070113
070113
070113
070113
contains 4 records which are dates.
The requirement is to compare the header date in file1 with the 4 dates in file2, if they are equal then it should load all the records in file1 except the header into a table and if they donot match then it should log an err msg. Please could someone provide a lead on this.
The files have same record length and fixed field delimited.
Thanks in advance
raj
View 1 Replies
View Related
May 12, 2014
We have a few customers dropping files in Amazon S3. how to load this data into SQL Server 2008 R2 database using SSIS? We are 2008 R2 BIDS environment.
View 5 Replies
View Related
May 29, 2015
how do you load the multiple flat files to into destination dynamically?
View 9 Replies
View Related
Jun 23, 2015
I want to load flat files into a single table. But the flat files can have variable number of columns upto a maximum of 10 columns. The table in my database has 10 columns in it. So in case if I load a flat file having 6 columns then rest of the columns in the table will have nulls. I don't want to use script task for this as I am not good in writing C#code.
View 5 Replies
View Related
Feb 6, 2007
Hi,
My scenario:
I have 4 different flat files types each having different no. of column, order of columns etc. I want to upload all the 4 types into the same destination table in the SQL database. Before uploading I need to apply transformation to each column in the flat files. The transformations could be like
1) Multipying the source column by 100
2) Put an if condition for 2 source columns and then select one column to be copied into the destination.
I have the flat files schema with me and also all the transformations that are required.
Question:
Can SSIS provide me with a component that can read the flat file schema and the transformations from the database and apply them to the source data and then upload it to the constant destination table? Can derived column transformation be provided with the input columns list and the transformation to be done on each dynamically?
Why I want this way?
In future there can be an addition of extra flat file formats and we want to keep the changes to the SSIS packages to he mininum. Just entereing the addiional schema and transformation details in the database should run the package on the new flat file successfully.
Thanks for your time.
Regards,
$wapnil
View 15 Replies
View Related
Aug 19, 2015
I have defined a stored procedure with one parameter. With this parameter I'm able to controll which year of the sales amount data should be selected. This works fine.
Now I want to implement this stored procedure as the source of the partitions. But if I do this I get an error. The syntax-check says, that everything is fine. But if I want to calculate the partition with this command: "exec dst.fact_umsatz_year 0" get the following error (in German):
OLE DB-Fehler: OLE DB- oder ODBC-Fehler : Falsche Syntax in der Nähe von ')'.; 42000; Falsche Syntax in der Nähe des exec-Schlüsselworts.; 42000.
Fehler im OLAP-Speichermodul: Fehler beim Verarbeiten der FACT Umsatz Pivot View-Partition der Anzahl Kunden-Measuregruppe für den Vertrieb-Cube aus der OLAP AS-Datenbank.
View 2 Replies
View Related
May 5, 2008
Hi,
I am currently using SSIS ,MS Sql Server 2000 database and 2000 Analysis Services for the cube.
I am creating a new table everyday and giving name like day_20080504, day_20080505 etc...
So then I go to Analysis Services and process dimensions(incremental) AND
Create a new partition using old partition as a template.
My first question is how to create a new partition everday and use old partition as template...(Almost same except database table)
My Second question : Can I do this on 2000 Analysis services or Should I convert my cube into SSAS?
thanks,
J
View 11 Replies
View Related
Feb 19, 2008
Hi,
Actually we have multiple Sub Reports which we want to open from the Main Report. Initially when i used "Jump To Report" property it worked fine but in the browser the Main Report used to get refreshed everytime i had to Go Back to Main Report from Sub Report.
Second solution which i want to try is using "Jump To URL" property.
Some Details:
I have multiple parameters in the report.
My parameters are getting data from a Cube.
Parameters can have Multiple Vaue selected at a time.
I want to pass all the values to the URL.
Say
Parameter1 is DimCountryCountry
can have values such as:
[Dim Country].[Country].[All]
[Dim Country].[Country].&[India]
[Dim Country].[Country].&[Germany]
......
Parameter2 is DimCountryState
can have values such as:
[Dim Country].[State].[All]
[Dim Country].[State].&[Delhi]
[Dim Country].[State].[Punjab]
.....
Now i want to create a URL using these values,since each parameter should be separated by "&"(ampersand) my URL looks like one below.
Following URL Works fine:
javascript:void(window.open'(http://MyReportServer/ReportServer/Pages/Report.aspx?/MyReports/Project1/Country+Data&rcarameters=false&DimCountryCountry=[Dim Country].[Country].[All]&DimCountryState=[Dim Country].[State].[All]'))
as long as there is no "&" in the Parameter Value the URL works fine.
But when i try to insert values other than "[All]" it starts failing because "[Dim Country].[Country].&[India]" contains "&", which gives error due to ambiguity, (whether this "&" is a parameter separater or is it a part of parameter value)
So i found a solution for that, which is to replace "&" with "%26" escape sequence characters.This works fine as long as we specify this escape sequence in address bar. i.e if we directly paste the following URL in the address bar it will work.
http://MyReportServer/ReportServer/Pages/Report.aspx?/MyReports/Project1/Country+Data&rcarameters=false&DimCountryCountry=[Dim Country].[Country].%26[India]&DimCountryState=[Dim Country].[State].%26[Delhi]
but if we put this URL in the Javascript, it doesnot work, because the WIndow.Open function automatically replaces the
"%26" characters with "&", which we know gives ambiguity error.
Please let me know if you have any suggesiton for it. Anyway any better solution for my problem.
Thanks
Sumit
www.sumitkhemka.com
View 4 Replies
View Related
Oct 3, 2006
I have thousands of records, which are .txt format, to be loaded to the DB and need to mining them. The only way for that is to write a program and read them to the DB or can I load them directly? And also I have to design the tables based on these data myself, right? Thank you.
View 6 Replies
View Related
Aug 4, 2006
Hi,
I have a problem with processing my cube. My fact table (with telephone data) contains about 400,000 records... which is increasing rapidly (400,000 records is about 8 months of data)...
I have a few dimensions:
Dimension User: about 200 records
Dimension Line: about 200 records
Dimension Direction: 4 records
Dimension Date: 365 records for each year
Dimension TimeInterval: with 24 records
So far so good... when I process this dimension I have no problem....
However, when I add a dimension (CalledNumber, with exactly 101 records) the processing hangs as soon as it starts...
The SQL performed when processing the cube looks like this:
SELECT field1, field2,... fieldn
FROM table1, table2,.... tablem
WHERE
(table1.id=table2.table1id)
AND
(table2.id=table3.table2id)
...
When I execute above SQL in the Query Analyser from SQL Server Enterprise Manager, it ALSO hangs...
I am not really suprised by that, because this SQL first create a huge table of 400,000 x 200 x 200 x 4 x 365 x 24 x 101 records and after that works through the WHERE statements to filter out the appropriate records.
for me it would be more logical to use the following code to process the cube, but that cannot be changed in Analysis Manager:
SELECT field1, field2,... fieldn
FROM table1
LEFT JOIN table2 ON (table1.id=table2.table1id)
....
LEFT JOIN tablem ON (tablem.id = tablem-1.tablemid)
When I execute above SQL in the Query Analyser from SQL Servel Enterprise Manager, it does NOT hang, but performs the query in about 35 seconds....
But Analysis Manager does not allow me to change the SQL used for processing the cube...
What can I do to add more dimensions to my cube... (It will be more anyway after adding the CalledNumber dimension)??
any suggestions?
PS. forgot to mention: I use Sql Server 2000
View 1 Replies
View Related
Aug 4, 2006
Hi,
I have a problem with processing my cube. My fact table (with telephone data) contains about 400,000 records... which is increasing rapidly (400,000 records is about 8 months of data)...
I have a few dimensions:
Dimension User: about 200 records
Dimension Line: about 200 records
Dimension Direction: 4 records
Dimension Date: 365 records for each year
Dimension TimeInterval: with 24 records
So far so good... when I process this dimension I have no problem....
However, when I add a dimension (CalledNumber, with exactly 101 records) the processing hangs as soon as it starts...
The SQL performed when processing the cube looks like this:
SELECT field1, field2,... fieldn
FROM table1, table2,.... tablem
WHERE
(table1.id=table2.table1id)
AND
(table2.id=table3.table2id)
...
When I execute above SQL in the Query Analyser from SQL Server Enterprise Manager, it ALSO hangs...
I am not really suprised by that, because this SQL first create a huge table of 400,000 x 200 x 200 x 4 x 365 x 24 x 101 records and after that works through the WHERE statements to filter out the appropriate records.
for me it would be more logical to use the following code to process the cube, but that cannot be changed in Analysis Manager:
SELECT field1, field2,... fieldn
FROM table1
LEFT JOIN table2 ON (table1.id=table2.table1id)
....
LEFT JOIN tablem ON (tablem.id = tablem-1.tablemid)
When I execute above SQL in the Query Analyser from SQL Servel Enterprise Manager, it does NOT hang, but performs the query in about 35 seconds....
But Analysis Manager does not allow me to change the SQL used for processing the cube...
What can I do to add more dimensions to my cube... (It will be more anyway after adding the CalledNumber dimension)??
any suggestions?
PS. forgot to mention: I am using Sql Server 2000
View 2 Replies
View Related
Sep 6, 2007
I have an excel workbook, that has a pivot table in it where the data source is an OLAP cube. My problem occurs on the client machine, logged in as the client. I remote into the PC, and create a pivot table using the OLAP cube connection. I create the pivot table and everything works fine - I am able to browse the data with no issues. Once I exit the Excel Workbook, and come back in - I am no longer able to connect to the datasource. I have tried both saving the password in the connection and not saving it. It has made no difference.
View 1 Replies
View Related
May 21, 2015
how can use this mdx script in the calculation part of a cube, will i simply dump it in the script form by starting with the 'create member current cube.
[measures].[test]'
select
[measures].[abc] on 0,
[xyz].[xyz].(&0):[xyz].[xyz].(&60) on 1
from
(
select
(tail([month].[month].[month].members,6))on 0
from
[cube])
View 3 Replies
View Related
Dec 13, 2010
Is it possible to have more than one cube under one SSAS database? For example I have a database "Test" and in this the cube exist is "TestCube", is iit possible to deploy another cube "TestCube2" under the Test databse?
If yes then what is the process to do that, the reason I am asing is there are some common dimensions used n both the cubes and I am not sure what is the best way so that I can use the shared dimension?
View 6 Replies
View Related
Nov 4, 2009
i am getting the following error when i am processing the cube in SSAS 2008...Errors in the back-end database access module. The size specified for a binding was too small, resulting in one or more column values being truncated. Errors in the OLAP storage engine: An error occurred while the 'Policy Type' attribute of the 'Policy Type' dimension from the MyDemo' database.i verified the datatype column length for policytype column in the dimension as well as all fact views.
View 9 Replies
View Related
Nov 12, 2007
Hi guys,
This is my first post. Could anyone help me out? I am trying to refresh a cube that i created by using a new script command but i get the following error "the script contains the statement, which is not allowed." On the microsoft page i've read that for refreshing you need to create a new script. Is there another way for refreshing the cube?
Cheers
View 5 Replies
View Related
Aug 7, 2015
I want to implement population data in sales cube.
Fact table has customer code which is foreign key of Customer master dimension which in turn is linked to census data dimension. Census data dimension have city wise population data having foreign keys of zone and state.
We want to add population data in fact table.
View 3 Replies
View Related
Oct 25, 2010
I have a .abf file, which I am attempting to restore. I go to Management Studio and attempt to restore the cube.
However, whenever I attempt to restore the following error message occurs:
"File 'C:/.......' specified in restore command is damaged or is not an AS backup file.
The following error occured:
Access is denied (Microsoft SQL Server 2008 R2 Analysis Services)
View 4 Replies
View Related
Jun 21, 2013
How can i deploy an existing cube with a new name on the same server?
View 4 Replies
View Related
May 21, 2015
I am thinking of a possible design where the cube will never go offline.
Usually when I do some code changes on my cube the cube goes offline and I need to Full Process it again to get it back .
However , in cases where the cube is extremely critical for the business users , it would be great if I can deliver a solution where the cube never goes down.
View 4 Replies
View Related
Jun 12, 2015
How can I connect ssas cube to server. I have no SSAS SERVER INSTANCE...
View 3 Replies
View Related
Aug 24, 2015
I need to create 3 fact table in a cube with dimensions ?? is it possible . What are the steps to do.
View 3 Replies
View Related
Sep 8, 2015
Background: I have a Huge fact dimension table(table has both measures and dimensions) that i am using to build a SSAS Cube.
The table didn't have a unique identifier, so the database team added ROW NUM as a column to the table which i am using as a PrimaryKey in my Cube build. I was able to create a cube successfully with it without any issues.
Problem: Now customers are asking for a 'Claim Count calculation' which shows the Distinct Claim Count.
Its defined as below :
Count(Distinct Claim_Number || Claim Year || Claim Month)
. All the 3 columns are available in table, but when i am trying to create this Count Distinct Object in the DSV the Cube processing time increased by 5 times, as now i have to use a GroupBy function in my SQL.(There are around 30 columns to group by).
Is there a better way to achieve this Count(Distinct Claim_Number || Claim Year || Claim Month) without using groupBy in DSV SQL logic? I cant seem to find any Count(Distinct) function in the Cube Calculation functions?
Environment: SSAS 2012 Multidimensional Model
View 4 Replies
View Related
Apr 16, 2015
I am working on SQL 2012.We have a SSAS Cube build. On top of it client use Excel to connect to SSAS Cube and See the reports.My Cube process every hour and take almost a 1-2 min to Process.When ever End user/Client See or refresh the Excel report (Which use Cube as Source),WHEN CUBE IS PROCESSING ,they get an error that Source is not available
We Have tried best but cannot bring down the Processing time of the cube to 3-5 Second , so that End user don't face report refresh issue at the moment of cube processing
Requirement :In Case End User see the report while Cube is processing from back end , Instead of Error they should see some customize Msg which we can provide some thing Like "Data is Refreshing , please wait ".
View 5 Replies
View Related
Jan 14, 2008
Hi,
I have built two perspectives in a cube. Let's say: Sales and Projects. When I open my cube in Excel 2007 and I choose to open the perspective 'Sales' I see all sales related dimensions and attributes. When I open the perspective Projects I see all projects related dimensions and attributes.
So far so good...
But now I generated a report model from the cube. When I start reportbuilder I can choose between three perspectives, because the total cube is added as an perspective as well. Looks pretty good, and now I open the sales perspective, but...
I see al the entities in the model, even the ones from the 'projects' perspective! It doesn't have all the functionality though, but it is visible with all the entritbutes.
I hope somebody knows the solution to this problem. Thanks in advance.
Julian Kooiker
View 10 Replies
View Related
Jul 3, 2006
Hey Forum Community
I seem to have a problem creating a model for my AS cube through Report Manager.
I have set up the data source as follows:
Name: Gates Aust Business Intelligence
Connection Type: MS SQL Server Analysis Services
Connection String: Data Source=MANT4003;initial catalog="GAPL Sales Analysis"
Connect Using: Windows Interegrated Security
This part works, but when i go to generate the model, i get the following error in the browser:
"
Cannot create a connection to data source ''. (rsErrorOpeningConnection) Get Online Help
For more information about this error navigate to the report server on the local server machine, or enable remote errors
In the reporting services log file i have the following errors:
w3wp!library!7!07/03/2006-10:51:40:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.DataSourceOpenException: Cannot create a connection to data source ''., ;
Info: Microsoft.ReportingServices.Diagnostics.Utilities.DataSourceOpenException: Cannot create a connection to data source ''. ---> Microsoft.AnalysisServices.AdomdClient.AdomdConnectionException: The connection either timed out or was lost. ---> System.IO.IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. ---> System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host
at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags)
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
--- End of inner exception stack trace ---
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.IO.BufferedStream.Read(Byte[] array, Int32 offset, Int32 count)
at Microsoft.AnalysisServices.AdomdClient.DimeRecord.ForceRead(Stream stream, Byte[] buffer, Int32 length)
at Microsoft.AnalysisServices.AdomdClient.DimeRecord.ReadHeader()
at Microsoft.AnalysisServices.AdomdClient.DimeRecord..ctor(Stream stream)
at Microsoft.AnalysisServices.AdomdClient.DimeReader.ReadRecord()
at Microsoft.AnalysisServices.AdomdClient.TcpStream.GetDataType()
--- End of inner exception stack trace ---
at Microsoft.AnalysisServices.AdomdClient.XmlaClient.EndRequest()
at Microsoft.AnalysisServices.AdomdClient.XmlaClient.CreateSession(ListDictionary properties, Boolean sendNamespaceCompatibility)
at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.XmlaClientProvider.Microsoft.AnalysisServices.AdomdClient.AdomdConnection.IXmlaClientProviderEx.CreateSession(Boolean sendNamespaceCompatibility)
at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.ConnectToXMLA(Boolean createSession, Boolean isHTTP)
at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.Open()
at Microsoft.ReportingServices.Library.RSService.OpenDataSourceConnection(DataSourceInfo dataSourceInfo, CreateDataExtensionInstance createDataExtensionInstanceFunction, Boolean isUnattendedExecution, Boolean unwrapConnection, IntPtr clientToken, IDbConnection& unwrappedConnection)
--- End of inner exception stack trace ---
w3wp!library!1!07/03/2006-10:51:59:: i INFO: Call to GetPermissions:/
w3wp!library!1!07/03/2006-10:53:02:: i INFO: Call to GetPermissions:/
I have tried searching the forums and cannot find anything really helpful. I have had one cube setup and a model generated before and all was workin, but i had to delete it and i recreated the SSAS cube from scratch.
My user account has admin rights on the cube and also on the servers.
Server is Windows 2003
Client is Windows XP.
Any one got anythoughts on this or how i can determine what is wrong.?
Many Thanks
Scotty
View 2 Replies
View Related
May 7, 2015
I have only 1 denormalized table that is being used in a SSAS Tabular model(which is about 3GB). I am doing a POC to convert it into a SSAS Multidimensional and explore it.
Table1
-----------------------
StoreName
StoreDesc
ItemName
ItemDesc
Qty1
Cost1
Price1
Amount1
1st Question) I am seeing that there is no Primary Key(unique key) in the current denormalized table. (Tabular Model didnt require any primary key). But i think for Multidimensional the key is mandatory? Should i generate a composite key myself in a Named Query based on this table(in the DSV)?
2nd Question) What is the best way to design my Multidimensional Cube/Dimensions based on this single table?
Say if i comeup with a Composite Primary Key called (PK_ID) . Should i be splitting up my facts / dimensions in my DSV using Named Queries similar to below(Using the same PK for my dimension tables also?)
a) FactTable = Select PK_ID, Qty1,Cost1,Price1,Amount1 from Table1
b) StoreDim = Select PK_ID, StoreName,StoreDese from Table1
c) ItemDim = Select PK_ID, ItemName,ItemDesc from Table1
Would this work?
View 3 Replies
View Related
Oct 31, 2012
All I get back is an error message of "Analysis Services Processing Task Error: A Connection cannot be made. Ensure the Server is running" The server is running, I can process the cube by connecting to the AS instance and right-click processing it.
I can process the cube by running the SSIS task inside of SSDT Just when I deploy the SSIS package (in Project mode) and then execute it do I get the error message.
SQL Server, SSAS, and SSIS processes are all running under the same account. SSAS is on a separate server from SSIS and SQL if that matters.
View 2 Replies
View Related
Jul 22, 2015
For Example: I have one dimension named as "Name", Under this I have "FirstName" and "LastName" Attributes are there.But when i drag "Name" dimension, By default "First Name" dragged. But i Want "Last Name" should drag.
View 6 Replies
View Related
Oct 2, 2015
Is there any way to copy my Data of 2015 to the Planning/Forecasting Value of 2016?
My question is based on that we use INFOR ION BI right now and there we can just add an Button in our reports wich physically copies the value from one year to the next year based on some other rules in the cube.
Now I need to make this example work with SSAS and Excel PivotTables but I cant figure out how.
I have absolutely no clue where and how to accomplish it. Do I use Calculations, do I use Actions, do i make it in the Dataview, Cube or directly in Excel?
View 4 Replies
View Related