what is it that decides that a particular OLAP cube is small or large? Is it the number of tuples or the number of dimensions that are present in the cube?
Thank you.
Small or large - are subjective characteristics, what is large for some people is small for others. There are also multiple dimensions by which to judge the size:
1. Number of members in the key attribute of the dimension. I would say that below 1 million is small, from 1 to 10 million is medium and above 10 million is large
2. Total number of attributes in all dimensions. Below 1000 is small, above is large
3. Number of records in the fact table at measure group granularity (i.e. number of records which will remain in the cube, not the number of rows in the fact table which will get reduced due to aggregation/deduplication). Or from another angle, number of rows loaded into the cube every day. I would say that below 10 million rows a day is small, between 10 million to 100 million rows a day is medium, and above 100 million rows a day is large.
There are, of course interesting combinations of the above.
|||thanks a lot !!!|||Thanks Mosha!!
Could you please shed some light on the "interesting combinations of the above" if you get some time or is it out of scope of this forum?
We are working on at our company on a shared farm service model where upon multiple ssas databases are hosted on a single server.
If a certain app group fits in the large/medium criteria then they have to budget for their own hardware and storage.
Any additional information will be very much appreciated.
Rgds
Hari
No comments:
Post a Comment