How to Scale Out a SharePoint 2010 Farm From Two-Tier to Three-Tier By Adding A Dedicated Application Server

by Admin 23. September 2014 01:34

Many small to medium-sized organizations start using SharePoint in a “two-tier” server farm topology.  The two tiers consist of:

  1. Tier 1 – SharePoint Server with all web page serving and all Service Applications running on it
  2. Tier 2 – A SQL Server to store the SharePoint databases – the SQL Server could be dedicated to the farm or it might be shared with other non-SharePoint applications.

 

This farm topology can frequently support companies with hundreds of employees.  Of course, it depends a lot on the specifications of the hardware, but with late-model quad-core Xeons running on the two servers and 8 – 16 GBs of RAM on each one with RAID built with 15k RPM SAS drives in the SQL Server, this configuration with SharePoint Server 2010 can perform very well in many organizations that have less than 1000 users.

At some point, an organization that started with this two-tier topology may want to scale out to the next level which is a three-tier topology.  The three tiers would be:

  1. Tier 1 – SharePoint Server dedicated as a Web Front-End (WFE) with only the web application(s) and the search query service running on it
  2. Tier 2 – SharePoint Server dedicated as an Application Server with all of the other service applications running on it, but no web applications or query service
  3. Tier 3 – SQL Server for the databases

Visually, this topology looks like this:

 

There are many different reasons why a company might want to scale out to three-tiers from two.  Some kind of performance improvement is frequently what drives it.  However, it may not be the obvious one of desiring better page serving times for the end users.  For instance, I frequently see companies do this to move the search crawling and index building process to a different server that is more tuned for its unique resource requirements and can do a more efficient job of crawling and indexing the company’s content.  Perhaps in the two-tier approach their crawl\index component can’t get enough hardware resources to crawl through all of the content on a timely basis.

One more point.  Many organizations will also choose to add a second WFE when they scale out to a three-tier farm.  (I don’t show this in the diagram above).  The second WFE will be configured exactly like the first one and some type of network load balancing (NLB) mechanism will be put in front of the WFEs to intelligently route user traffic to the two servers to balance out the load.   In this scenario, the three-tier farm diagram above would be modified to add a second WFE and the total number of servers in the SharePoint farm would be four.

Hardware and software requirements for SharePoint 2013

by Admin 28. March 2014 04:18

Hardware requirements—web servers, application servers, and single server installations

Installation Scenario

Deployment type and scale

RAM

Processor

Hard disk space

Single server with a built-in database or single server that uses SQL Server

Development or evaluation installation of SharePoint Server 2013 or SharePoint Foundation 2013 with the minimum recommended services for development environments.

8 GB

64-bit, 4 cores

80 GB for system drive

Single server with a built-in database or single server that uses SQL Server

Development or evaluation installation of SharePoint Server 2013 or SharePoint Foundation 2013 running Visual Studio 2012 and the minimum recommended services for development environments.

10 GB

64-bit, 4 cores

80 GB for system drive

Single server with a built-in database or single server that uses SQL Server

Development or evaluation installation of SharePoint Server 2013 running all available services.

24 GB

64-bit, 4 cores

80 GB for system drive

Web server or application server in a three-tier farm

Pilot, user acceptance test, or production deployment of SharePoint Server 2013 or SharePoint Foundation 2013.

12 GB

64-bit, 4 cores

80 GB for system drive

Minimum recommended services for development environments

The following are the minimum SharePoint 2013 services and service applications that are recommended for development environments:

  • App Management service application
  • Central Administration web site
  • Claims to Windows Token service (C2WTS)
  • Distributed cache service
  • Microsoft SharePoint Foundation 2013 Site and Subscription Settings service
  • Secure Store Service
  • User Profile service application (SharePoint Server 2013 only)

Minimum software requirements

This section provides minimum software requirements for each server in the farm.

Minimum requirements for a database server in a farm:

  • One of the following:
    • The 64-bit edition of Microsoft SQL Server 2012.
    • The 64-bit edition of SQL Server 2008 R2 Service Pack 1
  • The 64-bit edition of Windows Server 2008 R2 Service Pack 1 (SP1) Standard, Enterprise, or Datacenter or the 64-bit edition of Windows Server 2012 Standard or Datacenter
  • Microsoft .NET Framework version 4.5
  • The SharePoint parsing process crashes in Windows Server 2008 R2 (KB 2554876)
  • FIX: IIS 7.5 configurations are not updated when you use the Server Manager class to commit configuration changes (KB 2708075)
  • Hotfix: ASP.NET (SharePoint) race condition in .NET 4.5 RTM:
    • Windows Server 2008 R2 SP1 (KB 2759112)
    • Windows Server 2012 (KB 2765317)

Minimum requirements for a single server with built-in database:

  • The 64-bit edition of Windows Server 2008 R2 Service Pack 1 (SP1) Standard, Enterprise, or Datacenter or the 64-bit edition of Windows Server 2012 Standard or Datacenter
  • The SharePoint parsing process crashes in Windows Server 2008 R2 (KB 2554876)
  • FIX: IIS 7.5 configurations are not updated when you use the Server Manager class to commit configuration changes (KB 2708075)
  • Hotfix: ASP.NET (SharePoint) race condition in .NET 4.5 RTM:
    • Windows Server 2008 R2 SP1 (KB 2759112)
    • Windows Server 2012 (KB 2765317)
  • The Setup program installs the following prerequisite for a single server with built-in database:
    • Microsoft SQL Server 2008 R2 SP1 - Express Edition
  • The Microsoft SharePoint Products Preparation Tool installs the following prerequisites for a single server with built-in database:
    • Web Server (IIS) role
    • Application Server role
    • Microsoft .NET Framework version 4.5
    • SQL Server 2008 R2 SP1 Native Client
    • Microsoft WCF Data Services 5.0
    • Microsoft Information Protection and Control Client (MSIPC)
    • Microsoft Sync Framework Runtime v1.0 SP1 (x64)
    • Windows Management Framework 3.0 which includes Windows PowerShell 3.0
    • Windows Identity Foundation (WIF) 1.0 and Microsoft Identity Extensions (previously named WIF 1.1)
    • Windows Server AppFabric
    • Cumulative Update Package 1 for Microsoft AppFabric 1.1 for Windows Server (KB 2671763)

Minimum requirements for front-end web servers and application servers in a farm:

  • The 64-bit edition of Windows Server 2008 R2 Service Pack 1 (SP1) Standard, Enterprise, or Datacenter or the 64-bit edition of Windows Server 2012 Standard or Datacenter.
  • The SharePoint parsing process crashes in Windows Server 2008 R2 (KB 2554876)
  • FIX: IIS 7.5 configurations are not updated when you use the Server Manager class to commit configuration changes (KB 2708075)
  • Hotfix: ASP.NET (SharePoint) race condition in .NET 4.5 RTM:
    • Windows Server 2008 R2 SP1 (KB 2759112)
    • Windows Server 2012 (KB 2765317)
  • The Microsoft SharePoint Products Preparation Tool installs the following prerequisites for front-end web servers and application servers in a farm:
    • Web Server (IIS) role
    • Application Server role
    • Microsoft .NET Framework version 4.5
    • SQL Server 2008 R2 SP1 Native Client
    • Microsoft WCF Data Services 5.0
    • Microsoft Information Protection and Control Client (MSIPC)
    • Microsoft Sync Framework Runtime v1.0 SP1 (x64)
    • Windows Management Framework 3.0 which includes Windows Power Shell 3.0
    • Windows Identity Foundation (WIF) 1.0 and Microsoft Identity Extensions (previously named WIF 1.1)
    • Windows Server AppFabric
    • Cumulative Update Package 1 for Microsoft AppFabric 1.1 for Windows Server (KB 2671763)

Does your organization make the most value of your enterprise data?

by Admin 26. March 2014 07:20

Does your organization make the most value of your enterprise data?

Under construction !

Business Intelligence Overview

by Admin 1. March 2014 03:32

Business Intelligence Overview

-          Definition

-          Architecture

-          Source systems /OLTP

-          ETL process

-          Data Warehouses /OLAP

-          OLTP vs. OLAP

-          ODS and Data Marts

-          Data Warehouse Design Approaches

-          Dimensional Modeling

-          From Enterprise models to Dimensional models

-          Schema Types: Star, Snowflake, Fact Constellation

-          Conclusion

Definition
The term business intelligence (BI) refers to technologies, applications and practices for the collection, integration, analysis, and presentation of business information. The purpose of business intelligence is to support better business decision making. BI systems provide historical, current, and predictive views of business operations, most often using data that has been gathered into a data warehouse or a data mart and occasionally working from operational data.

BI enables enterprises

  • Measure performance and trends
  • To use analytic information strategically
  • Unlock the value of its information
  • Identify opportunities
  • Improve efficiency
  • Do competitive analysis..
  • to find the Cause
  • data Mining
  • Etc.

Examples

  • Cause & predictive analysis: Credit cart annual fee
  • Performance and trends: Region total sales / our sales
  • Competitive: Our sales / competitor sales in a particular region or a location, etc,
  • Right timing: Bank customer accounts (pattern changes)
  • Data mining:  market basket analysis

Architecture

Architecture cont…

Typical BI architecture has the following components:

•  A source system, also called Operational system—typically an online transaction processing (OLTP) system, but other systems or files that capture or hold data of interest are      also possible.

•  An extraction, transformation, and loading (ETL) process.

•  A data warehouse—typically an online analytical processing (OLAP) system.

•  A business intelligence platform such as Microstrategy.

Source Systems (OLTP)

An operational system is a term used in data warehousing to refer to a system that is used to process the day-to-day transactions of an organization. These systems are designed so processing of day-to-day transactions is performed efficiently and the integrity of the transactional data is preserved.

                Sometimes operational systems are referred to as operational databases, transaction processing systems, or on-line transaction processing systems (OLTP). In OLTP — online transaction processing systems relational database design use the discipline of data modeling and generally follow the Codd rules of data normalization in order to ensure absolute data integrity.

Source Systems examples

-          Account transactions in a Bank

-          Sales transactions in a Retail outlet.

-          Inventory management transactions in a warehouse

-          Workforce management transactions such as attendance, vacations, overtime tracking, etc.

-          Operational expenditure systems

-          External sources such as industry information like elasticity  or demand of a product from a third part sources in Retail domain.

-          Etc.

 

ETL – Extraction, Transformation and Loading

The Extraction, Transformation, and Loading (ETL) process represents all the steps necessary to move data from different source systems to an integrated data warehouse.

The ETL process involves the following steps:

-          Data is gathered from various source systems.

-      The data is transformed and prepared to be loaded into the data warehouse. Transformation procedures can include converting data types and names, eliminating unwanted data, correcting typographical errors, aggregating data, filling in incomplete data, and similar processes to standardize the format and structure of data.

-      The data is loaded into the data warehouse.

Data Warehouse (OLAP)

A Data Warehouse, in its simplest perception, is no more than a collection of the key pieces of information used to manage and direct the business for the most profitable outcome.  

- According to Bill Inmon, “a data warehouse is a

  • subject-oriented,
  • integrated,
  • nonvolatile,
  • time-variant

 collection of data in support of management decisions”.

- Ralph Kimball states that a data warehouse is “ a copy of transaction data specifically structured for Query and Analysis”.

OLAP

OLAP: a category of software tools that provides analysis of data stored in a database. OLAP tools enable users to analyze different dimensions of multidimensional data. For example, it provides time series and trend analysis views. OLAP often is used in data mining.

OLAP Analysis

Imagine an organization that manufactures and sells goods in several states of USA

During the OLAP analysis, the top executives may seek answers for the following:

-          Number of products manufactured.

-          Number of products manufactured in a location

-          Number of products manufactured on time basis within a location.

-          Number of products manufactured in the current year when compared to the previous year.

-          Sales Dollar value for a particular product.

-          Sales Dollar value for a product in a location.

-          Sales Dollar value for a product in a year within a location.

-          Sales Dollar value for a product in a year within a location sold or serviced by an employee.  

 

DW related: ODS and Data Marts

ODS (Operational Data Store) - This has a broad enterprise wide scope, but unlike the real enterprise data warehouse, data is refreshed in near real time and used for routine business activity.

Data Mart – is a subset of data warehouse and it supports a particular region, business unit or business function 

 

ODS and DW use case

In a pharmaceutical company

Customer ODS is used for:

                - sending new product details,

                - promotional activities,

                - and scheduling appointments.

DW is used to answer:

                - In a month, what is the total value of medicines prescribed by a Doctor?           

                - What is our company share

                - Is he missing any info from us.

Data Warehouse design approaches

Kimball - Let everybody build what they want when they want it, we'll integrate it all when and if we need to. (BOTTOM-UP APPROACH)

Pros:      fast to build, quick ROI, nimble

Cons:     harder to maintain as an enterprise resource, often redundant, often difficult to integrate data marts

Inmon - Don't do anything until you've designed everything. (TOP-DOWN APPROACH)

Pros:      easy to maintain, tightly integrated

Cons:     takes way too long to deliver first projects, rigid

Dimensional data modeling

•       Dimensional data modeling is

–      A logical design technique

                that seeks to

–      present the data in a standard frame work

                that is

–      intuitive and allows high-performance access.

       A data model specifically for designing data warehouses

•       The method was developed based on observations of practice, and in particular, providing data in “user-friendly” form.


 

Step 1. Classify Entities

Transaction Entities

-          An event happened at a point of time

-          contains measurements or quantities

Component Entities :

-          directly related to a transaction entity

-          Component entities answer questions like “who”, “what”, “when”, “where”, “how” and “why” of a business event.

In a sales application transaction entities are:

                Customer: who made the purchase                                                       

   Product: what was sold

                Location: where it was sold

                Period: when it was sold

Classification Entities:

-          related to component entities by a chain of one-to-many relationships

-          represent hierarchies embedded in the data model

Step 2. Identify Hierarchies

       A hierarchy in an Entity Relationship model is any sequence of entities joined together by one-to-many relationships, all aligned in the same direction.

Step 3. Produce Dimensional Models

Operators For Producing Dimensional Models

                Operator 1: Collapse Hierarchy

                Operator 2: Aggregation

There is a wide range of options for producing dimensional models from an Entity Relationship model.

These include:

                Star Schema

                Snowflake Schema

                Constellation / Integrated Schema

Star Schema

A star schema consists of one large central table called the fact table, and a number of smaller tables called dimension tables which radiate out from the central table

•       A fact table is formed for each transaction entity. The key of the table is the combination of the keys of its associated component entities.

•       A dimension table is formed for each component entity, by collapsing hierarchically related classification entities into it.

 

Sample Star Schema

Snowflake Schema

A snowflake schema is a star schema with all hierarchies explicitly shown.

 

Star vs. Snowflake

Fact constellation schema

The fact constellation architecture contains multiple fact tables that share many dimension tables

Constellation /Integrated Schema

A constellation schema consists of a set of star schemas with hierarchically linked fact tables.

 

MicroStrategy Architecture and Product Suite

by Admin 28. February 2014 09:02

MicroStrategy Architecture and Product suite

  • Overview and Architecture
  • MicroStrategy Platform
  • MicroStrategy Intelligent Server
  • MicroStrategy Architect
  • MicroStrategy Desktop
  • MicroStrategy Web
  • MicroStrategy Administrator
    • MicroStrategy Object Manager
    • MicroStrategy Command Manager
    • MicroStrategy Enterprise Manager
  • MicroStrategy Distribution Services
  • MicroStrategy Office
  • MicroStrategy Mobile
  • MicroStrategy OLAP Service
  • MicroStrategy Report Services
  • MicroStrategy Integrity Manager
  • MicroStrategy SDK

Typical MicroStrategy Architecture

 

MicroStrategy Platform

 

Layers of MicroStrategy Platform

  • Data layer consists of Metadata Repository, Relational Data warehouses, Non-Relational Data Stores.

        The Metadata repository contains information used by all the MicroStrategy products.

  • Interactive reporting and Analysis layer allows users to perform on-demand reporting and data manipulation.
  • Information Delivery and Alerting allows users to receive proactive information delivery services, delivered on a schedule or only when a certain condition is triggered.
  • Design and Administration products allow speeding up project development and enabling fast, simple and centralized administration.

Intelligence Server
MicroStrategy Intelligence Server — architectural foundation and core component of the MicroStrategy system provides a powerful, comprehensive set of features necessary for a scalable, fault-tolerant, enterprise-wide business intelligence system; which provides the core analytical processing and  job management for all reporting, analysis and monitoring applications.
    It consists of :

  • SQL Engine
  • Query Engine
  • Analytical Engine

Architect
MicroStrategy Architect is a rapid development tool that maps the physical structure of your database into a logical business model consisting of Schema objects. These mappings are stored in a centralized metadata repository. These schema objects are used to build reports and other objects.

 

Desktop

MicroStrategy Desktop is a Client/Server application used to create reports, documents, other reporting objects and manipulate result sets. It provides integrated querying and reporting, powerful analytics, decision support workflows and enables to perform complex types of online analysis of data.
    It is available in 2 versions:

  • Desktop designer - Full featured version supporting complete functionality.
  • Desktop analyst - Simplified version supporting basic interactive functionality.

 

Desktop Screenshot

 

Web
 MicroStrategy Web is a reporting interface for end users that offers interactive reporting and analysis through a web browser and provides access to extensive report generation, document creation, manipulation and formatting capabilities with data browsing and drilling.
    It is available in 3 versions:

  • Web Professional - Full featured version that allows to create Intelligent cubes and reports.
  • Web Analyst - Simplified version that allows ad hoc analysis from intelligent cubes.
  • Web Reporter - Enterprise reporting version allows to view scheduled reports.

Web Screenshot

 

Administrator
MicroStrategy Administrator provides system administrators with a comprehensive set of tools to monitor manage and automates their BI Infrastructure by providing the centralized user and object management and performance monitoring that is critical for rolling out high performance, secure applications to small internal audiences or large Extranets.
    It consists of following components:

  • Object Manager - It enables to manage distributed environments for Change management, Versioning, Internationalization from a single interface and provides Full object life cycle management from development and testing to production. Its four components are Manager, Project Merge Wizard, User Merge Wizard and Repository Translation Wizard.
  • Enterprise Manager - It provides prebuilt reports and dashboards that enable to monitor MicroStrategy usage patterns and activities. It also helps to monitor system, user and group activity to tune performance and resource utilization.
  • Command Manager - It is used to automate administrative tasks which enables to reduce workload and increase efficiency. It provides a fast and easy way to perform necessary and repetitive tasks.

Distribution Services

MicroStrategy Distribution Services - an add-on component available for MicroStrategy Intelligence Server that offers business performance monitoring and automated information distribution.

MicroStrategy platform also consists of the following products:

  • MicroStrategy Office - this add-on enables power analysts and end users to run and analyze reports in Microsoft Excel,     Word, or PowerPoint.
  • MicroStrategy Mobile - an interactive interface to the MicroStrategy BI platform that lets mobile business users run reports and dashboards directly from their BlackBerry mobile devices from Research in Motion.
  • MicroStrategy OLAP Services - an add-on component available for MicroStrategy Intelligence Server that provides MicroStrategy Desktop, MicroStrategy Office, and MicroStrategy Web users with faster and more extensive analytics as part of In-Memory BI.
  • MicroStrategy Report Services - an add-on component available for MicroStrategy Intelligence Server that enables business users to easily create and analyze Pixel-Perfect dynamic enterprise dashboards, scorecards, or production reports from multiple sources of data.
  • MicroStrategy Integrity Manager - an application that automates the detection of inconsistencies and errors in your BI environment.
  • MicroStrategy SDK - a collection of programming tools, utilities, documentation, and libraries of functions or classes that are designed to enable users to customize and extend MicroStrategy products and to integrate them within other applications

MicroStrategy Architect and Project Design

by Admin 28. February 2014 08:11

MicroStrategy Architect and Project Design

  • Overview of MicroStrategy Architect and Schema Objects
  • Roles of MicroStrategy Architect and its importance
  • Building Logical table design and Warehouse catalogue
  • Architect Interface
  • Creating Project Tables, Facts, Attributes and its relationships and hierarchies


Overview of MicroStrategy Architect and Schema Objects

MicroStrategy Architect provides a set of tools that are used to create new projects and modify the structure of existing projects.
It enables to perform the following tasks:

  • Initially populate the metadata
  • Create schema objects

The role of the metadata in generating report and document results


Any time you create, modify, or remove any type of project object, those changes are stored in the metadata.

Creating Schema Objects
The most fundamental objects stored in the metadata are schema objects. The basic schema objects are:

 

Roles of MicroStrategy Architect and its Importance
MicroStrategy Architect supports the following project functions :

  • Reporting
  • Drilling
  • Browsing

MicroStrategy Architect and Reporting
Attributes and metrics are two of the most frequently used objects in designing reports. Metrics consist of facts, which are also schema objects.

MicroStrategy Architect and Drilling
The basic drilling options available for a report directly relate to the definition of following items in MicroStrategy Architect:

  • Relationships between attributes
  • Hierarchies and their drilling configuration

MicroStrategy Architect and Browsing
MicroStrategy Architect is used to create and define hierarchies, which determine the various paths for browsing attributes.

Building Logical table design and Warehouse catalogue

Logical Data Model
The first step in project design process is to design the logical data model. Logical Data Model is a diagram that shows what information is to be analyzed in a project and seen in reports and how that information is related. It depicts the flow of data in an organization but does not show the physical structure of how the information is stored in the data warehouse.

Simple Logical Data Model

Logical Data Model components
A logical data model includes the following three components :

  • Facts
  • Attributes
  • Hierarchies

Facts
Facts are measures that are used to analyze  business. Fact data is typically numeric, and it is generally aggregated. In MicroStrategy projects, facts map to fact schema objects, which form the basis for all the metrics you use in reports. The other components of a logical data model provide context for the facts.

Attributes

Attributes are descriptive data that provide context for analyzing facts. Attributes provide levels for aggregating and qualifying fact data. They map to attribute schema objects, which describe the metrics on reports.

Attribute Relationships
They help to join data from different attributes and aggregating fact data to different levels. Attributes are related to each other in 2 ways :
•    Direct - A parent-child relationship exists between two or more attributes.
•    Indirect - Two or more attributes are related only through a fact or set of facts.

Types of Direct Relationships

 

Hierarchies
Hierarchies are groupings of directly related attributes ordered to reflect their relationships. Hierarchies contain only attributes that are directly related to each other. Attributes in one hierarchy are indirectly related to attributes in other hierarchies.

Structure of a Logical Data Model
When all components - facts, attributes and their relationships, and hierarchies are put together we have a logical data model.

Creating a Logical Data Model
Factors that influence the design of the logical data model :

  • User reporting requirements
  • Existing source data
  • Technical and performance considerations

User reporting requirements
The Logical data model should take into account the reporting requirements of end users. It should include all information needed for reports and must not include any information captured  in source systems that is not needed for reports.

Existing Source Data
The Logical data model should take into account what source data is available for use. It should be ensured through an initial review of source systems that sufficient source data is present to support user reporting requirements.

Technical and Performance Considerations
Many technical and performance factors affect the design of logical data model, mostly with regard to its size and complexity. Technical factors are Robustness of database server and software, Network bandwidth and Volume of concurrent users. Complex user reporting requirements or source data structures pose greater challenges to delivering optimal performance.

Steps to Create a Logical Data Model
Factors that influence the design of the logical data model :

  • List all the information from the source data needed to include in the logical data model
  • Identify which items are facts
  • Identify which items are attributes
  • Determine the direct relationships between attributes
  • Organize directly related attributes into hierarchies


Introduction to Physical Schema
The second step in the project design process is to design the data warehouse schema.
             A physical schema is a detailed, graphical representation of the physical structure of a database. The physical schema is designed  based on the organization of the logical data model. While the logical data model shows the facts and attributes, the physical schema shows how the underlying data for these objects is stored in the data warehouse.

Physical Schema Components
A physical schema includes the following two primary components:

  • Columns
  • Tables

Column Types
In a data warehouse, the columns in tables store fact or attribute data. The following are the three types of columns:

  • ID Columns
  • Description Columns
  • Fact Columns

Table Keys
Every table has a primary key that consists of a unique value that identifies each distinct record (or row) in the table. There are two types of primary keys :

  • Simple Key
  • Compound Key

Lookup Tables
Lookup tables store information about attributes, including their IDs and any descriptions. They enable you to easily browse attribute data. Depending on the design of physical schema, a lookup table can store information for a single attribute or multiple related attributes.

Relationship Tables
Relationship tables store information about the relationship between two or more attributes. They enable you to join data for related attributes. To map the relationship between two or more attributes, their respective ID columns must exist together in a relationship table.

One-to-One Relationship
For one-to-one relationship, there is no separate relationship table and separate lookup table for the parent attribute. Instead, the parent is directly placed in the lookup table of the child attribute to map the relationship between the two attributes.

One-to-Many Relationship
For a one-to-many relationship, there is no need to have a separate relationship table. Instead, the parent-child relationship can be defined by including the ID column for the parent attribute in the lookup table of the child attribute.
 

Many-to-Many Relationship
For a many-to-many relationship, a separate relationship table with the IDs of parent and child attributes is created to map the parent-child relationship.

Fact Tables
Fact tables store fact data and attribute ID columns that describe the level at which the fact values are recorded. They enable to analyze fact data with regard to the business dimensions or hierarchies those attributes represent.
 

Base and Aggregate Fact Tables
Base fact tables are tables that store a fact or set of facts at the lowest possible level of detail. Aggregate fact tables are tables that store a fact or set of facts at a higher, or summarized, level of detail.

Schema Types
The type of schema design for the data warehouse depends on the nature of the data, how users want to query the data and other factors unique to the project and database environments. Commonly used schema designs :

  • Completely normalized schema
  • Moderately denormalized schema
  • Completely denormalized schema


Normalized Versus Denormalized Schemas
Normalization occurs whenever there is a schema design that does not store data redundantly. Denormalization occurs whenever there is a schema design that stores at least some data multiple times, or redundantly.

Completely Normalized Schema

A completely normalized schema does not store any data redundantly.

Moderately Denormalized Schema
A moderately denormalized schema stores some data redundantly.

Completely Denormalized Schema
A completely denormalized schema stores the maximum amount of data redundantly.

Creating a Data Warehouse Schema
The following factors influence the design of the schema :

  • User reporting requirements
  • Query performance
  • Data volume
  • Database maintenance


Architect Interface

Introduction to Architect
Overview of Architect Components
The Architect graphical interface consists of several components that combine most of the functions of the Project Creation Assistant with most of the functions of the schema object editors.
The Architect graphical interface has the following components:

  • Warehouse Tables pane
  • Project Tables View tab
  • Hierarchy View tab
  • Properties pane
  • Project objects pane
  • Menu bar
  • Toolbar


Architect Interface

Architect Graphical Interface

Warehouse Tables Pane
The Warehouse Tables pane enables to view database instances and their associated tables and select the tables to be included in a project. The Warehouse Tables pane also displays only database instances that have been associated to the project.

Project Tables View Tab
The Project Tables View tab displays images of the tables used in a project. It uses layers to display the tables. The All Project Tables layer enables to view all the tables used in a project. This layer exists by default when a project is created.

Hierarchy View Tab
The Hierarchy View tab displays all of the attributes that have been created in a project. This tab is used to create, modify, and remove relationships between attributes, which builds the system hierarchy. It can also be used to create, modify, and remove user hierarchies.

Properties Pane
The Properties pane enables to view and modify the properties for attributes, facts, and tables. It has three tabs-Attributes, Facts, and Tables.
Architect Interface

Project Objects Pane
The Project objects pane enables to view the number of attributes, facts, and tables that have been created in a project. It also shows the project name and the current user.

MicroStrategy Intelligence server setup

by Netpeach Team 27. February 2014 00:47

MicroStrategy Intelligence Server

 MicroStrategy Intelligence Server

  • Setting up MicroStrategy Intelligence Server
  • Intelligent server configuration, start up and connectivity
  • Creating Metadata and Statistics Repositories
  • Project Sources, connectivity and MicroStrategy Projects
  • Defining and setting up the MicroStrategy reporting Environment

MicroStrategy Intelligence Server

MicroStrategy Intelligence Server provides the core analytical processing and  job management for all reporting, analysis and monitoring applications.

 

 

 

Setting up MicroStrategy Intelligence Server

Configuration Wizard

Metadata Connection

MicroStrategy Authentication

Server Definitions

Settings

Summary

Intelligent server configuration, start up and connectivity

Intelligent server configuration, start up and connectivity contd.


Creating Metadata and Statistics Repositories

Configuration Wizard

Repository Types

Metadata Tables

Summary

History List Tables

Summary

Statistics Tables

Project Sources, connectivity and MicroStrategy Projects

Configuration Wizard

 

Project Source Name

 

Metadata Location

 

Authentication

 

Summary

 

Defining and setting up the MicroStrategy reporting Environment

Configuring the MicroStrategy Reporting Suite – Connectivity Wizard

Driver Selection

Driver Details

DSN Created

 

Netpeach Technologies on Facebook