Who should attend
- Business Analysts, BI Developers, Data Scientists, Data Warehousing Programmers and Solution Architects
- Mainframe and Testing Professionals
- Those who want to take up a career in Business Intelligence and Data Analytics
What are the prerequisites for learning Pentaho?
No prior knowledge is required to learn the Pentaho BI suite.
About the course
Pentaho training class from Intellipaat helps you learn the Pentaho BI suite which covers Pentaho Data Integration, Pentaho Report Designer, Pentaho Mondrian Cubes and Dashboards. This Pentaho online course you will help you prepare for the Pentaho Data Integration exam and you will work on real-life project works.
About Pentaho Training Course
This Intellipaat Pentaho Business Intelligence training course equips you with the knowledge of Business Intelligence and data warehousing concepts, along with in-depth coverage of Pentaho Data Integration (aka Kettle), Pentaho Reporting, Dashboards and Mondrian Cubes. Pentaho is an open-source comprehensive BI suite and provides integration with Hadoop distribution for handling large dataset and doing reporting on top of it. This training will also equip you with the skills to integrate Pentaho BI suite with Hadoop.
What will you learn in this Pentaho online certification Training?
- Architecture of the Pentaho BI suite
- Pentaho Analytics for creating reports using Pentaho BI Server
- Performing multiple data integration, transformation and analytics
- Pentaho Dashboard and Pentaho Business Analytics
- Using PDI / Kettle and ETL design patterns to populate data warehouse star schema
- Creating complex reports and dashboard for analysis
- Developing Mondrian Cube OLAP schemas with Pentaho workbench
- Integrating Pentaho with Big Data Stack like HDFS and MapReduce
- Performance tuning PDI jobs and transformations
- Using Pentaho Kettle to build and deploy reports in an automated manner
Why should you go for Pentaho online training?
- Poor data quality costs the US businesses up to $600 billion annually – TDWI
- Global Big Data Analytics Market to reach $40.6 Billion in 4 years -ResearchandMarkets
- The average US salary for a Pentaho Professional is $104,000 – indeed.com
Big Data today is rapidly entering mainstream, and there is an urgent need for a flexible tool to address the changing requirements. Pentaho is a very versatile tool that is simple yet effective in the Business Intelligence space, and hence it is expected to grow at a fast pace. Intellipaat Pentaho training certification program provides great opportunities for professionals in this domain.
Pentaho Course Content
- Introduction to Pentaho Tool
Pentaho user console, Oveview of Pentaho Business Intelligence and Analytics tools, database dimensional modelling, using Star Schema for querying large data sets, understanding fact tables and dimensions tables, Snowflake Schema, principles of Slowly Changing Dimensions, knowledge of how high availability is supported for the DI server and BA server, managing Pentaho artifacts Knowledge of big data solution architectures
Hands-on Exercise – Schedule a report using user console, Create model using database dimensional modeling techniques, create a Star Schema for querying large data sets, Use fact tables and dimensions tables, manage Pentaho artifacts
- Data Architecture
Designing data models for reporting, Pentaho support for predictive analytics, Design a Streamlined Data Refinery (SDR) solution for a client
Hands-on Exercise – Design data models for reporting, Perform predictive analytics on a data set, design a Streamlined Data Refinery (SDR) solution for a dummy client
- Clustering in Pentaho
Understanding the basics of clustering in Pentaho Data Integration, creating a database connection, moving a CSV file input to table output and Microsoft Excel output, moving from Excel to data grid and log.
Hands-on Exercise – Create a database connection, move a csv file input to table output and Microsoft excel output, move data from excel to data grid and log
- Data Transformation
The Pentaho Data Integration Transformation steps, adding sequence, understanding calculator, Penthao number range, string replace, selecting field value, sorting and splitting rows, string operation, unique row and value mapper, Usage of metadata injection
Hands-on Exercise – Practice various steps to perform data integration transformation, add sequence, use calculator, work on number range, selecting field value, sorting and splitting rows, string operation, unique row and value mapper, use metadata injection
- Pentaho Flow
Working with secure socket command, Pentaho null value and error handling, Pentaho mail, row filter and priorities stream.
Hands-on Exercise – Work with secure socket command, Handle null values in the data, perform error handling, send email, get row filtered data, set stream priorities
- Deploying SCD
Understanding Slowly Changing Dimensions, making ETL dynamic, dynamic transformation, creating folders, scripting, bulk loading, file management, working with Pentaho file transfer, Repository, XML, Utility and File encryption.
Hands-on Exercise – Make ETL dynamic transformation, create folders, write scripts, load bulk data, perform file management ops, work with Pentaho file transfer, XML utility and File encryption
- Type of Repository in Pentaho
Creating dynamic ETL, passing variable and value from job to transformation, deploying parameter with transformation, importance of Repository in Pentaho, database connection, environmental variable and repository import.
Hands-on Exercise – Create dynamic ETL, pass variable and value from job to transformation, deploy parameter with transformation, connect to a database, set pentaho environmental variables, import a repository in the pentaho workspace
- Pentaho Repository & Report Designing
Working with Pentaho dashboard and Report, effect of row bending, designing a report, working with Pentaho Server, creation of line, bar and pie chart in Pentaho, How to achieve localization in reports
Hands-on Exercise – Create Pentaho dashboard and report, check effect of row bending, design a report, work with Pentaho Server, create line, bar and pie chart in Pentaho, Implement localization in a report
- Pentaho Dashboard
Working with Pentaho Dashboard, passing parameters in Report and Dashboard, drill-down of Report, deploying Cubes for report creation, working with Excel sheet, Pentaho data integration for report creation.
Hands-on Exercise – Pass parameters in Report and Dashboard, deploy Cubes for report creation, drill-down in report to understand the entries, import data from an excel sheet, Perform data integration for report creation
- Understanding Cube
What is a Cube? Creation and benefit of Cube, working with Cube, Report and Dashboard creation with Cube.
Hands-on Exercise – Create a Cube, create report and dashboard with Cube
- Multi Dimensional Expression
Understanding the basics of Multi Dimensional Expression (MDX), basics of MDX, understanding Tuple, its implicit dimensions, MDX sets, level, members, dimensions referencing, hierarchical navigation, and meta data.
Hands-on Exercise – Work with MDX, Use MDX sets, level, members, dimensions referencing, hierarchical navigation, and meta data
- Pentaho Analyzer
Pentaho analytics for discovering, blending various data types and sizes, including advanced analytics for visualizing data across multiple dimensions, extending Analyzer functionality, embedding BA server reports, Pentaho REST APIs
Hands-on Exercise – Blend various data types and sizes, Perform advanced analytics for visualizing data across multiple dimensions, Embed BA server report
- Pentaho Data Integration (PDI) Development
Knowledge of the PDI steps used to create an ETL job, Describing the PDI / Kettle steps to create an ETL transformation, Describing the use of property files
Hands-on Exercise – Create an ETL transformation using PDI / Kettle steps, Use property files
- Hadoop ETL Connectivity
Deploying ETL capabilities for working on the Hadoop ecosystem, integrating with HDFS and moving data from local file to distributed file system, deploying Apache Hive, designing MapReduce jobs, complete Hadoop integration with ETL tool.
Hands-on Exercise – Deploy ETL capabilities for working on the Hadoop ecosystem, Integrate with HDFS and move data from local file to distributed file system, deploy Apache Hive, design MapReduce jobs
- Creating dashboards in Pentaho
Creating interactive dashboards for visualizing highly graphical representation of data for improving key business performance.
Hands-on Exercise – Create interactive dashboards for visualizing graphical representation of data
- Performance Tuning
Managing BA server logging, tuning Pentaho reports, monitoring the performance of a job or a transformation, Auditing in Pentaho
Hands-on Exercise – Manage logging in BA server, Fine tune Pentaho report, Monitor the performance of an ETL job
Integrating user security with other enterprise systems, Extending BA server content security, Securing data, Pentaho’s support for multi-tenancy, Using Kerberos with Pentaho
Hands-on Exercise – Configure security settings to implement high level security
What projects I will be working on this Pentaho training?
Project 1– Pentaho Interactive Report
Data– Sales, Customer, Product
Objective – In this Pentaho project you will be exclusively working on creating Pentaho interactive reports for sales, customer and product data fields. As part of the project you will learn to create a data source, build a Mondrian cube which is represented in an XML file. You will gain advanced experience in managing data sources, building and formatting Pentaho report, change the report template and scheduling of reports.
Project 2– Pentaho Interactive Report
Objective – Build complex dashboard with drill down reports and charts for analysing business trends.
Project 3– Pentaho Interactive Report
Objective – To do automation testing in ETL environment, Check the correctness of data transformation, Data loading in datawarehouse without any loss or truncation, Rejecting, Replacing and Reporting invalid data, Creation of unit tests to target exceptions
This course is designed for clearing the Pentaho Business Analytics Certification exam. The entire course content is in line with the certification program and helps you clear the certification exam with ease and get the best jobs in top MNCs.
As part of this training, you will be working on real-time projects and assignments that have immense implications in the real-world industry scenarios, thus helping you fast-track your career effortlessly.
At the end of this training program, there will be a quiz that perfectly reflects the type of questions asked in the certification exam and helps you score better marks.
Intellipaat Course Completion Certificate will be awarded upon the completion of the project work (after the expert review) and upon scoring at least 60% marks in the quiz. Intellipaat certification is well recognized in top MNCs like Ericsson, Cisco, Cognizant, Sony, Mu Sigma, Saint-Gobain, Standard Chartered, TCS, Genpact, Hexaware, etc.
Videos and materials
Because of COVID-19, many providers are cancelling or postponing in-person programs or providing online participation options.
We are happy to help you find a suitable online alternative.