Deprecated: mysql_connect(): The mysql extension is deprecated and will be removed in the future: use mysqli or PDO instead in /home/content/77/8880177/html/ipartner/db.php on line 2
Pentaho Training For Hadoop -iPartner
loading...

Pentaho Training For Hadoop

  1. Pentaho Training For Hadoop
20 hr / 680*
(* including all taxes.)


Key Features
SMALL
BATCHES
MENTORING
BY EXPERTS
FLEXIBLE
SCHEDULE
LEARN
BY DOING
GOAL
ORIENTED



Course Agenda


  • What is pentaho
  • Dimensional modeling
  • Dimensional design
  • Star schema-what and by (model)
  • Star schema –what and by (language)
  • Conformed dimensions
  • Additive vs. semi-additive facts
  • Snow flake schema
  • Star vs. Dimensional schema
  • Slowly changing dimensions
  • Files in pentaho
  • Spoon.bat
  • Pan.bat
  • Kitech.bat
  • Carte.bat
  • Encr.bat
  • Spoon transformation and its steps of transformation
  • How to create a connection in DB
  • How to move a data from CSV file input to table output
  • How to move the data from CSV file input to the Microsoft excel output
  • How to move the data from Microsoft excel input to write to log
  • Data Grid
  • Generate rows
    • How to add constant
    • How to add sequence
    • Add value field changing sequence
    • How the calculator work in pentaho
    • No. range in pentaho
    • Replace in string
    • Select values-select field value
    • Select field value to constant
    • Sort rows
    • Split field to rows
    • String operation-string cut
    • Unique rows
    • Unique rows (hash set)
    • Value mapper
  • Revision of SSH command
  • How to handle the null value in pentaho
  • Mail in pentaho
  • Error handling in pentaho
  • Filter rows
  • Priorities stream
  • Revision of SCD type 2
  • Jobs
  • Differences between jobs and transformation
  • How to make ETL dynamic
  • How to make transformation dynamic
  • File management
  • Create folder
  • Conditions
  • Scripting
  • Bulk loading
  • XML
  • Utility
  • Repository
  • File transfer
  • File encryption
  • How to make ETL dynamic
  • Difference between parameter and variable
  • How to pass a variable from a job to transformation
  • How to use a parameter within a transformation
  • How to set and get the value from a job to a transformation
  • Environmental variable
  • Functionality of repository in pentaho
  • Data base connection
  • Repository import

  • Basic operational report/dashboard
  • Row bending effect in pentaho
  • Report designing
  • How to public the report/dashboard
  • How pentaho server BI looks like
  • How to create Bar chart, Pi chart and Line chart in pentaho
  • Limitation of design the product
  • Sub report/dashboard in pentaho
  • How to pass parameter in report/dashboard?
  • Drill down report/dashboard
  • How to create report/dashboard using cubes?
  • How to create report/dashboard using excel sheet?
  • How to create report/dashboard using Pentaho Data Integration
  • What are the benefit of cube?
  • How to create the cube?
  • How to deploy a cube?
  • How to create a report/dashboard using a cube?
  • MDX basics
  • MDX in practice
  • Cells
  • Tuple implicit added aggregation member
  • Tuple
  • Tuple implicit dimensions
  • Sets in MDX
  • Selects
  • Referencing dimensions,level members
  • Member referencing
  • Positional
  • Hierarchical navigation
  • Hierarchical navigation MISC
  • Functions
  • Meta data
  • How ETL tools work in Big data Industry
  • Connecting to HDFS from ETL tool and moving data from Local system to HDFS
  • Moving Data from DBMS to HDFS
  • Working with Hive with ETL Tool
  • Creating Map Reduce job in ETL tool
  • End to End ETL PoC showing Hadoop integration with ETL tool
  • How to create Data source
  • Manage Data source
  • Formatting the Report
  • How to change the template of the report
  • Scheduling etc.
  • Big Data, Factors constituting Big Data
  • Hadoop and Hadoop Ecosystem
  • Map Reduce -Concepts of Map, Reduce, Ordering, Concurrency, Shuffle, Reducing, Concurrency
  • Hadoop Distributed File System (HDFS) Concepts and its Importance
  • Deep Dive in Map Reduce – Execution Framework, Partioner, Combiner, Data Types, Key pairs
  • HDFS Deep Dive – Architecture, Data Replication, Name Node, Data Node, Data Flow
  • Parallel Copying with DISTCP, Hadoop Archives

  • How to develop Map Reduce Application, writing unit test
  • Best Practices for developing and writing, Debugging Map Reduce applications
  • Joining Data sets in Map Reduce
  • What Is Hive?
  • Hive Schema and Data Storage
  • Comparing Hive to Traditional Databases
  • Hive vs. Pig
  • Hive Use Cases
  • Interacting with Hive
  • What Is Pig?
  • Pig’s Features
  • Pig Use Cases
  • Interacting with Pig
  • Pig Latin Syntax
  • Loading Data
  • Simple Data Types
  • Field Definitions
  • Data Output
  • Viewing the Schema
  • Filtering and Sorting Data
  • Commonly-Used Functions
  • Hands-On Exercise: Using Pig for ETL Processing
  • What is Impala?
  • How Impala Differs from Hive and Pig
  • How Impala Differs from Relational Databases
  • Limitations and Future Directions
  • Using the Impala Shell
  • Putting it all together and Connecting Dots
  • Working with Large data sets, Steps involved in analyzing large data
  • How ETL tools work in Big data Industry
  • Connecting to HDFS from ETL tool and moving data from Local system to HDFS
  • Moving Data from DBMS to HDFS
  • Working with Hive with ETL Tool
  • Creating Map Reduce job in ETL tool
  • End to End ETL PoC showing Hadoop integration with ETL tool.
  • Major Project, Hadoop Development, cloudera Certification Tips and Guidance and Mock Interview Preparation, Practical Development Tips and Techniques, certification preparation

Learn & Get

  • Know what is Pentaho and understand the architecture of the Pentaho Business Intelligence (BI) Suite
  • Perform multiple data transformations, data integration and data analytics
  • Use PDI and ETL design patterns to populate a data warehouse star schema
  • Describe and Demonstrate the reporting end-user experience with the Pentaho BI Server
  • Develop basic OLAP schemas for and using Pentaho Analysis
  • Build and deploy report and understand the concept of Pentaho Kettle

Payment Method

PAYMENT METHODS
You need to pay through PayPal. We accept both Debit and Credit Card for transaction.
SCHOLARSHIPS
We subsidize our fees by 10% for military personnel, and college students with exceptional records. To apply for a scholarship, email info@ipartner.ca
FREQUENTLY ASKED QUESTIONS
In our iPartner self-paced training program, you will receive the training assessments, recorded sessions, course materials, Quizzes, related softwares and assignments. The courses are designed in such a way that you will the get real world exposure; the solid understanding of every concept that allows you to get the most from the online training experience and you will be able to apply the information and skills in the workplace. After the successful completion of your training program, you can take quizzes which enable you to check your level of knowledge and also enables you to clear your relevant certification at higher marks/grade where you will be able to work on the technologies independently.
In Self-paced courses, the learners are able to conduct hands-on exercises and produce learning deliverables entirely on their own at any convenient time without a facilitator whereas in the Online training courses, a facilitator will be available for answering queries at a specific time to be dedicated for learning. During your self-paced learning, you can learn more effectively when you interact with the content that is presented and a great way to facilitate this is through review questions and quizzes that strengthen key concepts. In case if you face any unexpected challenges while learning, we will arrange a live class with our trainer.
All Courses from iPartner are highly interactive to provide good exposure to learners and gives them a real time experience. You can learn only at a time where there are no distractions, which leads to effective learning. The costs of self-paced training are 75% cheaper than the online training. You will offer lifetime access hence you can refer it anytime during your project work or job.
Yes, at the top of the page of course details you can see sample videos.
As soon as you enroll to the course, your LMS (The Learning Management System) Access will be Functional. You will immediately get access to our course content in the form of a complete set of previous class recordings, PPTs, PDFs, assignments and access to our 24*7 support team. You can start learning right away.
24/7 access to video tutorials and Email Support along with online interactive session support with trainer for issue resolving.
Yes, You can pay difference amount between Online training and Self-paced course and you can be enrolled in next online training batch.
Please send an email. You can join our Live chat for instant solution.
We will provide you the links of the software to download which are open source and for proprietary tools, we will provide you the trail version if available.
You will have to work on a training project towards the end of the course. This will help you understand how the different components of courses are related to each other.
Classes are conducted via LIVE Video Streaming, where you get a chance to meet the instructor by speaking, chatting and sharing your screen. You will always have the access to videos and PPT. This would give you a clear insight about how the classes are conducted, quality of instructors and the level of Interaction in the class.
Yes, we do keep launching multiple offers that best suits your needs. Please email us at: info@ipartner.ca and we will get back to you with exciting offers.
We will help you with the issue and doubts regarding the course. You can attempt the quiz again.
Sure! Your feedbacks are greatly appreciated. Please connect with us on the email support - info@ipartner.ca.