Apache Hadoop – Mainframe

While Apache Hadoop originated as a batch process, big data analysis framework, today’s uses of the technology have evolved and grown dramatically.

From low-cost data storage, to OLAP, NoSQL datastores and now real-time queries, the flexibility and power of Apache Hadoop is evident in the innovation we’ve seen in its open-source ecosystem. Global Knowledge trainer, Rich Morrow explores new use cases and value that Hadoop can bring to your organization

Mainframe Resources can switch to Hadoop: With these skills:

  1. Core java knowledge
  2. Unix basic commands
  3. NoSql

Related articles

Author: Srini

Experienced software developer. Skills in Development, Coding, Testing and Debugging. Good Data analytic skills (Data Warehousing and BI). Also skills in Mainframe.