Big Data Stores and Pipelines
Big Data Stores and Pipelines skill requires knowledge of data flow orchestration. As data gathering is usually only the first step, continuous data transformation, ETL and pipelining are required for further analytics. Common tools associated with this skill are Luigi, Airflow, Apache NiFi within Hadoop environment and Informatica and Pentaho Kettle for Data Warehouses.
Skill Levels
No knowledge of even ad-hoc data processing / ETL techniques.
No knowledge of even ad-hoc data processing / ETL techniques.
No knowledge of even ad-hoc data processing / ETL techniques.
No knowledge of even ad-hoc data processing / ETL techniques.
No knowledge of even ad-hoc data processing / ETL techniques.
Assessments
The following assessments award this skill: