About the job
- Good Experience in handling multiple TB of data and Working experience large EDW environment.
- · Hadoop development and implementation.
- · Experience in working unstructured and disparate data sets
- · Pre-processing using Hive and other big data technologies like Kafka, Spark.
- · Designing, building, installing, configuring and supporting Hadoop.
- · Experience Translate complex functional and technical requirements into feasible solutions.
- · Experience in performing analysis of vast data stores and uncover insights.
- · Experience in Big data security and data privacy.
- · Create scalable and high-performance web services for data tracking.
- · Experience in near real timing streaming technologies.
- · Experience in real time replication on Teradata
- · Experience in real time replication on Hadoop
- · Experience or Working knowledge of Presto
- · Experience in creating POC effort to help build new Hadoop clusters/use cases.
- · Test prototypes and oversee handover to operational teams.
- · Knowledge of best practices/standards.
Posted on Oct 30, 2023.