Big Data Engineer - Singapore - KERRY CONSULTING PTE. LTD.

    KERRY CONSULTING PTE. LTD.
    KERRY CONSULTING PTE. LTD. Singapore

    Found in: Talent SG 2A C2 - 2 weeks ago

    Default job background
    Description
    Roles & Responsibilities

    Big Data Engineer (Hortonworks Hadoop, Cloudera) - Contract

    Job posting done by Sheralynn Tjioe, Head of Interim and Contracting Solutions (Technology) Recruitment at Kerry Consulting

    Email:

    My Client is a leading stable firm in Singapore.

    Job Description

    My client is looking for an accomplished Data Engineer (Hortonworks Hadoop, Cloudera) to spearhead the migration of our current data warehouse and model from MariaDB to the on-premise Hadoop Big Data Cloudera platform. The ideal candidate should excel in SQL, Hive SQL, Spark, and possess a strong grasp of data modeling. They must also have a thorough understanding of the production deployment process, including design, development, testing, UAT, and deployment. Experience with Autosys for job scheduling is indispensable for this role.

    Primary Responsibilities

    • Lead the migration of our data warehouse and model to the on-premise Hadoop Big Data Cloudera platform.
    • Develop and fine-tune SQL, Hive SQL, and Spark scripts for efficient data processing.
    • Design and deploy data models to meet business requirements and optimize performance.
    • Collaborate closely with cross-functional teams to understand data needs and maintain integrity throughout migration.
    • Develop and execute comprehensive test plans to validate data accuracy and system performance.
    • Coordinate with internal stakeholders to plan and execute production deployments.
    • Utilize Autosys for job scheduling and monitoring to ensure timely execution and minimal downtime.
    • Provide technical expertise and support to team members throughout the migration project.
    • Document processes, procedures, technical specifications, and best practices for knowledge sharing and scalability.
    • Create and maintain unit test case documents to uphold code quality and reliability.

    Qualifications

    • Bachelor's (or higher) degree in computer science, engineering, or a related field.
    • Proven experience of at least 5 years in data engineering and migration projects for big data, particularly with Hortonworks/Cloudera.
    • Hands-on experience with Cloudera and its ecosystem components.
    • Strong proficiency in ETL processes implementation.
    • High proficiency in SQL, Hive SQL, Spark, and data modeling.
    • Sound understanding of the production deployment process.
    • Experience with Autosys or similar tools for job scheduling.
    • Familiarity with version control systems such as Bitbucket, Git, etc.
    • Ability to troubleshoot and resolve data-related issues efficiently.
    • Experience with Dataiku is a plus, though not mandatory.

    To Apply

    For a confidential chat regarding your next Technology role, please submit your resume (in MS Words format) to Sheralynn Tjioe at , quoting the job title. We regret that only shortlisted candidates will be contacted.

    Registration No.: R1878306

    License No.: 16S8060

    Tell employers what skills you have

    Version Control
    Git
    UAT
    Scalability
    Big Data
    Data Modeling
    Hadoop
    ETL
    Technology Recruitment
    MariaDB
    Reliability
    Data Engineering
    Autosys
    Scheduling
    Business Requirements