Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: May 5, 2023
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    At Data2Bots, we build secure and scalable data solutions in the cloud, helping businesses make informed decisions off their data. Our solutions are driven towards identifying organizational data challenges, designing strategies to address them while keeping your business’s needs top of mind without disrupting your business activities. Our consultan...
    Read more about this company

     

    Medior Data Engineer

    RESPONSIBILITIES

    • Implement DevOps practices, CI/CD pipelines, and Infrastructure as Code for efficient and automated data workflows. 
      • Preferred; Terraform (1.5 years of experience)
      • Github Workflows
      • Cloud Native CI/CD pipelines
    • AWS (Codepipeline, Codebuild and Codecommit).
    • Working with or building production-grade end-to-end data platforms.
      • Architecting scalable and maintainable cloud infrastructure, with specific emphasis on the data domain.
      • Technical implementation and deployment of architected cloud infrastructure.
      • Follow software development standards and best practices, including Test Driven Development (TDD), Keep it simple, stupid (KISS), you aren't gonna need it (YAGNI), and don't repeat yourself (DRY).
    • Apply data warehousing concepts and data warehouse modelling techniques to design and optimise data solutions.
    • Work with cloud data warehouse solutions such as Redshift, BigQuery, or Snowflake to build scalable and performant data solutions.
    • Utilise data flow orchestration tools such as Apache Airflow to schedule and manage data workflows.
    • Leverage distributed computing and container orchestration technologies like Kubernetes for efficient and scalable data processing.
    • Explore and implement data streaming solutions like Kafka for real-time data processing.
    • Design and develop microservices and event-driven architecture for efficient data integration and processing.
    • Core knowledge of Data Engineering frameworks such as Spark, Kafka, and Airflow is a plus
    • Attention to detail
    • Leadership skills

    REQUIREMENTS AND SKILLS

    • Bachelor's degree in computer science, an engineering discipline, or a related field.
    • Minimum of 4 years of professional IT industry experience.
    • Minimum of 4 years of software engineering experience.
    • Minimum of 2 years of cloud computing experience, preferably with AWS.
    • Proficiency in ETL development using Python and SQL.
    • Minimum of 1.5 with DevOps, CI/CD, and Infrastructure as Code.
    • Preferred Terraform.
    • Good understanding of software development standards and best practices.
    • Knowledge of data warehousing concepts and data warehouse modelling.
    • Experience working with at least one Cloud Data Warehouse Solution such as Redshift, BigQuery, or Snowflake.
    • Experience with data flow orchestration tools such as Apache Airflow is a plus.
    • Experience with distributed computing and container orchestration (Kubernetes) is a plus.
    • Experience with data streaming solutions such as Kafka is a plus.
    • Experience with microservices and event-driven architecture is a plus.

    SOFT SKILLS

    • Excellent understanding of Agile Methodology and experience with Scrum rituals.
    • Ability to work independently, think proactively, and pay attention to details, while providing technical leadership to a team.
    • Demonstrated experience in leading data engineering projects from conception to finished product, and ability to drive technical decisions.
    • Adaptability to a fast-paced technical environment, with a strong sense of urgency and ability to meet tight deadlines.
    • Energetic, motivated, and a team player, with excellent communication skills in English (both written and spoken).
    • Ability to effectively communicate and collaborate with cross-functional teams and business stakeholders.
    • Ability to communicate effectively with cross-functional teams and business stakeholders

    Method of Application

    Interested and qualified? Go to Data2Bots on data2bots.com to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Data2Bots Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail