The Dangote Group is one of the most diversified business conglomerates in Africa with a hard-earned reputation for excellent business practices and products' quality with its operational headquarters in the bustling metropolis of Lagos, Nigeria in West Africa.
The Group's activities encompass:
Cement - Manufacturing / Importing
Sugar - Manufacturing ...
Read more about this company
The ideal candidate will have a strong background in data engineering, with proficiency in ETL/ELT processes, big data technologies, and cloud platforms.
You will play a critical role in ensuring data accessibility, quality, and integrity by maintaining a scalable infrastructure that supports data analytics and other data-related initiatives.
This position offers an exciting opportunity to work collaboratively with cross-functional teams and leverage your expertise to shape our data infrastructure.
Responsibilities
Acts as the architect and builder of the data infrastructure, creating the necessary foundation for proper analytics engagements.
Design and implement efficient ETL (Extract, Transform, Load) pipelines to support data integration and analytics.
Build and maintain data warehouses and data lakes to support business intelligence and related data applications.
Develop data models and schemas for effective data analysis and reporting.
Optimise data processing, query performance, and platform efficiency.
Ensure data quality, integrity, and security through robust governance practices
Lead data migration initiatives across multiple platforms.
Modernise applications and databases to leverage Azure's advanced capabilities.
Implement monitoring solutions for database usage, performance, and reliability.
Develop automated data quality checks and testing procedures.
Requirements
Bachelor's degree in Computer Science, Engineering, or a related field.
3+ years of experience in a similar role within the technology industry.
Proven experience in data engineering, with a strong foundation in ETL/ELT processes and data warehousing.
Proven ability to work with both OLTP (Microsoft SQL Server/MySQL) and OLAP systems (Snowflake).
Hands-on experience in designing, optimising, and managing batch and streaming data pipelines.
Expertise in data migration and modernisation, particularly on Azure cloud platforms.
Microsoft Certification in Azure Data Engineering is highly desirable.
Expertise in SQL, Python, and other programming languages relevant to data engineering.
Experience with big data technologies such as Hadoop, Spark, and Hive.
Knowledge and hands-on experience with cloud platforms like AWS, Azure, or GCP, including services such as Azure Data Factory, Azure Synapse Pipeline, Azure Event Hub, and Azure IoT Hub.
Familiarity with data visualisation tools like Power BI.