Flex Finance - We help free business owners and finance teams in Africa from the stress of spend management. We make this aspect of business delightfully simple and save businesses lots of money and time, helping them grow and succeed. With Flex, you enjoy digital requisitions, bulk payment transfers, smart budgeting, and expense tracking possible and seamle...
Read more about this company
We are seeking a highly skilled Senior Data Engineer to join our engineering team and lead our data infrastructure's design, development, and maintenance. You will be pivotal in building pipelines that power real-time financial analytics, risk management, and AI-driven decision-making across our product suite.
Key Responsibilities
Architect, build, and maintain scalable data pipelines and ETL processes from multiple data sources (transaction data, third-party APIs, internal services).
Optimize and manage our data lake and warehouse (AWS, Redshift, or similar) to support analytics and product features.
Partner with product, engineering, and finance teams to define data requirements and deliver accessible, clean, and reliable datasets.
Implement best practices for data governance, security, compliance (especially PCI DSS), and access control.
Monitor, debug, and improve the performance of our data infrastructure and tooling.
Mentor junior data engineers and contribute to technical design reviews.
Work closely with our AI/ML team to support experimentation, model training, and production inference with high-quality datasets.
Requirements
5+ years of experience in data engineering or backend engineering roles.
Deep proficiency in Python, SQL, and distributed data processing (e.g., Apache Spark, Airflow).
Hands-on experience with cloud data architecture (preferably AWS – Lambda, S3, Redshift, DynamoDB).
Experience building real-time and batch data pipelines.
Solid understanding of database systems (SQL and NoSQL), performance tuning, and data modeling.
Experience working with financial or transactional data is a plus.
Familiarity with compliance and regulatory requirements for data handling in fintech is a strong advantage.
Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
Nice to Have
Experience with GraphQL or REST API data integrations.
Exposure to business intelligence tools like Metabase, Looker, or Power BI.
Familiarity with AI/ML pipelines or experimentation frameworks.
Previous experience in a fast-paced startup or fintech environment.
30 Contract Staffing Risks That Could Get Your Company SuedThis piece outlines 30 contract staffing risks that have real legal consequences under Nigerian law. If you are a business owner, HR professional, or staffing agency operator, you will find this highly valuable.
10 Steps to Building an Effective Talent PipelineLearn how to keep a list of good candidates ready in advance, before a role becomes vacant. Discover step by step the process of building a talent pipeline that works.
2026 / 2027 NEPL / OERNL Joint Venture Tertiary Scholarship Scheme (National Merit Award)The NEPL/OERNL Joint Venture in pursuance of its Corporate Social Responsibility invites suitably qualified applicants for its 2026/2027 Tertiary Scholarship Scheme, commencing Tuesday, March 3, 2026, and concluding on Wednesday, April 1, 2026. For applicants from Non-Host/Transit Communities