Jobs Career Advice Post Job
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Feb 25, 2026
    Deadline: Mar 4, 2026
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • We run a Business Management System. We are also branched out and well versed in the Telemedicine, Logistics, Agriculture as well as the Travel and Tourism industries.
    Read more about this company

     

    Senior Data Engineer

    Engagement Type: Fixed-Price Contract (3–4 Weeks + 1 Week Post-Launch Support)

    Overview

    We are seeking an exceptionally strong Data Engineering Specialist to build a production-grade operational monitoring and analytics layer on top of our existing Google Cloud data stack.

    This is not a discovery-heavy engagement.
    The project is fully specified and documented. All alert logic, thresholds, dashboard requirements, schemas, and technical expectations are defined.

    We need someone who can:

    • Read detailed specifications
    • Translate them into clean, production-ready data workflows
    • Implement advanced SQL logic in BigQuery
    • Build robust Dagster jobs
    • Create operational dashboards that drive decision-making
    • Deliver within a tight timeline

    This role is execution-focused and suitable only for engineers with deep experience in data systems and production orchestration.

    What You Will Build

    You will deliver two production systems on top of our existing BigQuery warehouse.

    Part A — Automated Operational Alert System

    An automated monitoring system that continuously evaluates the health of our order pipeline and triggers notifications when predefined conditions occur.

    Core Responsibilities

    You will:

    • Implement 11 automated alert rules as scheduled Dagster jobs
    • Write advanced SQL queries against BigQuery (window functions, CTEs, rolling baselines, joins across multiple datasets)
    • Implement anomaly detection logic using rolling comparisons
    • Integrate Slack API for structured, formatted alerts
    • Integrate SMS notifications (Twilio or similar) for critical-level alerts
    • Implement structured logging and audit trails
    • Store configurable alert thresholds in BigQuery (no hard-coded values)
    • Ensure system reliability, idempotency, and clean error handling

    Alert Categories Include

    • Delayed orders
    • Stuck pipeline stages
    • SLA breaches
    • Partner performance degradation
    • Volume anomalies vs rolling baselines
    • Failure spikes
    • Operational bottlenecks

    The alert system must be production-ready, observable, and maintainable.

    Part B — Operational Dashboard (Looker Studio)

    You will build a 4-page operational dashboard connected directly to BigQuery.

    This dashboard will be used daily by the operations team to monitor pipeline health and partner performance.

    Dashboard Pages

    Real-Time Operations Overview

    • Pipeline status
    • Active orders by stage
    • SLA compliance summary
    • Stuck orders
    • Critical alerts

    Partner Performance Tracking

    • Doctors, pharmacies, carriers
    • Trend analysis
    • SLA adherence
    • Throughput metrics

    Historical Trends & Anomaly History

    • Rolling baselines
    • Alert frequency trends
    • Performance over time

    Detailed SLA Compliance Reporting

    • Breach tracking
    • Time-to-resolution metrics
    • Drill-down capability
    • The dashboard must include:
    • Filters
    • Drill-down capability
    • Trend visualizations
    • Clear operational storytelling

    Technology Stack (Already in Place)

    You will work within our existing infrastructure.

    • Google BigQuery – Primary data warehouse
    • Dagster – Production orchestration tool
    • Python – For alert logic and orchestration
    • SQL (Advanced) – Complex transformations and aggregations
    • Looker Studio – Dashboard development
    • Slack API – Alert notifications
    • SMS (Twilio or equivalent) – Critical alerts
    • Google Cloud Platform (GCP) – Cloud environment

    No infrastructure setup required. You will build within our existing systems.

    Mandatory Requirements (Non-Negotiable)

    Only apply if you meet ALL of the following:

    Technical

    • Strong Python in a data engineering context (not web development)
    • Advanced SQL proficiency:
    • Window functions
    • CTEs
    • Rolling aggregations
    • Complex joins
    • Performance optimization in BigQuery
    • Proven Google BigQuery experience
    • Experience with Dagster OR strong Airflow/Prefect experience (Dagster preferred)
    • Experience building multi-page Looker Studio dashboards connected to BigQuery
    • Slack API or webhook integration experience
    • Experience working within Google Cloud Platform

    Professional

    • Ability to read detailed specifications and execute independently
    • Strong production mindset (logging, error handling, idempotency)
    • Clear communication in async environments
    • Structured documentation habits
    • Experience delivering under time-bound milestones
    • If you do not have hands-on BigQuery + orchestration experience, this role is not suitable.

    Nice-to-Have (Strong Advantage)

    • Healthcare or e-commerce pipeline experience
    • Shopify data familiarity
    • SMS integrations (Twilio)
    • Experience building operational monitoring systems
    • Experience designing anomaly detection logic
    • Experience with SLA tracking frameworks

    Timeline & Milestones

    This is a 3-week build with a 1-week support buffer.

    Week 1 (Days 1–5)

    • Environment familiarization
    • Core alert rule implementation
    • Slack integration
    • Initial production deployment

    Milestone: Core alerts live

    Week 2 (Days 6–9)

    • Anomaly detection logic
    • SMS integration
    • Logging framework
    • Configurable thresholds
    • Full system testing

    Milestone: Alert system complete

    Week 3 (Days 10–13)

    • 4-page Looker Studio dashboard
    • Testing and validation
    • Documentation and handover

    Milestone: Dashboard live, project complete

    Post-Launch (1 Week)

    • Threshold tuning
    • Minor refinements
    • Operational adjustments

    Engagement Structure

    • 1 month Fixed-price contract 
    • Nigeria-based consultant
    • Fully remote
    • Daily async updates via email
    • Full system access provided on Day 1
    • Detailed specification documents shared with shortlisted candidates
    • 1-week post-launch support required

    What You Must Submit

    To be considered, your application must include:

    Your CV

    A concise summary of relevant experience:

    • BigQuery
    • Dagster/Airflow/Prefect
    • Alerting systems
    • Looker Studio dashboards
    • Example of a production pipeline you built:
    • Dagster or Airflow preferred
    • Brief explanation of architecture
    • Example of a Looker Studio dashboard:
    • Screenshot or link
    • Brief explanation of data model
    • Your fixed-price quote
    • Confirmation of availability for the 3-week timeline
    • Applications without concrete examples will not be reviewed.

    Ideal Candidate Profile

    You are:

    • A senior-level data engineer
    • Comfortable working directly in BigQuery
    • Experienced with orchestration tools
    • Detail-oriented and systematic
    • Able to build reliable monitoring systems
    • Strong in operational thinking
    • Comfortable working independently with minimal hand-holding

    If you are confident in your ability to execute this project at a high technical standard within the stated timeline, we encourage you to apply.

    This is a high-impact build that will directly power our operational decision-making systems.

    Check how your CV aligns with this job

    Method of Application

    Interested and qualified candidates should forward their CV to: hrng@turbham.com using the position as subject of email.

    Build your CV for free. Download in different templates.

  • Apply Now
  • Send your application

    View All Vacancies at Turbham Limited Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail