Data Engineer - GMU INNO

ABOUT US

We are one of the world's largest research companies and currently the only one primarily managed by researchers. With offices in 90 markets, Ipsos brings together research, implementation, methodological, and subject-matter experts from around the world, combining thematic and technical experts globally with local knowledge to deliver top-quality research.


Our culture is unique - we have the entrepreneurial spirit and quirkiness of a small boutique, but we also have the resources, scale and diversity of a large global agency. We are bright, friendly, hard-working and enthusiastic people, with a variety of interests, skills and experiences to learn from. Our values are based on our diverse backgrounds, helping us to be responsive, client-focused and flexible.


The Audience Measurement team at Ipsos use deep understanding of people to make sense of audiences and how they consume media. We use these to influence media strategy, helping clients to answer crucial questions, such as how to target audiences, maximise attention across platforms, enhance audience experience, and demonstrate or increase audience value.
 

THE IMPACT OF THIS ROLE

Your work will be essential for the Route project's success. Building robust and scalable infrastructure and being the linchpin between the data platform and data science teams in an ‘advisory-like' role will enable data scientists to focus on research and innovation in synthetic data, enhancing data-driven solutions for clients and Ipsos. You will often support the productionisation and scaling of local ML models, guiding the data science teams and providing the guardrails as they develop and iterate.

 

WHAT WILL I BE DOING?

As a Data Engineer on the groundbreaking Route project, you'll develop and maintain the data infrastructure for a first-of-its-kind synthetic travel survey, generating audience figures for Out-of- Home media in Great Britain. Reporting to a Principal Data and Platform Engineer, you will collaborate with data scientists to design, build, and optimise data pipelines for high-quality audience measurement data.

Your key responsibilities will include:

  • Develop robust, scalable data pipelines and optimise data storage solutions on GCP. Implement ETL processes and CI/CD pipelines to ensure clean, structured data ready for use.
  • Work with data scientists to integrate synthetic models into production environments and provide the guardrails and advice as they develop
  • Provide technical support, troubleshoot issues, and research new technologies to enhance capabilities.
  • Document pipelines and the platform including architectures and user guides, helping to enforce data management standards.
  • Participate in agile ceremonies and provide occasional client interaction.
  • Engage in DataOps practices and improve data delivery performance.
     


TECH STACK

  • GCP: GCS, BigQuery, GKE, Artifact Registry, Vertex Al, App Engine, Data Store, Secret Manager, Pub/Sub
  • Open-Source Tools: Argo Workflows, Argo Events, dbt, Elementary, Cerberus, Terraform, Jupyter Hub, Docker, Kubernetes
  • Coding Languages: Python, SQL, JavaScript (minimal)
  • CI/CD & PM Tools: Azure DevOps, Confluence

 

WHAT DO I NEED TO BRING WITH ME?

It is essential that your personal attributes compliment your technical skills. To be successful in this role you will need the following skills and experience:
 

  • A minimum of 3 years relevant experience with scalable data pipelines using Argo on GKE, SQL on BigQuery, Python libraries like Pandas.
  • Comfortable with APIs and Cloud storage systems.
  • Basic web application development knowledge using Flask, HTML, JavaScript.
  • Experience with containerisation (Docker) and orchestration (Kubernetes).
  • Familiarity with Terraform and data systems optimisation.
  • Commitment to data quality and experience in synthetic data, Al/ML model deployment.
  • Excellent communication skills and a collaborative and positive mindset.
  • Willingness to learn.
  • GCP certifications are a plus.


If you believe you're a fit for this role, we encourage you to apply, even if you're not meeting all requirements. We value applications from anyone who feels they could be right for the role either now or with a little growth and guidance.

Interested applicants, kindly email your detailed resume to [email protected].

Only shortlisted candidates will be notified.