Contract
Feb. 26, 2026
Remote
Lead Data Engineer
JOB ID
LVDE2601
VISA STATUS
Only EU/CH Citizens
REMOTE OPTION
100%
Job details:
- Full-time
- Remote
- Start date: asap
- Duration: 6 months + extensions
- Possible occasional travels to UK / Latvia
- Large UK media company
Scope:
The Principal Data Engineer will support the Product Data Domain teams. You will build ETL pipelines to ingest and transform data to develop the data products that will power key value use cases across the company. You will work in an agile multi-disciplinary team alongside product analytics developers, product data managers, data modelers and data operations managers, ensuring that all work delivers maximum value to the company.
- Lead and architect on developing robust and scalable complex data pipelines to ingest, transform, and analyse large volumes of structured and unstructured data from diverse data sources. Pipelines must be optimised for performance, reliability, and scalability in line with the client’s scale.
- Lead initiatives to enhance data quality, governance and security across the organisation, ensuring compliance with guidelines and industry best practices.
- Prioritise stakeholders’ requirements and identify the best solution for timely delivery.
- Leads on building automation workflows including monitoring and alerting.
- Encouraging and mentoring team members in partnership with other disciplines to create value with data across the wider organisation.
- Help set standards for coding, testing and other engineering practices.
- Leads on the building and testing of business continuity & disaster recovery procedures per requirements.
- Proactively evaluate and provide feedback on future technologies and new releases/upgrades based on deep understanding of the domain.
Requirements:
- Extensive (5+ years) experience in a data engineering or analytics engineering role, preferably in digital products.
- Extensive experience in building ETL pipelines, ingesting data from a diverse set of data sources (including event streams, various forms of batch processing)
- Excellent SQL and python skills.
- Extensive use of AWS
- Good working knowledge of Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake).
- Experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow.
- Demonstrable experience of working alongside cross-functional teams interacting with Product Managers, Infrastructure Engineers, Data Scientists, and Data Analysts.
- Fluent English
Teamwork and stakeholder management
- Ability to listen to others’ ideas and build on them
- Ability to clearly communicate to both technical and non-technical audiences.
- Ability to collaborate effectively, working alongside other team members towards the team’s goals, and enabling others to succeed, where possible.
- Ability to prioritise. A structured approach and ability to bring other on the journey.
- Strong attention to detail
SNI sp. z o.o. will process personal data for the purpose of the recruitment process in accordance with Data Privacy Policy. The data may also be stored and processed for future recruitment purposes, in accordance with the given consent.