We are looking for a skilled Data Engineer to join a high-impact, data-driven project involving the integration of real-time data feeds from external vendors. The ideal candidate will have hands-on experience building and maintaining scalable data pipelines, working closely with Business Analysts and QA teams to ensure data accuracy, alignment, and auditability.
This position requires client presence between 25%-50% of the time per month at the client's office, which is located in London.
Key Responsibilities:
- Design, develop, and maintain robust data ingestion pipelines for processing real-time CSV or structured data files.
- Collaborate with Business Analysts to understand business logic and translate it into data models and transformations.
- Ensure high data quality and traceability across ingestion, transformation, and export processes.
- Work with QA engineers to validate data integrity and support downstream testing initiatives.
- Optimize pipeline performance and implement monitoring, error handling, and logging strategies.
- Document data structures, workflows, and dependencies clearly.
- Participate in sprint planning and agile ceremonies to align work with delivery targets.
Requirements:
- 8+ years of experience in Data Engineering or a related role.
- Strong proficiency in Python, SQL, and working with data integration tools or frameworks (e.g., Airflow, dbt, custom pipelines).
- Experience processing structured data formats such as CSV and working with real-time or near-real-time systems.
- Understanding of data architecture, ETL/ELT design patterns, and pipeline optimization techniques.
- Hands-on experience with cloud data platforms, particularly AWS and Azure.
- Comfortable collaborating with cross-functional teams including BAs, PMs, and QA engineers.
- Familiarity with version control (e.g., Git) and CI/CD workflows for data.
Nice to Have:
- Experience in the airline or travel industry.
- Knowledge of data governance, audit trail design, and regulatory compliance.
- Exposure to data warehousing solutions (e.g., Redshift, Snowflake, Synapse).
Nice-to-Have Qualifications:
- Experience with streaming technologies such as Kafka or similar.
- Familiarity with containerization and orchestration (Docker and ECS) for data workflows.
- Exposure to BI tools such as Tableau or Power BI for data visualization.
- Understanding of machine learning pipelines and how they integrate with data engineering processes.
- Certification in cloud data engineering (e.g., AWS Certified Data Analytics)
What We'll Offer You In Return:
- The chance to join an organisation with triple-digit growth that is changing the paradigm on how digital solutions are built.
- The opportunity to form part of an amazing, multicultural community of tech experts.
- A highly competitive compensation package.
- A flexible and remote working environment.
- Medical insurance.
Come and join our #ParserCommunity.