Filled
This offer is not available anymore

Senior Data Engineer in Barcelona or Remote

Parser

Workplace
Remote
Hours
Full-Time
Internship
No
Share offer

Job Description

This position offers you the opportunity to join a fast-growing technology organization that is redefining productivity paradigms in the software engineering industry. Thanks to our flexible, distributed model of global operation and the high caliber of our experts, we have enjoyed triple digit growth over the past five years, creating amazing career opportunities for our people.


We are seeking a Data Engineer to join our team and focus on maintaining data streams and ETL pipelines. The ideal candidate will have experience in building and maintaining data pipelines, ensuring data consistency, and monitoring interfaces with upstream teams. This role is crucial for enabling seamless data flow and providing robust support for data-driven decision-making within the company.


If you want to accelerate your career working with like-minded subject matter experts, solving interesting problems, and building the products of tomorrow, this opportunity is for you.


The impact you'll make:


  • Develop new features for a large-scale software platform and applications;
  • Keep the best practices and patterns in backend design and development;
  • Collaborate with cross-functional teams to define, design, and ship solutions;
  • Continuously discover, evaluate, and implement new technologies to maximize development efficiency.


Technology stack:


  • Programming Languages: Python.
  • Data pipeline tools (e.g., Apache Airflow, Prefect, Dagster).
  • Data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
  • Database systems, both SQL and NoSQL.
  • Cloud platforms (AWS, GCP, Azure).

Key Responsibilities:


  • Build and maintain data pipelines and ETL processes.
  • Perform data consistency checks and ensure data integrity across all data streams.
  • Monitor and troubleshoot data pipeline issues, ensuring minimal downtime and data loss.
  • Align and monitor interfaces with upstream teams to ensure uninterrupted data flow.
  • Collaborate with data scientists and analysts to support data initiatives.
  • Maintain and support data platform infrastructure.
  • Implement best practices for data management, security and compliance.
  • Gain a deep understanding of data and its business context to enhance data solutions.
  • Document data processes, workflows, and system architecture to facilitate knowledge sharing and continuity


What you'll bring to us:


  • Bachelor’s degree in Computer Science, Data Science, or related field.
  • 8 / 9 years of experience in data engineering or a related role.
  • Proficiency with SQL and NoSQL databases.
  • Experience with data pipeline tools (e.g., Apache Airflow, Prefect, Dagster).
  • Basic programming skills (Python, Java, etc.).
  • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery).
  • Strong problem-solving skills and attention to detail.
  • Experience with cloud platforms (AWS, GCP, Azure).
  • Knowledge of data governance and best practices in data security and privacy.
  • Excellent written and verbal English communication skills, and ability to work effectively in a collaborative team environment.


Some of the benefits you’ll enjoy working with us:


  • The chance to join an organization with triple-digit growth that is changing the paradigm on how software products are built.
  • The opportunity to form part of an amazing, multicultural community of tech experts.
  • A highly competitive compensation package.
  • Medical insurance.


 

About Parser

.

Other data engineer jobs that might interest you...