Senior Data Engineer (Python/AWS) in Barcelona or Remote

Parser

Workplace
Remote
Hours
Full-Time
Internship
No
Skills
Share offer

Job Description

This position offers you the opportunity to join a fast-growing technology organization that is redefining productivity paradigms in the software engineering industry. Thanks to our flexible, distributed model of global operation and the high caliber of our experts, we have enjoyed triple digit growth over the past five years, creating amazing career opportunities for our people.


Key Responsibilities

  • Audit existing data structures, pipelines, and storage to identify improvement opportunities.
  • Redesign or optimize data architecture using AWS native tools (S3, Glue, Athena, Redshift, EMR, Lambda, Step Functions).
  • Build and maintain automated ingestion and transformation pipelines (Python, PySpark).
  • Implement data validation, retention policies, purging strategies, and backup workflows.
  • Optimize query performance, file formats, partitioning, and cost efficiency.
  • Lead or support compliance initiatives (GDPR, access controls, auditability).
  • Create automated or semi-automated CSV reports for business stakeholders.
  • Improve data transparency and observability using CloudWatch, Glue Data Catalog, and lineage tools.
  • Document processes, standards, and architecture changes.


Requirements

  • 8+ years of experience in data engineering.
  • Strong proficiency in Python, SQL, and distributed data processing.
  • Deep hands-on expertise with AWS big data services: S3, Glue, Athena, Redshift, EMR, Lambda, Step Functions, IAM.
  • Experience with Lakehouse or Medallion-style data architectures.
  • Knowledge of data governance, retention, security, and compliance best practices.
  • Experience with orchestration tools (Airflow, Glue Workflows, Step Functions).
  • Familiarity with CI/CD and infrastructure as code (CloudFormation, Terraform, CDK).
  • Strong communication skills and ability to work remotely.
  • Deep understanding of serverless architecture patterns and best practices
  • Experience with DynamoDB data modeling, indexing, and query optimization
  • Knowledge of microservices and event-driven architecture in serverless environments
  • Experience with CI/CD pipelines for serverless applications
  • Excellent written and verbal English communication skills
  • Ability to work effectively and proactively in a collaborative team environment
  • Proficiency in Typescript, Javascript, NodeJs, AWS Lambdas & API Gateway, (.NET & C# is a plus)


Preferred

  • Experience with Databricks / Delta Lake.
  • Experience with backend integration (FastAPI or similar).
  • Background in ESG, climate analytics, or data governance.


  Location

  • Remote; Based in Spain and Portugal


  Some of the benefits you’ll enjoy working with us

  • The chance to join an organization with triple-digit growth that is changing the paradigm on how software products are built.
  • The opportunity to form part of an amazing, multicultural community of tech experts.
  • A highly competitive compensation package.
  • Medical insurance.
 

About Parser

.

Other data engineer jobs that might interest you...