Newsela logo

Newsela

Contractor: Senior Data Platform Engineering Services

🇲🇽 Remote - MX

🕑 Contractor

💰 TBD

💻 Software Engineering

🗓️ November 21st, 2025

Airflow DBT Python

Edtech.com's Summary

Newsela is hiring a Contractor: Senior Data Platform Engineering Services. This role involves working with app developers and data stakeholders to implement data system changes, respond to inquiries, and contribute to the design and planning of data initiatives. The contractor will build and maintain data pipelines, DAG tooling, and establish a data catalog with business-related metadata.

Highlights
  • Collaborate with developers and stakeholders to create and modify data systems.
  • Contribute to problem definition, design, and planning through epics and blueprints.
  • Build and maintain data pipelines and DAG tooling using technologies like Dagster or Airflow.
  • Establish and maintain a data catalog with business-relevant metadata.
  • Requires strong proficiency in SQL, Python, and relational datastores.
  • Experience with event-based pipelines, CDC tooling, and managing large-scale data migrations.
  • Advanced knowledge of data testing strategies and DBT orchestration best practices.
  • Skills in monitoring, health checks, and alerting on data systems and infrastructure automation.
  • Minimum of 5 years’ experience in data or software engineering or related fields.
  • Experience with cloud infrastructure (AWS, GCP, Terraform) and schema-less datastores is a plus.

Contractor: Senior Data Platform Engineering Services Full Description

Seeking to hire a Contractor based out of Mexico, Brazil, Costa Rica, Colombia, Chile, or Argentina for Senior-Level Data Platform Engineering Services.

Scope of Services:
  • As a Contractor you will work alongside app developers and data stakeholders to create data system changes and to respond to data inquires
  • You will contribute to initiatives and assist in problem definition scoping, design, and planning through epics and blueprints. 
  • Utilize your domain knowledge to develop documentation, participate in technical presentations, discussions, and incident reviews.
  • You will build and maintain data pipelines and DAG tooling 
  • You will establish and maintain a data catalog with business-related metadata

Skills & Experience:
  • At least 5 years of experience in data engineering, software engineering, or a related field.
  • Strong proficiency in SQL, Python, and relational datastores (columnar and row databases)
  • Proficiency in building and maintaining data pipelines and DAG tooling (Dagster, Airflow, etc)
  • Experience with event-based pipelines and CDC tooling.
  • Experience in managing data migrations in relational datastores.
  • Experience in optimizing SQL query performance.
  • Experience with data testing strategies to ensure resulting datastores align with expected business logic
  • Experience with DBT orchestration and best practices
  • Experience with enabling monitoring, health checks and alerting on data systems and pipelines
  • Experience establishing and maintaining a data catalog with business-related metadata
  • Experience building tools and automation to run data infrastructure
  • Experience with writing and maintaining cloud-based infrastructure for data pipelines (AWS, GCP and Terraform) is a plus
  • Experience in document, graph or schema-less datastores is a plus

Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. 
#LI-Remote