Newsela logo

Newsela

Contractor: Senior Analytics Engineer Services

🇲🇽 Remote - MX

🕑 Contractor

💰 TBD

💻 Data Science

🗓️ November 25th, 2025

Airflow DBT ETL

Edtech.com's Summary

Newsela is hiring a Contractor: Senior Analytics Engineer Services. The role involves applying software engineering best practices to analytics code, building and maintaining data pipelines, optimizing SQL queries, and transforming raw data into actionable business insights while collaborating with stakeholders and creating data visualizations.

Highlights
  • Build and maintain data pipelines, optimize SQL query performance, and transform raw data into business insights.
  • Use SQL and Python for data modeling, transformation, and analysis development.
  • Apply software engineering principles such as version control and continuous integration to analytics code.
  • Implement advanced data testing and validation checks to ensure data quality in ETL/ELT pipelines.
  • Collaborate with stakeholders to define business logic and drive data usage initiatives.
  • Develop data visualizations and support stakeholders in using visualization tools like Tableau.
  • Manage large-scale data migrations and build automation tools for data infrastructure.
  • Experience required: 6+ years working with data in software environments, proficiency in SQL, Python, dbt orchestration, and tools like Dagster or Airflow.
  • Technical stack includes relational datastores, DAG tooling, dbt, Tableau, cloud infrastructure (AWS, GCP, Terraform), and data cataloging and integrity frameworks.
  • Contract position not eligible for company-sponsored benefits.

Contractor: Senior Analytics Engineer Services Full Description

Seeking to hire a Contractor based out of Mexico or Argentina for Senior-Level Analytics Engineering Services.

Scope of Services:
  • We are looking for a Contract Analytics Engineer to join our data team.
  • As a contractor, you will be responsible for applying software engineering best practices to analytics code to transform, test, and document data.
  • You will provide clean and organized data sets to end users.
  • You will be responsible for building and maintaining data pipelines, as well as optimizing SQL query performance for the models you build.
  • You will transform raw data into business insights, working closely with stakeholders and developing analyses to answer critical business questions.
  • You will create data visualizations and help stakeholders explore and understand the data visualization tools available to them.

Why you'll love this role:
  • Data Modeling and Transformation
    • Build new analyses and support existing ones using SQL and Python.
    • Apply software engineering principles like version control and continuous integration to the analytics codebase.
    • Expand our data warehouse with clean data ready for analysis.
  • Data Quality and Testing
    • Apply advanced data testing strategies to ensure resulting datastores are aligned with expected business logic.
    • Implement validation checks and automated testing procedures to manage data quality in your ETL/ELT pipelines.
  • Collaboration and Communication
    • Work with stakeholders to define business logic and data expectations.
    • Help drive a change in the usage of data by actively surfacing insights to stakeholders.
    • Lead initiatives and problem definition, scoping, design, and planning.
  • Infrastructure and Automation
    • Build tools and automation to run data infrastructure.
    • Manage large-scale data migrations in relational datastores.

Skills & Experience:
  • 6+ Years experience working with data in a software environment. 
  • Mastered proficiency in SQL and Python
  • Advanced experience managing business semantic layer tooling, data catalog tooling and  data integrity testing frameworks.
  • Experience with dbt orchestration and best practices.
  • You have a track record of working autonomously, with deep domain knowledge of data systems.
  • Tech Stack: SQL, Python, relational datastores, DAG tooling (like Dagster or Airflow), dbt and Tableau.
  • Experience with cloud-based infrastructure (AWS, GCP, Terraform) and document, graph, or schema-less datastores.

Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. 
 
#LI-Remote