Remote Data Engineer (Senior, E-Commerce, Google Cloud, LATAM) (Resistencia)

Remote Data Engineer (Senior, E-Commerce, Google Cloud, LATAM) (Resistencia)

23 abr
|
Evolvesquads
|
Resistencia

23 abr

Evolvesquads

Resistencia

Overviewncreate world-changing products using God-given talents.

n

PROJECT DESCRIPTIONnOur client began as a family-run women’s apparel business in the late 1930s. Over the decades, the company has evolved into a nationally recognized fashion retailer focused on helping customers feel confident and well dressed for special moments, nights out, and everyday occasions. What started as a small operation has grown into a large retail organization with hundreds of locations, a growing team, and an ongoing expansion strategy.

n

They are currently looking for a proactive Data Engineer to join the team. If you’re a passionate individual and think you have what it takes to help carry this legacy forward, we encourage you to apply and join the team.

n

PROJECT STACK and TEAM We are seeking a highly skilled Senior Data Engineer who can own the end-to-end design, development, and operation of robust data pipelines with minimal supervision. This role is an individual contributor who will be the organization’s sole data engineer, responsible for delivering practical, production-ready solutions without direct people management or formal mentoring responsibilities.

n

The idóneo candidate will have extensive hands-on experience with BigQuery, Python-based data pipelines, and cloud-native orchestration. Knowledge of Airbyte and dbt is a plus, and cloud-based machine learning experience is an asset.

n

n
- Our client is a GCP shop, and they also leverage AWS and Azure to support their operations.
n
- Strong experience working on e-commerce platforms.
n
- Hands-on experience with Shopify.
n
- Contract: Long-Term
n
- Location: LATAM
n
- Start Date: ASAP
n
- The core team is based in the Los Angeles area,



with additional developers located across LATAM in Brazil, Argentina, and Colombia. Working hours will follow the Pacific Time Zone.
n

nMAIN REQUIREMENTSn

n
- 5+ years of hands-on data engineering experience, with a proven track record of owning data pipelines in production.
n
- Strong expertise in Google Cloud Platform (GCP), including: n
- BigQuery (advanced SQL, partitioning, clustering; BI Engine familiarity a plus)
n
- Cloud Composer (Apache Airflow) or equivalent workflow orchestration
n
- Pub/Sub (Cloud Pub/Sub) for event-driven data ingestion
n
n
n
- Proficient Python developer with extensive experience building data pipelines, transformations, and automation
n
- Deep experience extracting data from:

n
- GraphQL endpoints
n
- REST APIs
n
- Relational and/or NoSQL databases
n
- Flat files (CSV, JSON, Parquet, etc.)
n

n
n
- Demonstrated ability to design and implement scalable ETL/ELT pipelines and maintain them in production.
n
- Strong SQL skills with the ability to optimize BigQuery queries; understanding of data lakehouse concepts.
n
- Excellent problem-solving, communication, and stakeholder-management skills.
n
- Ability to work independently, set priorities, meet deadlines, and drive initiatives with minimal guidance.
n

nGOOD TO HAVEn

n
- Experience with Airbyte for data ingestion and connectors.
n




- Experience with dbt (data build tool) for transformations and data modeling.
n
- Familiarity with orchestration patterns, CI/CD for data pipelines, and versioning of data assets.
n
- Experience with data quality frameworks and testing (e.g., dbt tests, Great Expectations).
n
- Knowledge of multi-cloud or hybrid data architectures.
n
- Experience with data instrumentation, monitoring, and SRE practices for data pipelines.
n
- Exposure to cloud-based ML services and ML data workflows (Vertex AI, AutoML, etc.).
n

nCloud ML + Plus (Nice-to-Have)n

n
- Experience or familiarity with cloud-based machine learning services (e.g., Vertex AI, Cloud AI Platform) and integrating data pipelines with ML workflows.
n
- Building or supporting data prep pipelines for ML model training, feature stores, and model inference data routing.
n

nQUALIFICATIONS (PREFERRED)n

n
- 5+ years of hands-on data engineering experience delivering production-grade pipelines.
n
- Proven ability to drive end-to-end data initiatives with business impact.
n
- Excellent communication skills and ability to work cross-functionally with product, analytics, and data science teams.
n

nJOB RESPONSIBILITIESn

n
- Design, implement, and own scalable ETL/ELT data pipelines from GraphQL endpoints, REST APIs, databases, and flat files into BigQuery.
n
- Lead the architecture and implementation of data models, schemas, partitioning, and clustering to optimize performance and cost.
n
- Build reusable, maintainable data pipelines using Python, with strong emphasis on reliability, observability, and quality.
n
- Develop and enforce data quality checks, moni

📌 Remote Data Engineer (Senior, E-Commerce, Google Cloud, LATAM) (Resistencia)
🏢 Evolvesquads
📍 Resistencia

Postulate a este anuncio

Muestra tus habilidades a la empresa, rellenar el formulario y deja un toque personal en la carta, ayudará el reclutador en la elección del candidato.

Suscribete a esta alerta:
Escribe tu dirección de correo electrónico, te permitirá de estar al tanto de los últimos empleos por: remote data engineer (senior, e-commerce, google cloud, latam) (resistencia) / resistencia
Suscribete a esta alerta:
Escribe tu dirección de correo electrónico, te permitirá de estar al tanto de los últimos empleos por: remote data engineer (senior, e-commerce, google cloud, latam) (resistencia) / resistencia