Data Engineer
- Closed
- US Company | Medium ( employees)
- LATAM (100% remote)
- 3+ years
- Long-term (40h)
- Advertising Services
- Full Remote
Required skills
- Python
- SQL
Requirements
Must-haves
- 2+ years of data engineering experience
- Experience with Python for data engineering work involving ETL/ELT pipelines and related components
- Proficiency with SQL, Python and other data-focused languages
- Ability to design scalable solutions, evaluate emerging data technologies and anticipate new trends to address complex challenges
- Strong communication skills in both spoken and written English
Nice-to-haves
- Startup experience
- Familiarity with Snowflake
- Familiarity with AWS
- Experience with DBT, Dagster, Apache Iceberg or Infrastructure as Code
- Knowledge of scalable data lake and streaming patterns
- Bachelor's Degree in Computer Engineering, Computer Science, or equivalent
What you will work on
- Contribute to core data engineering initiatives within an experienced data team
- Explore and interpret multiple interconnected datasets to identify relationships and develop high-quality data models
- Collaborate with the data engineering group to improve and optimize transformation workflows
- Design, build and operationalize large-scale data solutions using AWS services and third-party technologies (Spark, EMR, DynamoDB, Redshift, Kinesis, Lambda, Glue, Snowflake)
- Develop production-grade data pipelines from ingestion to consumption using SQL and Python
- Implement data engineering, ingestion and curation components on AWS using native or custom tooling
- Lead proofs of concept and support the transition of validated solutions into scalable production environments
- Work with analytics teams using Looker, QuickSight and Q to deliver clean and reliable datasets