Role:Senior Data Engineer
Position Type:Full-Time Contract (40hrs/week)
Contract Duration:Long Term
Work Schedule:8 hours/day (Mon-Fri)
Work Timezone:US Time
Location:100% Remote (Candidates can work from anywhere in LATAM Countries)
What You'll Be Doing
We're looking for a highly skilledSenior Data Engineerto lead the development of scalable data pipelines and modern data warehouse solutions. This role will be pivotal in architecting, implementing, and optimizing high-performance data workflows using tools such asSQL, Python, DBT, Databricks, and Snowflake.
You'll partner closely with stakeholders across the organization to understand complex business problems and translate them into robust, data-driven solutions. This is a hands-on role for a data engineering expert who thrives in a collaborative, fast-paced environment.
Key Responsibilities
Design, build, and maintain robust
data pipelinesfor ingesting and transforming large datasets from multiple sources.
Develop and optimizeSQL and DBT codeto model, transform, and load data into a centralizeddata warehouse.
Lead the design and implementation of modern data architectures usingDatabricksorSnowflake.
Write efficient, reusable, and scalablePythoncode to support ETL/ELT workflows.
Collaborate with data scientists, analysts, and business stakeholders to deliver clean, reliable, and well-governed data.
Advocate for and implementautomated testing, monitoring, and deployment pipelines.
Championdata quality, governance, and best practicesacross the team.
Solve complex technical challenges across various layers of the data stack.
Mentor and support junior team members in their technical growth.Qualifications
Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field.
8+ years of hands-on experience indata engineering, with a proven track record of delivering scalable solutions.
Strong expertise inSQL, including performance tuning and complex query development.
Proven experience withDatabricksorSnowflakein a production environment.
Hands-on experience developingdata pipelines and ETL/ELT workflowsusingDBTandPython.
Solid understanding ofdata modeling techniques(e.g., 3NF, dimensional modeling).
Experience with cloud platforms such asAzure, AWS, or GCP.
Familiarity withinfrastructure-as-code toolslike Terraform.
Passion for data quality, security, privacy, and governance.
Strong communication and problem-solving skills, with the ability to work cross-functionally.