Ready to accelerate your career?
Clara is the fastest-growing company in Latin America. We've built the leading solution for companies to make and manage all their payments. We already help over 20,000 large and growing businesses operate with agility and financial clarity through locally issued corporate cards, bill pay, financing, and a powerful B2B platform built for scale.
Clara is backed by some of the most successful investors in the world, including top regional VCs like monashees, Kaszek, and Canary, and leading global funds like Notable Capital, Coatue, DST Global Partners, ICONIQ Growth, General Catalyst, Citi Ventures, SV Angel, Citius, Endeavor Catalyst, and Goldman Sachs - in addition to dozens of angel investors and local family offices.
We’re building the financial infrastructure that powers high-performing organizations across the region. We invite you to join us if you want to be part of a fast-paced environment that will accelerate your career and support you to do some of the best work of your life alongside a passionate and committed team distributed across the Americas.
Senior Data Engineer (Ingeniero de Datos Senior)
CLARA's Data Engineering Team is looking for a highly professional, experienced, and innovative Senior Data Engineer to architect and build robust data infrastructure that powers our analytics, reporting, and emerging AI capabilities. You'll work alongside developers, data analysts, and data scientists to create, deploy, and maintain complex banking and financial data systems that enable data-driven decision making across the organization.
Responsibilities
Your main focus will be to design and build reliable data pipelines that serve analytics teams, business stakeholders, and AI applications. You'll leverage modern tooling (including AI-assisted development) to efficiently develop data infrastructure, improve existing processes, create data models, and mentor junior team members. You will be responsible for the maintenance and operation of these systems, collaborating across teams to provide high-quality, fast-delivery data for the company.
Core Responsibilities:
Data Pipeline Engineering & Analytics Infrastructure
- Build, integrate, and maintain batch and streaming data pipelines following principles of reliability, scalability, and maintainability
- Design and implement data models for analytics, reporting, and business intelligence using dimensional modeling and lakehouse architectures
- Ensure data quality, lineage, and monitoring across all pipelines, understanding and resolving issues proactively
- Implement data lake/lakehouse architectures following best practices: partitioning strategies, zone organization (bronze/silver/gold), avoiding data swamps
- Integrate data from diverse sources: databases, APIs, event streams (Kafka, Kinesis), third-party services, and CDC processes
- Optimize query performance and data access patterns for analytics workloads and dashboard performance
Modern Data Platform & AI-Ready Infrastructure
- Build feature pipelines and data scaffolding that enable ML model training and AI applications for internal teams and external clients
- Design reusable data frameworks and patterns that accelerate both analytics and AI development
- Implement data APIs and microservices that expose curated datasets to downstream consumers
- Create observability systems for data quality, freshness, and pipeline health across all use cases
Development Excellence & AI-Assisted Productivity
- Leverage AI tools (coding assistants, LLMs) daily to accelerate development, improve code quality, and increase productivity
- Implement CI/CD pipelines with automated testing, validation, and deployment for data assets
- Enforce DevOps culture and best practices within the team
- Document your work thoroughly to create solid foundations for team members and future reference
Collaboration & Leadership
- Collaborate with data analysts to understand reporting requirements and optimize data models for BI tools
- Partner with data scientists and ML engineers on feature engineering and model data requirements
- Work with product and engineering teams to meet company goals and deliver data products
- Mentor junior and mid-level engineers through code reviews, pair programming, and knowledge sharing
- Lead architectural discussions and contribute to long-term platform strategy
- Enforce data security and data privacy best practices across all systems
Requirements
Core Technical Skills:
- Advanced proficiency in Python and SQL with strong performance optimization skills
- Experience with Databricks (Spark, Delta Lake, Unity Catalog, workflows)
- Deep experience with data warehousing/lakehousing (Databricks, Redshift, BigQuery, Snowflake) and dimensional data modeling
- Strong understanding of distributed systems and Spark architecture (PySpark optimization, partitioning strategies)
- Proven experience building complex data pipelines (ETL/ELT) with orchestration tools (Airflow, AWS Glue, Step Functions, Dagster)
- Experience with big data technologies: Spark, Hive, Kafka, and understanding of column vs row-based storage
- Hands-on experience integrating data from databases, APIs, event streams (Kinesis, Kafka), and CDC processes
- Proficiency with CI/CD tools (GitHub Actions, GitLab CI, Jenkins) and Git workflows
- Experience with modern data stack tools (databricks tools is a strong plus)
AI & Analytics Capabilities:
- Understanding of how to structure data for both analytics and ML use cases
- Familiarity with MLOps concepts: feature stores, model training pipelines, inference data requirements
- Experience leveraging AI tools to improve development velocity and code quality
Nice to Have
- Proficiency in Scala for advanced Spark development
- Experience with feature stores or ML data infrastructure (Databricks Feature Store, Feast, Tecton)
- Experience with lambda or kappa architectures for unified batch/streaming processing
- Experience in Data Governance frameworks and data cataloging
- Hands-on experience with CDC processes (Debezium, AWS DMS)
- Experience with vector databases or embedding pipelines for AI applications
- Familiarity with dbt for analytics engineering and data transformation
- Knowledge of LLM application patterns (RAG, embeddings, semantic search)
What Makes You Stand Out
Technical Excellence
- You leverage AI tools daily (GitHub Copilot, Claude, ChatGPT) to create projects and unlock superpowers in your workflow
- You're a problem solver with creativity, able to design elegant solutions to complex data challenges
- You have exceptional attention to detail—critical in a financial institution handling sensitive customer data
- You're knowledgeable across multiple programming languages and eager to learn new technologies
- You stay current with emerging technologies and proactively identify opportunities to implement them
Collaboration & Ownership
- Excellent at working in distributed teams with strong communication skills (async and real-time)
- You take ownership and act like an owner, because you are one
- You're dedicated and deeply responsible in every aspect of your work
- You're innovative, excellent to others, supportive, and take initiative
Strategic Impact
- Strategic thinking: you'll be part of long-term architectural decisions that shape our data platform
- Ability to stay focused, set clear targets, and visualize paths to achieve ambitious goals
- You balance shipping quickly with building sustainable, maintainable systems
Why join Clara
At Clara, you’ll have the autonomy, speed, and support to make meaningful impact — not just on your team, but on how organizations are run across Latin America.
Who we are
-
We’re the leading B2B fintech for spend management in Latin America.
-
Certified as one of the world's fastest-growing companies, a Great Place to Work, and a LinkedIn Top Startup.
-
Passionate about making Latin America more prosperous and competitive.
-
Constantly innovating to build financial infrastructure that enables each of our customers to thrive.
-
Product-led, high-talent-density culture — designed for builders who raise the bar.
-
Proud of our open, inclusive, and values-driven environment.
What we believe in
-
#Clarity. We say things clearly, directly, and proactively.
-
#Simplicity. We reduce noise to focus on what really matters.
-
#Ownership. We take responsibility and never wait to be told.
-
#Pride. We build products and experiences we’re proud of.
-
#Always Be Changing (ABC). We grow through feedback, risk-taking, and action.
-
#Inclusivity. Every voice counts. Everyone contributes to our mission.
What we offer
-
Competitive salary and stock options (ESOP) from day one
-
Multicultural team with daily exposure to Portuguese, Spanish, and English (our corporate language)
-
Annual learning budget and internal accelerated development paths
-
High-ownership environment: we move fast, learn fast, and raise the bar — together
-
Smart, ambitious teammates — low ego, high impact
-
Flexible vacation and hybrid work model focused on results
If you’re ready for growth, ownership, and impact — apply now and help us redefine B2B finance in Latin America.
Clara’s Hybrid Policy
Claridians in a hybrid mode split their time between working from the office, talking to or visiting customers, or working from home. This hits a balance between bringing people together for in-person collaboration and learning from each other, while supporting flexibility about how to do this in a way that makes sense for each individual and team.
We don't enforce a minimum number of days for most roles, but you're expected to spend time at the office organically, and be at the office most days during your ramp-up or when required by your leader.