Career Profile
With over 14 years of experience spanning BI development, Data Engineering, DataOps, and DevOps, I’ve developed a comprehensive understanding of the data industry’s ecosystem. My diverse roles have enabled me to master the complete data lifecycle—from initial collection and processing to advanced analytics and model deployment. I’ve consistently delivered solutions to complex real-world problems, collaborating effectively across teams and technical domains.
This broad expertise allows me to see beyond individual components to understand how each element contributes to a project’s success. I pride myself on being a proactive team player who combines technical proficiency with a growth mindset, always eager to tackle new challenges and expand my skillset.
Experiences
Founded and led a new DataOps team, responsible for building a platform for data teams. Deployed Apache Airflow first on Amazon Elastic Container Service (ECS) and then on AWS EKS. The deployment uses GitHub Actions to automate the deployment process and Argo CD. Created a custom DAG factory to automate the creation of DAGs from YAML files. Customized operators, plugins, and connections to allow jobs to be executed from the AWS account of Data in other accounts of the organization. Designed a new data lake (Apache Iceberg) simplify data pipelines and optimize rythm of ingestions (near real time) and reduce costs.
- Founded team
- Design architectures for Data solutions
- Collaborate with legacy system maintenance
During my time at the company as a DataOps, I spearheaded the implementation of tools and processes aimed at streamlining daily tasks for data engineers. This included optimizing build processes, deployments, enhancing observability, and improving incident management. Also was a DevOps role, I participated in troubleshooting and deployments collaborating with the team.
- Terraform modules for data streams
- Improve performance Metabase cluster.
I led the building of data platform, conducting analyses on tools and technologies for delivery to diverse user groups, including analytics and operations teams. My responsibilities extended to architecting the data lake and databases, implementing pipelines and automations. Additionally, I played a key role in deploying, administering, and providing support for the team’s infrastructure.
- Apache Airflow running on k8s
- Redshift Materialized Views improvemets
- Apache Airflow - EMR steps integrations and DAG factory.
Certifications
Projects
Some projects in which I collaborated in parallel to my main activity.