We are looking for a talented Data Architect to design and implement scalable data lake/warehouse architectures using Databricks, Parquet, Unity Data Catalog, Iceberg, and DBT. The ideal candidate will be responsible for developing data pipelines, ensuring connectivity with Azure Storage, collaborating with stakeholders, optimizing data processes, and implementing data governance measures.
Location: 100% Remote. Working hours are based on the US Central Time Zone.
About the Company:
Abstra is a fast-growing, Nearshore Tech Talent services company, providing top Latin American tech talent to U.S. companies and beyond. Founded by U.S.-bred engineers with over 15 years of experience, Abstra specializes in sourcing skilled professionals across a wide range of technologies to meet our clients’ needs, driving innovation and efficiency.
Job Description:
Core Responsibilities:
- Design and implement scalable and efficient data lake/warehouse architectures using Databricks, Parquet, Unity Data Catalog, or other similar open catalog governance tools, Iceberg, DBT, or other ELT tools. Familiarity with the Azure data lake object store and data lineage tools is required.
- Develop and maintain data pipelines, ensuring seamless connectivity with Azure Storage and other cloud services.
- Collaborate with data engineers, data scientists, and business stakeholders to understand data requirements and deliver solutions that meet business needs.
- Optimize data storage and retrieval processes to enhance performance and reliability.
- Implement data governance and security measures to ensure data integrity and compliance.
- Provide technical leadership and mentorship to the data engineering team.
- Stay updated with the latest trends and technologies in data architecture and cloud computing.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- Proven experience in designing and implementing data lakehouse architectures.
- Proficiency in Databricks, Parquet, Unity Data Catalog, Iceberg, DBT, and Azure Storage.
- Strong understanding of data modeling, ETL processes, and data governance.
- Excellent problem-solving skills and the ability to work in a fast-paced environment.
- Strong communication and collaboration skills.
What We Offer:
- Flexible working hours and remote work options.
- Opportunities for professional growth and development.
- A collaborative and inclusive work environment.
- The chance to work on impactful projects with a talented team.
- Excellent compensation in USD.
- Hardware and software setup (if needed).
Job Features
Type | Remote |
Time Zone | US Central Time |