Data Engineer
Job Overview
We are seeking a Data Engineer to join our dynamic team and provide critical support in expanding our data infrastructure and analytics capabilities. The ideal candidate will have a deep understanding of the Snowflake platform and possess the technical expertise to integrate it into various operations. In this role, you will contribute to data initiatives, analyze and store data on Snowflake, and work closely with cross-functional teams to deliver scalable data solutions.
Key Responsibilities:
- Integration with Snowflake Platform: Use your expertise in Snowflake to effectively integrate it within our team's operations, ensuring seamless data flow and storage for various initiatives.
- Data Engineering Operations: Design, build, and maintain scalable data pipelines on the Snowflake platform, ensuring that the architecture meets the business's data processing needs.
- Support for Various Initiatives: Analyze and store data on the Snowflake platform to support different business initiatives. Ensure that the data infrastructure is robust and capable of handling diverse operational needs.
- Collaboration and Teamwork: Join the team to provide additional capacity through staff augmentation, working closely with existing team members to contribute to project success. Demonstrate excellent collaboration skills in a highly team-oriented environment.
- Data Governance and Best Practices: Ensure data governance practices are followed, including data quality and security, particularly when handling sensitive information.
- Automation and Optimization: Use Python, dbt Cloud, and other tools to automate data workflows, optimize performance, and ensure smooth data operations on the Snowflake platform.
Requirements:
- Proven experience as a Data Engineer with a strong background in Snowflake platform architecture and operations.
- Expertise in data pipeline design and management, including knowledge of data modeling, ETL (Extract, Transform, Load) processes, and database performance optimization.
- Familiarity with Python for scripting and automation of data workflows.
- Experience using dbt Cloud to build governed data pipelines on the Snowflake platform.
- Strong understanding of data governance principles, including data quality, metadata management, and data security.
- Demonstrated ability to collaborate in a team environment and provide staff augmentation to help increase the team’s capacity.
- Experience in supporting various data initiatives, particularly those requiring seamless integration of data into different business functions.
Preferred Qualifications:
- Prior experience with Salesforce integration for data engineering tasks.
- Experience with DPT (Snowflake API layer) for data integration and management.
- Cloud platform experience (AWS, Azure, GCP) in managing and optimizing data infrastructure.
Job Types: Full-time, Contract
Pay: $137,036.00 - $146,612.00 per year
Benefits:
- 401(k)
- 401(k) matching
- Dental insurance
- Flexible spending account
- Health insurance
- Health savings account
- Life insurance
- Paid time off
- Referral program
- Vision insurance
Schedule:
Experience:
- Python: 5 years (Required)
- ETL: 4 years (Required)
Ability to Commute:
Work Location: Hybrid remote in Brighton, MA