Expertise and/or relevant experience in the following areas are mandatory:
- Minimum 10+ years of experience as a Backend Data Engineer or in a similar role and strong understanding of ETL processes and data warehousing concepts.
- Proven experience with Python and related data engineering libraries (e.g., pandas, NumPy, Spark) and hands-on experience with Apache Airflow for managing data pipelines
- and workflows.
- Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL. Resource must be able to implement data automations within
- existing frameworks as opposed to writing one off scripts.
- Experience with big data technologies and frameworks like Hadoop, Spark, Kafka, and Flink.
- Strong understanding of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review.
Expertise and/or relevant experience in the following areas are desirable but not mandatory:
- Experience with cloud computing platforms.
- Familiarity with agile development methodologies, software design patterns, and best practices.
- Strong analytical thinking and problem-solving abilities.
- Excellent verbal and written communication skills, including the ability to convey
- technical concepts to non-technical partners effectively.
- Flexibility to adapt to evolving project requirements and priorities.
- Outstanding interpersonal and teamwork skills; and the ability to develop productive
- working relationships with colleagues and partners.
- Experience working in a virtual environment with remote partners and teams
- Proficiency in Microsoft Office
Job Type: Contract
Pay: Up to $90.00 per hour
Expected hours: 40 per week
Schedule:
Application Question(s):
Experience:
- Big data: 10 years (Required)
- Databases: 10 years (Required)
Work Location: Remote