What are we building?
Hard Rock Digital is a team focused on becoming the best online sportsbook, casino, and social casino company in the world. We’re building a team that resonates passion for learning, operating, and building new products and technologies for millions of consumers. We care about each customer's interaction, experience, behavior, and insight and strive to ensure we’re always acting authentically.
Rooted in the kindred spirits of Hard Rock and the Seminole Tribe of Florida, the new Hard Rock Digital taps a brand known the world over as the leader in gaming, entertainment, and hospitality. We’re taking that foundation of success and bringing it to the digital space — ready to join us?
What’s the position?
We are seeking an experienced DataOps Engineer to join our dynamic data engineering team. In this critical role, you will be responsible for bridging the gap between data engineering and operations, ensuring the reliability, efficiency, and scalability of our data infrastructure. You'll work on implementing and maintaining robust data pipelines, automating processes, and establishing best practices for data quality and governance, with a focus on supporting business-critical data workloads.
What are we looking for?
Strong experience in implementing and managing observability solutions for data pipelines and infrastructure
Proficiency in setting up and maintaining monitoring systems for data workflows and platforms
Expertise in implementing CI/CD practices for data engineering projects
Experience in enforcing and automating data quality checks and data governance policies
Solid understanding of SQL, Python, and/or Java programming languages
Hands-on experience with Snowflake, Apache Airflow, and dbt
Familiarity with AWS data services (e.g., S3, Lambda, DynamoDB, RDS, ECS/ECR)
Knowledge of GitHub CI/CD integrations and workflows
Experience with Infrastructure as Code (IaC) tools, particularly Terraform
Strong problem-solving skills and ability to optimize data pipelines for performance and reliability
Experience with Snowflake-specific practices, including:
Monitoring query performance, storage usage, and user activity
Implementing data classification, access controls, and cost management
Setting up auditing, compliance, and data sharing governance
Configuring automated alerts and data masking for sensitive information
Basic Qualifications:
Bachelor's degree in Computer Science, Data Science, or a related technical field
3+ years of experience in data engineering or a similar role
Proficiency in SQL and Python and/or Java programming
Experience with cloud-based data platforms, preferably AWS
Familiarity with data pipeline orchestration tools (e.g., Apache Airflow/Astro)
Understanding of data quality principles and practices
Experience with version control systems, preferably Git
Preferred Qualifications:
Master's degree in Computer Science, Data Engineering, or a related field
5+ years of experience in DataOps or similar roles
Certifications in AWS, Snowflake, or other relevant technologies
Experience with real-time data processing and streaming technologies
Familiarity with data modeling techniques and tools (e.g., dbt)
Knowledge of machine learning operations (MLOps) practices
Experience in the gaming or sports betting industry
Contributions to open-source projects related to data engineering or DataOps
What’s in it for you?
We offer our employees more than just competitive compensation. Our team benefits include:
Competitive pay and benefits
Flexible vacation allowance
Startup culture backed by a secure, global brand
Roster of Uniques
We care deeply about every interaction our customers have with us and trust and empower our staff to own and drive their experience. Our vision for our business and customers is built on fostering a diverse and inclusive work environment where regardless of background or beliefs you feel able to be authentic and bring all your talent into play. We want to celebrate you being you (we are an equal opportunities employer)