What You'll Do
- What the role is about: As an experienced Analytics Ops Engineer, you will design, implement, and maintain our data pipelines and analytics infrastructure. You will leverage modern data ops and automation technologies such as Docker, Kubernetes, and Airflow to ensure seamless data flow and availability. Experience with air-gapped configurations is a significant advantage.
- What impact this role has: Your role will be crucial in optimizing our data operations, ensuring high availability and performance of our data services. By automating workflows and managing infrastructure, you will enable our teams to make data-driven decisions more efficiently.
- What success might look like: Success in this role means delivering robust and scalable data pipelines, maintaining high system uptime, and effectively automating data processes. You will be recognized for your ability to integrate various data sources and optimize performance, contributing to the overall success of our analytics platform.
Who You'll Work With
- An outline of the team: You will be part of a dedicated and innovative data operations team that collaborates closely with data scientists, analysts, and other stakeholders. The team is focused on building and maintaining a resilient data infrastructure.
- The role and the team play : The data operations team plays a vital role in ensuring the reliability and efficiency of our data services. Your contributions will help maintain the integrity and performance of our analytics platform.
- Who the position reports to: This position reports to the Data Integration Team Lead
What Makes You a Qualified Candidate
Non-negotiable qualifications:
- Minimum of 5 years of experience in data engineering, analytics operations, or a related field.
- Strong SQL skills with experience in writing complex queries and optimizing database performance.
- Hands-on experience with Kubernetes and Docker for container orchestration and management.
- Proficiency in using Apache Airflow for workflow automation and scheduling.
- Experience with air-gapped configurations.
What You'll Bring
- Proficiency in programming language such as Python, Java, or JavaScript.
- Proficiency in CI/CD tools (Gitlab)
- Previous experience in dev ops role is a plus
- Proficiency in Linux administration and shell-scripting.
- Knowledgeable in specific Linux distro's: SUSE Linux Enterprise Server (SLES)
- Experience with cloud platforms such as AWS, GCP is a plus.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Experience with automation scripts and tools to streamline data processing and deployment workflows.
- Knowledge of data integration techniques and ensuring data consistency and reliability.
- Ability to set up monitoring tools and troubleshoot issues related to data pipelines and infrastructure.
- Familiarity with big data technologies like Hadoop or Spark is a plus.
- Strong documentation skills to maintain comprehensive records of data workflows, infrastructure, and processes.