Search by job, company or skills

L

Databricks Data Platform Engineer/Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Project description:

We are seeking an expert with deep proficiency as a DataBricks Platform Engineer, possessing experience in data engineering. This individual should have a comprehensive understanding of both data platforms and software engineering, enabling them to integrate the platform effectively within an IT ecosystem.

Responsibilities:

  • Manage and optimize Databricks data platform.
  • Ensure high availability, security, and performance of data systems.
  • Provide valuable insights about data platform usage.
  • Optimize computing and storage for large-scale data processing.
  • Design and maintain system libraries (Python) used in ETL pipelines and platform governance.
  • Optimize ETL ProcessesEnhance and tune existing ETL processes for better performance, scalability, and reliability.

Skills:

  1. Minimum 10 Years of experience in IT/Data.
  2. Minimum 3 years of experience as a Databricks Data Platform Engineer.
  3. Bachelor's in IT or related field.
  4. Infrastructure & Cloud: Azure, AWS (expertise in storage, networking, compute).
  5. Programming: Proficiency in PySpark for distributed computing.
  6. Proficiency in Python for ETL development.
  7. SQL: Expertise in writing and optimizing SQL queries, preferably with experience in databases such as PostgreSQL, MySQL, Oracle, or Snowflake.
  8. Data Warehousing: Experience working with data warehousing concepts and Databricks platform.
  9. ETL Tools: Familiarity with ETL tools & processes
  10. Data Modelling: Experience with dimensional modelling, normalization/denormalization, and schema design.
  11. Version Control: Proficiency with version control tools like Git to manage codebases and collaborate on development.
  12. Data Pipeline Monitoring: Familiarity with monitoring tools (e.g., Prometheus, Grafana, or custom monitoring scripts) to track pipeline performance.
  13. Data Quality Tools: Experience implementing data validation, cleaning, and quality frameworks, ideally Monte Carlo.

Nice to have

  • Containerization & Orchestration: Docker, Kubernetes.
  • Infrastructure as Code (IaC): Terraform.
  • Understanding of Investment Data domain (desired).

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135380507