Search by job, company or skills

GTS

Senior Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 9 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Key Responsibilities

  • Design, develop, and optimize big data solutions using Cloudera Data Platform.
  • Design and architect data virtualization solutions using Denodo Platform.
  • Define best practices for data modeling, data governance, metadata management, and data security .
  • Implement data modeling techniques to enhance data structure and performance.
  • Develop and maintain data pipelines to support business intelligence and analytics needs.
  • Collaborate with business stakeholders, data engineers, and BI teams to understand requirements and design scalable, high-performing data solutions.
  • Collaborate with cross-functional teams to improve data processing and analytics workflows.
  • Integrate diverse data sources (cloud, on-prem, relational, APIs, big data platforms, etc.) into a unified data layer.
  • Provide technical leadership on Denodo & Cloudera Data Platform deployment, configuration, upgrades, and troubleshooting.
  • Work closely with enterprise architects to align data virtualization strategy with overall enterprise data architecture.
  • Create and maintain technical documentation for architecture, design, and operational procedures.
  • Implement product demos and pilots to showcase Data Virtualization in enterprise scenarios, cloud deployments, and Big Data projects.
  • Interpret client needs, assess the full requirements, and identify suitable solutions.
  • Build simulated products in test labs to validate the design and solution.
  • Work on detailed projects integrating multiple technologies across multiple disciplines as part of a team.
  • Work on-site or remotely to diagnose and resolve both emergency and problems related to availability, performance, connectivity, and overall functionality.
  • Other work-related duties as assigned.

Qualifications & Experience

  • Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
  • +6 years of experience in Big Data Technologies, Public Cloud platforms, Data Architecture, Data Virtualization, Data Modeling, Data Warehousing, and Data Integration concepts.
  • Experience in Denodo Platform and Cloudera Data Platform (CDP).
  • Proficiency in SQL, data modeling, data virtualization, and API-based integration, and scripting languages, especially Python.
  • Good knowledge of software development and architectural patterns.
  • Technical skills in Java development, JDBC, XML, Web Service related APIs, and version control systems (e.g. SVN, git).
  • Experience in Windows & Linux (and UNIX) operating systems in server environments.
  • Experience with cloud platforms(AWS, Azure, GCP...)and hybrid data architectures.
  • Experience in building and optimizing data pipelines, including ETL, ELT, change data capture (CDC) processes ,data lakes, data warehouses, and BI/reporting tools.
  • Expertise in performance tuning, caching strategies, and query optimization.
  • Strong understanding of data governance, metadata management, and security practices.
  • Proficiency in troubleshooting big data processing issues.
  • Strong analytical and problem-solving skills.
  • Excellent written/verbal communication skills are essential for interaction with clients, making presentations, attending meetings, and writing technical documentation.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 135980949

Similar Jobs