CrawlJobs Logo

Data DevOps Engineer

https://www.hpe.com/ Logo

Hewlett Packard Enterprise

Location Icon

Location:
United Kingdom, Bristol

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

Not provided

Job Description:

The Data DevOps Engineer role at Hewlett Packard Enterprise involves providing clients with consultative services and implementing scalable Big Data solutions. The role requires expertise in Linux, containerisation technologies (e.g., Docker, Kubernetes), Infrastructure as Code paradigms (e.g., Terraform, Ansible), monitoring tools (e.g., Prometheus, Grafana), and Big Data technologies (e.g., Apache Spark). Applicants must hold UK National Security Clearance Level 04 Developed Vetting (DV) and be eligible to work in the UK. This onsite role is based in Bristol and offers extensive benefits and opportunities for professional growth.

Job Responsibility:

  • Development and implementation of scalable clustered Big Data solutions with a specific focus on automated dynamic scaling, self-healing systems
  • participating in the full lifecycle of data solution development from requirements engineering through to continuous optimisation engineering
  • providing technical thought-leadership and advisory on technologies and processes at the core of the data domain
  • engaging and collaborating with both internal and external teams
  • assisting with solution improvement activities driven either by the project or service

Requirements:

  • An organised and methodical approach
  • excellent time keeping and task prioritisation skills
  • an ability to provide clear and concise updates
  • an ability to convey technical concepts to all levels of audience
  • data engineering skills – ETL/ELT
  • technical implementation skills – application of industry best practices & designs patterns
  • technical advisory skills – experience in researching technological products/services with the intent to provide advice on system improvements
  • experience of working in hybrid environments with both classical and DevOps
  • excellent written & spoken English skills
  • excellent knowledge of Linux operating system administration and implementation
  • broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes
  • Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse
  • monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat
  • observability - SRE
  • Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
  • edge technologies e.g. NGINX, HAProxy
  • excellent knowledge of YAML or similar languages
  • must hold National Security Clearance Level 04 Developed Vetting (DV) - UKIC
  • eligible to work in the UK

Nice to have:

  • Jupyter Hub Awareness
  • Minio or similar S3 storage technology
  • Trino/Presto
  • RabbitMQ or other common queue technology e.g. ActiveMQ
  • NiFi
  • Rego
  • familiarity with code development, shell-scripting in Python, Bash
What we offer:
  • Extensive social benefits
  • flexible working hours
  • competitive salary
  • shared values
  • equal opportunities
  • work-life balance
  • constantly evolving career opportunities

Additional Information:

Job Posted:
March 20, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.