This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data DevOps Engineer role at Hewlett Packard Enterprise involves providing clients with consultative services and implementing scalable Big Data solutions. The role requires expertise in Linux, containerisation technologies (e.g., Docker, Kubernetes), Infrastructure as Code paradigms (e.g., Terraform, Ansible), monitoring tools (e.g., Prometheus, Grafana), and Big Data technologies (e.g., Apache Spark). Applicants must hold UK National Security Clearance Level 04 Developed Vetting (DV) and be eligible to work in the UK. This onsite role is based in Bristol and offers extensive benefits and opportunities for professional growth.
Job Responsibility:
Development and implementation of scalable clustered Big Data solutions with a specific focus on automated dynamic scaling, self-healing systems
participating in the full lifecycle of data solution development from requirements engineering through to continuous optimisation engineering
providing technical thought-leadership and advisory on technologies and processes at the core of the data domain
engaging and collaborating with both internal and external teams
assisting with solution improvement activities driven either by the project or service
Requirements:
An organised and methodical approach
excellent time keeping and task prioritisation skills
an ability to provide clear and concise updates
an ability to convey technical concepts to all levels of audience
data engineering skills – ETL/ELT
technical implementation skills – application of industry best practices & designs patterns
technical advisory skills – experience in researching technological products/services with the intent to provide advice on system improvements
experience of working in hybrid environments with both classical and DevOps
excellent written & spoken English skills
excellent knowledge of Linux operating system administration and implementation
broad understanding of the containerisation domain adjacent technologies/services, such as: Docker, OpenShift, Kubernetes
Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse
monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat
observability - SRE
Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem
edge technologies e.g. NGINX, HAProxy
excellent knowledge of YAML or similar languages
must hold National Security Clearance Level 04 Developed Vetting (DV) - UKIC
eligible to work in the UK
Nice to have:
Jupyter Hub Awareness
Minio or similar S3 storage technology
Trino/Presto
RabbitMQ or other common queue technology e.g. ActiveMQ
NiFi
Rego
familiarity with code development, shell-scripting in Python, Bash
Welcome to
CrawlJobs.com
– Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.