This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a Data Engineering Manager to lead, manage, and deliver complex data engineering projects within Hewlett Packard Enterprise. This includes technical leadership, delivery management, stakeholder collaboration, and fostering innovation across the organization.
Job Responsibility:
Plan, execute, and manage end-to-end delivery of data engineering projects, ensuring they meet quality, timeline, and budgetary requirements
Implement best practices in Agile, Scrum, or other relevant methodologies for iterative and efficient project delivery
Establish robust mechanisms to monitor project progress, identify risks, and ensure proactive resolution
Provide guidance on the design, development, and deployment of scalable data pipelines, ETL/ELT processes, and data storage solutions
Oversee the integration of cloud-based or on-premises data platforms
Build, mentor, and lead a high-performing team of data engineers
Conduct performance reviews, provide feedback, and develop professional growth plans for team members
Act as a primary point of contact for stakeholders, ensuring alignment of technical solutions with business needs
Translate business requirements into technical deliverables and prioritize tasks effectively
Collaborate with cross-functional teams, including product managers, data scientists, and analysts, to deliver cohesive solutions
Requirements:
The overall experience is between 10-15 years
Bachelor's degree in Computer Science, Information Technology, or a related field
Proficiency in programming languages such as Python or Java
Extensive experience working with big data technologies such as Airflow, Hadoop, Spark, Kafka
Strong proficiency with SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra, Cloudera)
Experience in on-prem data engineering open source tools
Strong experience in data modeling, data warehousing, and ETL frameworks
Familiarity with data governance practices, data privacy, and security standards
Experience with containerization and orchestration tools like Docker and Kubernetes
Knowledge of CI/CD pipelines, version control systems, and agile methodologies
Good to have databricks and Snowflake experience
Good to have: Experience in building AI systems using data
Nice to have:
Good to have databricks and Snowflake experience
Good to have: Experience in building AI systems using data
Welcome to
CrawlJobs.com
– Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.