This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team.
Job Responsibility:
Developing and supporting scalable, extensible, and highly available data solutions
Deliver on critical business priorities while ensuring alignment with the wider architectural vision
Identify and help address potential risks in the data supply chain
Follow and contribute to technical standards
Design and develop analytical data models
Requirements:
First Class Degree in Engineering/Technology/MCA
3 to 4 years’ experience implementing data-intensive solutions using agile methodologies
Experience of relational databases and using SQL for data querying, transformation and manipulation
Experience of modelling data for analytical consumers
Ability to automate and streamline the build, test and deployment of data pipelines
Experience in cloud native technologies and patterns
A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
Excellent communication and problem-solving skills
ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design
Data Modeling & Design: Good exposure to data modeling techniques
design, optimization and maintenance of data models and data structures
Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala
DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management
Nice to have:
Ab Initio: Experience developing Co>Op graphs
ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows
Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs
Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls
Containerization: Fair understanding of containerization platforms like Docker, Kubernetes
File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta
Others: Basics of Job scheduler like Autosys. Basics of Entitlement management
Welcome to
CrawlJobs.com
– Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.