This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities.
Job Responsibility:
Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans
Build Web-Service and RESTful graphs and create RAML or Swagger documentations
Complete understanding and analytical ability of Metadata Hub metamodel
Strong hands on Multifile system level programming, debugging and optimization skill
Hands on experience in developing complex ETL applications
Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues
Strong in UNIX Shell/Perl Scripting
Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3
Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment
Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now
Build Query>It data sources for cataloguing data from different sources
Parse XML, JSON & YAML documents including hierarchical models
Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components
Build Autosys or Control Center Jobs and Schedules for process orchestration
Build BRE rulesets for reformat, rollup & validation usecases
Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations
Ability to identify performance bottlenecks in graphs, and optimize them
Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies
Build regression test cases, functional test cases and write user manuals for various projects
Conduct bug fixing, code reviews, and unit, functional and integration testing
Participate in the agile development process, and document and communicate issues and bugs relative to data standards
Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids
Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment
Perform other duties and/or special projects as assigned
Requirements:
Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics)
Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications
Expertise in handling complex large-scale Data Lake and Warehouse environments
Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities
Welcome to
CrawlJobs.com
– Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.