This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities.
Job Responsibility:
Utilize knowledge of applications development procedures and concepts to identify and define necessary system enhancements
Consult with users, clients, and other technology groups on issues, and recommend programming solutions
Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging
Serve as advisor or coach to new or lower level analysts
Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions
Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents
Partner with domain experts, product managers, analysts, and data scientists to develop Data pipeline and Databricks
Define needs around maintainability, testability, performance, security, quality and usability for data platform
Tune Big data applications on Hadoop and non-Hadoop platforms for optimal performance
Define and publish data analytics standards to adopt by others to operate
Able to perform in-depth understanding of how data analytics collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the entire function
Produces detailed analysis of issues where the best course of action is not evident from the information available, but actions must be recommended/ taken
Requirements:
4-8 years hands-on experience in software development
Advanced knowledge of the Hadoop ecosystem and Big Data technologies
Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala)
Data warehousing experience (Common data platforms such as Hadoop / Teradata / Snowflake etc.)
Hands-on experience in building codes for analysis large data sets with PySpark, Snowflake
Experience working on Google or AWS cloud developing data solutions, certifications is a plus
Experience in ETL streaming technologies to process data near real-time
Data Integration / ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio/Informatica PowerCenter etc.)
Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
Data Migration, Data integration experience
Strong development/automation skills
Comprehensive knowledge of the principles of software engineering and data analytics
Knowledge of agile (scrum) development methodology is a plus
System level understanding - Data structures, algorithms, distributed storage & compute
Can-do attitude on solving complex business problems, good interpersonal and teamwork skills
Having Funds Domain knowledge will be preferred
Nice to have:
Knowledge of agile (scrum) development methodology
Having Funds Domain knowledge
What we offer:
Best-in-class benefits
Equal opportunity and affirmative action employer
Reasonable accommodation for persons with disabilities
Welcome to
CrawlJobs.com
– Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.