CrawlJobs Logo

Informatica Data Engineer

https://www.randstad.com Logo

Randstad

Location Icon

Location:
Canada, Burnaby

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided
Save Job
Save Icon
Job offer has expired

Job Description:

Our Burnaby-based government client is looking for a skilled Informatica Data Engineer to drive the success of an impactful and fast-paced project. If you're eager to make a meaningful contribution within a large enterprise environment, this is your opportunity to help shape the future of a forward-thinking organization!

Job Responsibility:

  • Perform daily operations activities for data pipelines and Informatica platform to support data and analytics solutions
  • Designs, develops, tests, and implements high quality and sustainable Business Insights
  • Perform detailed source system analysis and gap analysis using a data modeler to complete source target mappings and to determine the gap between current and future state data information needs based on business requirements and capability to create solutions that meet business needs

Requirements:

  • Seven (7) years of Informatica consulting experience on enterprise technology projects in at least three (3) of the following areas: Informatica PowerCenter
  • Informatica Administration
  • Informatica Data Quality
  • Oracle PL/SQL
  • PostGreSQL
  • Real Time
  • Big Data – Greenplum and Hadoop
  • Seven (7) years of working experience in Informatica PowerCenter including v9.x and 10.x
  • Three (3) years of working experience in Informatica Data Quality with writing, debugging, and maintaining application code such as SQL, ETL, and Shell Scripts
  • Three (3) years of working experience in SAP Business Objects Crystal reports, Web Intelligence, and Design Studio
  • Five (5) years of working experience in production support
What we offer:
  • Opportunity to lead and shape the modernization of platforms
  • Collaborative environment working closely with stakeholders from various business areas and Information Management & Technology team
  • Chance to develop and implement cutting-edge solutions
  • Platform to showcase excellent presentation skills by regularly updating and engaging with project stakeholders and IT leadership
  • Contribution to meaningful work

Additional Information:

Job Posted:
April 10, 2025

Expiration:
May 23, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Informatica Data Engineer

New

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our growing Quality Engineering t...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent experience)
  • 5+ years of experience in data engineering, data warehousing, or data architecture
  • Expert-level experience with Snowflake, including data modeling, performance tuning, security, and migration from legacy platforms
  • Hands-on experience with Azure Data Factory (ADF) for building, orchestrating, and optimizing data pipelines
  • Strong experience with Informatica (PowerCenter and/or IICS) for ETL/ELT development, workflow management, and performance optimization
  • Deep knowledge of data modeling techniques (dimensional, tabular, and modern cloud-native patterns)
  • Proven ability to translate business requirements into scalable, high-performance data solutions
  • Experience designing and supporting end-to-end data pipelines across cloud and hybrid architectures
  • Strong proficiency in SQL and experience optimizing large-scale analytic workloads
  • Experience working within SDLC frameworks, CI/CD practices, and version control
Job Responsibility
Job Responsibility
  • Ability to collect and understand business requirements and translate those requirements into data models, integration strategies, and implementation plans
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake, ensuring functionality, performance and data integrity
  • Ability to work within the SDLC framework in multiple environments and understand the complexities and dependencies of the data warehouse
  • Optimize and troubleshoot ETL/ELT workflows, applying best practices for scheduling, orchestration, and performance tuning
  • Maintain documentation, architecture diagrams, and migration plans to support knowledge transfer and project tracking
What we offer
What we offer
  • PTO Policy
  • Eligibility for Health Benefits
  • Retirement Plan
  • Work from Home
  • Fulltime
Read More
Arrow Right

Data Engineer

We are seeking a skilled and innovative Data Engineer to join our team in Nieuwe...
Location
Location
Netherlands , Nieuwegein
Salary
Salary:
3000.00 - 6000.00 EUR / Month
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BSc or MSc degree in IT or a related field
  • Minimum of 2 years of relevant work experience in data engineering
  • Proficiency in building data pipelines using tools such as Azure Data Factory, Informatica Cloud, Synapse Pro, Spark, Python, R, Kubernetes, Snowflake, Databricks, or AWS
  • Advanced SQL knowledge and experience with relational databases
  • Hands-on experience in data modelling and data integration (both on-premise and cloud-based)
  • Strong problem-solving skills and analytical mindset
  • Knowledge of data warehousing concepts and big data technologies
  • Experience with version control systems, preferably Git
  • Excellent communication skills and ability to work collaboratively in a team environment
  • Fluency in Dutch language (required)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes
  • Collaborate with Information Analysts to provide technical frameworks for business requirements of medium complexity
  • Contribute to architecture discussions and identify potential technical and process bottlenecks
  • Implement data quality checks and ensure data integrity throughout the data lifecycle
  • Optimise data storage and retrieval systems for improved performance
  • Work closely with cross-functional teams to understand data needs and deliver efficient solutions
  • Stay up-to-date with emerging technologies and best practices in data engineering
  • Troubleshoot and resolve data-related issues in a timely manner
  • Document data processes, architectures, and workflows for knowledge sharing and future reference
What we offer
What we offer
  • A permanent contract and a gross monthly salary between €3,000 and €6,000 (based on 40 hours per week)
  • 8% holiday allowance
  • A generous mobility budget, including options such as an electric lease car with an NS Business Card, a lease bike, or alternative transportation that best suits your travel needs
  • 8% profit sharing on target (or a fixed OTB amount, depending on the role)
  • 27 paid vacation days
  • A flex benefits budget of €1,800 per year, plus an additional percentage of your salary. This can be used for things like purchasing extra vacation days or contributing more to your pension
  • A home office setup with a laptop, phone, and a monthly internet allowance
  • Hybrid working: from home or at the office, depending on what works best for you
  • Development opportunities through training, knowledge-sharing sessions, and inspiring (networking) events
  • Social activities with colleagues — from casual drinks to sports and content-driven outings
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right
New

Data Engineer

As a Data Engineer, you will partner with KFC, Pizza Hut, Taco Bell & Habit Burg...
Location
Location
Salary
Salary:
Not provided
locations.tacobell.com Logo
KFC
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of data engineering experience
  • 2+ years of experience building cloud data solutions (e.g. on Azure, AWS, GCP) and using services such as storage, virtual machines, serverless technologies and parallel processing technologies
  • Experience processing structured & semi-structured data
  • Experience building serverless APIs (AWS preferred, but any cloud environment works)
  • Working knowledge of agile development, including DevOps concepts (IAC, CI/CD, etc.)
  • Experience with cloud SDKs and programmatic access services
Job Responsibility
Job Responsibility
  • Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to identify opportunities to leverage company data to drive business outcomes
  • Play a key role in our advanced analytics team - developing data-driven solutions, and responsible for driving Yum Growth
  • Design and develop scalable streaming data integration frameworks to move and transform a variety of data sets
  • Develop on the Yum! data platform by building applications using a mix of open-source frameworks (Py-Spark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.)
  • Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration point
What we offer
What we offer
  • 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program
  • generous parental leave
  • competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match
Read More
Arrow Right
New

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Data Engineer

Hands on development and support of new or existing data applications. Work clos...
Location
Location
United States , Branchville, NJ
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Five to seven years of experience in Data Warehousing, Data integration or Data Engineering projects
  • Ability to effectively work well with people in other departments and/or outside of the enterprise
  • Proficient in SQL
  • Experience working within Azure ecosystem
  • Experience in Informatica Powercenter, IICS, Cognos, Netezza Performance servers
  • Experienced in any of these analytical platforms - PowerBI, AzureML, Databricks or Synapse
  • Experience using Python or Scala
  • Experience in Azure DevOps and Github is preferred
  • P&C Insurance experience is preferred
  • Possesses excellent communication skills
Job Responsibility
Job Responsibility
  • Hands on development and support of new or existing data applications
  • Work closely with business and analysts to understand data and business process and make recommendations to clients as requested on best practices or long-term solutions to resolve current issues and also for future system design
  • Works closely with Application and Enterprise Architects to create/review low level implementation designs, understand high level data flow designs developed by data architects
  • Provide technical guidance to the team for implementing complex data solutions
  • Provide support in the design, development, code reviews, test deploy and documentation of data engineering and data integration Applications
  • Maintain detailed documentation to support downstream integrations
  • Provide support for production issues
  • Performs activities of a scrum master
  • Identify technology trends and explore opportunities for use within the organization
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • Eligibility to enroll in company 401(k) plan
  • Fulltime
Read More
Arrow Right
New

Data Test Engineer

We are looking for a skilled Data Test Engineer who can design, build, and valid...
Location
Location
India , Chennai
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of experience in Data Engineering and Data/ETL Testing
  • Strong expertise in writing and optimizing SQL queries (joins, subqueries, window functions, performance tuning)
  • Proficiency in Python or PySpark for data transformation and automation
  • Hands-on experience with ETL tools such as Azure Data Factory, Talend, SSIS, or Informatica
  • Familiarity with cloud platforms, preferably Azure
  • AWS or GCP is a plus
  • Experience working with data lakes, data warehouses (Snowflake, BigQuery, Redshift), and modern data platforms
  • Knowledge of version control systems (Git), issue tracking tools (JIRA), and Agile methodologies
  • Exposure to data testing frameworks like Great Expectations, DBT tests, or custom validation tools
  • Experience integrating data testing into CI/CD pipelines
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL/ELT pipelines to process large volumes of structured and unstructured data using Azure Data Factory, PySpark, and SQL-based tools
  • Collaborate with data architects and analysts to understand transformation requirements and implement business rules correctly
  • Develop and execute complex SQL queries to validate, transform, and performance-tune data workflows
  • Perform rigorous data validation including source-to-target mapping (S2T), data profiling, reconciliation, and transformation rule testing
  • Conduct unit, integration, regression, and performance testing for data pipelines and storage layers
  • Automate data quality checks using Python and frameworks like Great Expectations, DBT, or custom-built tools
  • Monitor data pipeline health and implement observability through logging, alerting, and dashboards
  • Integrate testing into CI/CD workflows using tools like Azure DevOps, Jenkins, or GitHub Actions
  • Troubleshoot and resolve data quality issues, schema changes, and pipeline failures
  • Ensure compliance with data privacy, security, and governance policies
What we offer
What we offer
  • Competitive salary aligned with industry standards
  • Hands-on experience with enterprise-scale data platforms and cloud-native tools
  • Opportunities to work on data-centric initiatives across AI, analytics, and enterprise transformation
  • Access to internal learning accelerators, mentorship, and career growth programs
  • Flexible work culture, wellness initiatives, and comprehensive health benefits
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.