CrawlJobs Logo

Senior Data Engineer - GCP

https://www.cvshealth.com/ Logo

CVS Health

Location Icon

Location:
United States, Woonsocket

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

83430.00 - 203940.00 USD / Year

Job Description:

CVS Health is building a world-class health solutions company. The Senior Data Engineer - GCP role focuses on creating scalable data pipelines, developing microservices, and collaborating across teams for impactful digital projects in healthcare technology.

Job Responsibility:

  • Design, develop, and optimize data pipelines on GCP using BigQuery, Dataflow, and Pub/Sub
  • Build and maintain microservices to support data processing and analytics
  • Develop real-time and batch data ingestion frameworks using Kafka
  • Work with NoSQL databases (e.g., Firestore, Cassandra, MongoDB) to support high-throughput data applications
  • Ensure data integrity, governance, and security best practices in the data pipeline
  • Optimize performance, scalability, and cost efficiency of data solutions
  • Collaborate with data scientists, analysts, and software engineers to deliver end-to-end data products
  • Implement monitoring, logging, and alerting for data pipelines and services
  • Participate in agile development methodologies, attend meetings, and contribute to project planning
  • Explore new tools, frameworks, and techniques to improve digital solutions and drive innovation
  • Create and maintain technical documentation while sharing knowledge with team members

Requirements:

  • 5+ years of overall experience in large-scale software development with at least 2+ years of experience in a technical lead role
  • 5+ years of experience in data engineering with a focus on microservices-based data solutions
  • 5+ years of designing and developing highly scalable Microservices, Webservices, and Rest API’s
  • 3+ years of hands-on experience with GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage
  • 3+ years of containerization (Docker, Kubernetes) and CI/CD for data pipelines
  • 3+ years of experience developing within an agile development

Nice to have:

  • Experience designing and developing in a Test-Driven Development environment
  • Proficiency in Python or Java for data processing
  • Experience with monitoring and logging tools like Prometheus, Stackdriver, or ELK
  • Understanding of distributed computing and data partitioning strategies
  • Strong problem-solving skills and ability to work in an agile environment
  • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes, etc.)
  • Experience with Terraform or Infrastructure as Code (IaC)
  • Familiarity with Apache Beam or Flink for large-scale data processing
  • Knowledge of machine learning pipelines and integration with data engineering workflows
  • Experience in performance tuning and cost optimization in cloud environments
  • Experience with new SQL databases
What we offer:
  • Affordable medical plan options, 401(k) plan (including matching company contributions), and employee stock purchase plan
  • No-cost programs for all colleagues including wellness screenings, tobacco cessation and weight management programs, confidential counseling, and financial coaching
  • Paid time off, flexible work schedules, family leave, dependent care resources, colleague assistance programs, tuition assistance, retiree medical access

Additional Information:

Job Posted:
March 19, 2025

Expiration:
May 02, 2025

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.