CrawlJobs Logo

Data Engineer

https://www.atlassian.com Logo

Atlassian

Location Icon

Location:
India, Bengaluru

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Atlassian is looking for a Data Engineer to join our Go-To Market Data Engineering (GTM-DE) team, which is responsible for building our data lake, maintaining our big data pipelines/services, and facilitating the movement of billions of messages each day. As a Data Engineer, you'll work on an AWS-based data lake backed by open-source projects like Presto, Spark, Airflow, and Hive.

Job Responsibility:

  • Help our stakeholder teams ingest data faster into our data lake
  • Make our data pipelines more efficient
  • Come up with ideas to help instigate self-serve data engineering within the company
  • Build micro-services
  • Architect, design, and enable self-serve capabilities at scale to help Atlassian grow

Requirements:

  • At least 2 years professional experience as a software engineer or data engineer
  • A BS in Computer Science or equivalent experience
  • Strong programming skills (some combination of Python, Java, and Scala preferred)
  • Experience with data modeling
  • Experience in modern software development practices (Agile, TDD, CICD)
  • Knowledge of data warehousing concepts
  • Experience writing SQL, structuring data, and data storage practices
  • Experience building data pipelines, platforms, micro services, and REST APIs
  • Experience with Spark, Hive, Airflow and other streaming technologies to process incredible volumes of streaming data
  • A willingness to accept failure, learn and try again
  • An open mind to try solutions that may seem crazy at first
  • Experience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like)

Nice to have:

  • Experience building self-service tooling and platforms
  • Built and designed Kappa architecture platforms
  • A passion for building and running continuous integration pipelines
  • Built pipelines using Databricks and well versed with their API’s
  • Contributed to open source projects (Ex: Operators in Airflow)
What we offer:
  • Health coverage
  • Paid volunteer days
  • Wellness resources

Additional Information:

Job Posted:
March 19, 2025

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.