Data Engineer
GoTo Meeting
Software Engineering, Data Science
Bengaluru, Karnataka, India
Posted on Nov 25, 2025
Job Description
Where you’ll work: Remote / Bangalore
Engineering at GoTo
We’re trailblazers in remote work technology—building powerful, flexible solutions that empower everyone to live their best life, both at work and beyond. With us, you’ll have the opportunity to chart new paths and help redefine how the world works. For us, AI isn’t just a buzzword; it’s a tool we use to deliver real, practical value to our customers and teams. We focus on solving meaningful problems, not just adding features for the sake of using AI. Here, growth takes many forms: you can expand your skills, take on new challenges, lead initiatives, and explore creative ideas. Join a GoTo product team and play a key role in transforming the workplace for millions of users worldwide—your work will truly make a difference.
Your Day to Day
As a Data Engineer, you would be:
Design, develop and optimize ETL pipelines using PySpark, Hive and Airflow for large-scale data processing.
Build and maintain data lakes using Delta Lake, ensuring data reliability, quality, and integrity.
Manage data infrastructure deployed on Amazon EMR on EKS and Databricks, leveraging orchestration tools for automation.
Collaborate with stakeholders to understand business requirements and translate them into scalable data models.
Implement data modelling best practices for structured and semi-structured data to support reporting and machine learning.
Contribute to the development and maintenance of data schemas, tables, views, and metadata management strategies.
Monitor, troubleshoot, and improve pipeline performance, scalability, and reliability.
Work with version control like github/bitbucket and CI/CD tools to automate infrastructure and data workflow deployments.
Ensure compliance with data governance, privacy, and security standards.
What We’re Looking For
As a Data Engineer, your background will look like
Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems or related field.
Minimum 2-4 years’ experience as a Data Engineer or related role in big data environments.
Strong proficiency in Python, PySpark, SQL.
Hands-on experience with Airflow for workflow orchestration.
Working knowledge of Hive and Delta Lake.
Experience with EMR on EKS and/or Databricks environments.
Familiarity with containerization (Docker, Kubernetes).
Git/GitHub for source control.
Understanding of best practices for CI/CD pipelines.
Familiarity with cloud infrastructure (AWS preferred).
Nice-to-have (good to have)
Experience with cloud-native tools (AWS, Kubernetes, EMR on EKS).
Familiarity with Databricks and its ML capabilities.
Exposure to streaming data processing (Kafka, Spark Streaming, etc.).
Basic understanding of machine learning concepts and workflows.
Experience in building REST APIs for data services.
What We Offer
At GoTo, we believe in supporting our employees with a comprehensive range of benefits designed to fit your life—at work and beyond. Here are just some of the benefits and perks you can expect when you join our team:
Comprehensive health benefits, life and disability insurance, and fertility and family-forming support program
Generous paid time off, paid holidays, volunteer time off, and quarterly self-care days and no meeting days
Tuition and reading reimbursement programs to support your continuous learning and professional growth
Thrive Global Wellness Program, confidential Employee Assistance Program (EAP), as well as One to One Wellness Coaching
Employee programs—including Employee Resource Groups (ERGs), GoTo Gives, and our charitable matching program—to amplify your connection and impact
GoTo performance bonus program to celebrate your impact and contributions
Monthly remote work stipend to support your home office expenses.