Careem
Careem is building the Everything App for the greater Middle East, making it easier than ever to move around, order food and groceries, manage payments, and more. Careem is led by a powerful purpose to simplify and improve the lives of people and build an awesome organisation that inspires. Since 2012, Careem has created earnings for over 2.5 million Captains, simplified the lives of over 50 million customers, and built a platform for the region’s best talent to thrive and for entrepreneurs to scale their businesses. Careem operates in over 70 cities across 10 countries, from Morocco to Pakistan.
About the Team
The Careem Data Platform team’s mission is to provide a platform to abstract big data complexities and enable fast, reliable and secure access to data. As a member of this team, you will be at the forefront of fulfilling this mission. You will be working with the top talent of the region, leveraging modern big data tools and techniques to solve the region’s day to day problems, on top of our own in-house data platform, serving users in real-time.
This role will be part of the Data processing and computation platform team. We are heavily invested in open source technologies like Apache Spark, Apache Kafka, Apache Trino etc. You would also have an opportunity to contribute and collaborate with the open source community.
What you'll do
- Bringing an innovative and creative mindset to data engineering challenges to develop a modern data platform with efficient reusable components
- Design, architect, solutioning, implement and test rapid prototypes that demonstrate value of the data and present them to diverse audiences.
- Your focus will be on making code more efficient to run, optimizing resources across the cluster, and speeding up the compute workloads we face
- Continuously improve our engineering processes, tests, and systems that allow us to scale the code base and productivity of the team
- Collaborate with teams globally and operate in a fast paced environment
- Responsible for creating reusable and scalable data pipelines.
What you’ll need
- 6+ years of professional software development experience.
- Bachelor's Degree in Computer Science or other related technical field.
- Extensive software development experience with Scala, Java, or similar programming languages.
- Prior experience developing distributed systems or working on similar projects.
- Proficiency in using cloud-native Big Data technologies
- Demonstrated experience with software engineering and design best practices.
- Appreciation for creating maintainable, performant, and high quality software as part of a fun, high-performing global team.
Nice to have
- Prior experience with cloud control plane (AWS, GCP etc) or database internals such as query optimization
- Experience contributing to open source software
- Experience with Docker and Kubernetes is preferred.
What we’ll provide you
We offer colleagues the opportunity to drive impact in the region while they learn and grow. As a full time Careem colleague, you will be able to:
- Work and learn from great minds by joining a community of inspiring colleagues.
- Put your passion to work in a purposeful organisation dedicated to creating impact in a region with a lot of untapped potential.
- Explore new opportunities to learn and grow every day.
- Work 4 days a week in office & 1 day from home, and remotely from any country in the world for 30 days a year with unlimited vacation days per year. (If you are in an individual contributor role in tech, you will have 2 office days a week and 3 to work from home.)
- Access to healthcare benefits and fitness reimbursements for health activities including gym, health club, and training classes.