THOUGHTWORKS DATA HIRING CHALLENGE



About ThoughtWorks :

At ThoughtWorks, you will see passionate technologists who believe in the power of software and technology as tools for social change. The 1000+ people in ThoughtWorks India are as diverse in personality as they are in their backgrounds, culture, and expertise.

If you’re someone who’s inspired by technology, by joining ThoughtWorks, you become part of a community. People join because they get to talk to the people who wrote the books that influenced them, work with the people who wrote the tools they would like to use, and collaborate on projects that propel change in the real world.

Join telegram for more updates
JOIN NOW
Job Description

If you’re someone who’s inspired by technology, by joining ThoughtWorks, you become part of a community. People join because they get to talk to the people who wrote the books that influenced them, work with the people who wrote the tools they would like to use, and collaborate on projects that propel change in the real world.

Our developers have been contributing code to major organizations and open source projects for over 25 years now. They’ve also been writing books, speaking at conferences, and helping push software development forward -- changing companies and even industries along the way. As Consultants, we work with our clients to ensure we’re delivering the best possible solution. Data engineers play an important role in leading these projects to success.

Role

Data Engineering

Roles and Responsibilties
  • Creating complex data processing pipelines, as part of diverse, high energy teams
  • Designing scalable implementations of the models developed by our Data Scientists
  • Hands-on programming based on TDD, usually in a pair programming environment
  • Deploying data pipelines in production based on Continuous Delivery practices
Ideally
  • You’re familiar with and interested in any one of these languages - Python/Scala/Java.
  • You enjoy working with SQL, both querying for reports and creating projections
  • Understand how to build data pipelines and data-centric applications using distributed storage platforms like HDFS, S3, NoSql databases (Hbase, Cassandra, etc) and distributed processing platforms like Hadoop, Spark, Hive, Oozie, Airflow, Kafka etc as part of the curriculum or project
  • You have an understanding of building data pipelines (ETL) and conceptually understand how distributed storage and distributed processing works. It helps if you have played around with some tools within the space (e.g. Hadoop MapReduce, spark, Hdfs etc)
  • It's great if you have hands-on experience in (one or more) cloud offerings like AWS EMR, Azure HDInsights etc or with data specific offerings like Databricks or Cloudera
  • Knowledge of software best practices like Test-Driven Development (TDD) and Continuous Integration (CI), Agile development
  • Strong communication skills with the ability to work in a consulting environment are essential
  • You work well with others and have a collaborative attitude
  • You’re keen on making things as efficient as possible and solving challenges.
  • Have the passion and analytical skills to play around with data to gain insights
Qualification

B.E. / B. Tech / M.E / M.Tech / MCA

Batch

2020/2021

Skills

Java/Ruby/Python/C#/Golang

Location

Bangalore, Chennai, Pune, Coimbatore , Mumbai, Hyderabad, Gurgaon

challenge format
  • 1 Basic Programming Question (Language restriction - Java, Ruby, Python, C# & Golang)
  • 1 SQL Question
  • 10 MCQs
Note

You will require a webcam to be able to take this test. Please have a functional webcam-enabled and grant access to the pop-up on the browser.

Apply now