Big Data Engineer - DevOps

Operations • mountain view, ca

Big Data Engineer – DevOps

Tango is a leading mobile messaging service with more than 200 million registered members around the world and is transforming the way people communicate, discover and share. Evolved from its beginnings in 2010 as a cross platform texting and calling app, Tango today combines free communication, social networking, and content in a single platform. The best part, Tango is giving members more fun and engaging ways of connecting with those they care about through social networking, playing games, sharing music, news feed channels, and a whole lot more!

Join the Tango team where we work together in a thriving, fast-paced start-up environment that is about passion, trust and drive. In between programming, designing, and serving our members, our team enjoys daily family style meals, and many other great perks to make our employees happy.


Tango is looking for a talented and motivated Big Data Engineer to join our unique DevOps team. The team’s mission is to architect, build, scale, and maintain automated tools & frameworks that support multiple cloud based environments.

You will help in creating and maintaining the best and most advanced Mobile Voice/Video/Text and Social infrastructure in the industry with major focus on Big Data technologies and analytics. You will be working with state-of-the-art and emerging technologies and amazingly smart people to help improve and scale the global product offering.

You are curious about Big Data analytics and pipelines, cloud computing (private & public), distributed systems and networking. You don’t just learn how things work, you learn why. Understanding how systems work at a fundamental level is a passion of yours. You are motivated to grow and you are a fast learner.


Strong desire to work in a fast-paced dynamic environment where you have a lot of responsibilities and see immediate impact of your efforts.

Experience working with large datasets, preferably using tools like Hadoop, MapReduce, Pig, or Hive, ElasticSearch, Flume, Morphline.

Familiar with Pipeline technologies and distributed systems in large scale.

Significant programming experience in Python and Shell Scripts

Expert knowledge of Linux / Unix systems

Exceptional ability and motivation to solve problems and learn fast.


Software development background is a big plus

Knowledge of Hybrid Cloud Solutions (VMWare and Amazon Cloud) with hands on experience in direct API integration.

Experience programming in Java, Ruby

A GitHub account with some cool projects in it

Experienced in developing custom solutions for systems monitoring and analyzing Big Data in Production systems including log analysis

Solid understanding of systems monitoring, alerting and analytics (NewRelic, Cactai, Graphite, Log Stash, Nagios, Ganglia etc.)

Apply for Position

First Name *
Last Name *
Email Address *
Phone *