Megagon Team Profile: Nikita Bhutani, Research Scientist

Welcome to our second team profile blog post! In this article, we highlight Nikita Bhutani, a Research Scientist at Megagon Labs. We’ll discuss the exciting developments in Nikita’s work here at Megagon, the advice she’d give to aspiring researchers, and what she loves to do for fun in the San Francisco Bay Area.

How did you arrive at your current role at Megagon?

That’s a tough question! Where do I even begin? It feels like every choice I’ve ever made in my life has led me to this point in time and space. Though, if I had to pick, I’d say that my decision to go back to school played an integral role in becoming part of Megagon’s team.

But the one crucial episode in my life that led me here was deciding to go back to school after a short detour to software engineering. It led me to a doctorate program at the University of Michigan, Ann Arbor, and eventually here.

During this time, my advisor Professor H.V. Jagadish suggested that I undertake a natural language processing (NLP) research project. This endeavor was eventually featured in Empirical Methods in Natural Language Processing (EMNLP), one of the leading conferences in the field. At this event, I learned about Megagon Labs and met Wang-Chiew Tan, our Head of Research.

After this chance encounter, I was fortunate enough to complete an internship here at Megagon’s Mountain View office in 2018. Soon after, I joined the team as a Research Scientist.

What's the most surprising or interesting aspect of your recent work here at Megagon?

Generally speaking, there’s been tremendous progress in NLP over the last couple of years, especially for tasks involving question-answering, reading comprehension, and conversational search. Honestly, I’m astounded by how well numerous large-scale neural networks perform these days.

But do you want to know what I find even more interesting? Creating benchmarks that test a model’s “understanding” of data. It’s challenging but rewarding.

Our new benchmark datasets challenge modern neural networks to understand subjective and implicit information – this kind of information is abundant in reviews/feedback data and getting the machine to extract this information is interesting.

Are there any exciting work events or developments coming up?

Yes! Soon, we’ll release many of our review search and comprehension benchmark datasets to the rest of the research community. These datasets present some unique challenges for current state-of-the-art techniques and systems. I am eager to work on methods that can tackle these challenges. I am also eager to see how other researchers would use our benchmarks. I am curious to see how this will shape the NLP research.

What one piece of advice would you offer to aspiring research scientists?

You can do anything — but not everything. For young researchers and students, it’s easy to get excited about several ideas and concepts. I know the feeling all too well! But time is finite; it’s impossible to work on every single project that you find interesting.

So take a moment to step back and prioritize what you really want to commit your time and effort towards. For me, this comes down to “What initiatives are most impactful and interesting to my team members and myself?”

Stay focused. See things objectively. And persevere in pushing the boundaries of whatever field you dedicate yourself to.

What’s the most underrated activity or place in the San Francisco Bay Area?

Hiking! Going from Michigan to San Francisco is like moving to a different planet. The number of trails available here is a metaphor for freedom. And they’re all breathlessly beautiful; each one brings countless opportunities for photography.

Also, I want to give a special shoutout to Sheba’s Live Jazz Bar! Want amazing live music and delicious Ethiopian eats? This place has it all!

We hope you’ve enjoyed this special interview with Nikita! Check out our blog to see other team member profiles and learn more about Megagon’s recent work!

Written by Nikita Bhutani and Megagon Labs



More Blog Posts: