At Deepgram, we spend every day tackling big, real-world challenges in speech. Our customers hire us to solve their hardest problems in speech, taking real, complex audio and transforming it into novel insights. And to raise the bar, everything we build needs scale in its DNA; we aren’t content with simple horizontal scaling: we intend to replace entire data centers dedicated to speech analytics with a single rack of servers. These challenges provide opportunities for creativity and innovative problem-solving every day. Deepgram’s Research Scientists tackle some of the most exciting and difficult problems on the forefront of ASR and NLU technologies.
You’ll have the freedom to innovate and uncover breakthroughs — and influence our product roadmap in turn. We look forward to you bringing your whole self to work, sharing learnings from your latest experiments, and collaborating with us to advance the state of speech technology.
The Role
Deepgram is currently looking for an experienced Research Scientist who has worked extensively with Large Language Models (LLMS) and has a deep understanding of transformer architecture. This individual should have extensive experience working on the hard technical aspects of LLMs, such as data curation, distributed large-scale training, optimization of transformer architecture, and Reinforcement Learning (RL) training.
What You’ll Do
Brainstorming and collaborating with other members of the research team to define new LLM research initiatives
Broad surveying of literature, evaluating, classifying, and distilling current methods
Designing and carrying out experimental programs for LLMs
Driving transformer (LLM) training jobs successfully on distributed compute infrastructure and deploying new models into production
Documenting and presenting results and complex technical concepts clearly for a target audience
Staying up to date with the latest advances in deep learning and LLMs, with a particular eye towards their implications and applications within our products
You’ll Love This Role If You
Are passionate about AI and excited about working on state of the art LLM research
Have an interest in producing and applying new science to help us develop and deploy large language models
Enjoy building from the ground up and love to create new systems.
Have strong communication skills and are able to translate complex concepts clearly
Are highly analytical and enjoy delving into detailed analyses when necessary
It’s Important To Us That You Have
3+ years of experience in applied deep learning research, with a solid understanding toward the applications and implications of different neural network types, architectures, and loss mechanism
Proven experience working with large language models (LLMs) - including experience with data curation, distributed large-scale training, optimization of transformer architecture, and RL Learning
Strong experience coding in Python and working with Pytorch
Experience with various transformer architectures (auto-regressive, sequence-to-sequence.etc)
Experience with distributed computing and large-scale data processing
Prior experience in conducting experimental programs and using results to optimize models
It Would Be Great if You Had
Deep understanding of transformers, causal LMs, and their underlying architecture
Understanding of distributed training and distributed inference schemes for LLMs
Familiarity with RLHF labeling and training pipelines
Up-to-date knowledge of recent LLM techniques and developments
Published papers in Deep Learning Research, particularly related to LLMs and deep neural networks
Salary range of USD $150,000 to $250,000 base plus variable and equity regardless of location
Deepgram is an equal opportunity employer. We want all voices and perspectives represented in our workforce. We are a curious bunch focused on collaboration and doing the right thing. We put our customers first, grow together and move quickly. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, gender identity or expression, age, marital status, veteran status, disability status, pregnancy, parental status, genetic information, political affiliation, or any other status protected by the laws or regulations in the locations where we operate.
We are happy to provide accommodations for applicants who need them.
Deepgram is a foundational AI company building state of the art, production-ready AI models that streamline human-computer interaction and amplify productivity. By enabling seamless communication between humans and machines, we believe we can harness the untapped potential of AI and help pave the way for a more productive future. We passionately believe in the potential of audio data to transform lives, businesses, and interactions across the globe - which is why Deepgram is trusted by well-respected companies like NASA, Twilio, Auth0, and Spotify to push the boundaries of what is possible in voice technology!
Backed by prominent investors including Y Combinator, Madrona, Tiger Global, Wing VC and NVIDIA, Deepgram has raised over $85 million in total funding after closing our Series B funding round last year. If you're looking to work on cutting-edge technology and make a significant impact in the AI industry, we'd love to hear from you!