talk-data.com talk-data.com

Event

Mastering LLM Serving and Management at Grammarly

2024-05-15 – 2024-05-15 Meetup Visit website ↗

Activities tracked

1

Join us on May 15 to hear Grammarly software engineer Christoph Stuber explore our approaches to using the tools and processes of our ML infrastructure and discover how we manage and serve LLMs at scale.

Registration: to attend the meetup, please register ➡️ here ⬅️

🔈 Christoph Stuber, Software Engineer at Grammarly 🚀 At Grammarly, we use a combination of third-party LLM APIs and in-house LLMs. During this talk, we’ll: - Talk about how LLMs play an essential role in our product offerings - Give an overview of the different tools and processes we use in our ML infrastructure - Discuss how we approach challenges like access\, cost control\, and load testing of LLMs - And share our experience in optimizing and serving LLMs

✨ Who Will Be Interested: ML engineers, ML Infrastructure engineers, and anyone with knowledge of, or interest in, ML architecture and infrastructure.

This session will present a general overview of the topic, which will be of interest to enthusiasts and specialists at all levels. For the more senior members of our audience, we will briefly examine the practical aspects and associated challenges.

Agenda: 18:30 Doors open: Time for mingling and networking with fellows; snacks and drinks will be served 19:00 Talk 20:00 More snacks, drinks, mingling, and networking 21:00 Meetup ends

✅ Where: In person, Grammarly Berlin hub ✅ When: Wednesday, May 15 ✅ Language: English ✅ Use this link to register: https://gram.ly/3JGmbrq

The event is free. Registration is mandatory. Due to a limited number of seats, the invites will be sent to a limited number of registered on a first registered first invited basis. Please check your inbox for a confirmation email about your attendance.

Sessions & talks

Showing 1–1 of 1 · Newest first

Search within this event →

ML infrastructure and serving LLMs at scale

2024-05-15
talk
Christoph Stuber (Grammarly)

This talk covers Grammarly's approach to using a combination of third-party LLM APIs and in-house LLMs, the role of LLMs in Grammarly's product offerings, an overview of the tools and processes used in our ML infrastructure, and how we address challenges such as access, cost control, and load testing of LLMs, sharing our experience in optimizing and serving LLMs.