Companies need robust data management capabilities to build and deploy AI. Data needs to be easy to find, understandable, and trustworthy. And it’s even more important to secure data properly from the beginning of its lifecycle, otherwise it can be at risk of exposure during training or inference. Tokenization is a highly efficient method for securing data without compromising performance. In this session, we’ll share tips for managing high-quality, well-protected data at scale that are key for accelerating AI. In addition, we’ll discuss how to integrate visibility and optimization into your compute environment to manage the hidden cost of AI — your data.
talk-data.com
Y
Speaker
Yudhish Batra
1
talks
Distinguished Engineer
Capital One
Filter by Event / Source
Talks & appearances
1 activities · Newest first