talk-data.com talk-data.com

Topic

GenAI

Generative AI

ai machine_learning llm

1517

tagged

Activity Trend

192 peak/qtr
2020-Q1 2026-Q1

Activities

1517 activities · Newest first

With AI tools constantly evolving, the potential for innovation seems limitless. But with great potential comes significant costs, and the question of efficiency and scalability becomes crucial. How can you ensure that your AI models are not only pushing boundaries but also delivering results in a cost-effective way? What strategies can help reduce the financial burden of training and deploying models, while still driving meaningful business outcomes?  Natalia Vassilieva is the VP & Field CTO of ML at Cerebras Systems. Natalia has a wealth of experience in research and development in natural language processing, computer vision, machine learning, and information retrieval. As Field CTO, she helps drive product adoption and customer engagement for Cerebras Systems' wafer-scale AI chips. Previously, Natalia was a Senior Research Manager at Hewlett Packard Labs, leading the Software and AI group. She also served as the head of HP Labs Russia leading research teams focused on developing algorithms and applications for text, image, and time-series analysis and modeling. Natalia has an academic background, having been a part-time Associate Professor at St. Petersburg State University and a lecturer at the Computer Science Center in St. Petersburg, Russia. She holds a PhD in Computer Science from St. Petersburg State University. Andy Hock is the Senior VP, Product & Strategy at Cerebras Systems. Andy runs the product strategy and roadmap for Cerebras Systems, focusing on integrating AI research, hardware, and software to accelerate the development and deployment of AI models. He has 15 years of experience in product management, technical program management, and enterprise business development; over 20 years of experience in research, algorithm development, and data analysis for image processing; and  9 years of experience in applied machine learning and AI. Previously he was Product Management lead for Data and Analytics for Terra Bella at Google, where he led the development of machine learning-powered data products from satellite imagery. Earlier, he was Senior Director for Advanced Technology Programs at Skybox Imaging (which became Terra Bella following its acquisition by Google in 2014), and before that was a Senior Program Manager and Senior Scientist at Arete Associates. He has a Ph.D. in Geophysics and Space Physics from the University of California, Los Angeles. In the episode, Richie, Natalia and Andy explore the dramatic recent progress in generative AI, cost and infrastructure challenges in AI, Cerebras’ custom AI chips and other hardware innovations, quantization in AI models, mixture of experts, RLHF, relevant AI use-cases, centralized vs decentralized AI compute, the future of AI and much more.  Links Mentioned in the Show: CerebrasCerebras Launches the World’s Fastest AI InferenceConnect with Natalia and AndyCourse: Implementing AI Solutions in BusinessRewatch sessions from RADAR: AI Edition New to DataCamp? Learn on the go using the DataCamp mobile appEmpower your business with world-class data and AI skills witha...

This past year was one of technology’s most exciting with the emergence of generative AI, as leaders everywhere considered the possibilities it represented for their organisations.

While many have already recognised its value and are eager to continue innovating, others are inspired by its potential and are seeking ways to adopt it

To implement a successful AI analytics strategy, three key ingredients are essential: powerful AI models, clean data, and a data culture ready to leverage these solutions.

Join us as we examine the challenges and opportunities data leaders face in preparing their organisations for the AI era.

Establishing a solid data foundation within your organisation is a crucial element for AI initiatives. Having this foundation enables your organisation to align its data strategies with your overall business strategies, to help maximise the return on your AI and data investments.

Join our session to explore how "data is the fuel for AI" and how you can unlock the potential of your AI efforts, drive innovation and achieve a sustainable competitive advantage.

The data engineer role has expanded far beyond data pipeline management. Data engineers are now tasked with managing scalable infrastructure, optimizing cloud resources, and ensuring real-time data processing, while keeping costs in check - which continues to be quite challenging.

In this session, Revefi will demonstrate Raden, the world’s first AI data engineer. Raden augments data teams with “distinguished engineer level” expertise in data architecture, system performance, optimization, and cost management.

Raden uses GenAI and AI to address these challenges by working with your team as an 👩‍✈️ AutoPilot and/or 👨‍✈️ CoPilot by automating critical functions such as Data Quality, Data Observability, Spend Management, Performance Management, and Usage Management, allowing your data team to tackle complex use cases with ease.

Join us to discover how you can revamp your data engineering practices and dramatically improve the ROI from your data investments 

In this flagship Big Data LDN keynote debate, conference chair and leading industry analyst Mike Ferguson welcomes executives from leading software vendors to discuss key topics in data management and analytics. Panellists will debate the impact of Generative AI, the implications of key industry trends, how best to deal with real-world customer challenges, how to build a modern data and analytics (D&A) architecture, how to manage, produce, share and govern data and AI, and issues on-the-horizon that companies should be planning for today.

Attendees will learn best practices for data and analytics implementation in a modern data-driven enterprise from seasoned executives and an experienced industry analyst in a packed, unscripted, candid discussion.

Artificial Intelligence has transitioned from a niche concept to a widespread force shaping the business world's landscape. Streaming and AI integration have emerged as crucial drivers in this digital transformation era, focusing on the dynamic and real-time facets of data flow to generate contextually relevant predictions.

Businesses across diverse sectors increasingly adopt AI technology to optimise operations, stay competitive, and augment user experiences. However, AI's true potential only unfolds when applied to the right data sets, at the right moment, and within the appropriate context. In this session, Italo will discuss how AI and Streaming can work together to provide the latest and freshest data, be it about our customers, your business, or the market to your business.

In this presentation, we will explore the transformative potential of Generative AI, a rapidly evolving field poised to redefine industries across the globe. We will begin by examining the market potential and addressing key challenges surrounding GenAI adoption, providing a comprehensive overview of its current landscape. Following this, we will delve into SAS's innovative approach to GenAI, highlighting our cutting-edge capabilities that empower organisations to harness its full potential. Finally, we will share real-world applications where SAS has successfully enabled organisations to implement GenAI solutions, driving tangible business value and innovation. Join us to gain valuable insights into the future of GenAI and learn how SAS is at the forefront of this technological revolution.

Everything has changed in the last year with Generative AI entering onto the scene. This means a re-shuffling of priorities and budgets, putting AI-enabled Data & Analytics right back at the top of the agenda. In this session we will discuss: 

• That there is no Generative AI without data – but it has to be the right data 

• The importance of being able to bring together organised and trusted data 

• Why your data integration strategy is the foundation to successfully using AI

As data continues to grow in complexity, the need for a unified data layer with rich semantic business meaning has become more critical than ever. This session examines the transformative impact of integrating generative AI with a well-structured, unified data layer, emphasizing how this combination unlocks new levels of intelligence and efficiency. By standardizing and contextualizing data across the organization, companies can fully leverage the power of generative AI to drive insights, automation, and decision-making. Explore practical strategies and case studies that highlight how a unified data layer is the key to harnessing generative AI, marking the moment when data management truly evolved. Don’t miss this opportunity to learn how to prepare your data infrastructure for the future.

Overcome the limitations of your legacy data warehouse or BI systems and reap the benefits of a cloud-native stack with LeapLogic, Impetus’ automated cloud migration accelerator. Join our session to explore how LeapLogic’s end-to-end automated capabilities can fast-track and streamline the transformation of legacy data warehouse, ETL, Hadoop, analytics, and reporting workloads to the cloud. Gain actionable insights from real-world success stories of Fortune 500 enterprises that have successfully modernised their legacy workloads, positioning them at the forefront of the GenAI revolution. 

Join us at the Gen AI theater for an exclusive fireside chat with Matthew Thomson, Nokia’s head of data and digital, and Marianne Taudiere, Quid’s vice president of EMEA. Delve into how predictive analytics is revolutionizing strategic functions, explore the pivotal role of data in Nokia’s operations, and discover compelling case studies showcasing the transformative impact of Generative AI. Don’t miss this engaging session to learn how Gen AI is shaping the future of business!

In today’s data-driven world, whether you’re building your own data pipelines or relying on third-party vendors, understanding the fundamentals of great data movement systems is invaluable. It’s not just about making things work—it’s about ensuring your data operations are reliable, scalable, and cost-effective.

As an early employee and Airbyte’s Platform Architect, I’ve spent the last 3.5 years working through the challenges and intricacies of building a data movement platform. Along the way, I’ve learned some important lessons, often the hard way, that I believe could be helpful to others who are on a similar journey.

In this session, I’ll share these lessons in the hope that my experiences can offer some guidance, whether you’re just starting out or looking to refine what you’ve already built. I’ll also touch on how the rapid rise of generative AI is changing the landscape, and how we’re trying to adapt to these new challenges. My goal is to provide insights anyone can take back to their own projects, helping them avoid some of the pitfalls and navigate the complexities of modern data movement.

2 - 3 Main Actionable Takeaways:

• A general framework for designing a data movement system.

• Crucial fine print such as managing various destination memory types, the surprising need to re-import data and the shortcuts & pitfalls of artificial cursors.

• Adjusting data movement systems for an AI-first world.