talk-data.com talk-data.com

Don Scott

Speaker

Don Scott

3

talks

GM, Azure AI Products Microsoft

Don Scott is the GM of Azure AI Product at Microsoft, where he leads portfolio strategy and marketing for Azure AI Foundry and Microsoft’s agentic platform, bringing frontier models and tools to developers and businesses worldwide. He collaborates with engineering, research, partner, and field teams to translate breakthroughs into measurable customer outcomes while upholding Responsible AI principles. A builder and operator at heart, he has led high-growth cloud and data platform businesses, scaled product-led motions across the AI and data cloud, and speaks with customers about modern AI architectures, safety, and ROI.

Bio from: Databricks DATA + AI Summit 2023

Filter by Event / Source

Talks & appearances

3 activities · Newest first

Search activities →
Build Partner Advantage: Drive Key AI Use-Cases with Azure Tech Stack

The session will highlight the most popular market trends and use-cases in building apps and agents for customers, and how partners can use the full Azure tech stack - app, data, and AI platforms, with the world's most popular developer tools, to offer their customers the solutions they need. The session will cover resources made available to partners to be successful.

Driving agentic innovation with MCP as the backbone of tool-aware AI

In this technical deep-dive, Maria and Don unveil how Microsoft is shaping the future of agent-tool interactions through the Model Context Protocol (MCP). As AI agents evolve from simple task runners to autonomous collaborators, MCP emerges as the preferred communication mechanism—enabling secure, scalable, and OS-agnostic tool orchestration across platforms like Microsoft Foundry, GitHub, and VS Code.

Comparing Databricks and Snowflake for Machine Learning

Snowflake and Databricks both aim to provide data science toolkits for machine learning workflows, albeit with different approaches and resources. While developing ML models is technically possible using either platform, the Hitachi Solutions Empower team tested which solution will be easier, faster, and cheaper to work with in terms of both user experience and business outcomes for our customers. To do this, we designed and conducted a series of experiments with use cases from the TPCx-AI benchmark standard. We developed both single-node and multi-node versions of these experiments, which sometimes required us to set up separate compute infrastructure outside of the platform, in the case of Snowflake. We also built datasets of various sizes (1GB, 10GB, and 100GB), to assess how each platform/node setup handles scale.

Based on our findings, on the average, Databricks is faster, cheaper, and easier to use for developing machine learning models, and we use it exclusively for data science on the Empower platform. Snowflake’s reliance on third party resources for distributed training is a major drawback, and the need to use multiple compute environments to scale up training is complex and, in our view, an unnecessary complication to achieve best results.

Talk by: Michael Green and Don Scott

Connect with us: Website: https://databricks.com Twitter: https://twitter.com/databricks LinkedIn: https://www.linkedin.com/company/databricks Instagram: https://www.instagram.com/databricksinc Facebook: https://www.facebook.com/databricksinc