talk-data.com talk-data.com

Topic

GitHub

version_control collaboration code_hosting

661

tagged

Activity Trend

79 peak/qtr
2020-Q1 2026-Q1

Activities

661 activities · Newest first

Enterprises modernizing core systems on Azure face a familiar set of challenges: legacy mainframes, complex code refactors, and fragmented DevOps pipelines. Join Cognition to get a practical view of how autonomous AI agents like Devin help engineering teams accelerate large-scale modernization efforts. Witness how engineers used Devin to refactor COBOL applications, automate migration pipelines into Azure DevOps and GitHub, and validate migrated workloads for production.

AI-powered workflows with GitHub and Azure DevOps

Modernize your DevOps strategy with Agentic DevOps by migrating your Azure Repos to GitHub while continuing to leverage the investments you’ve made in Azure Boards and Azure Pipelines. We’ll walk through real-world patterns for hybrid adoption, show how to integrate GitHub, Azure Boards and Azure Pipelines, and share best practices for enabling agent-based workflows with the MCP Servers for Azure DevOps, Playwright and Azure.

Delivered in a silent stage breakout.

Engineering the Future in the Age of Digital Product Innovation

Discover how AI, cloud, and robotics are transforming digital engineering across industries. Learn how Microsoft and its partners are enabling faster design, smarter simulation, and scalable automation—from PLM modernization to autonomous systems. See how Azure, GitHub, and agentic AI are powering the next wave of industrial innovation.

Security standards often live in policy documents but aren’t consistently enforced. Learn how Azure Policy and Machine Configuration let you deploy built-in CIS benchmark templates for Linux and Windows, customize them to your needs, and apply them across Azure and hybrid / multi-cloud servers (via Azure Arc). We’ll also show how to integrate these policies into CI/CD pipelines (e.g., GitHub Actions) for continuous compliance checks, turning written guidelines into real-world security posture.

Want to ship applications faster? This demo shows you how to dramatically accelerate time to market with a high-velocity vibe coding workflow. We'll build a complete app with Azure Cosmos DB from scratch using GitHub Copilot as our AI pair programmer. We'll make use of the new Azure Cosmos DB Linux emulator and VS Code extension for local development to create a frictionless, end-to-end experience that helps you deliver features faster than ever before.

In this talk we’ll learn Infrastructure-as-Code by automating the world’s most popular game: Minecraft. Using Packer, Terraform and GitHub Actions, we’ll build a server, configure Linux, provision cloud infrastructure and operate it through GitOps. Finally, we’ll demonstrate how to go beyond automating traditional cloud control planes—automating the Minecraft world itself by using Terraform to build and demolish structures like castles and pyramids before our very eyes!

This demo-heavy session highlights the enhanced MSSQL extension for Visual Studio Code, now more robust than ever with new AI-driven enhancements to streamline your SQL development experience. With GitHub Copilot, you can move faster from schema to code, generate sample data, explore relationships, and help your app and backend stay in sync. With our latest mssql-python driver, you can develop with ease, security, and performance, across SQL Server, Azure SQL and SQL database in Fabric.

Before the age of agents, tech debt often went straight from the backlog to the graveyard. While it’s important to continually identify ways to improve your services, it can be difficult to prioritize those items over customer-facing enhancements. With the introduction of AI agents, you can now more easily modernize and evolve your codebase, ensuring it never becomes the next legacy. Learn how to embrace the process of modernizing and how to leverage AI agents to keep your codebase up to date.

Learn to leverage agent-framework, the new unified platform from Semantic Kernel and AutoGen engineering teams, to build A2A compatible agents similar to magnetic-one. Use SWE Agents (GitHub Copilot coding agent and Codex with Azure OpenAI models) to accelerate development. Implement MCP tools for secure enterprise agentic workflows. Experience hands-on building, deploying, and orchestrating multi-agent systems with pre-release capabilities. Note: Contains embargoed content.

Please RSVP and arrive at least 5 minutes before the start time, at which point remaining spaces are open to standby attendees.

From legacy to modern .NET on Azure faster than ever

Modernizing legacy .NET apps just became easier end to end. Explore how Visual Studio 2026, .NET, and GitHub Copilot modernization tooling streamline the journey. Learn how new features in Azure App Service solve common compatibility challenges—delivering scalable, secure, AI-ready deployments with deep observability and cost efficiency.

In this hands-on lab, you will learn how to migrate traditional workloads to AKS using containerization and AKS Automatic, while enhancing functionality through integration with Azure PaaS services. We'll also explore how GitHub Copilot Application Modernization accelerates the process with AI-assisted coding and intelligent refactoring, helping teams streamline modernization and unlock cloud-native capabilities faster.

Please RSVP and arrive at least 5 minutes before the start time, at which point remaining spaces are open to standby attendees.

“On Air at Microsoft Ignite” will deep dive into key announcements with expert interviews, demonstrations, and real-world applications for all the latest news. ​​

Hour 2 will feature:​ Azure Infrastructure: Jeremy Winter ​ AI Skills Navigator: Kavitha Radhakrishnan​ Varonis: Shawn Hays​ NVIDIA: Andrew Hester​ GitHub Copilot: Martin Woodward​ AMD: Arjun Oberoi & Daniel Kim

In this episode, Mukundan opens up about one of the most difficult phases of his job hunt and how he built a tiny 3-agent AI system that completely changed the way he applied, prepared, and stayed consistent. You will learn how the Researcher Agent, Writer Agent, and Reviewer Agent work together to turn any job description into clarity. You will also hear a raw, human story about self-doubt, burnout, and the psychology behind job search momentum. This is a deeply personal, practical episode for anyone who feels stuck, exhausted, confused, or overwhelmed in their job search.Optimized for: AI jobs, data jobs, job search tools, AI agents, career automation, productivity systems, job search burnout, how to find clarity in job search.

Join the Discussion (comments hub): https://mukundansankar.substack.com/notes Tools I use for my Podcast and Affiliate PartnersRecording Partner: Riverside → Sign up here (affiliate)Host Your Podcast: RSS.com (affiliate )Research Tools: Sider.ai (affiliate)Sourcetable AI: Join Here(affiliate)🔗 Connect with Me:Free Email NewsletterWebsite: Data & AI with MukundanGitHub: https://github.com/mukund14Twitter/X: @sankarmukund475LinkedIn: Mukundan SankarYouTube: Subscribe

In this episode, Conor and Bryce record live from C++ Under the Sea! We interview Ray and Paul from NVIDIA, talk about Parrot, scans and more! Link to Episode 260 on WebsiteDiscuss this episode, leave a comment, or ask a question (on GitHub)Socials ADSP: The Podcast: TwitterConor Hoekstra: Twitter | BlueSky | MastodonBryce Adelstein Lelbach: TwitterAbout the Guests: Ray is a Senior Systems Software Engineer at NVIDIA since 2022. Studied Software Engineering at the University of Amsterdam. Founded the Dutch C++ Meetup in 2013 and co-organizes C++ Under the Sea since 2023. He has been programming for more than 25 years, his journey began on his father's Panasonic CF-2700 MSX--and has been hooked ever since. He is also 'the listener' of ADSP the podcast. Paul Grosse-Bley was first introduced to parallel programming with C+MPI at a student exchange to Umeå (Sweden) in 2017 while studying Physics. In the following years he learned more about MPI, OpenMP, OpenACC, Thrust/parSTL and CUDA C++. After finishing his Master's degree in Physics at Heidelberg University (Germany) in 2021, he became a PhD candidate in Computational Science and Engineering researching the acceleration of iterative solvers in sparse linear algebra while being head-tutor for a course on GPU Algorithm Design. He learned using Thrust in 2019 shortly before learning C++ and became enamored with parallel algorithms which led to numerous answers on StackOverflow, contributions on GitHub, his NVIDIA internship in the summer of 2025 and full position starting in February of 2026. Show Notes Date Recorded: 2025-10-10 Date Released: 2025-11-14 NVIDIA BCM (Base Command Manager)C++11 std::ignoreC++20 std::bind_frontParrotParrot on GitHubParrot Youtube Video: 1 Problem, 7 Libraries (on the GPU)thrust::inclusive_scanSingle-pass Parallel Prefix Scan with Decoupled Look-back by Duane Merrill & Michael GarlandPrefix Sums and Their Applications by Guy BlellochParallel Prefix Sum (Scan) with CUDANVIDIA ON-Demand VideosA Faster Radix Sort ImplementationIntro Song Info Miss You by Sarah Jansen https://soundcloud.com/sarahjansenmusic Creative Commons — Attribution 3.0 Unported — CC BY 3.0 Free Download / Stream: http://bit.ly/l-miss-you Music promoted by Audio Library

Bridging Accessibility and AI: Sign Language Recognition & Inclusive Design with Sheida Rashidi

As AI continues to shape human-computer interaction, there’s a growing opportunity and responsibility to ensure these technologies serve everyone, including people with communication disabilities. In this talk, I will present my ongoing work in developing a real-time American Sign Language (ASL) recognition system, and explore how integrating accessible design principles into AI research can expand both usability and impact.

The core of the talk will cover the Sign Language Recogniser project (available on GitHub), in which I used MediaPipe Studio together with TensorFlow, Keras, and OpenCV to train a model that classifies ASL letters from hand-tracking features.

I’ll share the methodology: data collection, feature extraction via MediaPipe, model training, and demo/testing results. I’ll also discuss challenges encountered, such as dealing with gesture variability, lighting and camera differences, latency constraints, and model generalization.

Beyond the technical implementation, I’ll reflect on the broader implications: how accessibility-focused AI projects can promote inclusion, how design decisions affect trust and usability, and how women in AI & data science can lead innovation that is both rigorous and socially meaningful. Attendees will leave with actionable insights for building inclusive AI systems, especially in domains involving rich human modalities such as gesture or sign.

Stay Online When the Grid Dies! When the power or internet fails, your workflow shouldn’t. In 13 minutes, I’ll demo a Ruby-powered importer that syncs GitHub activity to Mastodon, turning contribution graphs into animated Conway’s Game of Life SVGs published to the Fediverse. 100 % offline-capable. Join to brainstorm a blackout-resilient dev stack and maybe co-build the next piece.

I missed my parents, so I built an AI that talks like them. This isn’t about replacing people—it’s about remembering the voices that make us feel safe. In this 90-minute episode of Data & AI with Mukundan, we explore what happens when technology stops chasing efficiency and starts chasing empathy. Mukundan shares the story behind “What Would Mom & Dad Say?”, a Streamlit + GPT-4 experiment that generates comforting messages in the voice of loved ones. You’ll hear: The emotional spark that inspired the projectThe plain-English prompts anyone can use to teach AI empathyBoundaries & ethics of emotional AIHow this project reframed loneliness, creativity, and connectionTakeaway: AI can’t love you—but it can remind you of the people who do. 🔗 Try the free reflection prompts below: THE ONE-PROMPT VERSION: “What Would Mom & Dad Say?”
“You are speaking to me as one of my parents. Choose the tone I mention: either Mom (warm and reflective) or Dad (practical and encouraging). First, notice the emotion in what I tell you—fear, stress, guilt, joy, or confusion—and name it back to me so I feel heard. Then reply in 3 parts: Start by validating what I’m feeling, in a caring way.Share a short story, lesson, or perspective that fits the situation.End with one hopeful or guiding question that helps me think forward. Keep your words gentle, honest, and simple. No technical language. Speak like someone who loves me and wants me to feel calm and capable again.”

Join the Discussion (comments hub): https://mukundansankar.substack.com/notes Tools I use for my Podcast and Affiliate PartnersRecording Partner: Riverside → Sign up here (affiliate)Host Your Podcast: RSS.com (affiliate )Research Tools: Sider.ai (affiliate)Sourcetable AI: Join Here(affiliate)🔗 Connect with Me:Free Email NewsletterWebsite: Data & AI with MukundanGitHub: https://github.com/mukund14Twitter/X: @sankarmukund475LinkedIn: Mukundan SankarYouTube: Subscribe

Supercharging Multimodal Feature Engineering with Lance and Ray

Efficient feature engineering is key to unlocking modern multimodal AI workloads. In this talk, we’ll dive deep into how Lance - an open-source format with built-in indexing, random access, and data evolution - works seamlessly with Ray’s distributed compute and UDF capabilities. We’ll walk through practical pipelines for preprocessing, embedding computation, and hybrid feature serving, highlighting concrete patterns attendees can take home to supercharge their own multimodal pipelines. See https://lancedb.github.io/lance/integrations/ray to learn more about this integration.