talk-data.com talk-data.com

Topic

Python

programming_language data_science web_development

1446

tagged

Activity Trend

185 peak/qtr
2020-Q1 2026-Q1

Activities

1446 activities · Newest first

Summary In this episode of the Data Engineering Podcast Jeremy Edberg, CEO of DBOS, about durable execution and its impact on designing and implementing business logic for data systems. Jeremy explains how DBOS's serverless platform and orchestrator provide local resilience and reduce operational overhead, ensuring exactly-once execution in distributed systems through the use of the Transact library. He discusses the importance of version management in long-running workflows and how DBOS simplifies system design by reducing infrastructure needs like queues and CI pipelines, making it beneficial for data pipelines, AI workloads, and agentic AI.

Announcements Hello and welcome to the Data Engineering Podcast, the show about modern data managementData migrations are brutal. They drag on for months—sometimes years—burning through resources and crushing team morale. Datafold's AI-powered Migration Agent changes all that. Their unique combination of AI code translation and automated data validation has helped companies complete migrations up to 10 times faster than manual approaches. And they're so confident in their solution, they'll actually guarantee your timeline in writing. Ready to turn your year-long migration into weeks? Visit dataengineeringpodcast.com/datafold today for the details.Your host is Tobias Macey and today I'm interviewing Jeremy Edberg about durable execution and how it influences the design and implementation of business logicInterview IntroductionHow did you get involved in the area of data management?Can you describe what DBOS is and the story behind it?What is durable execution?What are some of the notable ways that inclusion of durable execution in an application architecture changes the ways that the rest of the application is implemented? (e.g. error handling, logic flow, etc.)Many data pipelines involve complex, multi-step workflows. How does DBOS simplify the creation and management of resilient data pipelines? How does durable execution impact the operational complexity of data management systems?One of the complexities in durable execution is managing code/data changes to workflows while existing executions are still processing. What are some of the useful patterns for addressing that challenge and how does DBOS help?Can you describe how DBOS is architected?How have the design and goals of the system changed since you first started working on it?What are the characteristics of Postgres that make it suitable for the persistence mechanism of DBOS?What are the guiding principles that you rely on to determine the boundaries between the open source and commercial elements of DBOS?What are the most interesting, innovative, or unexpected ways that you have seen DBOS used?What are the most interesting, unexpected, or challenging lessons that you have learned while working on DBOS?When is DBOS the wrong choice?What do you have planned for the future of DBOS?Contact Info LinkedInParting Question From your perspective, what is the biggest gap in the tooling or technology for data management today?Closing Announcements Thank you for listening! Don't forget to check out our other shows. Podcast.init covers the Python language, its community, and the innovative ways it is being used. The AI Engineering Podcast is your guide to the fast-moving world of building AI systems.Visit the site to subscribe to the show, sign up for the mailing list, and read the show notes.If you've learned something or tried out a project from the show then tell us about it! Email [email protected] with your story.Links DBOSExactly Once SemanticsTemporalSempahorePostgresDBOS TransactPython Typescript Idempotency KeysAgentic AIState MachineYugabyteDBPodcast EpisodeCockroachDBSupabaseNeonPodcast EpisodeAirflowThe intro and outro music is from The Hug by The Freak Fandango Orchestra / CC BY-SA

Unlock the power of code execution with Gemini 2.0 Flash! This hands-on lab demonstrates how to generate and run Python code directly within the Gemini API. Learn to use this capability for tasks like solving equations, processing text, and building code-driven applications.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Tensor Processing Units (TPUs) are a hardware accelerator designed by Google specifically for large-scale AI/ML computations. Google's new Trillium TPUs are our most performant and energy-efficient TPUs to date, and offer unprecedented levels of scalability. Ray is a unified framework for orchestrating AI/ML workloads on large compute clusters. Ray offers Python-native APIs for training, inference, tuning, reinforcement learning, and more. In this lightning talk, we will demonstrate how you can use Ray to manage workloads on TPUs with an easy-to-use API. We will cover: 1) Training your models with MaxText, 2) Tuning models with Huggingface, and 3) Serving models with vLLM. Audience can gain an understanding of how to build a complete, end-to-end AI/ML infrastructure with Ray and TPUs.

 

Think Stats, 3rd Edition

If you know how to program, you have the skills to turn data into knowledge. This thoroughly revised edition presents statistical concepts computationally, rather than mathematically, using programs written in Python. Through practical examples and exercises based on real-world datasets, you'll learn the entire process of exploratory data analysis—from wrangling data and generating statistics to identifying patterns and testing hypotheses. Whether you're a data scientist, software engineer, or data enthusiast, you'll get up to speed on commonly used tools including NumPy, SciPy, and Pandas. You'll explore distributions, relationships between variables, visualization, and many other concepts. And all chapters are available as Jupyter notebooks, so you can read the text, run the code, and work on exercises all in one place. Analyze data distributions and visualize patterns using Python libraries Improve predictions and insights with regression models Dive into specialized topics like time series analysis and survival analysis Integrate statistical techniques and tools for validation, inference, and more Communicate findings with effective data visualization Troubleshoot common data analysis challenges Boost reproducibility and collaboration in data analysis projects with interactive notebooks

Unlock the power of code execution with Gemini 2.0 Flash! This hands-on lab demonstrates how to generate and run Python code directly within the Gemini API. Learn to use this capability for tasks like solving equations, processing text, and building code-driven applications.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

This session demonstrates how BigQuery ML connects all your data to cutting-edge AI using familiar SQL. Learn practical steps to build, train, and deploy machine learning (ML) models for predictive analytics directly in BigQuery while minimizing complexity and data movement. Discover ways to perform tasks such as sentiment analysis, audio transcription, and document classification with the latest models from Gemini, Claude, Llama, and others directly in BigQuery without the need for advanced Python or specialized ML skills.

Are you a data scientist or developer using Python to build AI models and generative AI applications? Learn how BigQuery can supercharge Python data science workflows with capabilities that give you the productivity of Python and allow BigQuery to handle core processing. Offloading Python processing enables large-scale data analysis and seamless production deployments along the data-to-AI journey. Find out how Deutsche Telekom modernized their machine learning platform with a radically simplified infrastructure and increased developer productivity.

Transform your Google Workspace experience with the power of Gemini. This fast-paced session dives into practical integrations using Apps Script, Vertex AI, Python, and Node.js to automate workflows and unlock new levels of efficiency. Discover how to leverage Gemini for intelligent task management, data-driven insights, and building custom AI solutions. Leave with actionable strategies and code snippets to immediately boost your productivity.

3D Data Science with Python

Our physical world is grounded in three dimensions. To create technology that can reason about and interact with it, our data must be 3D too. This practical guide offers data scientists, engineers, and researchers a hands-on approach to working with 3D data using Python. From 3D reconstruction to 3D deep learning techniques, you'll learn how to extract valuable insights from massive datasets, including point clouds, voxels, 3D CAD models, meshes, images, and more. Dr. Florent Poux helps you leverage the potential of cutting-edge algorithms and spatial AI models to develop production-ready systems with a focus on automation. You'll get the 3D data science knowledge and code to: Understand core concepts and representations of 3D data Load, manipulate, analyze, and visualize 3D data using powerful Python libraries Apply advanced AI algorithms for 3D pattern recognition (supervised and unsupervised) Use 3D reconstruction techniques to generate 3D datasets Implement automated 3D modeling and generative AI workflows Explore practical applications in areas like computer vision/graphics, geospatial intelligence, scientific computing, robotics, and autonomous driving Build accurate digital environments that spatial AI solutions can leverage Florent Poux is an esteemed authority in the field of 3D data science who teaches and conducts research for top European universities. He's also head professor at the 3D Geodata Academy and innovation director for French Tech 120 companies.

Build robust ETL pipelines on Google Cloud! This hands-on lab teaches you to use Dataflow (Python) and BigQuery to ingest and transform public datasets. Learn design considerations and implementation details to create effective data pipelines for your needs.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Unlock the power of code execution with Gemini 2.0 Flash! This hands-on lab demonstrates how to generate and run Python code directly within the Gemini API. Learn to use this capability for tasks like solving equations, processing text, and building code-driven applications.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Calling all Python developers using Google Cloud! Share your experiences, projects, and insights leveraging Python on Google Cloud. Whether you're exploring official libraries, automating tasks, or building innovative solutions, this meetup is for you. Let's discuss what's working well, challenges you've faced, and how Python + Google Cloud could be even better. Connect with fellow developers, get inspired, and help shape the future of Python on Google Cloud!

Learn how to speed up popular data science libraries such as pandas and scikit-learn by up to 50x in Google Colab using pre-installed NVIDIA RAPIDS Python libraries. Boost both speed and scale for your workflows by simply selecting a GPU runtime in Colab – no code changes required. In addition, Gemini helps Colab users incorporate GPUs and generate pandas code from simple natural language prompts.

This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.

Discover how machine learning (ML) engineers and app developers collaborate to build a viral AI app. This session takes you on a journey from a screenshot-to-calendar Python app running on a laptop to a globally scaled app. You’ll learn best practices for collaboration between ML engineers and app developers; how to build, refactor, and scale an AI app to millions of users with a modern architecture; and how to leverage the power of Gemini, Vertex AI, app hosting, and full-stack development. Find out how to build something greater than the sum of its parts!

Level up your Java apps with the power of Gemini. This session explores how Java developers can easily integrate generative AI features into their applications using LangChain4j. We’ll cover everything from building chatbots and implementing in-context learning to creating agentic workflows and using advanced techniques like unstructured data extraction. Learn best practices and build smart, AI-powered Java applications with ease. No Python experience required.