talk-data.com talk-data.com

Topic

BigQuery

Google BigQuery

data_warehouse analytics google_cloud olap

315

tagged

Activity Trend

17 peak/qtr
2020-Q1 2026-Q1

Activities

315 activities · Newest first

Try Keboola 👉 https://www.keboola.com/mcp?utm_campaign=FY25_Q2_RoW_Marketing_Events_Webinar_Keboola_MCP_Server_Launch_June&utm_source=Youtube&utm_medium=Avery Today, we'll create an entire data pipeline from scratch without writing a single line of code! Using Keboola MCP server and ClaudeAI, we’ll extract data from my FindADataJob.com RSS feed, transform it, load it into Google BigQuery, and visualize it with Streamlit. This is the future of data engineering! Keboola MCP Integration: https://mcp.connection.us-east4.gcp.keboola.com/sse I Analyzed Data Analyst Jobs to Find Out What Skills You ACTUALLY Need https://www.youtube.com/watch?v=lo3VU1srV1E&t=212s 💌 Join 10k+ aspiring data analysts & get my tips in your inbox weekly 👉 https://www.datacareerjumpstart.com/newsletter 🆘 Feeling stuck in your data journey? Come to my next free "How to Land Your First Data Job" training 👉 https://www.datacareerjumpstart.com/training 👩‍💻 Want to land a data job in less than 90 days? 👉 https://www.datacareerjumpstart.com/daa 👔 Ace The Interview with Confidence 👉 https://www.datacareerjumpstart.com/interviewsimulator ⌚ TIMESTAMPS 00:00 - Introduction 00:54 - Definition of Basic Data Engineering Terms 02:26 - Keboola MCP and Its Capabilities 07:48 - Extracting Data from RSS Feed 12:43 - Transforming and Cleaning the Data 19:19 - Aggregating and Analyzing Data 23:19 - Scheduling and Automating the Pipeline 25:04 - Visualizing Data with Streamlit

🔗 CONNECT WITH AVERY 🎥 YouTube Channel: https://www.youtube.com/@averysmith 🤝 LinkedIn: https://www.linkedin.com/in/averyjsmith/ 📸 Instagram: https://instagram.com/datacareerjumpstart 🎵 TikTok: https://www.tiktok.com/@verydata 💻 Website: https://www.datacareerjumpstart.com/ Mentioned in this episode: Join the last cohort of 2025! The LAST cohort of The Data Analytics Accelerator for 2025 kicks off on Monday, December 8th and enrollment is officially open!

To celebrate the end of the year, we’re running a special End-of-Year Sale, where you’ll get: ✅ A discount on your enrollment 🎁 6 bonus gifts, including job listings, interview prep, AI tools + more

If your goal is to land a data job in 2026, this is your chance to get ahead of the competition and start strong.

👉 Join the December Cohort & Claim Your Bonuses: https://DataCareerJumpstart.com/daa https://www.datacareerjumpstart.com/daa

How to Build an Open Lakehouse: Best Practices for Interoperability

Building an open data lakehouse? Start with the right blueprint. This session walks through common reference architectures for interoperable lakehouse deployments across AWS, Google Cloud, Azure and tools like Snowflake, BigQuery and Microsoft Fabric. Learn how to design for cross-platform data access, unify governance with Unity Catalog and ensure your stack is future-ready — no matter where your data lives.

Sponsored by: Onehouse | Open By Default, Fast By Design: One Lakehouse That Scales From BI to AI

You already see the value of the lakehouse. But are you truly maximizing its potential across all workloads, from BI to AI? In this session, Onehouse unveils how our open lakehouse architecture unifies your entire stack, enabling true interoperability across formats, catalogs, and engines. From lightning-fast ingestion at scale to cost-efficient processing and multi-catalog sync, Onehouse helps you go beyond trade-offs. Discover how Apache XTable (Incubating) enables cross-table-format compatibility, how OpenEngines puts your data in front of the best engine for the job, and how OneSync keeps data consistent across Snowflake, Athena, Redshift, BigQuery, and more. Meanwhile, our purpose-built lakehouse runtime slashes ingest and ETL costs. Whether you’re delivering BI, scaling AI, or building the next big thing, you need a lakehouse that’s open and powerful. Onehouse opens everything—so your data can power anything.

Sigma Data Apps Product Releases & Roadmap | The Data Apps Conference

Organizations today require more than dashboards—they need applications that combine insights with data collection and action capabilities to drive meaningful change. In this session, Stipo Josipovic (Director of Product) will showcase the key innovations enabling this shift, from expanded write-back capabilities to workflow automation features.

You'll learn about Sigma's growing data app capabilities, including:

Enhanced write-back features: Redshift and upcoming BigQuery support, bulk data entry, and form-based collection for structured workflows Advanced security controls: Conditional editing and row-level security for precise data governance Intuitive interface components: Containers, modals, and tabbed navigation for app-like experiences Powerful Actions framework: API integrations, notifications, and automated triggers to drive business processes This session covers both recently released features and Sigma's upcoming roadmap, including detail views, simplified form-building, and new API actions to integrate with your tech stack. Discover how Sigma helps organizations move beyond analysis to meaningful action.

➡️ Learn more about Data Apps: https://www.sigmacomputing.com/product/data-applications?utm_source=youtube&utm_medium=organic&utm_campaign=data_apps_conference&utm_content=pp_data_apps


➡️ Sign up for your free trial: https://www.sigmacomputing.com/go/free-trial?utm_source=youtube&utm_medium=video&utm_campaign=free_trial&utm_content=free_trial

sigma #sigmacomputing #dataanalytics #dataanalysis #businessintelligence #cloudcomputing #clouddata #datacloud #datastructures #datadriven #datadrivendecisionmaking #datadriveninsights #businessdecisions #datadrivendecisions #embeddedanalytics #cloudcomputing #SigmaAI #AI #AIdataanalytics #AIdataanalysis #GPT #dataprivacy #python #dataintelligence #moderndataarchitecture

session
by Murat Özcan (Trendyol) , Ozgur Uyar (Just Eat Takeaway) , Vinay Yerramilli (Google Cloud) , Ahmed Ayad (Google Cloud)

Join this session to learn best practices for optimizing your data and analytics costs. Discover new BigQuery capabilities that simplify workload management and provide greater cost controls and adherence to best practices. BigQuery customers Just Eat Takeaway and Trendyol will share their BigQuery migration journeys, scaling strategies, and how they used workload management and optimization tools to improve their return on investment (ROI).

Simplify real-time data analytics and build event-driven, AI-powered applications using BigQuery and Pub/Sub. Learn to ingest and process massive streaming data from users, devices, and microservices for immediate insights and rapid action. Explore BigQuery's continuous queries for real-time analytics and ML model training. Discover how Flipkart, India’s leading e-commerce platform, leverages Google Cloud to build scalable, efficient real-time data pipelines and AI/ML solutions, and gain insights on driving business value through real–time data.

Personalized predictions can be created by analyzing user clickstream data and using vector embeddings to capture the essence of an entity across multiple dimensions. This establishes relationships between users and items, revealing preferences and interests. BigQuery facilitates batch processing of vector embeddings, which are then fed into Spanner for efficient retrieval of these relationships via vector search. This enables real-time personalized recommendations with sub-ms response times. This solution offers accuracy, scalability, and real-time responsiveness.

In this hands-on lab, you'll explore data with BigQuery's intuitive table explorer and data insight features, enabling you to gain valuable insights without writing SQL queries from scratch. Learn how to generate key insights from order item data, query location tables, and interact with your data seamlessly. By the end, you’ll be equipped to navigate complex datasets and uncover actionable insights quickly and efficiently.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Dive deep into how governance, security, and sharing are innately integrated in BigQuery to power data and AI use cases across your organization. Learn about new innovations that further enhance data governance, security, and collaboration across data and AI assets, without you having to leave BigQuery. Find out how data governance leaders at Walmart and Box are using BigQuery to securely scale data and AI across their organizations.

Join Aritzia, Anomalo, and Google Cloud to learn how Aritzia automates data quality across 500+ sources in BigQuery. Discover how integrating Anomalo with Google Cloud helps proactively detect anomalies, maintain data integrity, and build trust in analytics. Explore how automation reduces time spent troubleshooting and increases time spent creating business value through reliable, AI-enhanced analytics.

Analyze BigQuery logs with SQL using Log Analytics. This hands-on lab covers enabling Log Analytics, querying BigQuery logs within Cloud Logging, and visualizing results for in-depth usage analysis and troubleshooting.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Routine tasks such as data wrangling and pipeline maintenance often inhibit data teams from doing higher-value analysis and insights-led decision-making. This session showcases how intelligent data agents in BigQuery can help automate complex data engineering tasks. You’ll learn how to use natural language prompts to streamline data engineering tasks from ingestion and transformation, such as data cleaning, formatting, and loading results into BigQuery tables that accelerate the time to build and validate data pipelines.

Dive deep into the world of multimodal analytics with BigQuery. This session explores how to unlock insights from all data types in BigQuery using embeddings generation and vector search. We’ll demonstrate how BigQuery object tables combine text, documents, and images to unlock popular use cases like recommendation engines and retrieval-augmentation generation (RAG). Learn how to leverage BigQuery as a knowledge base to ground your cutting-edge AI application with your own enterprise data.

Simplify blockchain development with generative AI on Google Cloud. In this interactive session, you’ll learn how Gemini AI helps generate queries for BigQuery blockchain datasets and analyzes real-time blockchain data. See how Blockscope is using Gemini to conduct forensic analysis of blockchain data. Live demos will show you how to supercharge your Web3 projects, whether you're a blockchain veteran or just starting out.

In this hands-on lab, you'll explore the power of BigQuery Machine Learning with remote models like Gemini Pro to analyze customer reviews. Learn to extract keywords, assess sentiment, and generate insightful reports using SQL queries. Discover how to integrate Gemini Pro Vision to summarize and extract keywords from review images. By the end, you’ll gain skills in setting up Cloud resources, creating datasets, and prompting Gemini models to drive actionable insights and automated responses to customer feedback.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Explore the future of data management with BigQuery multimodal tables. Discover how to integrate structured and unstructured data (such as text, images, and video) into a single table with full data manipulation language (DML) support. This session demonstrates how unified tables unlock the potential of unstructured data through easy extraction and merging, simplify Vertex AI integration for downstream workflows, and enable unified data discovery with search across all data.

Modernize your Oracle workloads on Google Cloud. Experience seamless migration, robust infrastructure, and familiar tools for mission-critical workloads. Unlock your data's potential with BigQuery and Vertex AI, driving business differentiation and cost reduction. Learn how the Google Cloud & Oracle partnership, combined with your expertise, can accelerate digital transformation, reduce costs and grow your customers' potential. 

This talk offers a solution to accelerate healthcare innovation by streamlining the conversion and integration of various data formats (HL7 v2, CSV, RDBMS, etc.) into the FHIR standard.

This solution reduces the need for manual mapping allowing for quick conversion of various healthcare data formats into FHIR and significantly reduces the workload of healthcare IT teams. FHIR data is then loaded into Google BigQuery providing a scalable and secure platform for data storage and analysis.

Concerned about AI hallucinations? While AI can be a valuable resource, it sometimes generates inaccurate, outdated, or overly general responses - a phenomenon known as "hallucination." This hands-on lab teaches you how to implement a Retrieval Augmented Generation (RAG) pipeline to address this issue. RAG improves large language models (LLMs) like Gemini by grounding their output in contextually relevant information from a specific dataset. Learn to generate embeddings, search vector space, and augment answers for more reliable results.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!