talk-data.com talk-data.com

Topic

interest-data-analytics

156

tagged

Activity Trend

121 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Google Cloud Next '25 ×

Modern analytics and AI workloads demand a unified storage layer for structured and unstructured data. Learn how Cloud Storage simplifies building data lakes based on Apache Iceberg. We’ll discuss storage best practices and new capabilities that enable high performance and cost efficiency. We’ll also guide you through real-world examples, including Iceberg data lakes with BigQuery or third-party solutions, data preparation for AI pipelines with Dataproc and Apache Spark, and how customers have built unified analytics and AI solutions on Cloud Storage.

A clean energy platform that connects residents, businesses, and other consumers with clean energy facilities that lower their electric costs, centralized data from CRM and billing systems in BigQuery for churn analysis. Historically, account managers manually reviewed cancellation cases, parsing emails and call transcripts to categorize reasons using a complex 65-category system, later condensed into 16 actionable insights. By leveraging Google Gemini, we automated this process, training the LLM to analyze customer interactions and assign accurate categories, streamlining operations and enhancing retention strategies.

Unlock the power of AI-assisted coding in BigQuery with this hands-on lab. Learn how to generate SQL queries using natural language prompts, utilize BigQuery's code explanation and transformation features, and collaborate with Gemini to review, debug, and optimize your SQL code. Whether you're looking to streamline query development or troubleshoot issues, this session will enhance your ability to write and refine code efficiently using Gemini's intelligent capabilities in BigQuery.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Are you ready to get hands-on with Google Cloud’s AI tools? In this 2 hour gHack, you will work in teams of 4. Together you will build a Formula E Race Analysis System from scratch using a variety of our AI and Data tools. Teams will work together to build the solution by searching, learning and collaborating together to find the answers needed. 3-2-1 lights out and away we go!

Learn how the legendary retail brand accelerated AI adoption by building a GCP Data Platform and conducting enterprise-wide data transformation program. We’ll demonstrate how Big Query and other GCP services liberated the data from legacy environments and became the foundation for AI Factory initiative. We will highlight the challenges and solutions for data quality control, enterprise-wide stakeholder alignment, and business users engagement on the road to data value realization.

This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.

session
by Ron Bushar (Google) , Reid Novotny (National Guard Bureau) , Derek Law (United States Space Force (USSF)) , Maria Lipana (California Cybersecurity Integration Center (Cal-CSIC))

Explore how infrastructure, data analytics, cybersecurity, and intelligence at the edge converge to support mission-critical operations on the frontlines. Hear from public sector agencies about how they’re applying technology to their mission to protect and defend our nation.

Routine tasks such as data wrangling and pipeline maintenance often inhibit data teams from doing higher-value analysis and insights-led decision-making. This session showcases how intelligent data agents in BigQuery can help automate complex data engineering tasks. You’ll learn how to use natural language prompts to streamline data engineering tasks from ingestion and transformation, such as data cleaning, formatting, and loading results into BigQuery tables that accelerate the time to build and validate data pipelines.

Learn from industry leaders how they migrated their platforms to BigQuery for business transformation. They’ll share how they simplified the migration process, unified their data for better insights from analytics and AI, and accelerated AI adoption with BigQuery and Vertex AI. Discover how they achieved real-world results, such as driving business growth, reducing costs, and improving customer experiences.

NVIDIA GPUs accelerate batch ETL workloads at significant cost savings and performance. In this session, we will delve into optimizing Apache Spark on GCP Dataproc using the G2 accelerator-optimized series with L4 GPUs via RAPIDS Accelerator For Apache Spark, showcasing up to 14x speedups and 80% cost reductions for Spark applications. We will demonstrate this acceleration through a reference AI architecture on financial transaction fraud detection, and go through performance measurements.

Unstructured data makes up the majority of all new data; a trend that's been growing exponentially since 2018. At these volumes, vector embeddings require indexes to be trained so that nearest neighbors can be efficiently approximated, avoiding the need for exhaustive lookups. However, training these indexes puts intense demand on vector databases to maintain a high ingest throughput. In this session, we will explain how the NVIDIA cuVS library is turbo charging vector database ingest with GPUs, providing speedups from 5-20x and improving data readiness.

This Session is hosted by a Google Cloud Next Sponsor.
Visit your registration profile at g.co/cloudnext to opt out of sharing your contact information with the sponsor hosting this session.

Discover how Google’s interconnected ecosystem of Google Cloud platform and specialty solutions can address the needs and challenges of resource-constrained IT teams. We’ll delve into practical use cases and demonstrate how Google Cloud’s specialized business intelligence platform (Looker) and security solutions (Google Security Operations, Mandiant) can help your business improve efficiency and reduce costs while improving your security posture.

Modernize business processes and ignite innovation. Google Cloud infrastructure uniquely offers unmatched performance, scalability, and reliability for your mission-critical SAP workloads, especially for RISE with SAP. Learn how enterprises are future-proofing their SAP landscapes with Google Cloud, unlocking data and accelerating business outcomes. Discover best practices to transform your SAP deployments into a foundation for innovation. Don’t miss these real-world customer success stories.

This talk will demonstrate how the SAP user community can use Looker/Explore Assistant Chatbot to explore data insights into SAP ERP data stored on Google Cloud's BigQuery using natural language prompts. We will discuss the challenge of accessing and analyzing SAP data - ETL, Complex Data Model, introduction to Generative AI and Large Language Models (LLMs), and Looker Explore Assistant and Chatbot This presentation will illustrate how SAP users can leverage Looker and Explore Assistant Chatbot to gain insights into their SAP ERP data residing on Google Cloud's BigQuery, using natural language prompts. We will address common challenges in accessing and analyzing SAP data, such as ETL processes and complex data models. Additionally, we will provide an introduction to Generative AI and Large Language Models (LLMs), as well as an overview of Looker Explore Assistant and Chatbot's capabilities.

Are you ready to get hands-on with Google Cloud’s AI tools? In this 2 hour gHack, you will work in teams of 4. Together you will build a Formula E Race Analysis System from scratch using a variety of our AI and Data tools. Teams will work together to build the solution by searching, learning and collaborating together to find the answers needed. 3-2-1 lights out and away we go!

In this hands-on lab, you'll learn how to build a powerful business intelligence (BI) dashboard using Looker Studio and BigQuery. Discover how to upload and query data, create reports datasets, and run scheduled queries to uncover valuable insights from large service usage logs. With your dashboard, you'll gain the ability to identify trends, optimize operations, and make data-driven decisions to improve efficiency and service quality.

If you register for a Learning Center lab, please ensure that you sign up for a Google Cloud Skills Boost account for both your work domain and personal email address. You will need to authenticate your account as well (be sure to check your spam folder!). This will ensure you can arrive and access your labs quickly onsite. You can follow this link to sign up!

Join us to learn how you can build on Google’s intelligent, open, and unified Data Cloud to accelerate your AI transformation. This session covers deep integrations between BigQuery and Google’s operational databases, such as Spanner, AlloyDB, Bigtable, Cloud SQL. Mercado Libre will share how Spanner and Bigtable Data Boost enable near-zero impact analytics on their operational data. Plus, discover how Datastream and change streams simplify data movement to BigQuery, and how reverse ETL (extract, transform, and load) from BigQuery powers operational analytics.

Learn about the latest developments in our integrated geospatial analytics capabilities from Google Maps Platform, Google Earth Engine, and BigQuery. Discover how our rich geospatial data, powerful cloud computing, and built-in AI tools make it easier for any professional to unlock insights, leading to faster, better decisions that drive business growth and sustainability. Find out how our customers are currently using geospatial analytics and Google Cloud infrastructure to improve their business and sustainability outcomes.

In today’s data-driven world, enterprises are seeking scalable, cost-effective and high-performance cloud data warehouses to fuel their analytics and AI initiatives. BigQuery has emerged as a leading solution, attracting many organizations with its serverless architecture, powerful AI and machine learning (ML) capabilities, and a unified data and AI experience. Discover how Quest Diagnostics and Ford successfully migrated to BigQuery to improve efficiency and drive innovation, and learn about new tools to help streamline migrations to BigQuery.

Google Cloud’s Sensitive Data Protection service is a highly effective capability that can discover and classify sensitive data in your environment, helping to prevent data leakage. But it also has features useful to developers to minimize the exposure of confidential customer information when handling large volumes of sensitive data. By taking advantage of Sensitive Data Protection transformation techniques, you can de-identify sensitive information in a dataset through redaction, replacement, masking, tokenization, bucketing, date shifting, and time extraction. Developers retain the ability to test applications using functional data while still meeting security requirements put in place to protect customer information. By using pseudonymization, which is reversible and provides an easier path for troubleshooting, developers will have a more useful dataset for functional testing than they would if they used data anonymization. In this talk, you’ll learn how to use the Cloud Data Loss Prevention API (DLP API) of Sensitive Data Protection to inspect data for sensitive information and build an automated data transformation pipeline to create de-identified copies of your dataset.

Tired of outdated Firebase Crashlytics alert settings? So are we! We’re giving Crashlytics in-console alerts a complete overhaul, addressing what’s been your top feedback since 2012. But we’re not stopping there. To unlock even more powerful alerting capabilities, we’re integrating Crashlytics with Google Cloud. Harness the power of BigQuery and the Google Cloud Observability suite to create advanced, enterprise-grade alerts tailored to your specific needs. Join us to learn how this revamped alerting system can streamline your debugging workflow and improve your app’s stability.