This talk dives deep into the top AI skills that are shaping the future of the field, drawing insights from cutting-edge research, open-source contributions, industry trends, and job market analysis. From foundational proficiencies in programming, machine learning, and data wrangling to emerging specialties like AI Agents, prompt engineering, and generative AI expertise, we’ll explore what it takes to excel in 2025. Designed for AI practitioners, developers, tech leaders, students, and business professionals, this session offers actionable insights tailored to every audience. This session is your roadmap to the skills and trends that will define the next wave of AI innovation, helping you stay ahead in an ever-evolving landscape.
talk-data.com
Topic
prompt engineering
42
tagged
Activity Trend
Top Events
Day 6 self-paced session on Prompt Engineering and practical applications.
Techniques for crafting effective prompts and applying AI.
During the session, we’ll discuss the challenges that prompt engineering has presented, both when it first gained popularity and as it continued to evolve. We’ll share how these challenges informed the development of our prompt engineering tooling and workflows. We’ll cover: Standardizing communication with LLMs; Using templating to customize prompts; Building prompt-centric production workflows; Working with structured LLM output; Ensuring the quality of LLM output; Creating tooling that supports our prompt engineering workflows.
Step into the world of promptiverse guardianship as we explore strategies to ensure the safety and security of your Large Language Model solutions. This talk is a fun and informative journey into defending against prompt hacking, covering techniques like prompt engineering and vigilant monitoring. Join us as we empower you to become a guardian of your LLM, safeguarding it against potential threats and ensuring its integrity in the ever-evolving digital landscape.
In this session, we will focus on fine-tuning, continuous pretraining, and retrieval-augmented generation (RAG) to customize foundation models using Amazon Bedrock. Attendees will explore and compare strategies such as prompt engineering, which reformulates tasks into natural language prompts, and fine-tuning, which involves updating the model's parameters based on new tasks and use cases. The session will also highlight the trade-offs between usability and resource requirements for each approach. Participants will gain insights into leveraging the full potential of large models and learn about future advancements aimed at enhancing their adaptability.
Explore how AI is revolutionizing marketing and creative professions. Learn essential skills, innovative practices, and the latest AI-driven tools to enhance creativity and productivity. Topics include: influence of AI on marketing strategies, prompting AI tools for desired results, implementing AI in marketing campaigns, content creation with AI, incorporating AI into project management, utilizing AI to innovate product development, and crafting user-centric designs with AI.
Date: 2024-08-22; Time: 10:00-12:00 EST. AWS Discovery Day - Introduction to Prompt Engineering.
Date: 2024-08-22; Time: 10:00-12:00 EST. AWS Discovery Day - Introduction to Prompt Engineering.
Date: 2024-08-22; Time: 10:00-12:00 EST. AWS Discovery Day - Introduction to Prompt Engineering.
Date: 2024-08-22; Time: 10:00-12:00 EST. AWS Discovery Day - Introduction to Prompt Engineering.
Date: 2024-08-22; Time: 10:00-12:00 EST. AWS Discovery Day - Introduction to Prompt Engineering.
Date: Tue, Aug 20, 2024; Time: 10:00 AM–12:00 PM EST. Masterclass discussing how AI is transforming L&D, HR, and Legal roles, with emphasis on prompting AI tools and applying AI across training, talent management, and legal research. What you will learn: Introduction to AI in L&D, HR, and Legal Roles; How to prompt AI tools for desired results; Improved training and development with AI; Utilizing AI for effective talent management; AI for legal research and case management.
In this talk, I will explore prompt engineering, a key part of gen AI systems. We will discuss how to create effective prompts, how these methods have evolved, and what are the new finding from the recent research. The main focus will be on text-based prompting techniques such as Few-shot Prompting, Chain of Thought (CoT), and various reasoning and decomposition methods.
ChatGPT is awesome, but developing with its API comes at a cost. Fortunately, there are open-source alternatives like Google Gemini, Streamlit, and Python APIs that can fetch prompt results using an API key. In this presentation, I'll explore how to create a lightweight, self-service end-to-end LLMs application using prompt engineering and fine-tuning based on user requests. Additionally, I'll demonstrate how to build a food suggestion application based on ingredients or food names.
In this talk we’ll introduce the core concepts for building a “copilot” application on Azure AI from prompt engineering to LLM Ops – using the Contoso Chat application sample as a reference. And we’ll explore the Azure AI Studio (preview) platform from a code-first perspective to understand how you can streamline your development from model exploration to endpoint deployment, with a unified platform and workflow.
Abstract: In the rapidly evolving landscape of AI, Large Language Models (LLMs) have outgrown their initial niche of powering chat bots. Today, we can use generative AI not only in enriching human-machine interactions but also in integrating sophisticated capabilities into business applications. This presentation will explore the nuanced process of customising LLMs for enterprise use, highlighting the importance of prompt engineering, in-context learning (ICL), and equipping the models with a diverse toolkit to ensure their responses are tuned to the application needs and to enable building reliable and responsible generative AI applications.
A continuation session on 2023-10-25 with Giuseppe Scalamogna exploring a universal framework for prompt crafting in generative text models, building on his earlier talk 'Prompt Engineering Evolution: Defining the New Program Simulation Prompt Framework'. The talk covers integrating instruction-based and role-based prompting into a program-like sequence of tasks, and discusses techniques such as Chain-of-Thought (CoT), N-shot, Few-shot, and Flattery/Role Assignment to create a library of prompts and enable ChatGPT to function as a programmable system where outputs influence subsequent steps.
Dive into the basics of Prompt Engineering, exploring different techniques, with examples and incorporating safety and moderation features.
Dive into the basics of Prompt Engineering, exploring different techniques, with examples and incorporating safety and moderation features.