talk-data.com talk-data.com

Filter by Source

Select conferences and events

People (10 results)

See all 10 →
Showing 19 results

Activities & events

Title & Speakers Event

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

Please note that the session is not recorded and participants are responsible for obtaining their own copy of the text.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.

Pages being discussed: Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.

> Instructions for joining Discord: https://bit.ly/llm-discord

> Buy book on Amazon

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.

You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.

The book aims to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models

We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book combines the essential theory with a practical focus and is written by two highly regarded experts from the LLM space.

This meetup will be to review / discuss the first 2 chapters:

  1. An Introduction to Large Language Models
  2. Tokens and Embeddings

Book overview Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today. You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings. This book purports to help you:

  • Understand the architecture of Transformer language models that excel at text generation and representation
  • Build advanced LLM pipelines to cluster text documents and explore the topics they cover
  • Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
  • Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
  • Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
Hands-On Large Language Models (Ch 1 & 2)
Showing 19 results