Présentation des avantages d’un déploiement local pour la gouvernance des données et la sécurité, avec une approche centrée sur la confidentialité, l’autonomie et la performance.
talk-data.com
Topic
local deployment
4
tagged
Activity Trend
Discussion sur les bénéfices du déploiement local pour la gouvernance des données et la sécurité.
Démonstration pratique montrant comment installer et configurer un modèle conversationnel en local.
We've all got used to using LLMs in our developer workflow - from asking ChatGPT what tools and libraries to use, to getting GitHub copilot to generate code for us. Great when you are online, but not so useful when you are offline, like on the tube or in a plane with no WiFi. But what if there was another way? In this session, Jim will introduce offline LLMs. We'll look at how you can run LLMs locally, such as Phi-2 from Microsoft, and add these to your developer workflow. We'll compare the performance of offline vs online, both the speed and quality, but also touch on privacy and other considerations. We'll also look at hardware requirements as we don't all have the latest GPUs to hand.