/
Business Data
Personal Bio

Holger Keibel is a senior product manager and consultant at Karakun AG in Basel, Switzerland, with more than 17 years professional experience in the fields of Language Technology and Artificial Intelligence. He has managed LT projects in a range of industries and derives great satisfaction when LT software solutions generate real value and make life easier. Previously, Holger worked for nine years in a similar role for Canoo Engineering AG, and for approx. five years as a research associate at the Institute for the German Language (IDS) in Mannheim, Germany. Holger obtained both his PhD in Linguistics and his Master’s degree in Mathematics from the University of Freiburg, Germany. He did much of his PhD project as a research fellow at UC San Diego, California.

Presentation title: 
Generative AI: What Comes After the Hype?
Presentation description: 
Over the past two years, generative AI chatbots such as ChatGPT have impressed the world with their powerful capabilities to understand and generate human language text. Additionally, their embedded world knowledge is extremely broad and often surprisingly deep, albeit far from error-free. Without a doubt, this innovation is a significant technological breakthrough which triggered a massive hype around these chatbots, their underlying AI models (enormous LLMs like GPT, Bard/Gemini, or LLaMa) and the field of AI as a whole. But as this hype is still in its early stages, we do not yet fully understand how to best leverage LLMs productively, and the initial excitement is likely to produce unrealistic expectations. The way and extent in which we will be using LLMs in five or ten years will probably look different from what we imagine today. In this talk, I will argue that compared to these huge general-purpose models, a combination of smaller and more specialized AI models can execute many tasks more efficiently – with better results and at lower costs. For instance, someone working in the automotive industry benefits greatly from AI tools that excel in automotive-specific tasks – and much less from AI tools that know a bit about automotive, life sciences, and banking. Along these lines, certain trends and best practices are emerging. In a nutshell: When building an AI-driven application, do not focus on any given LLM, but instead on the tasks you want to accomplish and on the data that will help you accomplish them. I will outline what this means in practice.