Interested in learning practical approaches to implement Large Language Models (LLMs)? Then this workshop is for you.
Join us to obtain a practical understanding of implementing applications which interact with Large Language Models (LLMs).
By attending, you will:
- Learn about architecting applications which use LLMs
- Explore techniques for robustly generating high-quality output
- Obtain an understanding of approaches to interacting with LLMs, to enable design and development of complicated integrations
The workshop includes hands-on labs designed to provide you with concrete implementation examples, as well as discussions of the various aspects involved in building out end-to-end solutions.
Intended Audience: Application Developers, Application Architects, Chief AI Officer / CDO, DS /ML Experts, SDE, DevOps, AI-GenAI practitioners
Register today to secure your spot.
Introduction to Large Language Models and Gen AI
Hands On Lab 1 – Gen AI Playground
LLM Application Architecture
Hands On Lab 2 - Building RAG Models
Deep Dive on RAG Processes and Document Ingestion
Callan Howell-Pavia
Oracle
APAC Solutions Specialist
Callan Howell-Pavia is an accomplished Information Security Specialist with a passion for Large Language Models and application development. He has spent the last few years focusing on artificial intelligence, machine learning, and natural language processing, solidifying his expertise in designing applications which leverage these capabilities. His comprehensive understanding of the cybersecurity landscape and software development gives him a unique perspective on the responsible integration of LLMs into secure applications. He endeavors to making complex technical concepts accessible to a broad audience, with this workshop designed to educate others on the practical applications and implications of LLMs in the real world.
Title
Donec vel placerat quam, ut euismod risus. Sed a mi suscipit, elementum sem a, hendrerit velit. Donec at erat magna. Sed dignissim orci nec eleifend egestas. Donec eget mi consequat massa vestibulum laoreet. Mauris et ultrices nulla, malesuada volutpat ante. Fusce ut orci lorem. Donec molestie libero in tempus imperdiet. Cum sociis natoque penatibus et magnis.