Structured Agent Distillation for Large Language Models
Structured Agent Distillation for Large Language Models
via YouTube
YouTube
2338 Courses
Overview
Discover how Structured Agent Distillation (SAD) enhances large language model performance before quantization, presented by researchers from MIT, Harvard, CMU, and other institutions.