Structured Agent Distillation for Large Language Models

via YouTube

YouTube

2338 Courses


course image

Overview

Discover how Structured Agent Distillation (SAD) enhances large language model performance before quantization, presented by researchers from MIT, Harvard, CMU, and other institutions.

Syllabus


Taught by


Tags