What You Need to Know Before
You Start
Starts 27 June 2025 18:36
Ends 27 June 2025
00
Days
00
Hours
00
Minutes
00
Seconds
18 minutes
Optional upgrade avallable
Not Specified
Progress at your own speed
Free Video
Optional upgrade avallable
Overview
Discover how Structured Agent Distillation (SAD) enhances large language model performance before quantization, presented by researchers from MIT, Harvard, CMU, and other institutions.
Subjects
Computer Science