What You Need to Know Before
You Start

Starts 4 June 2025 00:20

Ends 4 June 2025

00 days
00 hours
00 minutes
00 seconds
course image

Harnessing Ollama – Create Local LLMs with Python

Master Ollama and Python to build local LLM applications with practical projects like RAG systems, sentiment analysis tools, and voice-enabled interfaces for real-world use cases.
Packt via Coursera

Packt

2014 Courses


6 hours 23 minutes

Optional upgrade avallable

Not Specified

Progress at your own speed

Free Online Course (Audit)

Optional upgrade avallable

Overview

In this course, you will learn how to create local language models using Ollama and Python. By the end, you will be equipped with the tools to build LLM-based applications for real-world use cases.

The course introduces Ollama's powerful features, installation, and setup, followed by a hands-on guide to exploring and utilizing Ollama models through Python. You'll dive into topics such as REST APIs, the Python library for Ollama, and how to customize and interact with models effectively.

You'll begin by setting up your development environment, followed by an introduction to Ollama, its key features, and system requirements. After grasping the fundamentals, you'll start working with Ollama CLI commands and explore the REST API for interacting with models.

The course provides practical exercises such as pulling and testing models, customizing them, and using various endpoints for tasks like sentiment analysis and summarization. The journey continues as you dive into Python integration, using the Ollama Python library to build LLM-based applications.

You'll explore advanced features like working with multimodal models, creating custom models, and using the show function to stream chat interactions. Then, you'll develop full-fledged applications, such as a grocery list categorizer and a RAG system, exploring vector stores, embeddings, and more.

This course is ideal for those looking to build advanced LLM applications using Ollama and Python. If you have a background in Python programming and want to create sophisticated language-based applications, this course will help you achieve that goal.

Expect a hands-on learning experience with the opportunity to work on several projects using the Ollama framework.

Syllabus

  • Introduction
  • In this module, we will introduce the course's objectives, outline the prerequisites needed to succeed, and provide an engaging demo to showcase the tools and concepts that will be used throughout the course.
  • Ollama Deep Dive - Introduction to Ollama and Setup
  • In this module, we will take a deep dive into Ollama, explore its features and advantages, and walk you through the setup process for using Ollama and the Llama3.2 model, ensuring you're ready for hands-on experimentation.
  • Ollama CLI Commands and the REST API - Hands-on
  • In this module, we will demonstrate how to use Ollama's CLI commands and REST API to pull models, generate responses, and customize them for different use cases, providing practical hands-on experience.
  • Ollama - User Interfaces for Ollama Models
  • In this module, we will introduce you to various user interfaces for interacting with Ollama models, including a detailed look at the Msty app and its use as a frontend tool for RAG systems.
  • Ollama Python Library - Using Python to Interact with Ollama Models
  • In this module, we will guide you through using the Ollama Python library to build local LLM applications, interact with models through Python, and explore advanced techniques like custom model creation and streaming.
  • Building LLM Applications with Ollama Models
  • In this module, we will help you build your first LLM application using Ollama, walk you through the process of building and enhancing RAG systems, and dive deep into vectorstores and embeddings for optimizing LLM applications.
  • Ollama Tool Function Calling - Hands-on
  • In this module, we will teach you how to use function calling within Ollama, setting up an application that categorizes items using function calls, and guide you in creating a final product that utilizes these functions.
  • Final RAG System with Ollama and Voice Response
  • In this module, we will walk you through the creation of a final voice-enabled RAG system, integrating ElevenLabs API to summarize documents and provide voice-based interactions, culminating in a fully functional voice response application.
  • Wrap Up
  • In this module, we will review the key takeaways from the course and offer advice on how to continue learning and applying the concepts you've gained throughout the course.

Taught by

Packt - Course Instructors


Subjects

Computer Science