RAG Notes (Local LLM Study Assistant)
Fully local RAG pipeline for semantic search over class notes via a conversational CLI.

Developed a fully local RAG pipeline that ingests class notes and enables conversational semantic search. Built a Bun-based terminal app in TypeScript for fast indexing and querying, integrated LangChain with Ollama-hosted local LLMs for private offline QA, and focused on privacy, low-latency inference, and reproducible local deployment.