Session 3 - Running LLMs Locally Using Ollama | Downloading Models from Ollama, Alibaba, Hugging Face | Running LLMs via Python & Jupyter | Using OpenAI-Compatible API with Ollama | Creating Prompts, Roles, and Functions | Local Model Inference & Memory Usage