👋 I'm Gareth Brown, a UK based software engineer specialising in full stack web application development, available for freelance and consulting work. Contact me to hire me for your software project (CV, Portfolio).

Local LLM Setup on Windows with Ollama and LM Studio (Lenovo ThinkPad P1 Gen 4 with a RTX A3000)

22 Feb 2026

Introduction

This is a walkthrough of my set up of local LLM capability on a Lenovo ThinkPad P1 Gen 4 (with a RTX A3000 6GB VRAM) graphics card, using Ollamafor CLI and VS Code Copilot chat access, and LM Studio for a GUI option.

My Lenovo ThinkPad P1 Gen 4 is coming up for 4 years old. It is a powerful workstation, and has a good, but by no means state of the art GPU in the RTX A3000. My expectation is that many developers will have a PC capable of running local LLMs as I have set up here.

See the GitHub repository for the full walk through:

https://github.com/gbro3n/local-ai/blob/main/docs/local-llm-setup-windows-ollama-lm-studio.md

Stay Updated

Subscribe to the mailing list to receive the latest blog posts and updates directly in your inbox.

Please correct the following errors:

We respect your privacy. Unsubscribe at any time.

Comments

No comments yet. Be the first to comment!

Please sign in to leave a comment.