Try AS Notes for VS Code, Antigravity, Cursor and Windsurf. AS Notes brings markdown and [[wikilink]] editing for notes, documentation, blogs and wikis directly into VS Code and compatible editors. Capture ideas, link concepts, write, and stay focused - without ever leaving your editor. Free to Install

Local LLM Setup on Windows with Ollama and LM Studio (Lenovo ThinkPad P1 Gen 4 with a RTX A3000)

22 Feb 2026

Introduction

This is a walkthrough of my set up of local LLM capability on a Lenovo ThinkPad P1 Gen 4 (with a RTX A3000 6GB VRAM) graphics card, using Ollamafor CLI and VS Code Copilot chat access, and LM Studio for a GUI option.

My Lenovo ThinkPad P1 Gen 4 is coming up for 4 years old. It is a powerful workstation, and has a good, but by no means state of the art GPU in the RTX A3000. My expectation is that many developers will have a PC capable of running local LLMs as I have set up here.

See the GitHub repository for the full walk through:

https://github.com/gbro3n/local-ai/blob/main/docs/local-llm-setup-windows-ollama-lm-studio.md

Stay Updated

Subscribe to the mailing list to receive the latest blog posts and updates directly in your inbox.

Please correct the following errors:

We respect your privacy. Unsubscribe at any time.

Comments

No comments yet. Be the first to comment!

Please sign in to leave a comment.