Movatterモバイル変換


[0]ホーム

URL:


How-To Geek logo

I self-host my own private ChatGPT with this tool

LM Studio open on a Windows 11 PC asking about a opossum
4
By Nick Lewis
Nick Lewis is an editor at How-To Geek. He has been using computers for 20 years --- tinkering with everything from the UI to the Windows registry to device firmware. Before How-To Geek, he used Python and C++ as a freelance programmer. In college, Nick made extensive use of Fortran while pursuing a physics degree.

Nick's love of tinkering with computers extends beyond work. He has been running video game servers from home for more than 10 years using Windows, Ubuntu, or Raspberry Pi OS. He also uses Proxmox to self-host a variety of services, including a Jellyfin Media Server, an Airsonic music server, a handful of game servers, NextCloud, and two Windows virtual machines.

He enjoys DIY projects, especially if they involve technology. He regularly repairs and repurposes old computers and hardware for whatever new project is at hand. He has designed crossovers for homemade speakers all the way from the basic design to the PCB.

 Nick enjoys the outdoors. When he isn't working on a computer or DIY project, he is most likely to be found camping, backpacking, or canoeing.
Sign in to yourHow-To Geek account
Summary
Jump links

Jump Links

follow
Follow
followed
Followed
Thread2
Here is a fact-based summary of the story contents:
Try something different:

ChatGPT has become the poster child for artificial intelligence and large language models everywhere, but if you want something more specialized, or you want something you can guarantee is private, it isn't your only option.

I've been running a handful of AIs on my own PC for a year now instead of paying for ChatGPT—here's how.

Why run an AI locally on your own PC?

ChatGPT is responsive, relatively smart, and continuously receives updates, so why mess with hosting your own large language model at all?

There are three big reasons: integration with my projects, privacy, and specialization.

ChatGPT costs money to use

If you're self-hosting a smart home and want to integrate ChatGPT into your system, you're going to have to pay for access. Depending on how much you use it, that could range from a few cents per month to hundreds of dollars.

Hosting your own AI doesn't completely solve that problem, since you have to pay for electricity, but it does mean you're not going to unexpectedly experience a jump in cost to access the AI or accidentally incur a huge fee because you're overusing it. Even the most powerful home PCs would struggle to cost more than a few dollars per day in electricity, and that is assuming the system is running completely maxed out 24 hours a day.

Self-hosted AIs are private

ChatGPT is a fantastic tool, but it isn't private. If you're concerned about how your data might be used in the future, or if you're handling confidential information that cannot be shared outside your organization, a local AI is a fantastic option.

You can ensure that nothing leaves your PC, and so long as your PC is secure, you can be sure that data you provide won't be used for training some time in the future or leak because of a security bug.

Local AI can be fine-tuned to your needs

Not every AI or LLM is the same. If you ask Gemini or ChatGPT the same questions, you'll get slightly different answers. That sort of difference shows through in the AI you can host locally, too.

OpenAI's gpt-oss will provide different responses to Qwen3, and Gemma will provide different answers from Kimi. Additionally, these open models are subject to the same AI arms race that the commercial models are. Some of them are just better at certain tasks than others, and which AI is best at which job changes with the technology and new releases.

The ability to quickly switch between models for a specific job is incredibly handy, and one I leverage a lot. If I need complex feedback on an idea, a larger model like Qwen3 32B is handy. If I just need something to parse basic text, Gemma3 4b is perfectly fine for the job.

Loading a model to chat in LM Studio.

If you're self-hosting AI to handle tasks in your homelab, delegating simple jobs to lighter LLMs is a great way to save on resources. Additionally, you can attach other AI, like those specialized in machine vision or natural language processing, to perform more specialized jobs.

What do you need to host your own ChatGPT?

The first thing you need to run your own LLM isLM Studio, which provides a convenient interface to chat with an LLM much like you would talk with ChatGPT. It also makes trialing new LLMs extremely simple.

LM Studio with OpenAI's gpt-oss in the chat.

Pretty much any modern gaming PC can run at least some local AI models, though the main limiting factor isthe amount of VRAM available on your GPU. If you're buying new, 16GB of VRAM is probably a reasonable middleground that will make a large range of very capable AI accessible to you. 12GB is probably the minimum.

Other than that, it helps to havea zippy SSD to make loading and unloading models faster, anda healthy amount of system RAM (32GB or more) is ideal if you're going to try and offload some of the AI tasks from your GPU to your CPU.

If you're not sure what models your system can run with your given specs, there isa handy project on GitHub that can make recommendations for you based on what you want to do and what your system specs are.

The recommendation service on GitHub.

Running your own ChatGPT

Once you download and install LM Studio, all you need to do is click the magnifying glass icon, browse for the model you want, and click "Download" towards the bottom.

Selecting and downloading a model in LM Studio.

If you've found a model elsewhere that you'd like to use, you need to drop it into the correct folder on your PC. By default, that will be:

C:\Users\(YOURUSERNAME)\.lmstudio\models

Where (YOURUSERNAME) is your user account name. So, in my case it was "C:\Users\Equinox\.lmstudio\models."

Once you do that, it'll appear in your list of models just like any other.

What can your own ChatGPT do?

What your own, privately hosted LLM can do depends on what model you're using, what hardware you have, and how good you are at writing a prompt.

There are dozens of models (or more) out there with specialized functions, but at a minimum, you can get them to read sources, generate summaries, discuss the content of resources you provide, and parse the contents of images or videos. Many are optimized for tool use, which means if you want them to, they can even interact with external applications to perform extra jobs, or get information automatically.

If you're willing to experiment, you can even fully integrate them with Home Assistant to create your own talking, thinking (sorta) smart home.


Above and beyond the cost savings, though, there is something else about hosting your own LLM: It is just fun. It isn't every day that a brand-new technology becomes widely accessible to home users, especially one that is slated to be as disruptive as AI.

Follow
Followed
Share
FacebookXWhatsAppThreadsBlueskyLinkedInRedditFlipboardCopy linkEmail
Readers like you help support How-To Geek. When you make a purchase using links on our site, we may earn an affiliate commission.Read More.
Illustration of a blue calculator with the letters D-U-M-B on its buttons, framed on both sides by large curly brackets.
Why I keep starting weird projects I don’t actually need
A relaxed man lounging on an orange beanbag watches as a friendly yellow robot works on a laptop for him, while multiple red exclamation-mark warning icons float around them.
3 reasons why vibe coding is a terrible idea
A MacBook surrounded by a gear symbol, a shield, an iCloud icon, and a password dots bar.
I made my Mac more secure by changing these 5 settings
See More
The back of the OnePlus 15 sitting in grass and leaves.
The OnePlus 15 can finally be sold in the U.S.
A replacement battery for a Kindle third generation eReader.
It’s time to admit you can swap out internal rechargeable batteries yourself
Several smartphones arranged diagonally on a blue geometric background, each displaying a simple home screen with a solid black wallpaper
Black is the new best wallpaper for your phone
See More

[8]ページ先頭

©2009-2025 Movatter.jp