Getting started

Everything you need to run AI at home.

I.

I started by watching a few videos like this one...

I used LM Studio and chose a model that I'd read was closest to the way I like to use Claude. When in doubt, ask your preferred model which LM Studio model it would choose for you according to the ways you use it most - that's exactly what I did and it's working out fine so far.

I had to make sure that a full GPU offload was compatible with my laptop's memory (covered in this video), and then I just downloaded the correct version of the model I wanted.

Yeah. It was that easy.

FAQs

Everything you need to know

Where can I find the memory on my laptop?

For Mac users:

1. Using "About This Mac":
- Click on the Apple logo in the top-left corner of your screen.
- Select About This Mac from the dropdown menu.
- Under Overview, you'll see details about your computer's memory (RAM).

2. For More Detailed Memory Info:
- Open System Information:
- Click on the Apple logo > About This Mac > System Report
- Navigate to Memory in the sidebar to view detailed RAM stats.

For Windows PCs:

1. Using "About Your PC":
- Press Win + I to open Settings.
- Click on System > About.
- Scroll down to see your computer's memory (RAM) details.

2. Alternative Method:
- Right-click the Start button and select System.
- Look under Device Specifications for RAM information.

How do I know LM Studio is safe?

LM Studio vets its sources. Models in its catalog come from Hugging Face, and the ones surfaced in LM Studio's own search are generally from trusted, established publishers. You're not downloading from some rando, fly-by-night website, here.

Same goes for Ollama.

What types of AI models can I download and run locally?

SO many open-source language models are available for local deployment. Some popular options include:

  • Llama (and Llama-2): Developed by Meta, these models come in various sizes suitable for different hardware.

  • BLOOM: A large-scale model developed as part of the BigScience project, which is open-source and can be run locally.

  • Mistral Models: High-quality, open-source models that are optimized for local use.

  • Qwen: I'm using this one (version 2.5-vl-32b-instruct, to be exact).

  • Hugging Face Transformers: This library provides access to a wide range of pre-trained models (e.g., distilgpt2, bert-base-uncased) that can be downloaded and run locally.

Got more questions? Ask our subreddit.