
AstonJ
How to: Run DeepSeek on Mac, Windows, and Linux!
This is a very quick guide, you just need to:
- Download LM Studio: https://lmstudio.ai/
- Click on search
- Type DeepSeek, then select the one you want* and click download
- It should then show in your models
*When deciding which one to choose look at Params
- the bigger the model the beefier the machine you’ll need. I downloaded the 7B model and it runs fine on my Mac.
- You can then start a new chat and at the top select DeepSeek.
That’s it
Most Liked

iPaul
I was able to run the 32B model on a 4090. It was pretty responsive. I would love to get my hands on a bunch of Mac Mini M4 Pro to compare the 4090 with a cluster of these.

iPaul
I didn’t tried the smaller models, so I can’t really compare these with the 32B. But I tried the 70B model also and it was slightly better than the 32B, but it was too slow for me to be able to run it for anything serious. (I suppose a 64GB Mac Mini M4 Max should be able to run the 70B model.)

DevotionGeo
Since I’ve never run an open-source AI model on my local machine, which model will run smoothly on my M1 MacBook Air without any issues?
Popular Ai topics









Other popular topics










Latest in AI
Latest (all)
Categories:
Popular Portals
- /elixir
- /rust
- /wasm
- /ruby
- /erlang
- /phoenix
- /keyboards
- /js
- /rails
- /python
- /security
- /go
- /swift
- /vim
- /clojure
- /java
- /haskell
- /emacs
- /svelte
- /onivim
- /typescript
- /crystal
- /c-plus-plus
- /tailwind
- /kotlin
- /gleam
- /react
- /flutter
- /elm
- /ocaml
- /vscode
- /opensuse
- /ash
- /centos
- /php
- /deepseek
- /scala
- /zig
- /html
- /debian
- /nixos
- /lisp
- /agda
- /textmate
- /sublime-text
- /react-native
- /kubuntu
- /arch-linux
- /ubuntu
- /revery
- /manjaro
- /spring
- /django
- /diversity
- /lua
- /nodejs
- /c
- /julia
- /slackware
- /neovim