How to run Deepseek and other AIs locally

You can skip the first half if you want to. It only pertains to Deepseek.

2 Likes

Interesting. Thanks for sharing.

Haven’t installed any ChatGpt type models. I’ve only played with image and height map generators.

1 Like

Can you be worried about sharing YOUR data with others while running on Microsoft Windows? Maybe he has a video on shutting down all the telemetry and runs nothing from MS other than the OS. Still seems odd considering the telemetry built into Windows and their EULA but as a youtuber, has to include the largest audience, I get that it’s about following the money.
Big fan of running stuff locally so thanks for sharing.

1 Like

Got to looking around for running DeepSeek R1 locally and found lots of discussions on performance with Ollama operating far faster than LM Studio.
BTW, Ollama is open source.

FWIW, here’s one doc on installing DeepSeek R1 locally on Ollama:
https://www.datacamp.com/tutorial/deepseek-r1-ollama

2 Likes

Note that the models are not open source; they have a different license from the source.

DeepSeek source code is MIT licensed, but the model has a license with field-of-use restrictions, and the code to train the model isn’t open at all. I have not analyzed it for other failures to comply with the OSD. Hugging Face are trying to fill in the missing pieces for a truly end-to-end open source system: Open-R1: a fully open reproduction of DeepSeek-R1

The model size is going to be a huge factor in performance; all other things being equal a large model will have higher quality and slower performance. I don’t know if I’ve seen apples-to-apples here, and I don’t know how to even define what would be apples-to-apples. :grin:

3 Likes

The project to stock the US Congress with confirmed idiots appears to have achieved another success…

2 Likes

Hopefully it won’t go anywhere. We know longer prison terms don’t resolve anything.


If the congress would do right, one piece of legislation would have fixed both tictok, deepseek and any other piece of software by limiting where you can store and who can access the data.

We do it all the time for nuclear generation stations.

Most of Europe has better protections than Americans when it comes to data security.

Agree with confirmed idiots, but have to add elected confirmed idiots.

:smile_cat:

1 Like

A quick test and I was able to run an Ollama model with 32B locally. It was a tad slow but usable.

What hardware specs and OS are you using ?

13th Gen I7, 48GB Ram, 4060 GPU, Win 11 home.

Nice - thanks for the reply - thats not particularly beefy - good to know :+1:

1 Like