Has anyone tried DeepSeek with TiddlyWiki? There are many discussions online suggesting that it surpasses ChatGPT.
The code is available on GitHub: DeepSeek.
Has anyone tried DeepSeek with TiddlyWiki? There are many discussions online suggesting that it surpasses ChatGPT.
The code is available on GitHub: DeepSeek.
The latest version of anythingllm is supported DeepSeek through ollama.
I tried DeepSeek 8B and 11B in my laptop. Both models have the reasonable response speed. But it seems 11B has better answer for domain specific question.
In the online version of DeepSeek and ChatGPT, ChatGPT has better correct answer to generate TiddlyWiki related questions, e.g. generated wikitext, js macro/widget.
Yes, Deepseek can be run locally on oLlama β I will probably be running a model based off of Deepseek sometime this year, the strategy is far more efficient, and itβs likely the best open source model one could pick at this time.
Though the training strategy is far more efficient β I have tried both and neither their reasoning model nor their advanced LLM beats chatGPT equivalent models. And they also censor politically inconvenient content.