See TidGi Feature Handbook/Localized AI Copilot on tidgi.fun for how to setup Vicuna AI model.
I think currently no other Desktop Note app achieve this yet. Notion, Obsidian are all using AI API that requires network and money. While AI in TidGi is free and runs on your own CPU to run the LLM (large language model).
But this is in Beta stage, UI is defined in GitHub - tiddly-gittly/tidgi-language-model: Chat with TidGi's build-in language model service in Tiddlywiki. A private, local and rooted ChatGPT AI. and still needs improvements. And Vicuna AI model is not good enough, if you find a better AI model, please comment here to recommend it.
More supported model (in .bin format) can be found in GitHub - Atome-FE/llama-node: Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model. , you can test them out.
For more precise AI generation for Wikitext and Macro, I still need support on Can I get talk forum and gg dataset to train AI?