With the help of my friends Claude and Gemini, I made an AI-Adapter for TiddlyWiki. It is published under MIT-License “WITHOUT WARRANTY OF ANY KIND” (which is really impossible for a tool of that kind) For the exact terms see here: MIT License - Wikipedia
It needs an encrytion/decryption filter made by the great @EricShulman here at his TiddlyTools.com if you want to store your API-Keys encrypted with a password. Otherwise, the key are in temp-tiddlers and lost after saving.
If one or more API-Keys are provided in the settings, it creates an AI Chat in the sidebar, with the option to import the answers as tiddlers with a click of a button.
As we are a community of cat-lovers, I was keen to integrate “le chat” de Mistral. I also integrated gemini because it is also free and claude. Old Version: AI-Adapter.json (42.0 KB)
The answers in the thread are collapsible now. And have more buttons: to copy the text, to delete, to repair the format and to unwrap json answers that contain multiple tiddlers. Old Version: $__plugins_JJ_ai-adapter.json (46.6 KB)
I am not really happy with the last two functions because of the possible variations of the answers of the ai that make it hard to counter every case. If there are suggestions for ameliorations please post.
$__plugins_JJ_ai-adapterWithSelect.json (52.0 KB)
Now with the ability to store and set custom prompts and settings following with tiddlers like that one that appear in a select below the promptfield.
$__plugins_JJ_ai-adapter.json (59.8 KB)
Hi @Eskha I have told sonnet to build an integration of ollama. Since I do not have Ollama running, would you test this new version for me?
A new version of this plugin:
I tried to make prompting more understandable and modular.
The plugin now allows to set the LLM-Params like TOP-P etc.
and to modularly build the system-prompt with checkboxes and configurable modules. $__plugins_JJ_ai-adapter.json (78.4 KB)
This is not working: I keep having the “Locked Please enter your password to unlock the AI keys.” error message.
Access to the local API does not require an API key.
BR,
Well a quickfix to allow testing: open the API Keys settings with the button in the sidebar. Enter a mockup string for any AI-webmodel. That won’t hurt and opens the sidebar. I will fix this in the next version if the rest of Ollama is working.