Hey, well-noted, here the same, I try it yesterday …fyi
Hi @2right4, I’m glad to hear you tried it
Are you saying that you received the same message as @JanJo when trying to use the plugin with only an Anthropic key? Did following my suggestion to add some placeholder content to the OpenAI key tiddler work for you?
I appreciate the user feedback, the lack of such has been one of the major reasons I’ve slowed my roll on releases: so far everything I’ve been working on has been pretty insular to my own experience using the plugin. Receiving more feedback will increase the quality and consistency of releases going forward
Just spun up an API key and now I have WikiSage running. Am I missing an option somewhere for it to automatically save my chat history as a tiddler?
No, but that should be pretty simple to implement.it currently saves the last 5 messages, for context because I didn’t want it accumulating too much space in people’s tiddlers by default and because copy/paste works, but I think a toggle to achieve that could be implemented.
Tell me more about what you’re thinking as your ideal usecase. Would it be always on, something you’d set in config? Or would it be a toggle on the interface that you could select as needed? Or some other third thing?
Maybe a conditional you can set when you’re calling the widget? That would allow you to select a tiddler name and have multiple instances with that name that would all add to that one tiddler, if that were of any interest
I think I would always have it on, then I can periodically review it to save anything that is relevant. Or if there was a button to click that would export the conversation to a new tiddler, then I could choose to save the conversation as a new tiddler.
I think that should be very doable something that belongs in V1, I agree
I would like to have a possibility to store a converstation in one tiddler, with a mechanism to strip useless parts afterwards.
Perhaps this could use a mechanism like the section-editor.
An interesting question is what should happen to files the AI creates for you.
I’m listening, say more. Right now the agent can create tiddlers and those are treated the same as normal tiddlers
I’m in so deep with the streams outliner, lol (you should be able to expect a working version soon )
Even better if you do this with streams than with sections.
Hi just wanted to say I’m really excited to see the community is making progress towards adding support for interacting with AI services. Is it possible with this plugin to use the Perplexity API? It looks like at this point OpenAI does not support web search via API.
I can look into it I’ve never used perplexity, but we search would be quite excellent, I’m interested.
Just updated my model list with the new 3.7 Sonnet:
To use, just update the model list file with claude-3-7-sonnet-20250219
I recently found that instead of chatting directly with AI, having AI comment on your notes is more interesting. For example, we can use a field ai-comments
to save a JSON formatted list of data, which contains comments from various intelligent agents with different personalities and backgrounds on your note.
The question I am currently thinking about is how to make the best use of core functions to allow AI to randomly call the API during idle time on the computer to comment on a random note, and to keep it as brief as possible, rather than lengthy, to avoid taking up too much storage space. I know there is a plugin called its about time
that can provide a certain degree of random timing functionality, which might be useful.
I am not doing these things right now because the AI features and time functions in the core of TiddlyWiki have not been released. I am afraid that writing something now will make the plugin ecosystem a bit chaotic, such as the management of agents, where each plugin would maintain its own set of the management of prompts, as well as the management of APIs and keys (Like I want to use Deepseek R1, but how to provide it to all AI plugins at once?), would also lead to many redundant wheels in the ecosystem.
Yes, I agree that chatting directly is not often the most useful way of engaging. With WikiSage, I currently have prompt such as “Process this article into Source and Idea tiddlers and open for review by the user” which I just transclude along with text contents – and, elsewhere, I have described how I use AI to convert my reading notes into Tiddlers.
The “next release,” of WikiSage, when that occurs, will include more of these sorts of features.
In terms of what you’re discussing, I have been mapping out how an AI Daemon would work, which would run checks on what the user is doing with a small (and/or local model) and report, when appropriate, to a larger model for taking action (such as commenting, as you mention).
I tried to use GitHub - bpmnServer/bpmn-server: BPMN 2.0 server for Node.js , providing modeling, execution, persistence and monitoring for Workflow. along with sample UI. Intended to be developers workbench for BPMN 2.0 to create something like https://www.coze.com/ , but later I find this kind of workflow engine is too big (1MB+) and most of features are useless.
A combination of set-timeout action widget (from its about time plugin) and some action tiddler (tagged with $:/tags/Actions
) may be enough for auto commenting task.
This kind of workflow requires knowledge of wikitext widget usage, not as easy as visual workflow, but it is for developers so it is fine. End users will only use “AI comment pal” plugin without touching workflow under the hood.
Please note that It's About Time! — TiddlyTools: "Small Tools for Big Ideas!" (tm) is OBSOLETE and has not been updated since September of 2022. Instead, ALL TiddlyTools add-ons (including the $action-timeout
custom widget) are hosted directly on https://tiddlytools.com, which should be referenced as the definitive source for the most up-to-date versions.
-e
Interesting, I’ll check it out. Size efficiency is a big concern for me, but my thinking - - and this is true of my thoughts about how innovation in AI ultimately takes place generally - - revolves around coming up with clever and efficient applications of the tool, rather than piling on more and more features.
FYI 10 个月,5 万 DAU,我们可能找到了 AI 陪伴的另一种可能 - 王登科-DK博客 (you can use https://immersivetranslate.com/ to read it)
Sorry I was wrong, core AI plugin draft already have this standard, just following it will be fine
with tags: $:/tags/AI/CompletionServer
and \procedure json-prompt()
and \procedure completion-callback()
And there is already a core comment plugin Why is comment plugin not mentioned in the official documentation site , so I think making a AI comment plugin is pretty easy now, based on core plugins.