Intelligent Tiddlywiki: Machine Learning ("AI") Enhancements

I haven’t done any training myself, I’ve only run pretrained models, but yes, I agree a dataset is the most difficult part. I’ve had quite a lot of success just by providing context to the model in the system message for what Tiddlywiki is (models tend to have a general idea because they’ve been trained on reddit, et. al) and specific wikitext. I think if I slowly add more complex subtleties to that system prompt, it will become fairly competent at generating solutions for both newbie and intermediate level questions.

@linonetwo when you refer to Claude Artifacts, I am a bit confused in the context of your stated interest. According to What are Artifacts and how do I use them? | Anthropic Help Center Artifacts are pieces of reference material that the model has identified as being relevant to save and continue working with in the future (unless I’m misunderstanding). This is something that the Expanded Chat GPT Plugin is capable of doing – by default, it is instructed to store important information it wants to “remember” in a shadow tiddler, in which it might store for example “Actively working on XYZ code,” – then if one were to say “Let’s pick up where we left off on the code” it would check the reference file and pull up the XYZ code tiddler to refer to. A more complex UI could be generated for this – in fact, I’ve had quite a lot of success saying “create a tiddler titled this which contains a UI that communicates the most recent/relevant data from these tiddlers.”

This sounds like it has the makings of a really useful plugin, particularly for Streams users, if you ever get the chance to share it (which I realize can be rather time consuming).

I don’t force myself to keep tasting notes, but sometimes they come to me on a whim and I like to have a place to store them… they can be especially helpful when buying new teas, or if I’m trying to remember a particular variety that was very unique.

Now the Stacks UI is working, wine is my next application of it :sunglasses:🤌

1 Like

I’m fairly new to creating plugins - - in fact, many of the customizations I’ve made to streams have been made by directly modifying the plugin files over the years (a very shortsighted and bad habit I fell into). But the Stacks files I had the sense to keep separate, and with your encouragement I may do so :slight_smile:

Back on topic, this is actually an application of AI that I see as being especially helpful: I have made minor alterations to several plugins over the years, some of which might be interesting to share, but which I haven’t always been the best at documenting: I plan on running some of my versions against the original for comparison, and ask a model to pull out the differences and repackage.

1 Like

Please do, and do feel free to ask if you need any guidance.

Yes I can definitely see the appeal, for example for converting my short hand notes into something more legibile for others. I look forward to playing with this once I get the opportunity.

1 Like

It can be a full HTML + JS shareable webpage, like Claude Artifact , looks like the claude doc isn’t clear about this, but this is boast by the news I read. ChatGPT-Canvas seem to have this feature too.

I mean use LLM to create dedicate UI for certain task on the fly, like I create Principles plugin when I want to record principles. Each task can have a dedicate UI.

All my customizations are plugins, so easier to manage, won’t get my wiki dirty.
And I deserve Notion-grade note app, making plugins force me make them better, instead of using geek style makeshift UI all my life, if I’m going to use TW for a long time.

I think I see what you mean… to clarify, you are saying that an LLM would create a UI based on the very specific usecase scenario and then store the information in a way that could be represented in universal ways? I think this is a very spot-on vision for the capabilities of LLM as a technology… imagine if all user interfaces customized themselves by-default to the way the user’s brain worked, or the highly-specific needs of any given situation.

I think this would be possible, within the constraints of tiddlywiki… and the user would obviously need to self-host if it were to be sharable. I can definitely imagine how that would have all sorts of extremely useful applications for communicating ideas with others.

Edit: Lol, thank you @linonetwo, I had never actually thought about creating a javascript game w/n Tiddlywiki and immediately had to try it:

Yes, the universal way to store data in TW is tiddler, and every UI will store result as tiddler. Like principles layout and calendar layout does. They can read each other’s data. What a web 3.0!

2 Likes

This conversation has sparked some really fun ideas for me :slight_smile: I could imagine someone running a DnD game could add game details to the system prompt and generate some really interesting story elements on the fly.


Hi, perhaps this is interesting:
I just stumbled upon langchain which could be usefull to integrate LLMs via node.js.
See How to Integrate AI into Your Node.js Application: A Step-by-Step Guide

1 Like

Good find :slight_smile: Looking forward to checking out.

With the latest release of the Expanded Chat GPT widget, I’ve been able to demonstrate some of the possibilities I see for Tiddlywiki using multimodal ML enhancements:


A nonsense chart, created in paint, uploaded through the interface and showing a basic relationship between 3 titles


As you can see, nonsense titles were successfully extracted from the image with the relationships successfully reflected through tagging

Tagging is not the only kind of relationship that could be depicted, of course – any visual relationship that you can summarize broadly could be used in this manner to create relationships with backlinks, fields, transclusions, whatever you can imagine.

This might be very helpful if you, like me, value having information represented in different forms, especially for archival


Performs generally pretty well even with shoddy quick sketch and limited context – additional context and consideration would greatly refine the results

This, on top of the agent’s new ability to generate images on request –

– can be a real step up in how users can rapidly process, store, and retrieve information that goes well beyond text.

4 Likes

Wanted to repost @JanJo’s example here, away from the Expanded Chat-GPT topic itself, since it might have wider application than that plugin itself (though I’m certainly interested in exploring the possibilities of using the plugin in this way!)

Will have a look over coffee, @JanJo and get back to you here!

1 Like

There seems to be something wrong with this translation…

Using the tiddlywiki interface for something like this is definitely interesting! For a project years back I had a wordcount in the editor – something like that could be used to provide a score real-time if a student were actively working on something, and button could trigger a popup with contextual based recommendations, rather than allowing the student to actually edit their content with it.

@JanJo, the usecase you had in mind was for rapidly processing and providing feedback to students on their handwriting quality – that would be an interface that the students would use to get feedback on their own work, or an interface that you would use to rapidly process and sort their work?

langchain is not an good option, things can be written in native JS or use visual tools like n8n / coze / wordware to achieve same feature with less time. I wrote langchain-alpaca when they come out , but I soon discover langchain is badly designed, and is over-abstracted.

BTW, see propersal to implement wordware style feature in TW Add "AI Tools" Plugin by Jermolene · Pull Request #8365 · TiddlyWiki/TiddlyWiki5 · GitHub

1 Like

I am a bit behind on my understanding for the current state of TiddlyWiki and LLM training, but I would like to get it to the point where when you talk to the chatbot, it has better accuracy for understanding TiddlyWiki syntax.

As far as my personal use case for hooking up my TiddlyWiki data to a LLM, I am mostly interested in using it to synthesize and make connections between my existing notes. Curious how others have prepared their individual tiddlers to use as the knowledge base for their LLM.

1 Like

Hey @markkerrigan, please check out the Expanded Chat-GPT plugin to see where I’ve gotten with it, and feel free to post here and/or there detailing thoughts on your use cases, as I am formulating next steps :slight_smile: also feel free to ask specific questions about my own process, as I have been developing new habits since implementing AI into my knowledge base.

I have found there is no need to go through additional training of models, but rather an extensive system prompt going over important Tiddlywiki syntax and explaining its context as an agent within Tiddlywiki has been sufficient to yield consistent results.

Fantastic experiments. Thanks for the inspiration.

1 Like

Hey @joshuafontany, good to see you online :relaxed: if you decide to do some experiments yourself, would love to hear about them! And if you decide to try out the Expanded Chat-GPT plugin I’d love to hear how it works for you and how you feel it might be improved :grin:

One way I’m experimenting with the interface is to use it for identifying and creating backlinks for tiddlers, especially when I have old notes that may pertain to a newer tiddler.

image

Here you can see that the agent performing a search for instances of “interdependency” across the entire wiki and wrapping those as backlinks. What I’ve excluded is that the agent also correctly identifies and wraps other forms (aliases) of the term (e.g. interdependence).

Ultimately I’d like to use this kind of functionality to also identify opportunities for linking between tiddlers – ideally not just when prompted, but as a background operation.