Intelligent Tiddlywiki: Machine Learning ("AI") Enhancements

With the latest release of the Expanded Chat GPT Interface Plugin (Expanded ChatGPT Interface - #11 by well-noted) I wanted to open a discussion about the possibilities for incorporating machine-learning (tldr, AI) into Tiddlywiki. I’ll give some context here for what that plugin is capable of, but if you want a full overview of that plugin, or you have thoughts, questions, or recommendations about it specifically, I recommend checking out and posting on that topic.

That said, I haven’t heard much about other community members needs, interests, and activities with ML, so please share those here :slight_smile:

Several years back I became interested in using NLP to process all of my reading notes into my wiki, when I came across ML. With the recent spread of widely available GPT models, my vision for that has come true, allowing me to spend more time reading and interacting with my notes and less time processing them:


Notes as they are exported from my Boox tablet


Notes after they are passed through a python-script which uses an OpenAI api call to process all these notes into my specific format, to be imported into Tiddlywiki

More recently, as you can see in the above post, I was inspired to look a bit more into more-direct implementations of ML within the tiddlywiki environment.

signal-2024-10-27-075712_002
An example of the text editing capabilities of ML when incorporated into TW

vivaldi_NPMpGgEXUH
An example of text-completion capabilities of ML, within the Streams plugin

To my eye, these are neat tricks that might allow someone to use shorthand for rapid note entry which would be reformatted later or allow someone to speed up their drafting process.

Moreover, ML can be incredibly useful in completing otherwise cumbersome tasks:


Demonstration of complex multi-stage contextual operations performed rapidly with natural language quer




Demonstration of rapid transformation of raw data into a formatted table, on mobile interface


For me, the possibilities of incorporating ML into the TW environment really shine as an extension of what TW already does spectacularly well – categorizing and filtering bits of information to create relationships:


Demonstration of the ability to perform a complex search operation and provide user with functional buttons to enhance workflow



Create original connections between tiddlers to better structure information

Additionally, I will often find myself in the middle of performing one task (maybe reading an article while waiting in line) and come across some information I’d like noted but done have time to process completely:


Having come across some information, creating spontaneous stubs by copying/pasting the context into the interface, to be be revisited later

This is also helpful when trying to capture stray thoughts that I have, which I may not have time to fully articulate at any moment:



I want to hear more of the communities thoughts, experiences, and visions for incorporating these these two tools.

1 Like

Commendations for a well-written piece, @well-noted :wink:

For me, if the whole review-edit-proof cycle could be optimized with some kind of tool… that would be a fabulous thing to have.

It would need to be trained to recognize my style though. I wouldn’t want to lose that in the process.

Just my initial thoughts…

1 Like

@CodaCoder I am :100: in agreement with you on that point, it is one of my top priorities. The current version of the Expanded Chat-GPT plugin is capable of that - - it can scan the content of all the tiddlers in your wiki and come up with a set of conclusions and instructions for it’s own reference about how you write, and it can use reference files in order to influence how it responds: I just need to do some strategic prompting and restructuring of the existing widgets to make that the default. :slightly_smiling_face:

I appreciate your input, I’ll likely make that my next priority.

I am also considering adding a “User Prompt” which immediately proceeds the default system prompt, in which one could add their own instructions and definitions: for example, if one had their own style of shorthand, they could install the plugin and simply input their shorthand definitions into it an edit box, and the agent would reference that each time a query is sent.

Waiting for data for AI training Can I get talk forum and gg dataset to train AI?

Training is easy now a days, even ChatGPT knows how to train a GPT, but dataset is difficult to prepare.

Instead of use LLM to generate note, I’m more interested in generate UI or layout on the runtime, like Clude Atrifact does (I haven’t use it yet, just see the news).

1 Like

I am yet to find the time to explore your work properly, as well as the Streams integration, but this example caught my eye. Just a random example, or something you are interested in? As a gong fu cha aficionado, that sounds like an interesting read.

1 Like

As I sip my Da Zhang Mountain Wild Arbor from Jiangxi


1 Like

@saqimtiaz, my tea journals actually inspired me to work out a card-system version of Streams (which I call Stacks),


A user can switch between a Streams-view and a Stacks-view, depending on which representation is more useful

Cards can be dragged into one another to create tiers (hence “stack”) and clicking on a card with items underneath it reveals the next level down

1 Like

Lovely, thank you for sharing! Funnily enough, what drew me to explore customizing TiddlyWiki 5 some years ago was creating a wiki to keep track of my tea collection (mostly sheng puerh), which I later extended for tasting notes.

I have to admit that I gave up keeping tasting notes some time ago now, but that part of the customizations became the basis for another wiki for wine tasting notes that is now regularly used by a small circle of friends.

1 Like

I haven’t done any training myself, I’ve only run pretrained models, but yes, I agree a dataset is the most difficult part. I’ve had quite a lot of success just by providing context to the model in the system message for what Tiddlywiki is (models tend to have a general idea because they’ve been trained on reddit, et. al) and specific wikitext. I think if I slowly add more complex subtleties to that system prompt, it will become fairly competent at generating solutions for both newbie and intermediate level questions.

@linonetwo when you refer to Claude Artifacts, I am a bit confused in the context of your stated interest. According to What are Artifacts and how do I use them? | Anthropic Help Center Artifacts are pieces of reference material that the model has identified as being relevant to save and continue working with in the future (unless I’m misunderstanding). This is something that the Expanded Chat GPT Plugin is capable of doing – by default, it is instructed to store important information it wants to “remember” in a shadow tiddler, in which it might store for example “Actively working on XYZ code,” – then if one were to say “Let’s pick up where we left off on the code” it would check the reference file and pull up the XYZ code tiddler to refer to. A more complex UI could be generated for this – in fact, I’ve had quite a lot of success saying “create a tiddler titled this which contains a UI that communicates the most recent/relevant data from these tiddlers.”

This sounds like it has the makings of a really useful plugin, particularly for Streams users, if you ever get the chance to share it (which I realize can be rather time consuming).

I don’t force myself to keep tasting notes, but sometimes they come to me on a whim and I like to have a place to store them… they can be especially helpful when buying new teas, or if I’m trying to remember a particular variety that was very unique.

Now the Stacks UI is working, wine is my next application of it :sunglasses:🤌

1 Like

I’m fairly new to creating plugins - - in fact, many of the customizations I’ve made to streams have been made by directly modifying the plugin files over the years (a very shortsighted and bad habit I fell into). But the Stacks files I had the sense to keep separate, and with your encouragement I may do so :slight_smile:

Back on topic, this is actually an application of AI that I see as being especially helpful: I have made minor alterations to several plugins over the years, some of which might be interesting to share, but which I haven’t always been the best at documenting: I plan on running some of my versions against the original for comparison, and ask a model to pull out the differences and repackage.

1 Like

Please do, and do feel free to ask if you need any guidance.

Yes I can definitely see the appeal, for example for converting my short hand notes into something more legibile for others. I look forward to playing with this once I get the opportunity.

1 Like

It can be a full HTML + JS shareable webpage, like Claude Artifact , looks like the claude doc isn’t clear about this, but this is boast by the news I read. ChatGPT-Canvas seem to have this feature too.

I mean use LLM to create dedicate UI for certain task on the fly, like I create Principles plugin when I want to record principles. Each task can have a dedicate UI.

All my customizations are plugins, so easier to manage, won’t get my wiki dirty.
And I deserve Notion-grade note app, making plugins force me make them better, instead of using geek style makeshift UI all my life, if I’m going to use TW for a long time.

I think I see what you mean… to clarify, you are saying that an LLM would create a UI based on the very specific usecase scenario and then store the information in a way that could be represented in universal ways? I think this is a very spot-on vision for the capabilities of LLM as a technology… imagine if all user interfaces customized themselves by-default to the way the user’s brain worked, or the highly-specific needs of any given situation.

I think this would be possible, within the constraints of tiddlywiki… and the user would obviously need to self-host if it were to be sharable. I can definitely imagine how that would have all sorts of extremely useful applications for communicating ideas with others.

Edit: Lol, thank you @linonetwo, I had never actually thought about creating a javascript game w/n Tiddlywiki and immediately had to try it:

Yes, the universal way to store data in TW is tiddler, and every UI will store result as tiddler. Like principles layout and calendar layout does. They can read each other’s data. What a web 3.0!

2 Likes

This conversation has sparked some really fun ideas for me :slight_smile: I could imagine someone running a DnD game could add game details to the system prompt and generate some really interesting story elements on the fly.


Hi, perhaps this is interesting:
I just stumbled upon langchain
wich could be usefull to integrate LLMs via node.js.
See How to Integrate AI into Your Node.js Application: A Step-by-Step Guide