WikiSage -- Your Tiddlywiki AI Companion

Addendum:

When using TiddlyTools decrypt[...] filter, if the “password” you provide is incorrect the filter returns the text:

CORRUPT: ccm: tag doesn't match

(this is the value returned from the underlying Javascript sjcl.decrypt() library function)

So, when fetching the saved key from the encrypted JSON, if that text is returned, you will want to set the $:/temp/APIKey!!key value to blank, like this:

<$button> get key
   <$action-setfield $tiddler="$:/temp/APIKey"
      key={{{ [{MyAPIKey}decrypt{$:/temp/APIKey!!password}!match[CORRUPT: ccm: tag doesn't match]] }}}/>
</$button>

-e

I just installed it… but don’t I need the configtab to enter the api-key for gemini?

You don’t need the configtab, you can just edit the file directly :slight_smile:

But this should be far more helpful for you, and it’s packaged all pretty so you should be able to use the config tab to enter your API:

$__plugins_NoteStreams_WikiSage.tid (1.4 MB)

I’ve also included a “Toggles” tab that allows you to toggle on and off certain features.

Let me know what you think :slight_smile:

1 Like

This is a really great plugin! Thank you alot!
I have a question about the instruction in the config tab.
It seems that it has no effect on my prompts.
My instruction is something like “read my tiddlers before answering”
But if I ask a question about a topic of my wiki. The answer has absolutely nothing to do with my wiki.
How to force gpt to know my content by default?
Is the instruction added to the messages array as system prompt?

1 Like

Ok, it seems to work with this instruction:
“Always Search for relevant tiddlers and Extract their content.”

2 Likes

A preview of one of the many new features that will be in the new release:

vivaldi_ZxXtgRPdmC

1 Like

@Michael_Kohlhaas, you may appreciate one of the upcoming release features:

Which allows you to add multiple tiddlers which you would specifically like to reference and instructions, in addition to the standard conversation.

Eagle-eyed viewers will notice that it also includes an export-conversation feature, which is going to be standard in all the future widgets.

2 Likes

Hi @well-noted, thank you for the update. I finally had the time to test it and it is great to have the free-option of gemini in it.
Alas i found out the using it in my production-wikis is still a problem, because at the moment the API-Keys are saved, when I save on the go.
Alas the transclusion-workaround does not work. So could you please make the plugin use a temporary tiddler to store the key… and perhaps the mechanism suggested by @EricShulman to insert it form an encrypted value?

In the version of the plugin I installed, I can use the new gemini-API only in the coding wizard. Is there a version where i can use it with the normal interface.
It would be cool to be able to use the excise function on the results.

Yes, you should be able to use the normal interface now, but the next released version should work out-of-the-box

Can you explain what isn’t working about the gemini models in the normal interface? The model is in the model-list, but it does not work?

If I can think of a way to have this be a mode-switching toggle, I will do so. I think standard production mode it makes most sense to have it stored in a non-temporary tiddler – if there’s some way to have “use temporary tiddler instead” in the settings, I will do so.

1 Like

In the dropdown of $:/plugins/NoteStreams/WikiSage/WikiSage-ui I just have the option of gemini 2.5 flash and 3 openai llms. (Not like in the ai-coder) If I choose gemini it tells me to enter an open-ai-key - though the gemini-key is present and working in the ai coder.

I was not able to find the beautifull interface shown here in my version.

Try to find you $:/plugins/NoteStreams/WikiSage/model-list file and add models manually. My list is:

gemini-2.5-pro-exp-03-25
gemini-2.5-flash-preview-04-17
gpt-4.1-nano
gpt-4.1-mini
gpt-4.1
gpt-4o-mini
o4-mini-2025-04-16
gpt-4o-mini-tts
gpt-4.5-preview-2025-02-27
gpt-4.5-preview
chatgpt-4o-latest
gpt-4o
gpt-4
gpt-3.5-turbo
gpt-4o-2024-08-06
gpt-4.5-preview-2025-02-27
dall-e-3
o1-preview
claude-3-opus-20240229
claude-3-sonnet-20240229
claude-3-haiku-20240307
claude-3-5-haiku-latest
claude-3-5-sonnet-latest

That should fix the availability of the models in the list.

I don’t know why the AI coder would work while the chat would not, I would start by looking at the tiddler $:/plugins/NoteStreams/WikiSage/gemini-api-key and seeing if it’s actually there.

These are not problems I’ve encountered, so I’ll make sure to do some more extensive testing before the next release :slight_smile:

That will be in the next release, as well as the journal widgets :slight_smile:

The complete model list is visible in the coder-sidebar - but not in the tiddler that opens when i click the brain-button.

Interesting. It must be getting that list from somewhere, do a shadow and/or system tiddler search for model-list and see if there are any duplicates.

It’s possible that there is some kind of formatting error in the old widget file that I’ve subsequently fixed, such as Notestreams vs NoteStreams

If there is a second file, just update that model-list as well.

I would rather wait for the new beautifull interface :wink:

1 Like

psst $__plugins_NoteStreams_WikiSage 0.9.11.tid (1.5 MB) cough untested outside my wiki cough

3 Likes

https://talk.tiddlywiki.org/uploads/default/original/2X/7/74f5fdd768f55d0bb42e89cb781ae54ebc895c38.png

But I still cannot find this agent… and the model list in $:/plugins/NoteStreams/WikiSage/WikiSage-ui is still only 4 items

Yes, you’ll need to update that manually.

Is it possible to use a locally running LLM with this? LM Studio doesn’t give me an API key but can run a local server and it has the same ways of interaction that openAI has as far as i know.
If yes, how do I configure it?