Hey,
Get the same error via operation, when you click on query it works well !! Perhaps so can fix it with this info.
Best
hi @well-noted ,
i’m trying to make this work with lm studio and getting the following error:
wikisage says Error: Failed to fetch
and lm studio says 'messages' field is required
i got them to connect by simply find and replacing all the links to openai to the localhost where lm studio runs at and then adding the correct models in wikisage and lm-studio
as an openai api key, but i can’t figure out this “messages” problem
in the lm studio docs i found the following
curl http://127.0.0.1:1234/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-oss-20b",
"messages": [
{ "role": "system", "content": "Always answer in rhymes." },
{ "role": "user", "content": "Introduce yourself." }
],
"temperature": 0.7,
"max_tokens": -1,
"stream": true
}'
but as far as i can tell wikisage looks very similar to this, when i tried to add double quetes where there werent in wikisage the whole plugin stopped working so i’m guessing that’s not the problem
can you help?
edit: i see earlier you said if you add local llms you’d start with ollama, with that one i couldn’t make a connection between them, but i’m no programmer and just poking around. i can’t add an api key because it doesn’t provide one and so wikisage just says missing api key.
Hi just upgraded to the latest version I’ve noticed when using the widget it still tries to format the output in TiddlyWiki wikitext instead of just returning the raw markdown. Are there plans to fix that in future updates?