Hi folks,
I want to use the openAI API to submit a query to chatgpt, and receive the response in a new tiddler.
I’ve been working on integrating chatgpt (and others, like perplexity and gemini advanced) with TiddlyWiki, mostly by processing the exported json files that openAI provides.
I’ve been newly inspired by this video, showing a user chatting with and receiving responses from chatgpt within an emacs buffer. https://www.youtube.com/watch?v=bsRnh_brggM
I want to do the same thing within TiddlyWiki.
Thoughts?
Here are four topics where the community has discussed ChatGPT and Tiddlywiki:
As the title says, I made a simple ChatGPT plugin where you can customize the parameters and behavior of ChatGPT, but right now it only supports a single round of conversation where he will only answer your last question and not analyze your chat history - I will refine this feature later.
[image]
Another reason why it’s called “simple” is that it has no use other than to work in TiddlyWiki. I think it should work better with TiddlyWiki, like analyzing your tiddler. But I don’t have a specifi…
Steve I am very interested in your work here because I too make use of ChatGPT and save results in a TiddlyWiki. I would be happy to help refine the import process if you can give me;
Some sample data with new and duplicate conversations I can import to a copy of your demo site.
I may be able to build on this to achive the remove duplicates. I think its fair to reject tiddlers with the matching conversation code title.
However I am already quite happy with a simpler use of Tiddlywiki for Chat…
I have done some experiments with calling the OpenAI GPT3 API (which predates ChatGPT). It’s a pretty straightforward JSON REST API, and crucially it is CORS enabled so it can be used from single file wikis.
I have being exploring the use of ChatGPT in general, learing how best it can be used, its strengths and Limitations. I am using TiddlyWiki to store Knowledge obtained through ChatGPT and exploring what it knows about Tiddlywiki.
I thought I would start this discussion to share our experience and observations, to discover what it any supporting tools and other issues arise.
If you want to share results it could get quite verbose so consider sharing the Question put and not the answers.
A few…
I don’t have any idea about your question, as I’ve entirely avoided Generative AI, but I want to offer a little celebration on your having opened the ten thousandth topic in the forum!
You can use action widget to send HTTP requst, see example on hyper-tabld plugin doc - LoadPivotTableExampleJSON
I use this technique to write GitHub - tiddly-gittly/ai-actionstring: AI actions for TiddlyWiki CommandPalette , but is not finished yet. You can get some idea from
caption: <<lingo ShowControlPanel $:/plugins/linonetwo/ai-actionstring/language/>>
tags: $:/tags/Actions
title: $:/plugins/linonetwo/ai-actionstring/Qwen
\procedure qwenTextGenerate(promptValue)
\procedure completion()
\import [subfilter{$:/core/config/GlobalImportFilter}]
<$action-log msg="In completion"/>
<!-- Success -->
<$list filter="[<status>compare:number:gteq[200]compare:number:lteq[299]]" variable="ignore">
<$action-log msg="Generated text" text={{{ [<data>jsonget[output],[text]trim[]] }}}/>
<!-- Save the generated text -->
<$action-createtiddler $basetitle="$:/temp/TestQwenAPI/result" $overwrite="yes" text={{{ [<data>jsonget[output],[text]trim[]] }}}/>
</$list>
<!-- Failure -->
<$list filter="[<status>compare:number:lt[200]] [<status>compare:number:gt[299]]" variable="ignore">
<$action-log msg="API request failed" status=<<status>> statusText=<<statusText>> error=<<error>>/>
</$list>
\end completion
This file has been truncated. show original
If only I had learned Emacs!