Use http request message action widget for LLM AI, anyone tried this before?

When creating Command palette plugin v1.0, I find the referenced “Souk21/TW-commandpalette” plugin has a feature to call actionString (A tiddler of action widgets).

I think actionString + command palette may be a way to trigger AI generation, with context of currentTiddler.

I let GPT-4o generate an example API call to QWEN API

\procedure apikey() sk-xxxxxxxxxxx

\procedure qwenTextGenerate(promptValue)

  \procedure completion()
    \import [subfilter{$:/core/config/GlobalImportFilter}]
    <$action-log msg="In completion"/>
    <!-- Success -->
    <$list filter="[<status>compare:number:gteq[200]compare:number:lteq[299]]" variable="ignore">
      <$action-log msg="Generated text" text={{{ [<data>jsonget[output],[text]trim[]] }}}/>
      <!-- Save the generated text -->
      <$action-createtiddler $basetitle="$:/temp/TestQwenAPI/result" $overwrite="yes" text={{{ [<data>jsonget[output],[text]trim[]] }}}/>
    <!-- Failure -->
    <$list filter="[<status>compare:number:lt[200]] [<status>compare:number:gt[299]]" variable="ignore">
      <$action-log msg="API request failed" status=<<status>> statusText=<<statusText>> error=<<error>>/>
  \end completion

  \procedure request-url()
  \end request-url

  \procedure request-body()
    "model": "qwen-max",
    "input": {
      "messages": [
          "role": "system",
          "content": "You are a professional translator, specializing in using AI tools to translate the content I input. Target language: Chinese. Optimization points: grammar correction, conforming to normal Chinese expression, adapting to Chinese culture. Requirement: Try to use the professional terms in the file I uploaded, but if there is a serious conflict in meaning, do not adhere to the translation in the file. Special attention: Maintain the original meaning, optimize language fluency and accuracy. This is content from the game CDDA Cataclysm, ensure it fits a world after a zombie virus outbreak. Only output the translated content, do not provide any explanation."
          "role": "user",
          "content": "<<promptValue>>\n\nTranslation:"
    "max_tokens": 100,
    "temperature": 0.7
  \end request-body

  <$wikify name="url" text=<<request-url>>>
    <$wikify name="body" text=<<request-body>>>
        header-Authorization=`Bearer $(apikey)$`

\end qwenTextGenerate

\procedure display-input()
  <$edit-text tiddler="$:/temp/TestQwenAPI/prompt" tag="input" placeholder="Please enter the text to translate"/>
    <$macrocall $name="qwenTextGenerate" promptValue={{$:/temp/TestQwenAPI/prompt}} />
\end display-input

!! Qwen Translation API Call Example


<$macrocall $name="qwenTextGenerate" promptValue="Please enter the text to translate" />

!! Result


It is quite long, after several edit and argue with LLM, it finally works.

Just wondering anyone who tried to use this message before for AI?

Unable to use in website based HTML wiki, due to CORS issue.

But HTTP request to LLM server works fine when used in dedicated launcher app like TidGi-Desktop/Mobile or TiddlyDesktop.

Don’t trust AI generated wikitext, it may create many tiddlers mess up with your wiki


Does it have authentication possible? I’m struggling in that portion

See this part. But this is API specific. At least qwen API ask caller to write this.

But if I have authentication instead of API token…then how can I proceed

API endpoint authentication - #2 by yedhukrishna,

Could please see this, the method you suggested won’t work here right ?
Or is the any other way?