ChatGPT for TiddlyWiki

As the title says, I made a simple ChatGPT plugin where you can customize the parameters and behavior of ChatGPT, but right now it only supports a single round of conversation where he will only answer your last question and not analyze your chat history - I will refine this feature later.

Another reason why it’s called “simple” is that it has no use other than to work in TiddlyWiki. I think it should work better with TiddlyWiki, like analyzing your tiddler. But I don’t have a specific application in mind at the moment.

In addition, you need to register yourself for an openai account and create an API Key, which is required for the plugin to work.

You can download version v0.0.1 from here:

$__plugins_Gk0Wk_chat-gpt.json (16.3 KB)

However, I strongly recommend that you use the CPL for plugin downloads and updates to ensure that the functionality of the plugin is always recent:

7 Likes

@Sttot i tired it out and it working. I tried to use the widget in different tiddlers with seperate state tiddlers to keep the history and it was working. I even tried it with streams and it was working there also, although i had to set the height a bit smaller.

It would be nice to have a clear button next to the chat box or somewhere else to clear the history.

It is now possible to re-edit questions and delete them:

I also added some system events to the widget for easy programming extensions:

For example to play out the answer with TTS technology. @linonetwo might work on it.

1 Like

With Text to Speech widget, use Web Speech synthesis (TTS), free and offline

This seems not working in v0.0.4 , action widget is not invocked.

<$chat-gpt system_message="Say Hello World t">
	<$speech-synthesis text=<<output-text>> />
</$chat-gpt>
1 Like

@Sttot I have noticed the responses are shorter and more to the point when using the API vs. the site. Why is this, and are there parameters that can help with this?

Separately, I had one query produce a bullet list of items that appeared in a single set of paragraph tags. Therefore, new bullets did not appear on a new line. So, I added a CSS style white-space: pre-wrap to resolve. Then, a few minutes later on a different response, it returned bullets in <ul>, rendering my white-space useless. Any ideas why?

Thanks for this.

Directly calling the API will be faster than using the web version of ChatGPT. Additionally, if you want to change the AI’s speaking style and manner, you can define the system_message.

What is System Message

System message is a feature in ChatGPT that allows developers to customize the language style and behavior of the AI response. It is an optional parameter that can be defined when calling the API, which modifies the default behavior of the ChatGPT model while returning the conversational response.

Developers can define specific parameters such as tone, emotions, or specific words to include or exclude from the AI response. This customization provides more control over the conversational flow and allows the developer to tailor the AI response to meet specific needs or use cases.

(Answered by ChatGPT)

The systemMessage used in the sidebar example is as follows:

You known much on TiddlyWiki. You should answer in Markdown format.

This helps to provide a more consistent experience for the user while also freeing up resources for other tasks.

Similarly, by setting the max_tokens parameter, it is possible to limit the length of the ChatGPT’s response. This can help to prevent the AI from going off on tangents or producing excessively long messages that may be difficult for the user to process.

1 Like

Any practical examples?

@Sttot Both of these parameters (token, system message) seemed to addresses my concerns. Thanks.

I failed to save my examples. I have not seen the format issue since adding the parameters.

One more question: I added a button to clear all questions in the history—which does not work. What actions can be put on a button to clear/delete all questions? Thanks again.

1 Like

@clsturgeon

if you defined the history attribute, just delete the tiddler it points to.

That was one of options I tried. This produced a JS error: uncaught TypeError: cannot read properties of undefined (reading’insertBefore’)

at ChatGPTWidget.render (widget.js2:7068);

This line reads:

t.insertBefore(i, e);

Hi, late to the chatGPT party. When I implement the plugin, I only get a “Thinking…” response, and no response from ChatGPT

@stevesuny I get this too. For me it suggests my api key has expired.

Hi, returning to this thread one year later.

Here is a demo of the plugin https://openai-api-tiddlywiki.tiddlyhost.com/

You’ll need to paste your own OpenAI key to see that, well, it doesn’t work. With a key (valid; i’ve used it in a shell script) I get “thinking”. Without a key, nothing is returned. So something is happening!

Great work, and hoping this can get fixed!

Thanks, //steve.

Longtime lurker, first time poster (thus I cannot post attachments yet) – if you are still looking to make this work for you I have come up with a solution that works in my test environment

Update $:/plugins/Gk0Wk/chat-gpt/widget.js

"use strict";

var import_widget = require("$:/core/modules/widgets/widget.js");

const CHAT_COMPLETION_URL = "https://api.openai.com/v1/chat/completions";

class ChatGPTWidget extends import_widget.widget {
  constructor() {
    super(...arguments);
    this.containerNodeTag = "div";
    this.containerNodeClass = "";
    this.tmpHistoryTiddler = "$:/temp/Gk0Wk/ChatGPT/history-" + Date.now();
    this.historyTiddler = this.tmpHistoryTiddler;
    this.chatButtonText = $tw.wiki.getTiddlerText("$:/core/images/add-comment");
    this.scroll = false;
    this.readonly = false;
    this.chatGPTOptions = {};
    this.systemMessage = "";
  }

  initialise(parseTreeNode, options) {
    super.initialise(parseTreeNode, options);
    this.computeAttributes();
  }

  execute() {
    this.containerNodeTag = this.getAttribute("component", "div");
    this.containerNodeClass = this.getAttribute("className", "");
    this.historyTiddler = this.getAttribute("history", "") || this.tmpHistoryTiddler;
    this.scroll = this.getAttribute("scroll", "").toLowerCase() === "yes";
    this.readonly = this.getAttribute("readonly", "").toLowerCase() === "yes";

    const temperature = Number(this.getAttribute("temperature"));
    const top_p = Number(this.getAttribute("top_p"));
    const max_tokens = parseInt(this.getAttribute("max_tokens"), 10);
    const presence_penalty = Number(this.getAttribute("presence_penalty"));
    const frequency_penalty = Number(this.getAttribute("frequency_penalty"));

    this.chatGPTOptions = {
      model: this.getAttribute("model", "gpt-4o-mini"),
      temperature: temperature >= 0 && temperature <= 2 ? temperature : undefined,
      top_p: top_p >= 0 && top_p <= 1 ? top_p : undefined,
      max_tokens: Number.isSafeInteger(max_tokens) && max_tokens > 0 ? max_tokens : undefined,
      presence_penalty: presence_penalty >= -2 && presence_penalty <= 2 ? presence_penalty : undefined,
      frequency_penalty: frequency_penalty >= -2 && frequency_penalty <= 2 ? frequency_penalty : undefined,
      user: this.getAttribute("user")
    };

    this.systemMessage = this.getAttribute("system_message", "");
  }

  render(parent, nextSibling) {
  if (!$tw.browser) return;

  this.parentDomNode = parent;
  this.computeAttributes();
  this.execute();

  const container = $tw.utils.domMaker(this.containerNodeTag, {
    class: "gk0wk-chatgpt-container " + this.containerNodeClass
  });

  try {
    const conversationsContainer = $tw.utils.domMaker("div", {
      class: this.scroll ? "conversations-scroll" : "conversations"
    });
    container.appendChild(conversationsContainer);

    if (!this.readonly) {
      const chatBox = this.createChatBox(conversationsContainer);
      container.appendChild(chatBox);
    }

    this.renderHistory(conversationsContainer);
  } catch (error) {
    console.error(error);
    container.textContent = String(error);
  }

  if (this.domNodes.length === 0) {
    parent.insertBefore(container, nextSibling);
    this.domNodes.push(container);
  } else {
    this.refreshSelf();
  }
}


  createChatBox(conversationsContainer) {
    const chatBox = $tw.utils.domMaker("div", { class: "chat-box" });
    const input = $tw.utils.domMaker("input", {
      class: "chat-input",
      attributes: {
        type: "text",
        placeholder: "Ask a question..."
      }
    });
    const button = $tw.utils.domMaker("button", {
      class: "chat-button",
      text: "Send",
      style: {
        height: "50px",
        minWidth: "80px",
        fontSize: "16px",
        padding: "0 15px"
      }
    });

    chatBox.appendChild(input);
    chatBox.appendChild(button);

    const clearButton = $tw.utils.domMaker("button", {
      class: "chat-button clear-history",
      text: "Clear",
      style: {
        height: "50px",
        minWidth: "80px",
        fontSize: "16px",
        marginLeft: "10px",
        padding: "0 15px",
        backgroundColor: "#f44336",
        color: "white",
        border: "none",
        cursor: "pointer"
      }
    });

    clearButton.onclick = () => {
      this.clearChatHistory(conversationsContainer);
    };

    chatBox.appendChild(clearButton);

    let isProcessing = false;

    const sendMessage = async () => {
      if (isProcessing) return;

      const apiKey = $tw.wiki.getTiddlerText("$:/plugins/Gk0Wk/chat-gpt/openai-api-key", "").trim();
      if (!apiKey) {
        alert("Please set your OpenAI API key in the plugin settings.");
        return;
      }

      const message = input.value.trim();
      if (!message) return;

      input.value = "";
      isProcessing = true;
      button.disabled = true;

      const conversation = this.createConversationElement(message);
      conversationsContainer.appendChild(conversation);

      try {
        await this.fetchChatGPTResponse(apiKey, message, conversation);
        this.saveConversationHistory(message, conversation.querySelector(".chatgpt-conversation-assistant").innerHTML);
      } catch (error) {
        console.error(error);
        this.showError(conversation, error.message);
      } finally {
        isProcessing = false;
        button.disabled = false;
      }
    };

    button.onclick = sendMessage;
    input.addEventListener("keydown", (event) => {
      if (event.key === "Enter" && !event.shiftKey) {
        event.preventDefault();
        sendMessage();
      }
    });

    return chatBox;
  }

  async fetchChatGPTResponse(apiKey, message, conversationElement) {
  const assistantMessageElement = conversationElement.querySelector(".chatgpt-conversation-assistant");
  const messages = [];
  
  if (this.systemMessage) {
    messages.push({ role: "system", content: this.systemMessage });
  }

  messages.push({ role: "user", content: message });

  const response = await fetch(CHAT_COMPLETION_URL, {
    method: "POST",
    headers: {
      "Authorization": `Bearer ${apiKey}`,
      "Content-Type": "application/json"
    },
    body: JSON.stringify({
      ...this.chatGPTOptions,
      messages: messages,
      stream: true
    })
  });

  if (!response.ok) {
    throw new Error(`HTTP error! status: ${response.status}`);
  }

  const reader = response.body.getReader();
  const decoder = new TextDecoder("utf-8");
  let buffer = "";
  let fullResponse = "";

  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    buffer += decoder.decode(value, { stream: true });
    const lines = buffer.split("\n");
    buffer = lines.pop();

    for (const line of lines) {
      if (line.startsWith("data: ")) {
        const data = line.slice(6);
        if (data === "[DONE]") {
          return;
        }
        try {
          const parsed = JSON.parse(data);
          const content = parsed.choices[0].delta.content || "";
          fullResponse += content;
          assistantMessageElement.innerHTML = this.renderMarkdownWithLineBreaks(fullResponse);
        } catch (error) {
          console.error("Error parsing SSE message:", error);
        }
      }
    }
  }

  // Handle any remaining buffer
  if (buffer.length > 0) {
    try {
      const parsed = JSON.parse(buffer);
      const content = parsed.choices[0].delta.content || "";
      fullResponse += content;
      assistantMessageElement.innerHTML = this.renderMarkdownWithLineBreaks(fullResponse);
    } catch (error) {
      console.error("Error parsing final buffer:", error);
    }
  }

  this.saveConversationHistory(message, fullResponse);
}

 

  renderMarkdownWithLineBreaks(text) {
    // Replace newlines with <br> tags, but preserve code blocks
    const codeBlockRegex = /```[\s\S]*?```/g;
    const codeBlocks = text.match(codeBlockRegex) || [];
    let index = 0;
    let result = text.replace(codeBlockRegex, () => `__CODE_BLOCK_${index++}__`);
    
    // Replace newlines with <br> tags outside of code blocks
    result = result.replace(/\n/g, '<br>');
    
    // Restore code blocks
    codeBlocks.forEach((block, i) => {
      result = result.replace(`__CODE_BLOCK_${i}__`, block);
    });
    
    // Use TiddlyWiki's built-in markdown parser
    return $tw.wiki.renderText("text/html", "text/x-markdown", result);
  }

  createConversationElement(message) {
    const conversation = $tw.utils.domMaker("div", { class: "chatgpt-conversation" });
    conversation.appendChild($tw.utils.domMaker("div", {
      class: "chatgpt-conversation-message chatgpt-conversation-user",
      children: [$tw.utils.domMaker("p", { text: message })]
    }));
    conversation.appendChild($tw.utils.domMaker("div", {
      class: "chatgpt-conversation-message chatgpt-conversation-assistant"
    }));
    return conversation;
  }

  showError(conversationElement, errorMessage) {
    const errorElement = $tw.utils.domMaker("div", {
      class: "chatgpt-conversation-error",
      children: [
        $tw.utils.domMaker("p", { text: "Error: " + errorMessage })
      ]
    });
    conversationElement.querySelector(".chatgpt-conversation-assistant").appendChild(errorElement);
  }

  saveConversationHistory(userMessage, assistantMessage) {
  let history = [];
  try {
    history = JSON.parse($tw.wiki.getTiddlerText(this.historyTiddler) || "[]");
  } catch (error) {
    console.error("Error parsing conversation history:", error);
  }

  history.push({
    id: Date.now().toString(),
    created: Date.now(),
    user: userMessage,
    assistant: assistantMessage
  });

  $tw.wiki.addTiddler(new $tw.Tiddler({
    title: this.historyTiddler,
    text: JSON.stringify(history)
  }));

  // Re-render the conversation
  const conversationsContainer = this.domNodes[0].querySelector(".conversations, .conversations-scroll");
  conversationsContainer.innerHTML = '';
  this.renderHistory(conversationsContainer);
}


  renderHistory(container) {
    let history = [];
    try {
      history = JSON.parse($tw.wiki.getTiddlerText(this.historyTiddler) || "[]");
    } catch (error) {
      console.error("Error parsing conversation history:", error);
    }

    for (const entry of history) {
      const conversation = this.createConversationElement(entry.user);
      conversation.querySelector(".chatgpt-conversation-assistant").innerHTML = entry.assistant;
      container.appendChild(conversation);
    }
  }

  clearChatHistory(conversationsContainer) {
    $tw.wiki.deleteTiddler(this.historyTiddler);
    conversationsContainer.innerHTML = '';
  }

 refresh(changedTiddlers) {
    const changedAttributes = this.computeAttributes();
    if (Object.keys(changedAttributes).length > 0 || this.historyTiddler in changedTiddlers) {
      this.refreshSelf();
      return true;
    }
    return false;
  }
}


exports["chat-gpt"] = ChatGPTWidget;

Update $:/plugins/Gk0Wk/chat-gpt/chatgpt-widget.css

.gk0wk-chatgpt-container {
    height: 100%;
    width: 100%;
    display: flex;
    padding: 10px 0;
    flex-direction: column;
}

.gk0wk-chatgpt-container .conversations {
    width: 100%;
    flex-grow: 1;
}

.gk0wk-chatgpt-container .conversations-scroll {
    height: 0;
    width: 100%;
    flex-grow: 1;
    overflow-y: auto;
}

.gk0wk-chatgpt-container .chat-box {
    width: 100%;
    display: flex;
    border: 1.5px solid #888a;
    border-radius: 5px;
    background: #8881;
}

.gk0wk-chatgpt-container .chat-input {
    flex-grow: 1;
    padding-left: 10px;
    font-size: 16px;
}

.gk0wk-chatgpt-container .chat-button {
    height: 45px;
    width: 45px;
    font-size: 20px;
}

.gk0wk-chatgpt-container .chatgpt-conversation {
    display: flex;
    flex-direction: column;
}

.gk0wk-chatgpt-container .chatgpt-conversation-assistant {
    background-image: linear-gradient(0deg, #8883, #8883);
}

.gk0wk-chatgpt-container .chatgpt-conversation-error .chatgpt-conversation-assistant {
    color: red;
}

.gk0wk-chatgpt-container .chatgpt-conversation-user {
    font-weight: 750;
}

.gk0wk-chatgpt-container .chatgpt-conversation-message {
    padding: 10px 20px;
}

.gk0wk-chatgpt-container .chat-button {
    height: 45px;
    min-width: 45px;
    font-size: 14px;
    padding: 0 10px;
}

.gk0wk-chatgpt-container .clear-history {
    background-color: #f44336;
    color: white;
    border: none;
    cursor: pointer;
}

.gk0wk-chatgpt-container .clear-history:hover {
    background-color: #d32f2f;
}

Your mileage may vary.

1 Like

That version includes conversation memory, by the way, so you can have an ongoing conversation (though the memory currently only persists as long as the tiddler is open)

Here is a version of the code which additionally adds the ability to pass a tiddler title through for review:

"use strict";

var import_widget = require("$:/core/modules/widgets/widget.js");

const CHAT_COMPLETION_URL = "https://api.openai.com/v1/chat/completions";

class ChatGPTWidget extends import_widget.widget {
  constructor() {
    super(...arguments);
    this.containerNodeTag = "div";
    this.containerNodeClass = "";
    this.tmpHistoryTiddler = "$:/temp/Gk0Wk/ChatGPT/history-" + Date.now();
    this.historyTiddler = this.tmpHistoryTiddler;
    this.chatButtonText = $tw.wiki.getTiddlerText("$:/core/images/add-comment");
    this.scroll = false;
    this.readonly = false;
    this.chatGPTOptions = {};
    this.systemMessage = "";
  }

  initialise(parseTreeNode, options) {
    super.initialise(parseTreeNode, options);
    this.computeAttributes();
  }

  execute() {
  this.containerNodeTag = this.getAttribute("component", "div");
  this.containerNodeClass = this.getAttribute("className", "");
  this.historyTiddler = this.getAttribute("history", "") || this.tmpHistoryTiddler;
  this.scroll = this.getAttribute("scroll", "").toLowerCase() === "yes";
  this.readonly = this.getAttribute("readonly", "").toLowerCase() === "yes";
  this.tiddlerTitle = this.getAttribute("tiddlerTitle", ""); // New attribute

  const temperature = Number(this.getAttribute("temperature"));
  const top_p = Number(this.getAttribute("top_p"));
  const max_tokens = parseInt(this.getAttribute("max_tokens"), 10);
  const presence_penalty = Number(this.getAttribute("presence_penalty"));
  const frequency_penalty = Number(this.getAttribute("frequency_penalty"));

  this.chatGPTOptions = {
    model: this.getAttribute("model", "gpt-4o-mini"),
    temperature: temperature >= 0 && temperature <= 2 ? temperature : undefined,
    top_p: top_p >= 0 && top_p <= 1 ? top_p : undefined,
    max_tokens: Number.isSafeInteger(max_tokens) && max_tokens > 0 ? max_tokens : undefined,
    presence_penalty: presence_penalty >= -2 && presence_penalty <= 2 ? presence_penalty : undefined,
    frequency_penalty: frequency_penalty >= -2 && frequency_penalty <= 2 ? frequency_penalty : undefined,
    user: this.getAttribute("user")
  };

  this.systemMessage = this.getAttribute("system_message", "");
}


  render(parent, nextSibling) {
  if (!$tw.browser) return;

  this.parentDomNode = parent;
  this.computeAttributes();
  this.execute();

  const container = $tw.utils.domMaker(this.containerNodeTag, {
    class: "gk0wk-chatgpt-container " + this.containerNodeClass
  });

  try {
    const conversationsContainer = $tw.utils.domMaker("div", {
      class: this.scroll ? "conversations-scroll" : "conversations"
    });
    container.appendChild(conversationsContainer);

    if (!this.readonly) {
      const chatBox = this.createChatBox(conversationsContainer);
      container.appendChild(chatBox);
    }

    this.renderHistory(conversationsContainer);
  } catch (error) {
    console.error(error);
    container.textContent = String(error);
  }

  if (this.domNodes.length === 0) {
    parent.insertBefore(container, nextSibling);
    this.domNodes.push(container);
  } else {
    this.refreshSelf();
  }
}


  createChatBox(conversationsContainer) {
    const chatBox = $tw.utils.domMaker("div", { class: "chat-box" });
    const input = $tw.utils.domMaker("input", {
      class: "chat-input",
      attributes: {
        type: "text",
        placeholder: "Ask a question..."
      }
    });
    const button = $tw.utils.domMaker("button", {
      class: "chat-button",
      text: "Send",
      style: {
        height: "50px",
        minWidth: "80px",
        fontSize: "16px",
        padding: "0 15px"
      }
    });

    chatBox.appendChild(input);
    chatBox.appendChild(button);

    const clearButton = $tw.utils.domMaker("button", {
      class: "chat-button clear-history",
      text: "Clear",
      style: {
        height: "50px",
        minWidth: "80px",
        fontSize: "16px",
        marginLeft: "10px",
        padding: "0 15px",
        backgroundColor: "#f44336",
        color: "white",
        border: "none",
        cursor: "pointer"
      }
    });

    clearButton.onclick = () => {
      this.clearChatHistory(conversationsContainer);
    };

    chatBox.appendChild(clearButton);

    let isProcessing = false;

  const sendMessage = async () => {
  if (isProcessing) return;

  const apiKey = $tw.wiki.getTiddlerText("$:/plugins/Gk0Wk/chat-gpt/openai-api-key", "").trim();
  if (!apiKey) {
    alert("Please set your OpenAI API key in the plugin settings.");
    return;
  }

  let prompt = input.value.trim();
  if (!prompt) return;

  let fullMessage = prompt;

  // Fetch the tiddler content if a tiddler title is provided
  if (this.tiddlerTitle) {
    const tiddlerContent = $tw.wiki.getTiddlerText(this.tiddlerTitle, "");
    if (tiddlerContent) {
      fullMessage = `${prompt}\n\n${tiddlerContent}`;
    }
  }

  input.value = "";
  isProcessing = true;
  button.disabled = true;

  const conversation = this.createConversationElement(prompt);
  conversationsContainer.appendChild(conversation);

  try {
    await this.fetchChatGPTResponse(apiKey, fullMessage, conversation);
    this.saveConversationHistory(prompt, conversation.querySelector(".chatgpt-conversation-assistant").innerHTML);
  } catch (error) {
    console.error(error);
    this.showError(conversation, error.message);
  } finally {
    isProcessing = false;
    button.disabled = false;
  }
};


    button.onclick = sendMessage;
    input.addEventListener("keydown", (event) => {
      if (event.key === "Enter" && !event.shiftKey) {
        event.preventDefault();
        sendMessage();
      }
    });

    return chatBox;
  }

  async fetchChatGPTResponse(apiKey, message, conversationElement) {
  const assistantMessageElement = conversationElement.querySelector(".chatgpt-conversation-assistant");
  const messages = [];
  
  if (this.systemMessage) {
    messages.push({ role: "system", content: this.systemMessage });
  }

  messages.push({ role: "user", content: message });

  const response = await fetch(CHAT_COMPLETION_URL, {
    method: "POST",
    headers: {
      "Authorization": `Bearer ${apiKey}`,
      "Content-Type": "application/json"
    },
    body: JSON.stringify({
      ...this.chatGPTOptions,
      messages: messages,
      stream: true
    })
  });

  if (!response.ok) {
    throw new Error(`HTTP error! status: ${response.status}`);
  }

  const reader = response.body.getReader();
  const decoder = new TextDecoder("utf-8");
  let buffer = "";
  let fullResponse = "";

  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    buffer += decoder.decode(value, { stream: true });
    const lines = buffer.split("\n");
    buffer = lines.pop();

    for (const line of lines) {
      if (line.startsWith("data: ")) {
        const data = line.slice(6);
        if (data === "[DONE]") {
          return;
        }
        try {
          const parsed = JSON.parse(data);
          const content = parsed.choices[0].delta.content || "";
          fullResponse += content;
          assistantMessageElement.innerHTML = this.renderMarkdownWithLineBreaks(fullResponse);
        } catch (error) {
          console.error("Error parsing SSE message:", error);
        }
      }
    }
  }

  // Handle any remaining buffer
  if (buffer.length > 0) {
    try {
      const parsed = JSON.parse(buffer);
      const content = parsed.choices[0].delta.content || "";
      fullResponse += content;
      assistantMessageElement.innerHTML = this.renderMarkdownWithLineBreaks(fullResponse);
    } catch (error) {
      console.error("Error parsing final buffer:", error);
    }
  }

  this.saveConversationHistory(message, fullResponse);
}

 

  renderMarkdownWithLineBreaks(text) {
    // Replace newlines with <br> tags, but preserve code blocks
    const codeBlockRegex = /```[\s\S]*?```/g;
    const codeBlocks = text.match(codeBlockRegex) || [];
    let index = 0;
    let result = text.replace(codeBlockRegex, () => `__CODE_BLOCK_${index++}__`);
    
    // Replace newlines with <br> tags outside of code blocks
    result = result.replace(/\n/g, '<br>');
    
    // Restore code blocks
    codeBlocks.forEach((block, i) => {
      result = result.replace(`__CODE_BLOCK_${i}__`, block);
    });
    
    // Use TiddlyWiki's built-in markdown parser
    return $tw.wiki.renderText("text/html", "text/x-markdown", result);
  }

  createConversationElement(message) {
    const conversation = $tw.utils.domMaker("div", { class: "chatgpt-conversation" });
    conversation.appendChild($tw.utils.domMaker("div", {
      class: "chatgpt-conversation-message chatgpt-conversation-user",
      children: [$tw.utils.domMaker("p", { text: message })]
    }));
    conversation.appendChild($tw.utils.domMaker("div", {
      class: "chatgpt-conversation-message chatgpt-conversation-assistant"
    }));
    return conversation;
  }

  showError(conversationElement, errorMessage) {
    const errorElement = $tw.utils.domMaker("div", {
      class: "chatgpt-conversation-error",
      children: [
        $tw.utils.domMaker("p", { text: "Error: " + errorMessage })
      ]
    });
    conversationElement.querySelector(".chatgpt-conversation-assistant").appendChild(errorElement);
  }

  saveConversationHistory(userMessage, assistantMessage) {
  let history = [];
  try {
    history = JSON.parse($tw.wiki.getTiddlerText(this.historyTiddler) || "[]");
  } catch (error) {
    console.error("Error parsing conversation history:", error);
  }

  history.push({
    id: Date.now().toString(),
    created: Date.now(),
    user: userMessage,
    assistant: assistantMessage
  });

  $tw.wiki.addTiddler(new $tw.Tiddler({
    title: this.historyTiddler,
    text: JSON.stringify(history)
  }));

  // Re-render the conversation
  const conversationsContainer = this.domNodes[0].querySelector(".conversations, .conversations-scroll");
  conversationsContainer.innerHTML = '';
  this.renderHistory(conversationsContainer);
}


  renderHistory(container) {
    let history = [];
    try {
      history = JSON.parse($tw.wiki.getTiddlerText(this.historyTiddler) || "[]");
    } catch (error) {
      console.error("Error parsing conversation history:", error);
    }

    for (const entry of history) {
      const conversation = this.createConversationElement(entry.user);
      conversation.querySelector(".chatgpt-conversation-assistant").innerHTML = entry.assistant;
      container.appendChild(conversation);
    }
  }

  clearChatHistory(conversationsContainer) {
    $tw.wiki.deleteTiddler(this.historyTiddler);
    conversationsContainer.innerHTML = '';
  }

 refresh(changedTiddlers) {
    const changedAttributes = this.computeAttributes();
    if (Object.keys(changedAttributes).length > 0 || this.historyTiddler in changedTiddlers) {
      this.refreshSelf();
      return true;
    }
    return false;
  }
}


exports["chat-gpt"] = ChatGPTWidget;

Call with <$chat-gpt tiddlerTitle="test-operation" />

example:

Please note, in both of these versions I have selected gpt-4o-mini as the model, since it is the most affordable (and quite capable, in my experience)

I believe a dropdown menu to select from models would be an excellent future feature, though you can change it manually for the time being.

If you would like to reference a tiddler that includes transclusions, for example, you would want to make a small adjustment to the code:

const sendMessage = async () => {
  if (isProcessing) return;

  const apiKey = $tw.wiki.getTiddlerText("$:/plugins/Gk0Wk/chat-gpt/openai-api-key", "").trim();
  if (!apiKey) {
    alert("Please set your OpenAI API key in the plugin settings.");
    return;
  }

  let prompt = input.value.trim();
  if (!prompt) return;

  let fullMessage = prompt;

  // Fetch and wikify the tiddler content if a tiddler title is provided
  if (this.tiddlerTitle) {
    const tiddlerContent = $tw.wiki.renderTiddler("text/plain", this.tiddlerTitle);
    if (tiddlerContent) {
      fullMessage = `${prompt}\n\n${tiddlerContent}`;
    }
  }

  input.value = "";
  isProcessing = true;
  button.disabled = true;

  const conversation = this.createConversationElement(prompt);
  conversationsContainer.appendChild(conversation);

  try {
    await this.fetchChatGPTResponse(apiKey, fullMessage, conversation);
    this.saveConversationHistory(prompt, conversation.querySelector(".chatgpt-conversation-assistant").innerHTML);
  } catch (error) {
    console.error(error);
    this.showError(conversation, error.message);
  } finally {
    isProcessing = false;
    button.disabled = false;
  }
};

Apologies for the triple post (forgive if I am breaking any standards and practices, I will correct as possible)

If you happen to be using the (fantastic) Streams plugin, you may want to use this version of the code which I have modified to also check the tiddler’s stream-list and include that content in the referenced material

"use strict";

var import_widget = require("$:/core/modules/widgets/widget.js");

const CHAT_COMPLETION_URL = "https://api.openai.com/v1/chat/completions";

class ChatGPTWidget extends import_widget.widget {
  constructor() {
    super(...arguments);
    this.containerNodeTag = "div";
    this.containerNodeClass = "";
    this.tmpHistoryTiddler = "$:/temp/Gk0Wk/ChatGPT/history-" + Date.now();
    this.historyTiddler = this.tmpHistoryTiddler;
    this.chatButtonText = $tw.wiki.getTiddlerText("$:/core/images/add-comment");
    this.scroll = false;
    this.readonly = false;
    this.chatGPTOptions = {};
    this.systemMessage = "";
  }

  initialise(parseTreeNode, options) {
    super.initialise(parseTreeNode, options);
    this.computeAttributes();
  }

  execute() {
  this.containerNodeTag = this.getAttribute("component", "div");
  this.containerNodeClass = this.getAttribute("className", "");
  this.historyTiddler = this.getAttribute("history", "") || this.tmpHistoryTiddler;
  this.scroll = this.getAttribute("scroll", "").toLowerCase() === "yes";
  this.readonly = this.getAttribute("readonly", "").toLowerCase() === "yes";
  this.tiddlerTitle = this.getAttribute("tiddlerTitle", ""); // New attribute

  const temperature = Number(this.getAttribute("temperature"));
  const top_p = Number(this.getAttribute("top_p"));
  const max_tokens = parseInt(this.getAttribute("max_tokens"), 10);
  const presence_penalty = Number(this.getAttribute("presence_penalty"));
  const frequency_penalty = Number(this.getAttribute("frequency_penalty"));

  this.chatGPTOptions = {
    model: this.getAttribute("model", "gpt-4o-mini"),
    temperature: temperature >= 0 && temperature <= 2 ? temperature : undefined,
    top_p: top_p >= 0 && top_p <= 1 ? top_p : undefined,
    max_tokens: Number.isSafeInteger(max_tokens) && max_tokens > 0 ? max_tokens : undefined,
    presence_penalty: presence_penalty >= -2 && presence_penalty <= 2 ? presence_penalty : undefined,
    frequency_penalty: frequency_penalty >= -2 && frequency_penalty <= 2 ? frequency_penalty : undefined,
    user: this.getAttribute("user")
  };

  this.systemMessage = this.getAttribute("system_message", "");
}


  render(parent, nextSibling) {
  if (!$tw.browser) return;

  this.parentDomNode = parent;
  this.computeAttributes();
  this.execute();

  const container = $tw.utils.domMaker(this.containerNodeTag, {
    class: "gk0wk-chatgpt-container " + this.containerNodeClass
  });

  try {
    const conversationsContainer = $tw.utils.domMaker("div", {
      class: this.scroll ? "conversations-scroll" : "conversations"
    });
    container.appendChild(conversationsContainer);

    if (!this.readonly) {
      const chatBox = this.createChatBox(conversationsContainer);
      container.appendChild(chatBox);
    }

    this.renderHistory(conversationsContainer);
  } catch (error) {
    console.error(error);
    container.textContent = String(error);
  }

  if (this.domNodes.length === 0) {
    parent.insertBefore(container, nextSibling);
    this.domNodes.push(container);
  } else {
    this.refreshSelf();
  }
}


  createChatBox(conversationsContainer) {
    const chatBox = $tw.utils.domMaker("div", { class: "chat-box" });
    const input = $tw.utils.domMaker("input", {
      class: "chat-input",
      attributes: {
        type: "text",
        placeholder: "Ask a question..."
      }
    });
    const button = $tw.utils.domMaker("button", {
      class: "chat-button",
      text: "Send",
      style: {
        height: "50px",
        minWidth: "80px",
        fontSize: "16px",
        padding: "0 15px"
      }
    });

    chatBox.appendChild(input);
    chatBox.appendChild(button);

    const clearButton = $tw.utils.domMaker("button", {
      class: "chat-button clear-history",
      text: "Clear",
      style: {
        height: "50px",
        minWidth: "80px",
        fontSize: "16px",
        marginLeft: "10px",
        padding: "0 15px",
        backgroundColor: "#f44336",
        color: "white",
        border: "none",
        cursor: "pointer"
      }
    });

    clearButton.onclick = () => {
      this.clearChatHistory(conversationsContainer);
    };

    chatBox.appendChild(clearButton);

    let isProcessing = false;

  const sendMessage = async () => {
  if (isProcessing) return;

  const apiKey = $tw.wiki.getTiddlerText("$:/plugins/Gk0Wk/chat-gpt/openai-api-key", "").trim();
  if (!apiKey) {
    alert("Please set your OpenAI API key in the plugin settings.");
    return;
  }

  let message = input.value.trim();
  if (!message) return;

  let fullMessage = message;

  // Fetch and wikify the tiddler content if a tiddler title is provided
  if (this.tiddlerTitle) {
    console.log(`Fetching content for tiddler: ${this.tiddlerTitle}`);
    const tiddlerContent = $tw.wiki.renderTiddler("text/plain", this.tiddlerTitle);
    if (tiddlerContent) {
      fullMessage += `\n\nMain Tiddler Content:\n${tiddlerContent}`;
    } else {
      console.warn(`No content found for tiddler: ${this.tiddlerTitle}`);
    }

    // Get the stream-list field
    const tiddler = $tw.wiki.getTiddler(this.tiddlerTitle);
    if (tiddler && tiddler.fields["stream-list"]) {
      console.log(`Stream-list found for tiddler: ${this.tiddlerTitle}`);
      const streamList = tiddler.fields["stream-list"];
      
      // Use a regular expression to match tiddler titles, including those with spaces
      const streamTiddlers = streamList.match(/\[\[([^\]]+)\]\]|(\S+)/g).map(title => title.replace(/^\[\[|\]\]$/g, ''));

      for (const streamTiddler of streamTiddlers) {
        if ($tw.wiki.tiddlerExists(streamTiddler)) {
          console.log(`Fetching content for stream tiddler: ${streamTiddler}`);
          const streamContent = $tw.wiki.renderTiddler("text/plain", streamTiddler);
          fullMessage += `\n\nStream Tiddler (${streamTiddler}):\n${streamContent}`;
        } else {
          console.warn(`Stream tiddler does not exist: ${streamTiddler}`);
        }
      }
    } else {
      console.warn(`No stream-list field found for tiddler: ${this.tiddlerTitle}`);
    }
  }

  input.value = "";
  isProcessing = true;
  button.disabled = true;

  const conversation = this.createConversationElement(message);
  conversationsContainer.appendChild(conversation);

  try {
    await this.fetchChatGPTResponse(apiKey, fullMessage, conversation);
    this.saveConversationHistory(message, conversation.querySelector(".chatgpt-conversation-assistant").innerHTML);
  } catch (error) {
    console.error(error);
    this.showError(conversation, error.message);
  } finally {
    isProcessing = false;
    button.disabled = false;
  }
};



    button.onclick = sendMessage;
    input.addEventListener("keydown", (event) => {
      if (event.key === "Enter" && !event.shiftKey) {
        event.preventDefault();
        sendMessage();
      }
    });

    return chatBox;
  }

  async fetchChatGPTResponse(apiKey, message, conversationElement) {
  const assistantMessageElement = conversationElement.querySelector(".chatgpt-conversation-assistant");
  const messages = [];
  
  if (this.systemMessage) {
    messages.push({ role: "system", content: this.systemMessage });
  }

  messages.push({ role: "user", content: message });

  const response = await fetch(CHAT_COMPLETION_URL, {
    method: "POST",
    headers: {
      "Authorization": `Bearer ${apiKey}`,
      "Content-Type": "application/json"
    },
    body: JSON.stringify({
      ...this.chatGPTOptions,
      messages: messages,
      stream: true
    })
  });

  if (!response.ok) {
    throw new Error(`HTTP error! status: ${response.status}`);
  }

  const reader = response.body.getReader();
  const decoder = new TextDecoder("utf-8");
  let buffer = "";
  let fullResponse = "";

  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    buffer += decoder.decode(value, { stream: true });
    const lines = buffer.split("\n");
    buffer = lines.pop();

    for (const line of lines) {
      if (line.startsWith("data: ")) {
        const data = line.slice(6);
        if (data === "[DONE]") {
          return;
        }
        try {
          const parsed = JSON.parse(data);
          const content = parsed.choices[0].delta.content || "";
          fullResponse += content;
          assistantMessageElement.innerHTML = this.renderMarkdownWithLineBreaks(fullResponse);
        } catch (error) {
          console.error("Error parsing SSE message:", error);
        }
      }
    }
  }

  // Handle any remaining buffer
  if (buffer.length > 0) {
    try {
      const parsed = JSON.parse(buffer);
      const content = parsed.choices[0].delta.content || "";
      fullResponse += content;
      assistantMessageElement.innerHTML = this.renderMarkdownWithLineBreaks(fullResponse);
    } catch (error) {
      console.error("Error parsing final buffer:", error);
    }
  }

  this.saveConversationHistory(message, fullResponse);
}

 

  renderMarkdownWithLineBreaks(text) {
    // Replace newlines with <br> tags, but preserve code blocks
    const codeBlockRegex = /```[\s\S]*?```/g;
    const codeBlocks = text.match(codeBlockRegex) || [];
    let index = 0;
    let result = text.replace(codeBlockRegex, () => `__CODE_BLOCK_${index++}__`);
    
    // Replace newlines with <br> tags outside of code blocks
    result = result.replace(/\n/g, '<br>');
    
    // Restore code blocks
    codeBlocks.forEach((block, i) => {
      result = result.replace(`__CODE_BLOCK_${i}__`, block);
    });
    
    // Use TiddlyWiki's built-in markdown parser
    return $tw.wiki.renderText("text/html", "text/x-markdown", result);
  }

  createConversationElement(message) {
    const conversation = $tw.utils.domMaker("div", { class: "chatgpt-conversation" });
    conversation.appendChild($tw.utils.domMaker("div", {
      class: "chatgpt-conversation-message chatgpt-conversation-user",
      children: [$tw.utils.domMaker("p", { text: message })]
    }));
    conversation.appendChild($tw.utils.domMaker("div", {
      class: "chatgpt-conversation-message chatgpt-conversation-assistant"
    }));
    return conversation;
  }

  showError(conversationElement, errorMessage) {
    const errorElement = $tw.utils.domMaker("div", {
      class: "chatgpt-conversation-error",
      children: [
        $tw.utils.domMaker("p", { text: "Error: " + errorMessage })
      ]
    });
    conversationElement.querySelector(".chatgpt-conversation-assistant").appendChild(errorElement);
  }

  saveConversationHistory(userMessage, assistantMessage) {
  let history = [];
  try {
    history = JSON.parse($tw.wiki.getTiddlerText(this.historyTiddler) || "[]");
  } catch (error) {
    console.error("Error parsing conversation history:", error);
  }

  history.push({
    id: Date.now().toString(),
    created: Date.now(),
    user: userMessage,
    assistant: assistantMessage
  });

  $tw.wiki.addTiddler(new $tw.Tiddler({
    title: this.historyTiddler,
    text: JSON.stringify(history)
  }));

  // Re-render the conversation
  const conversationsContainer = this.domNodes[0].querySelector(".conversations, .conversations-scroll");
  conversationsContainer.innerHTML = '';
  this.renderHistory(conversationsContainer);
}


  renderHistory(container) {
    let history = [];
    try {
      history = JSON.parse($tw.wiki.getTiddlerText(this.historyTiddler) || "[]");
    } catch (error) {
      console.error("Error parsing conversation history:", error);
    }

    for (const entry of history) {
      const conversation = this.createConversationElement(entry.user);
      conversation.querySelector(".chatgpt-conversation-assistant").innerHTML = entry.assistant;
      container.appendChild(conversation);
    }
  }

  clearChatHistory(conversationsContainer) {
    $tw.wiki.deleteTiddler(this.historyTiddler);
    conversationsContainer.innerHTML = '';
  }

 refresh(changedTiddlers) {
    const changedAttributes = this.computeAttributes();
    if (Object.keys(changedAttributes).length > 0 || this.historyTiddler in changedTiddlers) {
      this.refreshSelf();
      return true;
    }
    return false;
  }
}


exports["chat-gpt"] = ChatGPTWidget;

example:

Take a look at Add "AI Tools" Plugin by Jermolene · Pull Request #8365 · TiddlyWiki/TiddlyWiki5 · GitHub , I think this core plugin needs some early adaptor. If it becomes useful, tw will be AI native.

1 Like