[Techie] Scheduling/requesting git commits for shared Node.js wikis

Does anyone have working techniques to automate git commits/pushes for Node.js wikis?

I have several wikis that I want to open up to additional editors. On my own, I simply run Node versions locally, and when I want to publish, I manually commit and push, leaving it to GitHub pages (or sometimes GitLab pages at work) to publish from there.

But I’m a techie who lives in git. Most other editors won’t be. There won’t be a large number of editors, and I’m not particularly concerned about contention. But I want their changes and mine to get saved back to git on some cadence. I would prefer if there is some scheduled check (every 30 minutes?) to see if there are unsaved changes, and then, if there have been no changes for some shorter period (5 minutes?) to commit everything to git, and push to GitHub/GitLab/Bitbucket/Codeberg/whatever.

Does anyone have examples of a similar workflow?


I would also love to see any examples people have of creating git tags/versions from a client-side event initiation. Have you done something like that?

I dont have an example of that… yet, but it’s definitely in line with what I’ve been planning to write, though my use-case is “just me” and a general preference to save history here into git. (sometimes going so far as to create reconstructed git histories from files found in normal backups!). I had similar thinking of “run from cron regularly, and commit if there have been edits, but a sufficient delay since the last edit to indicate a “session” of editing is likely done”

The part of my git workflow I HAVE been very happy with, is my git-automsg.sh script, which generates a git log with the summary counting the changes, and body of the log listing each file and it’s new/deleted/modified/renamed state - it makes a nice automated git log message, and my TW/node helper script effectively just runs git add * && git commit -a -m "$(git-automsg.sh)

git-automsg from: GitHub - nemothorx/neon-git: nemo's various git helper scripts

In terms of triggering git from client-side - I recall David Bovill talking about that during the Hitchhiker’s Guide to Wiki zoom meeting back in March (Invitation to a Hitchhikers TiddlyWiki All-nighter), so I know it’s doable, but I dont know if it’s public or usable

So you don’t know TidGi Desktop could do that? That is why its name is tid “Gi”, short for Git.

It auto git commit every 30 min (configuable in the tidgi preference), that save many data for users, when they mess-up with the wiki.

It is an Electron based tiddlywiki app that wraps the nodejs wiki. You could import HTML wiki into it with some mouse click.

Thanks. I’ll take a look.

I saw a lot of that discussion, but far from everything. I don’t remember that specifically. It’s something I’ve long know was possible, but I’ve never tried it myself or seen any actual examples.

No, I didn’t know that. I’m glad to know this and to know where the name comes from. But a desktop tool, even it if could be configured to run against an online wiki, is not the right tool for this scenario, I’m quite certain. I would not want to ask my users to install anything.

I started with git auto commit script ( GitHub - DiamondYuan/wiki , seems this guy not using tiddlywiki anymore) from 2019

And I make a JS version later, integrated into TidGi in 2020 . I get feedback from my blog post comments that thos git sync script are not easy to understand, so I made TidGi exe version to easify it.

Why it isn’t a nodejs wiki plugin? Because git-sync-js requires a git binary file, can’t share with-in JSON plugin file. But you could try with GitHub - simonthum/git-sync: Safe and simple one-script git synchronization and wrap it as a nodejs wiki plugin.

That’s interesting, as is your git-sync project.

I had always assumed that TW server-client is too chatty to do individual commits on every save, but in my day job, I’ve just been porting an (Enterprise) GitLab API tool into an (Enterprise) GitHub one. Its saves are atomic single-file commits. While they are less frequent during a working session than my typical TW usage, I’m wondering whether this could work for TW. I’m going to have to think about it more.

Thank you very much for something interesting to consider!

I have a bash script that builds my node.js TW to a static site and then git commits it and pushes to GitHub every 15 minutes - via cron.

I edit on the node.js version (also includes content that isn’t published) and then others can read my static version.

I supposed I could set that up. But I usually publish dynamic wikis that are only editable when served over Node. So I have a less frequent, on-demand build, but I would like the commits to happen relatively quickly on change. Although if I can get them to build when there’s a certain length lull in saves, that would be ideal.

Thank you, though. If you script is public, I’d love to have a look.

Don’t need to git commit on every tiddler change. That will result in very large .git folder. Or perhaps GitHub - jj-vcs/jj: A Git-compatible VCS that is both simple and powerful will make history smaller?

That is what I did in 2020, my script is on links above. But I instead build static site only on github action, and keeps git repo only contains nodejs wiki after a year, where each tiddler is a file, so .git is much smaller. Commiting HTML wiki to git repo was a big mistack, my repo gets very large after a year.

#!/bin/bash

# run tiddlywiki builder

source /home/alex/dev/wiki/build.sh

# git push

cd /home/alex/dev/public_wiki
git pull
git add .
git commit -am "Autobuild: `date`"
git push

Where the build script is

#!/bin/bash
PUB_FOLDER='/home/alex/dev/public_wiki'
pub_wiki="${PUB_FOLDER}/wiki"

# Export filtered version
rm -rf $pub_wiki
/usr/local/bin/tiddlywiki /home/alex/dev/wiki --savewikifolder /home/alex/dev/public_wiki/wiki "[all[tiddlers]] -[tag[Private]] -[has[draft.of]]"

# Externalise images
/usr/local/bin/tiddlywiki /home/alex/dev/public_wiki/wiki --save "[is[image]]" "[encodeuricomponent[]addprefix[images/]]"
/usr/local/bin/tiddlywiki /home/alex/dev/public_wiki/wiki --setfield "[is[image]]" _canonical_uri '$:/core/templates/canonical-uri-external-image' text/plain --setfield "[is[image]]" text "" text/plain --render '$:/plugins/tiddlywiki/tiddlyweb/save/offline' index.html text/plain

# Copy to github folder
cp -r /home/alex/dev/public_wiki/wiki/output/* /home/alex/dev/public_wiki/docs/

I’m sure @linonetwo is correct and this isn’t the best way. But my point is don’t forget about bash scripts, cron jobs, etc. I probably wanted it immediately, but then decided within 15 minutes was good enough. I had thought about maybe flask API server that you can trigger from TW and it calls the script on the server. There are lots of ways…try asking your favourite LLM to see if it has any other ideas.

Thank you for sharing. I will probably end up doing a frequent granular commit, perhaps on every save, but I haven’t figured out how I want to schedule builds. It might end up being in cron, but my preference would be something that waits for a pause in save/commit activity. That’s why I was thinking of running it out of the Node job, which already has this information. I don’t know, though, if it keeps any such state around. (It also makes sense because I want the facility in my Node wiki anyway to run on-demand builds from the client side: essentially an add tag/version feature.)

1 Like

When I get my equivalent setup, my aim is a similar “pause on activity”, but I was thinking of detecting the pause via bash scripting. Something like “if git status says there are changes AND the newest file is more than 30 minutes old, then git commit” - and then have the cron check every 5 minutes or something.

(though my current manual commit rate is every few days unless I’ve made a change I want to make special note of, so I may tune my timing to suit that sort of pace)

Yes, that would work quite well. I’ve never checked git commit history from bash, so that’s something new to learn.

Cheers!

Sounds like we have opposite experiences with git - my usage is (almost) 100% in bash!

Just thinking on it a little more, I think rather than ask git if there are any unstaged or untracked files, it may be better to just get the last commit time from git’s log, and then see if any files are newer. All up, I think there are three times to consider for your heuristics:

  • last git commit
  • last file update
  • time the script runs

git --no-pager log -1 --date=raw --format="%ct" will get you the epoch time of the last commit (%cd and %ci are both good human readable formats, though drop the date=raw option for those)