Check website online or offline as a badge widget

Not a very valuable plugin because AI wrote it in a few minutes.

I use it to check availability of my blog digital garden website https://wiki.onetwo.website/

Demo and drag to install: https://tiddly-gittly.github.io/tw-server-sitemap/#%24%3A%2Fplugins%2Flinonetwo%2Fcheck-website
CPL: https://tw-cpl.netlify.app/#Plugin_202310261631806%201
Source: tw-server-sitemap/src/check-website at master · tiddly-gittly/tw-server-sitemap · GitHub

The prompt to write this “high quality” typescript tiddlywiki plugin is

Create a TiddlyWiki plugin to check the availability of a given URL. I want its usage to be similar to providing a widget.

The "interval" is optional, with a default value of 1h. It supports simple formats like 1h30m, but does not support "s" (seconds), as hours and minutes are generally sufficient.

Internally, it will periodically fetch the website to check basic accessibility.

The result will be displayed as a badge, similar to the various badges commonly seen on GitHub project homepages. The content will be a "label". The "label" is also optional; if not provided, the host part of the URL will be used.

Provide usage examples in the readme. All content should be in English. You should refer to https://github.com/tiddly-gittly/Modern.TiddlyDev/tree/master/src/plugin-name to write the TypeScript version. Additionally, CSS can be used to implement the badge style.
4 Likes

@linonetwo I love this and it seems not to have a performance impact.

I presume as soon as it is displayed it does the first test, then every period set there after? As long as it is visible somewhere?

I am really keen to use this to monitor a number of sites, but I also see it can test local addresses too, such as <$check-website url="http://192.168.0.1" /> whi8ch is my internet gateway, a cellular hot spot, or google.com <$check-website url="https://google.com" /> for internet connectivity. This is because my hotspot has misbehaved recently.

If you do ever think of taking this out of experimental, I would love if it triggered named actions when the state changes. Either with a variable indicating connected or disconnected, or connect-actions/disconnect actions parameters, and only when checking at the set interval.

  • Perhaps a way to have them either working in the background or active but invisible though still triggering.
  • Perhaps even a way to trigger the test, such that if one fails it could trigger another test to diagnose where the connection is failing.
  • Perhaps even an action on timed test to trigger a record of the test

This would allow a wiki to be opened and monitored key site in the background and trigger any valid action when it occurs eg;

  • Send a message
  • Attempt to Open the site
  • Trigger other tests
  • Simply log when it happens
  • Open in iframe if allowed

Regards
Tones

I like this as an adhoc monitoring system. Things I’d like to see added - check the full URL given, not just the domain

ie, the following should fail because it’s a 404, rather than succeed because it made a network connection: <$check-website url="https://douglasadams.com/making-stuff-up" /> whilst this one should succeed: <$check-website url="https://douglasadams.com/creations" />

Full dedicated monitoring systems generally have a wealth of other options (follow redirects, check content returned against an expected string, push authentication or other form data, etc), but tbh I think those would be overkill for a TW widget check like this

Nemo I support your suggestion but there may need to be a seperation between the two functions if one is achieved by ping, the other by http get.

This makes me wonder if the existing html get solutions can do this for you already?

I think when doing a file or folder get that the additional features you mention become more important?

afaict from eyeballing the code (noting I dont program javascript), the current is done by a http HEAD (and not GET or ping), though I couldn’t say why this returns success on a 404.

I note it reports “offline” on sites with a failing SSL certificate (as tested one of my own domains where the cert failure is due to the domain not appearing in the SAN entry)

Checking content would need to be a full GET, but simple checking of a deep URL should be possible with the existing HEAD method

1 Like

did you try http:// on the https fails? I don’t know what a SAN entry is.

SAN is a https specific thing - “Subject Alternative Name” - it’s a list of domains a certificate is valid for. eg: tiddlywiki.org and www.tiddlywiki.org are the two SAN entries on the tiddlywiki.org certificate.

Curiously, I hadn’t checked my test domain with just http, and that’s returning an “offline” via this tool too, even though it’s definitely online.

My current test set is

<$check-website url="https://house.cx/" label="https://house.cx" interval="6s" />
<$check-website url="http://house.cx/" label="http://house.cx" interval="6s" />

<$check-website url="https://uuuuuu.au/" label="https://uuuuuu.au" interval="6s" />
<$check-website url="http://uuuuuu.au/" label="http://uuuuuu.au" interval="6s" />

<$check-website url="https://curlpipebash.org/" label="https://curlpipebash.org" interval="6s" />
<$check-website url="http://curlpipebash.org/" label="http://curlpipebash.org" interval="6s" />

resulting in:

2025-11-10T23:43:42_2f1fad36

Now https for house.cx makes sense to fail due to above SAN issue (known misconfiguration on my server)

http://house.cx though returns a normal webpage (200 response) so should be green, unless this code simply cannot handle unencrypted http?

https://uuuuuu.au is setup working and this test correctly picks it up as Online.
the unencrypted http://uuuuuu.au is a 301 redirect to https://uuuuuu.au - so it’s understandable it’s not returning green, but depending on your taste in monitoring, calling it “offline” isn’t nescessarily correct either.

curlpipebash.org is a quirky setup, where the https is a 301 redirect to the http unencrypted site - which returns valid working content. It’s marked as offline here in both http and https, combining both issues above into one domain. To a normal user in a normal browser, the site works and is online.

tl;dr: as it stands, I get 1 online and 5 Offline in my test six. But if you loaded those six in a browser, you’d get 5 loaded pages without error, and only 1 with an issue.