Suggestion: Adding gzip support to import/export from and to tiddlywiki

EDIT: found a relevant github issue: Please GZip $:/core, maybe? · Issue #4262 · Jermolene/TiddlyWiki5 · GitHub

Compression Streams are supported on all browsers since may 9 2023.

We can use this to de/compress data using the gzip algorithm with a few lines of javascript:

async function compress(blob) {
    const cs = await blob.stream().pipeThrough(new CompressionStream("gzip"));
    return await new Response(cs).blob();
}

async function decompress(blob) {
    const ds = await blob.stream().pipeThrough(new DecompressionStream("gzip"));
    return await new Response(ds).blob();
}

I did a quick test, empty.html goes from 2.4Mo to 0.431Mo (about 80% smaller).

You can test this here: Download compressed data

I think adding support for gzip would be great for backups and plugin library.

Does it mean the disk space is 0.43MB and in the browser it is 2.4MB?

The gzipped file is like an archive file format:

image

It can’t be opened and read in the browser as a html file, unless you use javascript to decompress the data in memory - which is the feature I suggest adding to tiddlywiki :slight_smile:

It might not be the most powerful compression algorithm (it works really well with text, not much for images and other type of media), but it is very easy to implement now that it is supported natively by every major web browsers. Html exports in .gz takes 80% less space, for someone that make a lot of backup of sizeable wikis, it can save a lot of disk space ! There is already support for gzip with the AWS plugin, so maybe this could be added to the core now that there is no need for dependency anymore.

1 Like

Your clarification is much appreciated. Most of our wikis are text and less are images.