You can, edit the files $:/config/ExternalJS/Prefix/Plugins and $:/config/ExternalJS/Prefix/Core, without the version and extension, example :
You need to do that before exporting the core and plugins
You can, edit the files $:/config/ExternalJS/Prefix/Plugins and $:/config/ExternalJS/Prefix/Core, without the version and extension, example :
You need to do that before exporting the core and plugins
Thanks for the fix. I’ve released a new version of motion that changes its startup order.
I’ve just set up a simple external-core site via tiddlyhost. I did successfully save once – after modifying just the site title. But every attempt to save, after that first one, is prompting a “file changed on server” rejection. (At first I thought it was something about the mass of imported tiddlers I brought over. But I’ve reloaded the simple site and tried tiny changes, always the same result.)
Hmm, I can’t reproduce it. Generally that message should go away if you reload the site first. Maybe try a shift-reload to ensure it really does reload…?
About the tw-external-js plugin:
That’s correct, but I needed to make some tweaks to the plugin so filenames can be extracted properly from custom core/plugins urls. So, if you’re interested in configuring the library url, you’d want to retrieve the updated version (>= 0.1.3).
It was a robust problem (across computers, browsers) until I tried, just now, downloading the file from the tiddlyhost dashboard (a file which, as expected, could not open on its own) and then re-uploading it. Now saving is working fine. But it’s worth keeping an eye on whether the glitch happens again.
I’m surprised. It’s loading the file via a relative path. So it won’t reduce TH’s overall storage load unless behind the scenes the library is a symbolic link to a central library.
You can fix this in your own TW by (while still on TH) editing tiddler
$:/core/save/offline-external-js
Change the macro definition coreURL to:
\define coreURL() https://<your tiddlywiki name>.tiddlyhost.com/tiddlywikicore-5.2.3.js
Where <your tiddlywiki name>
is the name of the file that you have externalized (is that a word?). Save. Then download from your TH dashboard. Now you should be able to use that downloaded file as you would a stand-alone file, as long as you have an internet connection.
@Mark_S Seems like the js file is also accessible at this url: https://tiddlyhost.com/tiddlywikicore-5.2.3.js
So no need to specify the wiki name
The question is whether that is an accident or on purpose.
I’ve just pushed an update that is suposed to automatically give everyone the shared https://tiddlyhost.com/tiddlywikicore-5.2.3.js .
(I might switch to https://tiddlyhost.com/tiddlywikicore-5.2.3.min.js soon, which is a little smaller.)
@simon With uglify it’s possible to reduce the size of the core to 1,197MB (vs 2MB for tiddlywikicore-5.2.3.min.js)
https://github.com/Telumire/TW/raw/main/tiddlywikicore-5.2.3-uglified.js
Maybe an iframe could be a solution, see Easy way to fix mastodon verification issue (for wiki over 1MB)
Just a follow-up to confirm that it will indeed work on tiddloid with a local library. You need to write the full path to the library. e.g.
/storage/emulated/0/share/tiddlywikicore-5.2.3.js
I can see where it would be handy to have a switch mechanism so you could reset the library location depending where you were using it.
Interesting idea @Mark_S
An Offline switch?
I find this whole thread fascinating!
As an idiot I could benefit from a simpler run-through of …
The advantages of splitting-out the core / + plugins to … (where?)
How easy would it be, in practice, for an average bear?
IF it improves performance on start-up I’d be likely very happy.
BUT, I slightly have some qualms that core on its lonesome could be a problem, sometimes?
Just queries from a naive end-user
Best, TT
Separating the core and plugins from a wiki is useful in these scenarios :
The browser is able to cache external javascript, so the loading time is improved, and a smaller file is also faster to save.
The external JavaScript configuration is useful in certain circumstances, hence its inclusion in the core. But I strongly believe that it is far from a universal solution, and that TiddlyWiki should not present it as a mainstream option.
The problem is that the concept of dependencies between files is difficult for many end users, despite it being a commonplace idea for developers. In fact, it’s not uncommon for web users to only have a hazy notion of files and directories.
For most users, moving from a single file configuration to a pair of dependent files opens the door to a range of fairly horrific bad outcomes when things go wrong with version mismatches and dangling dependencies. That’s bearable for sophisticated users, but it’s not something that we should expect of TiddlyWiki’s mainstream users.
The usability will always be problematic – for example, see the instructions for upgrading a single file wiki with external JS. Even if we smooth out those processes, there’s still a fundamental complexity introduced by shifting so much of a burden to users having to ensure that they keep track of the right tiddlywiki.js files.
As it happens, if we were going to promote a dual file solution for TiddlyWiki I’d be more interested in structuring it differently: a generic “tiddlywiki.html” viewer application that loads/saves from external “tiddlers.json” files. I think that’s closer to a conventional mental model of how applications work, and is more conducive to being packaged as an installable Chrome app etc.
Just to be clear, I am not saying that people shouldn’t use the externalised JavaScript configuration. Far from it, I think we learn a lot from these experiments. And of course it is entirely appropriate for online services like Tiddlyhost or Xememex to use it, where it should be entirely invisible to end users.
It would make sharing / doing backups of specific tiddlers much easier, but is that possible to do without additional software? If so then wont tiddlydesktop become obsolete ?
With the File System Access API you wouldn’t need additional software, though browser support isn’t universal. I have playing around with something similar on WebDAV.
I work a fair bit with users in bandwidth scarce regions where the savings in size would be very beneficial. However, in practice I have found that the usability drop is considerable when moving from a single file to two files that both need to be present, saved and shared etc and easily outweigh the bandwidth savings. Granted, these are also users with limited digital literacy.
There are no average bears. But, if you don’t understand the instructions or implications at the top of the wiki here, then Externalising your core probably isn’t for you. Which is alright. Or bearable.
Splitting out the core means rapid fire saves. Especially on hosted services, I always wait for a save to complete before continuing to work. It should also mean faster loads after the first load. At least on desktop chromium, the library file is cached in memory. Which means that reloads are faster. But the cache will be emptied when the browser is shut down. How long it is kept in the cache is entirely up to the browser. If the cache works the same way on a phone, then you should be pulling far less data.
And if you have it set up on your phone to use a local library file, then that’s 2mb of data that doesn’t have to be transferred. Local data is going to be much faster than pulling data over a phone/internet connection.
A service like TiddlyHost could save oodles of space using an external core. And there would be somewhat less bandwidth used since presumably most people are using browsers that will cache the core at least for awhile.
The main downside is that it is probably too complicated for people that don’t understand file dependencies and/or file paths. So, for instance, if you download your externalised TW from TiddlyHost and then turn off your internet connection, your TW file won’t work (unless of course you have a properly downloaded and pathed local library file.)