I’m sorry that my response here will not address your actual request.
Though what you’re envisioning is not hard to do, it makes more sense to problem-solve around why you feel the need to juggle capitalized and lowercase versions of titles (rather than to help you invest more effort in that very klugey direction)
Your experience (and mine, prompted by testing your kind of workflow) raises an important implementation question for me, which I think we need to flag @saqimtiaz on (even knowing saq probably can’t put significant attention toward this anytime soon):
OBSERVATION: SAQ’s CURRENT DEFAULT NEATLY HYBRIDIZES EXTERNAL & LOCAL FIELDS
If a tiddler is “adopted” (by toggling away from “yes” in the is_volatile
field), then a copy of it gets saved (as expected). But the next time external tiddlers are loaded, the solution as saq has set it up so far constructs a HYBRID result for any already-“adopted” tiddler:
- the incoming tiddler has a new
includeTimestamp
- “yes” is (re)written into the
is_volatile
field
- external content overwrites any local field-values.
- HOWEVER… any field-value pairs that are unique to the local (pre-import) tiddler remain!
** At least, they remain for the current load session…
Is this intentional? I think @saqimtiaz could have very good reasons for doing this: you could want to add local tags and comments / notes about a book (say), and still benefit from importing new information and/or corrections within the authoritative central record.
Then again, perhaps this hybrid behavior could just be an artifact of some json data remaining in un-erased limbo. I wonder this because if saq had wanted this behavior, it would have made sense to protect all local data not just for the duration of this browser session, but beyond (through subsequent saves)…
RISK OF DATA LOSS
ALAS, the benefit of this hybrid load behavior, as currently implemented, is easily outweighed if one must go through and manually re-toggle (for “adoption” / saving) all the tiddlers that started the session with local field contents that of course you don’t want to lose. But going through and doing this toggling is especially difficult, because we don’t even seem to have any clear way within the local wiki of tracking — after the re-import — whether an imported tiddler has fields whose data is only from the local side, and which fields those are!
(A naive user will see, perhaps with satisfaction, that their locally generated field-data from a prior session is “still there” after (re)importing a “conflicting” tiddler. They may be unlikely to realize that this hybrid tiddler, with its locally-generated content intermixed with remote content, has reset itself to disappear, taking the distinctive local content — which cannot be restored from the source wiki — with it.)
EASY SOLUTION?
If the load process is already doing some savvy “dovetail” work with local and incoming field values, then surely it’s possible to tweak the process so that whatever data that a wiki had, when its browser session started, all of that could survive the process of importing (“adding”) external content (with the possible exception of incoming field values overwriting local ones). That seems like an important norm — something users are entitled to expect, once this solution moves from experimental to mature. (To be clear, @saqimtiaz is doing some great pioneering work here, and of course it’s always true that details can fall into place only after the proof of concept!)
A simple solution would adjust the load-external-content process to check whether the local copy has the is_volatile
field, and then to preserve that field value – even if other locally existing field-value pairs are overwritten. (So, the includeTimestamp is updated, confirming the most recent integration of content from the source/authority wiki, and all other fields behave as they now do.)
Optionally, too, perhaps @papiche (and others) would like the option to configure the import/load process so that it simply bypasses any existing local tiddlers with the same name.
(That is not my preference — I love the dovetail solution — perhaps enhanced by letting the local copy protect some specified set of field-value data, including the is_volatie
field. Also, I think there will be great uses for having local shadows, and even other system tiddlers, overridden by incoming tiddlers with the same name. This sounds like a great way to sandbox complex and powerful stuff, knowing that those external tiddlers, by default, won’t save.)
MORE COMPLEX VISION:
Ideally, some additional fields could also be user-configured to be SAFE from overwriting; certain local field values (say, tags, or “lending status” for a biblio record) could be important to retain, even when the “master” record has a conflicting version of that field.
Most ambitious (and probably not easy): figure out how to employ a “TWO-SIDED FILTER” (!) Right now the filter for external content is entirely directed at the remote source, and is evaluated only as a filter within that remote wiki: “Go look at your tiddlers and send me all your tiddlers that meet this filter condition.”
But we could also imagine an filter defined/parsed on the LOCAL side, so that the INTERSECTION of the two filters determines the import. For example, "Look at these potential incoming tiddlers (in the filter we’ve already been using), and ignore all EXCEPT those that also meet this condition: Say, we want to load those that fit the [is[missing]]
filter condition here, or all those remote tiddlers whose titles are listed in the vocabulary field of a local tiddler, etc.
(A rather cumbersome version of that intersection could already be achieved by starting with a bluntly enumerative local filter: I use a local filter like [is[missing]!is[system]format:titlelist]
to produce a titlelist, and use that list of titles to start my import filter, going on to add further general constraints that can be parsed on the remote end. But this is clearly a very brittle way of getting at the intersection of local and remote filter criteria!)
LAST THOUGHT:
Another configuration detail that could be helpful … and I can probably figure this one out by myself soon… is to make the publishFilter check whether the external tiddler has been modified SINCE its importTimestamp. (Or perhaps any edit to the tiddler actually toggles the is_volatile
field to “no” so that our css will reflect its new status…) That way, naive users get the benefit of keeping the local wiki “lean” (without too much redundancy), while avoiding the risk of losing whatever fresh content they are adding to these remote-origin tiddlers. Maybe this reduces the demand for something like a toolbar button to “adopt” tiddlers — which probably would be used most frequently on tiddlers that need have a future life with local edits.