Following up on a few items here:
- I think (touch wood, fingers crossed, etc) the performance problems are largely resolved. The thing that helped most was installing a robots.txt file to instruct web-crawlers not to relentlessly follow every filter and sort link in the “Explore” page. Since doing that I’ve seen the CPU load chart look much more healthy, and I’ve not seen the frequent slowness, where it would take 10 seconds or so to display any page. And, as a bonus, I found and fixed a few other bugs while troubleshooting the performance.
- Thanks for the feedback @Bearking. Perhaps a cheaper option for light users is something to consider. I was building the subscription code roughly when “Twitter Blue” came out for $8 a month, so that was an inspiration for the price point. Interestingly the $ per MB costs are super low, which is why I haven’t worried too much about data caps. What does contribute to costs is the network traffic to read and write data, and of course the cost of the server itself.
- @Mark_S would you mind if I added your .ps1 file to tiddlyhost/examples at main · simonbaird/tiddlyhost · GitHub ? Or would you like to make a PR to do that yourself?
- @Mark_S There are a few things I check, but yeah, the potential for bad actors to cause trouble is something I worry about.
- @Springer (and others), thanks so much for sharing your enthusiasm for Tiddlyhost (here and elsewhere), and for the kindness and encouragement. It’s appreciated!
- @Springer again, I can do some queries and produce reports on those kind of statistics (size, kind, etc). Perhaps I’ll share something like that in the future, it might be interesting.