About Browser Security Policies

Given that “security reasons” do come up quite a lot I do wonder IF we could open up that sealed package a bit.

  • What exactly are those blocks?
  • Do they ruin user free access?

FWIW I do find standard browsers somewhat frustrating now.
It is an oddity that via in wiki addressing I can locate a file directly but not otherwise.

Just a comment
TT

pmario: Modern browsers all enforce a shared set of core security rules designed to block malicious sites, protect user data, and prevent unauthorised code execution.

I did let MS Copilot create a short summary, with links to references, wich is shown below. They are sorted by importance.

The most important rule for every browser is the Same-Origin Policy.

That’s also the one, which defines that a file:// and https:// URLs are strictly separated.


copilot:

1. Same‑Origin Policy (SOP)

SOP prevents one website from reading or modifying data from another unless they share the same origin (scheme, host, port). This blocks most cross‑site attacks by isolating content from different sites and ensuring scripts cannot freely access each other’s data [1].

Show 5 more ... and references

2. Sandboxing of Web Content

Browsers isolate tabs and processes so that malicious code in one site cannot access system resources or other tabs. Sandboxing limits the impact of exploits and helps contain malware or compromised pages [2].

3. HTTPS Enforcement & Certificate Validation

Browsers validate TLS certificates and warn/block when connections are insecure or tampered with. This protects against eavesdropping, man‑in‑the‑middle attacks, and spoofed websites by ensuring encrypted, authenticated communication [3].

4. Content Security Policy (CSP)

CSP allows sites to declare which sources of scripts, images, and frames are trusted. Browsers enforce these rules to block unauthorized scripts, reduce XSS risk, and prevent loading of malicious resources [4].

5. Safe Browsing / Malware & Phishing Protection

Browsers check visited URLs against constantly updated threat lists. If a site is known for malware, phishing, or harmful downloads, the browser warns or blocks access to protect users from common web threats [5].

6. Automatic Security Updates

Browsers automatically update to patch vulnerabilities quickly. This minimizes the window attackers have to exploit known issues and is considered one of the most effective protections for users [6].

If you want, I can also create a longer version, a visual diagram, or a developer‑focused explanation of how these rules interact.

References (6)

(1) C8: Leverage Browser Security Features - OWASP Top 10 Proactive Controls. C8: Leverage Browser Security Features - OWASP Top 10 Proactive Controls

(2) Top 10 Browser Security Best Practices for 2026. Top 10 Browser Security Best Practices for 2026

(3)Security on the web - MDN. Security on the web | MDN

(4) Use RPA to apply browser rules for secure content. Use RPA to apply browser rules for secure content

(5) 6 Tips for Browser Safety | Microsoft Edge. 6 Tips for Browser Safety | Microsoft Edge

(6) STEPS TO SECURE WEB BROWSING - National Security Agency/Central …. https://www.nsa.gov/portals/75/documents/what-we-do/cybersecurity/professional-resources/csi-steps-to-secure-web-browsing.pdf


More details about SOP

pmario: An even stricter separation, which was introduced with FF 57, caused the TW saver-problem, which lead to the inconvenience we all suffer since then.

But the restrictions are understandable from a developers point of view. It is f*&%$ dangerous.


copilot:

Browsers strictly separate https:// and file:// because mixing them would let any website read arbitrary local files, which is one of the most dangerous possible security failures. This separation is a direct enforcement of the Same‑Origin Policy and is intentionally rigid.

:lock: Why the separation is so strict

  • Local files contain highly sensitive data (passwords, SSH keys, browser profiles, documents). If a normal website could load file:// URLs, it could steal these instantly.
  • Browsers explicitly block navigation from web origins to file:// for this reason. Chrome and Edge forbid non‑file pages from opening or reading file URLs to prevent remote sites from probing the user’s filesystem1.
  • file:// has no meaningful “origin”. All local files would share the same origin, so one malicious local file could read all others. Browsers treat each file as isolated to avoid this.
  • Historical attacks exploited this gap, so modern browsers hardened the rule: remote content cannot access local content, and local content cannot freely access remote content without explicit user action.

:jigsaw: Bottom line

The strict separation prevents catastrophic data leakage. Without it, simply visiting a malicious webpage could expose your entire filesystem.

References (1)

(1) Restrictions on File Urls - text/plain. Restrictions on File Urls – text/plain

1 Like

I understand, but the alternative is pretty terrifying.

You’re a regular visitor to happy.site. You’ve done your due diligence, and know that happy.site takes security very seriously. They protect their users’ privacy and have vetted their dependencies to ensure that those tools are also security-conscious. One of those tools is reliable.helper, known by happy.site for its excellent security. reliable.helper depends on dependable.utensil, which itself depends upon stable.gizmo, which depends on predictable.widget. All of these tools have excellent security reputation.

The main developer of predictable.widget has been getting tired of maintaining this open source tool, and is very happy to have help from a new contributor, with the odd handle, “Sir Lucia Mouse”. Over several months Lucia has made a number of improvements to predictable.widget and has been invited onboard the team, taking on more and more of the work. When the main developer decides to take a three-month holiday to Greenland, she appoints Lucia to be the chief maintainer of the project.

Not long after, happy.site is making changes and they update the version to the most recent release of reliable.helper. This eventually involves using the latest version of predicatble.widget.

Unfortunately for all, "Sir Lucia Mouse" is an acronym for "Malicious User", a nasty sort who’s been biding his time, but finally put some awful code into the tool, which now finds its way back up that dependency chain into your visit to happy.site.

At this point, I want to ask if you’re glad that the browser is preventing him from having any connection to your private files?

While the names have been changed to protect the innocent guilty?, this is a realistic scenario. Browsers are often the last line of defense in such chains of events.

Thanks! I’m slowly beginning to understand (including the somewhat strange language of the directives :).

I appreciated the colourful story. Fun & instructive.


@pmario and @Scott_Sauyet I think it could be useful to try and open this up more. Why? Well some of us do use TW not just as a publisher but as a serious supplemental tool that needs easy access to files.

I’m constantly needing to open files (e.g. o/s scripts, plain text editing) and execute programs that prepare texts for use in TW’s.

My recent adoption of iframed TiddlyTools miniBrowsers from @EricShulman has helped me solve about half of the issues in a TW—that in the parent browser would get too complicated.

Q: What am I asking for? How to interact safely with external files from within TW.
(P.S. Is this why some folks Go Node?)

Best, TT

You can try running Chrome with some security disabled but this should only be a last resort and NOT used as your general purpose browser.

Example for Windows, run in Command Prompt or via a shortcut:

"C:\Program Files\Google\Chrome\Application\chrome.exe" --allow-file-access-from-files --disable-web-security --user-data-dir="C:\chrome\not-safe-profile"

Thanks! Good to try!

For the main purpose of local file access could I have ONE browser instance that is completely “offline”? Would that help? Could the command string you suggest also set that?

Best, TT

I think @saq’s Chrome overrides are going to be the way to; there may be similar flags for other browsers. But when you say it would be useful to “try and open this up more”, you’re talking about something TW cannot do. You can choose to override parts of the browser security model running on your machine, but TW—for extremely important security reasons—cannot request that on your behalf.

It actually is based on a true story.

Or a combination of several such incidents over recent years.