Humans can't read URLs. How can we fix it?

Jake leads this HTTP 203[1] episode with his reflexions about current URL display practice in browsers, and how it could be improved, at least for security.

For advanced Web users like me, the request part of the URL helps locate current page in the site, if there's a nice logic in the URL[2]:

2 URL examples, where the first one shows a nice content hierarchy

Safari unfortunately hides this request part, even on desktop:

How browsers show URLs

For users with no technical knowledge about the URL structure, being able to detect fishing attempts immediately would be a huge security improvement:

Fishing attempts are more obvious in Firefox

That's why I really like what Jake suggests, as it makes the eTLD+1 obvious for security, but keeps the full URL alongside it, if there's enough space:

Jake's suggestion to improve URL security

One thing Jake and Surma didn't talk about through, is how this Public Suffix List Mozilla maintains can grow without hurting browsers' performance, like with the HSTS Preload list.

  1. HTTP 203 is a great show where « Google Developers Jake Archibald and Surma discuss their philosophies about web development and the various aspects of it, meanwhile dropping in lifehacks, lessons and some honest truths ». ↩︎

  2. Like I try on this site… 😉 ↩︎