recursivedoubts 8 hours ago

as I said on twitter:

> the real red pill of the mcmaster website is what a mess it is: inline scripts, jamming stuff in window, a mix of fetch() & XHR, a back-end most web developers would be embarrassed to put on their resume (aspx), horrible RPC-style URLs, etc etc

and, despite all that, they have a site other developers marvel at, which should make anyone saying "you can't build a good website in X" pause and reflect

  • tshaddox 8 hours ago

    It's important to realize that we're not really marveling at the technical merits of what they've built. It's quite functional, but nothing particularly excellent. We're really just marveling at what isn't there, namely megabytes of tracking libraries, UI libraries, etc.

    • recursivedoubts 8 hours ago

      i see a lot of people marveling at the speed mainly, which is achieved through brotli compression on everything and optimized images, and the usability second which is done mainly with YUI

      the backend is a bunch of opaque aspx routes, the last thing most web developers would reach for

      i think a takeaway should be that good web applications can be built on damned near anything if you focus on the things that matter

      • tshaddox 7 hours ago

        The most noticeable "fast" thing to me is the JavaScript page transitions. They are near-instant if you have hovered on the link long enough for them to prefetch everything.

        Initial page loads are not particularly fast (one reason for that might be the lack of server-side rendering).

ramesh31 9 hours ago

None of this matters once your analysts get ahold of it and dump 10MB of JS snippets on the page. Frameworks don't make things fast. Any one in existence can be cached out to Cloudfront and made equally "fast" from a TTFB/FCP perspective. But it's meaningless if you're executing half a million LOC in the client on every page load, as just about every e-commerce site eventually does.