• Oliver Lowe@apubtest2.srcbeat.com
    link
    fedilink
    arrow-up
    24
    ·
    8 months ago

    As someone who never did much web development, I was… surprised… at the amount of tooling that existed to paper over this issue. The headaches which stood out for me were JavaScript bundling (then you need to choose which tool to use - WebPack but then that’s slow so then you switch to esbuild) and minified code (but that’s hard to debug so now you need source maps to re-reverse the situation).

    Of course the same kind of work needs to be done when developing programs in other languages. But something about developing in JS felt so noisy. Imagine if to compile Java or Rust you needed to first choose and configure your own compiler, each presenting their own websites with fancy logos adorning persuasive marketing copy.

    • jjjalljs@ttrpg.network
      link
      fedilink
      arrow-up
      17
      arrow-down
      2
      ·
      8 months ago

      My hypothesis is the JavaScript ecosystem is like 50:5:1 of junior : intermediate: expert contributors.

      That’s why there’s so much “fuck it, I’ll make a new project from scratch!” and “screw conventions I’m making a breaking change”. Those are frankly very junior attitudes, and they feel more common in JavaScript land.

      I don’t have any data to back this, but it’s my guess.

      That or the ecosystem is just so polluted with bad ideas, people aren’t learning good practices.

    • Sakychu@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      8 months ago

      Yeah I hated Web development when ever i had to do it. Web development as a whole feels half undercooked and half overcooked.

      • Oliver Lowe@apubtest2.srcbeat.com
        link
        fedilink
        arrow-up
        5
        ·
        8 months ago

        So-called “backend” I was OK with. HTTP is well-specified. It’s a too general of a protocol what it’s being used for, so you’re stuck implementing the same stuff over and over again. When using SMTP or NNTP you realise how much work the protocol does for you when building systems on top of it.

        But “frontend”… Jesus talk about abusing something that was never designed to be used like it is. Total nightmare in my opinion! UIs which are totally inconsistent in appearance and behaviour has somehow become the norm!

        • frezik@midwest.social
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          I don’t know about that. Frameworks like React or CSS toolkits have made things more consistent across browsers. Being rid of Internet Explorer helped, too. Things were way worse 15 years ago.

          Now, making the browser into a quasi operating system might not have been a good idea, but that’s a different argument.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      8 months ago

      I switched jobs from one using a mostly C++ stack to one using a Typescript/JavaScript stack for a large application. I was absolutely shocked at how slow and generally shitty the tooling for JavaScript is, and coming from C++ land the bar was already very low.

  • SuperSpruce@lemmy.zip
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    8 months ago

    I read this article a few weeks ago and it sent me on a rabbit hole of web performance articles.

    I think a good budget for basic websites (articles, landing pages, and small to medium functionality web apps) is what I call the “GZ250”, or 250KB of gzipped JavaScript, which is more than plenty. I picked this amount such that yesterday’s budget phones will be able to load the website in a few seconds at 1Mbps (and the name references my motorcycle).

    For comparison, my full on games take way less than that. The Unscaled Incremental and Elemental Incremental are 52KB and 19KB of compressed JS respectively, and v1.0 of my new deckbuilding game is about 27KB. The unreleased V1.1 is massive but will still be around 50-60KB of compressed JS.

    I don’t understand how an article uses 60x the script as my games, but cutting back to 6x would be a win for accessibility and efficiency.

    • Oliver Lowe@apubtest2.srcbeat.com
      link
      fedilink
      arrow-up
      5
      ·
      8 months ago

      Did you see this article by Dan Luu? https://danluu.com/slow-device/

      Super interesting. It’s a discussion from a point of view I hadn’t considered before: how bandwidth has increased much more than CPU performance of web apps. I felt this in a way as my main computer until recently was a mini PC with the an Intel i5-5250U processor. Despite my Internet connection going from a 10mbps link to a 300mbps link, and pings dropping from 25ms to <5ms, browsing the web on the device became unbearable.

      • SuperSpruce@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Interesting, it kinda feels like the opposite is true for me, at least on mobile. In 4 years, I’ve gone from a 1.4GHz A53 SD425 to a 2.2GHz A78 SD695 SoC, a 6x increase in single thread performance in 4 years for me. I also during that time got a powerful laptop with a Ryzen 9 5900HX CPU.

        Meanwhile, it’s still not unusual to see my Internet speeds drop below 1Mbps, often hovering around 100Kbps-300Kbps, on data or crappy university WiFi, which sometimes has a ping of no joke, 20000+ on my laptop when running Ubuntu. I can sometimes reach high throughput of up to 100Mbps, but when I don’t, my Internet speeds often chug.

  • Quetzalcutlass@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    8 months ago

    How big is 10 MB anyway?

    To be honest, after typing all these numbers, 10 MB doesn’t even feel that big or special. Seems like shipping 10 MB of code is normal now.

    If we assume that the average code line is about 65 characters, that would mean we are shipping ~150,000 lines of code. With every website! Sometimes just to show static content!

    And that code is minified already. So it’s more like 300K+ LoC just for one website.

    An important takeaway, as I feel byte size can be hard for people to intuitively visualize. And for those who didn’t read the article, many of the sites tested sent significantly more than 10 megs of JS, even sites containing nothing more than simple input boxes that should be doing any processing server-side.

    I want to see the difference with ad-block enabled. Analytics and tracking are certainly complex enough to account for a lot of that payload. Same with an addon like Decentraleyes to see how much is bloated frameworks that could easily be cached locally.

  • Kimusan@feddit.dk
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    8 months ago

    It just javascript - the bloat part is Implicitly there when talking about js/TS.

  • onlinepersona
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    8 months ago

    WASM still hasn’t convinced enough people to drop JS and write their website in something other than JS/TS. Maybe someday…

    CC BY-NC-SA 4.0

    • Bourff@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      8 months ago

      Isn’t DOM manipulation notoriously tedious with WASM? That seems quite a showstopper for most client-side js I’d say.

      • onlinepersona
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        8 months ago

        Why use DOM manipulation when you can use WebGL? (half-joking, it’s what Qt does)

        On a serious note, there are rust frameworks (Yew and Leptos for example) that generate all the DOM manipulation stuff for you. No need to touch JS or the DOM in JS.

        CC BY-NC-SA 4.0

        • Oliver Lowe@apubtest2.srcbeat.com
          link
          fedilink
          arrow-up
          3
          ·
          8 months ago

          I imagine part of the challenge going forward would be the hordes of programmers brought up on designing UIs using a DOM, and all the associated tooling.

          My prediction is the situation could be similar to how today many text-only programs assumes a terminal-like device. Terminals have been obsolete for years but I personally feel it’s a ball-and-chain on text UI development. The web document model could persist long after web browsers are a kind of “terminal” to load and render web documents.

        • Bourff@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          8 months ago

          Got it, but if you expect people to switch from JS to Rust , you’re going to be disappointed. That’s like asking people who just got their driving license to hop into a fighter jet just because it’s faster. JS is a simple language. Its widespread adoption is not due only to it being ubiquitous, but also because it’s pretty easy to learn. Rust, on the contrary, not so much.

    • SuperSpruce@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      I still have no idea what WASM really is. I’ve tried looking at articles but it still confuses me. I know how to use HTML, CSS, JS, and actual ARM assembly language at a basic level, but I don’t see how any of this could be used with WASM.

      • onlinepersona
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        WASM is just like assembly. It has instructions similar to MOV, JMP, STA, etc. It can be distributed as the textual instructions or as the compiled binary format.

        When it started it was interpreted by JS or could be compiled to JS directly. It proved to be faster than hand-written JS. However, it still had to go through a JS interpreter. Now, there’s a WASM interpreter / virtual machine built into browsers. It’s very much the new java bytecode but without running an unsandboxed, external (outside of the browser) java virtual machine.
        Given it’s an intepreter / virtual machine, it of course has limited APIs in the browser. For a while, it was not possible to access the DOM from WASM, so JS would do the DOM stuff and WASM was called (just like calling an external function in a lib in C/C++/Rust/…) upon to do computationally complex stuff since it was faster than running it in JS through the JS interpreter. IINM, WASM now does have access to the DOM.

        Of course there are WASM interpreters outside of browsers that can be included as libraries in other languages. Rust devs are using it for example for plugin systems in their software.

        CC BY-NC-SA 4.0

    • ReversalHatchery@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      8 months ago

      Honestly I’m better off this way, personally. At least javascript is text, and very often readable after pretty printing and debuggabke as a user, I’m not comfortable with loading basically opaque binaries for websites.

      • samc@feddit.uk
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        Isn’t production JavaScript usually minified/obfuscated to make it hard to read?

        Also wasm is actually bytecode, which I believe has a 1:1 conversion into a text-based format called wat.

        I agree with your main point though, it’s kinda creepy when you realise just how much we are expected to allow other people’s code to run on our machines.

        • ReversalHatchery@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Isn’t production JavaScript usually minified/obfuscated to make it hard to read?

          Somewhat, but often it’s still readable. Or maybe I just don’t look at it often enough to notice the worse cases…

          Also wasm is actually bytecode, which I believe has a 1:1 conversion into a text-based format called wat.

          That’s good to be aware of, thanks!