Not my blog, but the author’s experience reminded me of my own frustrations with Microsoft GitHub.

  • marcos@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    5
    ·
    5 months ago

    There’s a reason they do that: Files can get big

    Oh, boy. Wouldn’t it be great if servers had a way to discover the size of the files on their storage without having to read them?

    adding various code highlighting and interactivity costs performance

    Somebody, quick, there’s work to be done on language theory so that we learn how to do those things with a cost just proportional to the file size!

    (No way! Who is that Chomsky guy you keep telling me about?)

    • bitfucker
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      5 months ago

      Dude, his point is that if you did not implement partial rendering on a big file, the browser will have to work extra hard to render that shit. Not to mention if you add any interactivity on the client side like variable highlighting that needs to be context aware for each language… that basically turns your browser into VSCode, at that point just launch the browser based vscode using the . shortcut.

      It’s not a matter of the server side of things but rather on the client side of things.

    • lmaydev@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Don’t really get your point here.

      They virtualize the file because it’s big. They know the size.

      It does indeed scale with the size of the file. That’s exactly the problem.