Note: I call the scientists in this post by last name, not because I think I am their ‘peer’ but because that’s how the English language works, and if I put ‘Mr’ before every last name, I’ll sound like Consulla asking for Lemon Pledge! I am 30, will turn 31 in less than 20 days. I am the same age as the year of the Eternal September. I was born with web, but I hope Web dies before me!

Also, if you don’t know who Alan Kay is, don’t be distraught or feel like you’re an ‘outsider’ (especially if you are not much into the ‘science’ side of programming). Just think he’s a very important figure in CS (he is, you could look him up perhaps?)

Now, let me explain what Kay told me.

Basically, Kay thinks WWW people ignored works of people like Englebert and the NLS system, and it was a folly.

Dough Englebert, before UDP was even though of and TCP was a twinkle in the eyes of its creators, before Ethernet was created, back when switches were LARGE min-computers and ARPA was not DAPRA, tried his hand at sending media across a network (the aforementioned ARPA, you may know it from /usr/include/net/arpa), he even managed to do video conferencing. That was in the 1960s! He came up with a ‘protocol’ and this ‘protocol’ is not a TCP/IP stack we know today, it was completely different. I don’t think he even called it that. The name of this ‘protocol’ was ‘Online System’ or NLS.

Englebert’s NLS was different from the 4-lays abstraction we know and love today. It was different from web. It was like sending ‘computations’ across. Like a Flash clip! Kay believes that, WWW people should not have sent a ‘page’, they should have sent a ‘process’. He basically says “You’re on a computer, goddamit, send something that can be computed! Not plaintext!”

Full disclosure, yes, Kay is too brutal to Lee in this answer. I don’t deny that. And his ‘Doghouse’ analogy is a bit reductive. But I digress.

Kay believes the TCP/IP stack is sound. I think anyone who has read a Network Theory book (like Computer Networking A Top-Down Approach which I have recently perused through) doesn’t subject this. But he believes people are misusing it, abusing it, or not using it right.

In the speech which I am referring to at the question title, Kay said:

[Paraphrasing] “This is what happens when we let physicists play computer […] If a computer scientist were to create ‘web’, he would do a pipeline, ending at X”

X refers to X Windowing System used in UNIX systems, it’s a standard like Web is. The implementation of X11 on my system is Xorg. It’s being slowly replaced by wayland.

So what does this mean? Well Kay says, ‘send a process that can be piped’! Does it sound dangerous and insecure? WELL DON’T ELEVATE ITS ACCESS!

Imagine if this process-based protocol was too, called web, and the utility to interface with it was called ‘wcomm’, just like wget is. To view a video:

Imagine PostScript was strong enough to describe videos with. So we could get a video from Youtube, render it, and watch it:

$ ~ wcomm youtube.com/fSWmufgTp6EQ.ps | mkmpg | xwatch

So what is different here? How is it different than using a utility like ytdl and piping it to VLC?

Remember, when we do that, we are getting a binary file. But in my imaginary example, we are getting a ‘video description’ in form of PostScript.

====

So anyways, as I said, I am not super expert myself. But I think this is what Kay means. As Kay says himself, PostScript is too weak for today’s use! But I guess, if Web was not this ‘hacky’, there would be a ‘WebScript’ that did that!

Thanks.

  • @[email protected]
    link
    fedilink
    6
    edit-2
    4 months ago

    Let’s rewind the time machine… the original web concept was addressing a much smaller problem. That a document could reference or ‘link’ to another document. It came right after ‘gopher’ which was used as an index of indexes and had a text-based app that let you navigate the line items back and forth.

    Then came Ted Nelson’s idea of ‘hyperlinks.’ The original web mashed those two ideas together and threw in a sprinkling of SGML. There was no notion of styling of the presentation, a GUI, use of the mouse, multimedia, animation, or ‘scripting.’ It was just gopher with inline links, expressed in embedded markup.

    Multiple other players (Netscape, Microsoft, IBM, et al) morphed and bolted on extensions without really considering the consequences. The thing the web had going for it was precisely this decentralized process. It made for rapid evolution, but it also meant there was (and continues to be) a lot of fragmentation. Anyone wanting to go back and revisit something hacky had a lot of legacy inertia to overcome.

    So here we are today. It’s a messy, junkyard jalopy, but it does just enough that nobody has the time or energy to go back and clean up the technical debt. And if you want to start from scratch, it has to do much better than what is there today, while offering a reason for millions of people to unlearn what they know.

    As for sending ‘processes’ that’s essentially what a VM is. You’re sending a compact process as code (javascript, python, wasm, native binary) that a local runtime executes. We have app stores that manage the lifecycle, and script libraries to create abstractions and hide details. The embedded Javascript VM is as close as we have to a universal code execution environment.

    Sending ‘processes’ around also should account for malicious actors trying to do bad things. We’ve all seen how that ended up.

    That’s not to say people shouldn’t try to innovate, but at this point, it’s like trying to reinvent driving or the telephone.

    • ChubakPDP11+TakeWithGrainOfSaltOP
      link
      34 months ago

      Wow are you from the future? Because I just had this exact same thought, that JS is just that ‘process’, so I read the ECMA-262 standard and I posted the new thread about something funny I found in it. In fact I said something that closely resembles what you said. It’s just freaky!

    • @[email protected]
      link
      fedilink
      13 months ago

      The issue is the blurry interface between client and server in today’s “web”. I can create a local html file with js running applications, but the second it wants to do anything like run a server, big bad protocol blocks. It’s almost like these big web companies use security as a guise for ensuring they hold your data.

  • @[email protected]
    link
    fedilink
    34 months ago

    I was shocked that the web people and the browser people had apparently taken no heed of much better visionary work in the past, which could have made a big difference in how things went and now are.

    Three big examples were (a) Doug Engelbart’s NLS system (and even more important: Engelbart’s visions about collaborations and communications), (b) Apple’s Hypercard system which was both really good as it existed in the late 80s, and more importantly: showed a path for how the web could be matured to the benefit of all users, and © how Postscript solved important systems problems.

    Instead the WWW went for simple text based markup docs, and the web browsers were generally even worse because they concentrated on consumption rather than authoring.

    Maybe. But it works good enough for scraping, no?

    Authoring with HTML is indeed horrible barring 1990s style basic webpages.

    • ChubakPDP11+TakeWithGrainOfSaltOP
      link
      34 months ago

      Good point me lad. The plain-text based approach indeed makes scraping much easier. And plus, if we send a ‘process’, the process can be easily malicious, even if we don’t elavate its access.

      Like imagine today. I tell you to wget a shell script and pipe it to shell to install my software from my remote (FPT, Git etc). This almost always needs sudo. I can’t imagine how many 12 year olds would be fooled to sudo rm -rf *?

      That is provided that a 12 year old would even know how to do that. I know several people who began their UNIX journey when they were as young as 7~8, but there’s a reason these people earn 500k a year when they are 30! I can’t imagine if your normie aunt would really feel like using a UNIX pipeline to check her emails.

      HTTP ‘just werks’. Derpcat told me this in 2010 when I told her I hate HTTP in 2010. IT JUST WERKS. Kay’s solution, although extreemly unbaked, would not allow my mom to read her Intagram feed.

      Besides money, the computation cost is also high. Kay used to use mini-computers, us poor people used micros (if i were a poor person when mini/micro distinction existed, today it’s just clusters vs Jimmy’s gaming rig, oh, where art thou, DEC?)

      But again, nobody has given a it a thought. THAT IS THE ISSUE. Academic text on alternatives to web, AFAIK, is rare. Part of it is the ‘just werks’ thing, but also, academia just does not care about web.

      I think if people who are smarter than me would give this a ‘thorough’ thought, they will come up with a good solution. Web won because it was ‘open’, it was easy to navigiate, as opposed to pesky newsgroups and the such. You can still go to the first website to see this: http://info.cern.ch/ (browse this with Lynx or W3M, it’s the best way to do it! Don’t use FF or Chrome).

      I dunno!

      • @[email protected]
        link
        fedilink
        24 months ago

        What I think is that HTML was indeed easy to write, back in the days when everyone used plaintext editors. If you would need a dedicated editor you could do only hope your computer vendor would provide it. There were, iirc, a few OSes other than Windows.

        One good thing about HTML is that the browser can ignore tags it doesn’t know, and still present the correct body text to the user. That’s crucial because open standardization was rare at the time. Instead, there was the browser war. IE invented horrific tags no other browser can understand. Netscape even invented JavaScript while IE was stuck with static text.

        The CSS also came, with the idea that HTML should focus on text information while CSS should do so on the visual design. And they did a really good job on making it hard to. center the text…

        • @jadero
          link
          24 months ago

          The CSS also came, with the idea that HTML should focus on text information while CSS should do so on the visual design.

          My biggest beef with CSS is that it’s on the wrong end of the wire. What ever happened to the idea that the client is in charge of rendering?

          Or maybe it’s that the clients have abdicated their responsibility: the browser included with OS/2 Warp had a settings page that let me set the display characteristics of every tag in the spec. Thus, every site looked approximately the same: my font, my sizes, my indents, my spacing, whether images displayed (or even downloaded, I think) and whether text split at an image or wrapped around it. And it’s not like I had to customize everything for each site: if you used a tag my browser recognized, my browser took over.

  • @[email protected]
    link
    fedilink
    English
    14 months ago

    Remember, when we do that, we are getting a binary file. But in my imaginary example, we are getting a ‘video description’ in form of PostScript.

    The example is a reference/link with handling instructions.

    When I think of video metadata, we already have and use formats for that, in a more general and versatile manner.

    Maybe I’m missing the point, but sending instructions is a very different and restricted approach.

    I like that we describe data and how it shall render. That way, the data is accessible for various interpretations and uses.