I just searched online and was taken aback by the lack of content I could find , there are millions of video for different small niche things by hundred of people which are right and wrong about things but the most I could even find about how chips are made today are the ones explaining how silicon works etc. LTT is the only one which even have a factory video and it is too very censored uninformative and useless for my questions .

1 - I get that light is flashed in binary to code chips but how does it actually fookin work ? What is the machine emmiting this light made up of ? How does this flashing light hold as data forever on chip ?

2 - How was program’s, OSs, Kernal etc loaded on CPU in early days when there were no additional computers to feed it those like today ?

3 - I get internet is light storing information but how ? Fookin HOW ?

4 - How did it all come to be like it is today and ist it possible for one human to even learn how it all works or are we just limited one or two things ? Like cab we only know how to program or how to make hardware but not both or all ?

5 - Do we have to join Intel first or something to learn how most of the things work lol ? Cause the info available online about the software, hardware, skills etc is shit ? Not even RISC-V documentary are available .

Context - Just started learning python and got philosophical to how all things came to be ? Is just making apps or websites even a thing worth learning in the grand scheme of things ? I get that some people is just okay with that but come on have you never thought about how the deep you can go ?

anyway feel free to tell me to stfu and I’m sorry if sub=wrong and will move on request . And as the username suggest I’ll be posting questions as I have them and thanks.

ALSO ELI5 everything please

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    38
    ·
    7 months ago

    1, These days the machines used to etch chips (flash light onto the chips to carve them out) are mostly made by ASML. The most modern machines are the ASML Twinscan NXE and Twinscan EXE. The raw silicon is coated with different chemicals that react to light, and when the light patterns are flashed onto the silicon, it carves physical arrangements of atoms on the silicon that forms complex electrical circuits.

    1. CPUs were literally drawn by hand, and then the drawing was shrunk down with a magnifying glass back in the day. Programs could be written into electrical memory with physical switches (think 100 light switches), punch cards, or electric typewriters. You could pause the computer so that it would wait for you to type in the next program for it to run. By the time we had kernels, we already had large memory banks in the kilobytes that could store the OS between program runs. So you’d type in the OS once when you turned on the computer, and it would keep in in memory until you turned the computer off again.

    2. The internet is different computers connected together. This website is just data sitting on a server somewhere, and your computer connects to the server over the internet and asks for the data.

    3. Everything is built on the shoulders of giants. There is plenty to learn, but there will always be something you don’t know.

    4. There’s tons of information online if you know where to look. There’s also some good courses out there to understand more specific things like cpu design, networking, programming, etc. In university these sorts of questions fall into the field of Computer Engineering, if you’re looking for a university program to get into.

    With regards to the limits of programming: Making websites is already challenging enough, but the cutting edge can be rewarding too :) Software Engineering is a massive field with infinite opportunities. Start small and work your way towards more complex projects with larger teams.

    Here’s a good 20 minute video about the history of making microchips: https://youtu.be/Pt9NEnWmyMo

  • litchralee@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    7 months ago

    1 - I get that light is flashed in binary to code chips but how does it actually fookin work ? What is the machine emmiting [sic] this light made up of ?

    This video by Branch Education (on YouTube or Nebula) is a high level explanation of every step in a semiconductor fab. It doesn’t go over the details of how semiconductor junctions work, though. That sort of device physics is discussed in this YouTube video by Ben Eater, “how semiconductors work”

    2 - How was program’s, OSs, Kernal [sic] etc loaded on CPU in early days when there were no additional computers to feed it those like today ?

    When the CPU powers up, typically the very first thing it starts to execute is the bootloader. Bootloaders will vary depending on the system, and today’s modern Intel or AMD desktop machines boot very differently to their 1980s predecessor. However, since the IBM PC laid the foundation for how most computers booted up for a nearly four decades, it may be instructive to see how it worked in the 80s. This WikiBook on x86 bootloading should be valid for all 32-bit x86 targets, from the original 8086 to the i686. It may even be valid further, but UEFI started to take off, which changed everything into a more modern form.

    But even before the 80s, computers could have a program/kernel/whatever loaded using magnetic tape, punch cards, or even by hand with physical switches, each representing one bit.

    But how does the computer decode this binary “machine code” into instructions to perform? See this video by Ben Eater, explaining machine instructions for the MOS 6502 CPU (circa 1975). The age of the CPU is not important, but rather that by the 70s, the basics of CPU operations has already been laid down, and that CPU is easy to explain yet non-trivial.

    3 - I get internet is light storing information but how ? Fookin HOW ?

    The mechanics of light bouncing inside a fibre optic cable is well-explained in this YouTube video by engineerguy. But for an explanation of how ones-and-zeros get converted into light to be transmitted, that’s a bit more involved. I might just point you to the Wikipedia page for fibre optic communications.

    How the data is encoded is important, as this has significant impact on bandwidth and data integrity, not just for light but for wireless RF transmission and wireline transmission. For wireless, this Branch Education video on Starlink (YouTube or Nebula) is instructive. And for wired, this Computerphile YouTube video on ADSL covers the challenges faced.

    Quite frankly, I might just recommend the entirety of the Computerphile channel, particularly their back catalogue when they laid down computer fundamentals.

    4 - How did it all come to be like it is today and ist it possible for one human to even learn how it all works or are we just limited one or two things ? Like cab we only know how to program or how to make hardware but not both or all ?

    As of 2024, the field is enormous, to the point that a CompSci degree necessarily has to be focused on a specific concentration. But that doesn’t necessarily mean the hard stuff like device physics are off-limits, leaving just stuff like software and AI. Sam Zeloof has been making homemade microchips, devising his own semiconductor process and posting it on YouTube..

    Specifically to your question about either software or hardware, the specialty of embedded software engineering requires skills with low-level software or firmware, as well as dealing with substantial hardware-specific details. People that write drivers or libraries for new hardware require skills from both regimes, being the bridge between Electrical Engineers that design the hardware, and software developers that utilize the hardware.

    Likewise, developers for high performance computers need to know the hardware inside-out, to have any chance of extracting every last bit (pun intended) of speed. However, these developers tend to rely upon documentation such as data sheets, rather than having to be keenly aware of how the hardware was manufactured. Some level of logical abstraction is necessary to tractably understand today’s necessarily large and complex systems.

    5 - Do we have to join Intel first or something to learn how most of the things work lol ?

    Nope! Often, you can look to existing references, such as Linux source code, to provide a peek at what complexities exist in today’s machines. I say that, but the Linux kernel is truly a monster, not because it’s badly written, but because they willingly take code to support every single bleeding platform that people are willing to author code for. And that means lots and lots of edge cases; there’s no such thing as a “standard” computer. X86 might be the closest to a “standard” but Intel has never quite been consistent across that architecture’s existence. And ARM and RISC-V are on the rise, in any case.

    Perhaps what’s most important is to develop strong foundations to build on. Have a cursory understanding of computing, networking, storage, wireless, software licenses, encryption, video encoding/decoding, UI/UX, graphics, services, containers, data and statistical analysis, and data exchange formats. But then pick one and focus on it, seeing how it interacts with other parts of the computing world.

    Growing up, I had an interest in IT and computer maintenance. Then it evolved into writing websites. Then into writing C++ software. Right before university, I started playing around with the Arduino’s Atmel 328p CPU directly, and so I entered uni as a Computer Engineer, hoping to do both software and hardware.

    The space is huge, so start somewhere that interests you. From the examples above, I think online videos are a fantastic resource, but so can blog posts written by engineers at major companies, as can talks at conferences, as can sitting in at university courses.

    Good luck and good studies!

  • Redkey
    link
    fedilink
    arrow-up
    7
    ·
    7 months ago

    Some other people have given fine answers to your specific questions, so I won’t go over them again. But I want to make a more general comment on your post as a whole. Please take this in a spirit of care and kindness, because that’s how it’s intended.

    I don’t think you appreciate just how much you’ve asked, here. When we’re in the very early stages of learning about something, we’re often handicapped by the fact that we don’t even know how much we don’t know. The real answers to the questions that you asked here, if given in the useful sort of detail that you seem to want, represent a year or more of a Computer Science degree.

    There’s actually plenty of good information available for free on the 'net, but as you mentioned, it’s mostly in little bits and pieces, not big chunks. That’s just the nature of things made by hobbyists in their spare time, I’m afraid. You can assemble a detailed working knowledge of a variety of topics in computing, but you’ll need to spend some of your effort as a kind of detective, looking out for words, phrases, and concepts that will help you find the next “nibble” of information.

    I may be wrong, but I also get the feeling that you’re relying fairly heavily on YouTube for your searching. However, you’re going to find a lot of the more intermediate and advanced technical discussion happening in text on mailing list archives, forums, and personal websites.

    The more structured approach of an introductory textbook may be what you’re looking for. Even though it’s the 21st century, your local library is still a great information resource that you shouldn’t forget about.

    When I was a kid in the 80s and 90s, I learned a lot from the Usborne books. They’re still in business and still printing (new and updated) books like the ones I read. Don’t be put off by the “kiddy” theming; if they’re still the same as they were back then, they go into a surprising amount of depth.

    https://usborne.com/row/books/browse-by-category/science-and-technology/computers-and-coding

    The actual books that I read as a kid are now available for free from their website:

    https://usborne.com/row/books/computer-and-coding-books

    A lot of the implementation-specific information won’t be too useful any more, but the history and broad concepts haven’t changed.

    Happy trails!

  • thouartfrugal@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    7 months ago

    Just an old hobbyist here. Often I count myself lucky having grown up when a state-of-the-art home computer was a Commodore 64. Rightly or wrongly, I believe it’s quite possible for one human being to completely grasp what that machine is doing from the moment the power switch is turned on through to the end of running a complex self-written program. Not that it’s at the heart of your question(s) but that’s where my curiosity started. In those days any user had to know just a bit of the BASIC programming language, even if just to list the contents of a floppy disk or to load a pre-written program. I am always astounded at what people with much more dedication are able to do with a C64 to this day in the demoscene. The more generous among them make their discoveries digestible to mere mortals at sites such as codebase64.org. That’s a kind of comfort zone for me. Getting into something like a 386 PC and I start to feel overwhelmed. Maybe consider dipping back a bit into history if it sounds appealing?

    As to semiconductor fabrication, I found this unconventional book by Clive Maxfield to be very helpful in clarifying some things I was curious about.

    Some excellent stories from the heyday of MOS Technology in the first in this book series by Brian Bagnall. That’s the company that produced the popular 6502 family of 8-bit CPU that powered machines from Apple to Nintendo and many in between. Also where the custom chips were brought to life that formed the heart of the C64. One excerpt I often think back on were engineers laying flat on raised creepers, cutting the layout of their CPU-to-be out of huge sheets of vellum.

    5 - Do we have to join Intel first or something to learn how most of the things work lol ?

    May not be as far-fetched as you think. I’ve worked in Intel’s semiconductor factories, and Micron’s, and some others whose names aren’t widely known but whose products made things like the iPhone possible. Not cause I’m well-educated or have any particular talent, just that in a volatile marketplace such as this one there are ebbs and flows in demand for headcount in entry-level positions. Draft up a resume highlighting your critical thinking skills and willingness to learn and watch the recruiters from the staffing agencies fill your email inbox. I’ve had the good fortune to learn such processes as photolithography, thin-films, dry/wet etch, metrology, planarization, die sort (test), and on and on. Whether you’d like to operate the semiconductor tools, push the production metrics or maintain the equipment there just may be a need for you somewhere today.

  • BoscoBear@lemmy.sdf.org
    link
    fedilink
    arrow-up
    5
    ·
    7 months ago
    1. Flashing code to a chip doesn’t really involve light.

    2. you used switches on the front panel to load code into the computer by setting individual bits high or low. Typically you toggled in the bootstrap loader, which was a program that read a sequence of number directly into a spot in memory. The first program loaded by the bootstrap loader was usually the absolute loader. This was another program that loaded data from some peripheral, similar to the bootstrap loader, but it could do error checking and also load to non- sequential locations.

    3-the Internet isn’t light. It’s electricity. On fiber the bits may be temporarily encoded as light, but overall it is electric.

    4- You can understand it all if you want. It depends on the depth to which you want to understand it. You can understand a mouse has a plastic shell. You need some organic chemistry and chemical engineering to understand how to design plastic.

    5- I recommend Ben Eaters YouTube channel to get a good overview of the basics.

    • full_of_questionsOP
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      Not to be rude but beside from saying I’m wrong which I admit I mostly am you didn’t even bother to correct me or answer/explain .

      • BoscoBear@lemmy.sdf.org
        link
        fedilink
        arrow-up
        9
        ·
        7 months ago

        I think a good answer that you will understand is too long for this format. I gave a brief answer but then I went off looking for better information. Sorry to offend you.

        I think this course from Ben eater on how he built his own CPU from logic gates might explain a lot.

        I think it also covers how transistors work which is fundamental to how gates work.

        https://eater.net/8bit

      • BoscoBear@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        I think you’re asking all the right questions and I think knowledge of these things makes you much more capable.

  • dwraf_of_ignorance
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    If you really have time, read the book nand to Tetris or take the course on Coursera/YouTube. It will demistify almost everything about comp sci.