• Kogasa
    link
    English
    74 months ago

    a/b is the unique solution x to a = bx, if a solution exists. This definition is used for integers, rationals, real and complex numbers.

    Defining a/b as a * (1/b) makes sense if you’re learning arithmetic, but logically it’s more contrived as you then need to define 1/b as the unique solution x to bx = 1, if one exists, which is essentially the first definition.

    • @[email protected]
      link
      fedilink
      English
      44 months ago

      That’s me, a degree-holding full time computer scientist, just learning arithmetic I guess.

      Bonus question: what even is subtraction? I’m 99% sure it doesn’t exist since I’ve never used it, I only ever use addition.

      • Kogasa
        link
        English
        34 months ago

        Addition by the additive inverse.

            • @[email protected]
              link
              fedilink
              English
              24 months ago

              Computers don’t subtract, and you can’t just add a negative, a computer can’t interpret a negative number, it can only store a flag that the number is negative. You need to use a couple addition tricks to subtract to numbers to ensure that the computer only has to add. It’s addition all the way down.

              • Kogasa
                link
                English
                14 months ago

                What does this have to do with computers?

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        4 months ago

        what even is subtraction?

        It’s just addition wearing a trench coat, fake beard and glasses

    • @[email protected]
      link
      fedilink
      English
      0
      edit-2
      4 months ago

      Defining a/b as a * (1/b) makes sense if you’re learning arithmetic

      The example was just to illustrate the idea not to define division exactly like that