• jdeath@lemm.ee
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    1 year ago

    i like when my strongly typed language can type itself, why should i have to type extra words because the compiler is stupid?

    • Wats0ns@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      17
      arrow-down
      1
      ·
      edit-2
      1 year ago

      So that next time your coworker uses the wrong type, the compiler can scream at him: “NO I WONT COMPILE THIS YOU DUMBASS, LOOK JOHN SAID ON LINE 863 THAT IT SHOULD BE A DOUBLE, NOT A FLOAT FOR FUCK SAKE”

        • mark
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          As a JS dev, I can only wish we had those types 🥲

      • jdeath@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        you can still have that without having to declare the type manually. check out Swift or OCaml for example

    • tgv
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      edit-2
      1 year ago

      deleted by creator

      • Gecko@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        1 year ago

        Type error unless there’s an implementation of + that specifies adding together and integer and a string.

        • jdeath@lemm.ee
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          1
          ·
          1 year ago

          💯% accurate. funny how the typescript developer thinks this is some kind of “gotcha!”… like maybe just try a language besides typescript and find out for yourself 😆

        • nyan@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          1 year ago

          Exactly. Most languages I know of that allow this at all will coerce the “1” to an integer and give x = 2. They get away with this because they define the “+” operator as taking numbers only as arguments, so if you hand them x = x + "cheese" they’ll error out.

      • sweeny@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I’m not sure if you’re being rhetorical or not, but “string|number” is definitely correct here. A computer could definitely figure this out, but typing is for the benefit of the coders more than the code itself. It’s basically functional documentation

          • sweeny@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            Yeah that’s what I’m saying, I hate it when coworkers will assign everything as “any” just to avoid the scary red squigglies. Oh well I guess that’s what code reviews are for 🙃