• Scrubbles
    link
    fedilink
    189 months ago

    wow that lasted for what, just under a year? Impressive, it took less than a year for them to lose their morals.

    • @[email protected]
      link
      fedilink
      English
      49 months ago

      I remember a few years ago when Google or AWS staff rebelled because the Pentagon was going to use their (hosting?) services. I guess those types people at OpenAI have either been let go, or beaten into submission. I understand the feeling of futility, when no matter how much effort you put into something, someone above you who has little to no understanding of what they’re doing, won’t listen to your advice/recommendation

      • Scrubbles
        link
        fedilink
        English
        8
        edit-2
        9 months ago

        Hit the nail on the head. It doesn’t matter what the software/tech people say, business will do whatever the fuck it wants. If you have a moral objection? Guess what lucky case is you’re moved to an irrelevant project and your career trajectory is immediately stopped. Otherwise insubordination, not doing your job duties, there’s the door.

        For a lot of us software people (myself included), I’ve made decisions in the past of “Well, they’re going to build it anyway, I might as well try to enforce what I can from my level here”. I know 100% I’ve said things aren’t technically possible at key junctures when they started breaking moral lines. “Sorry, I just don’t know of a way technically to make that happen”. They can think of me being stupid, I don’t care.

  • voight [he/him, any]
    link
    fedilink
    English
    69 months ago

    Computer, draw me up a battle plan, you did great at picking random apartment complexes to bomb last time.

  • Lvxferre
    link
    fedilink
    59 months ago

    When I see this sort of thing, I immediately remember something that I learned from discourse analysis: look at what is said and what is not said.

    OpenAI knows that military and warfare are profitable and unpopular. So how do you profit from it without getting the associated bad rep (“OpenAI has bloods on its hands!”)? Do it as silently as possible, and cover it under an explanation that it’s “clearer” for you.

  • happybadger [he/him]
    link
    fedilink
    English
    39 months ago

    Hell yeah. I don’t see this being nefarious so much as I do as the same way corporate spaces implement ChatGPT. It’s going to be part of the enshittification of the military. Got an admin question? We just fired the admin specialists so ask the wonky robot. Got a medical question? Military healthcare has been dismantled so ask the wonky robot doctor. Aircraft mechanics are going to cause a crash because they made the robot mechanic hallucinate how an obscure part on a classified component must be installed. Everything will get worse the more they try to plug the holes in their manpower with a search engine that can pretend to be a horny minotaur.

  • Optional
    link
    fedilink
    19 months ago

    I viewed OpenAI as a disturbance and an annoyance until now.

    You realize, of course, this means war.