Researchers have unearthed hundreds of thousands of cuneiform tablets, but many remain untranslated. Translating an ancient language is a time-intensive process, and only a few hundred experts are qualified to perform it. A recent study describes a new AI that produces high-quality translations of ancient texts.

  • ribboo@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    1 year ago

    It’s pretty freaking great at stuff like that though. We use a custom programming language at work, there are similarities with Haskell and others, but also many differences.

    We had a little game where a colleague had put together some team-exercises. He had encrypted a message in base64 and therein written instructions for code, in our custom language that when run gave you an output.

    ChatGPT managed to print out the, 100% non random output, and 100% stuff that’s never been anywhere on the internet, without trouble.

    • dandelo@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Google’s DeepMind was able to teach itself Indonesian without being directly trained on how to do so. Ancient Sumerian doesn’t seem too far fetched, all things considered!

      • grinde
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        There was a funny bit on WANShow a few months back where they demonstrated tricking ChatGPT into speaking Dutch (I think. It might have been another language). It vehemently insisted that it didn’t know Dutch, and could only talk to them in English. The messages saying this were written in Dutch.

        • RubberDucky
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          As a Dutch speaker, chatgpt always was able to speak Dutch though, tested it very early on

    • dandelo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Google’s DeepMind was able to teach itself Indonesian without being directly trained on how to do so. Ancient Sumerian doesn’t seem too far fetched, all things considered!

    • lightsecond
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I can’t even wrap my head around how a large language model can do this.

    • Vicious Me@feddit.nl
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      1 year ago

      That’s the problem, you see… it is great for simple things. Then you start believing in it and give more complicated tasks. It will fail, you will never know until it is too late. We are doomed…

      • 𝕊𝕚𝕤𝕪𝕡𝕙𝕖𝕒𝕟M
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I’ve found that after using it for a while, I developed a feel for the complexity of the tasks it can handle. If I aim below this level, its output is very good most of the time. But I have to decompose the problem and make it solve the subproblems one by one.

        (The complexity ceiling is much higher for GPT-4, so I use it almost exclusively.)