James Cameron has reportedly revealed an anti-AI title card will open up Avatar 3, officially titled Avatar: Fire and Ash. The Oscar-winning director shared the news in a Q&A session in New Zealand attended by Twitter user Josh Harding.

Sharing a picture of Cameron at the event, they wrote: “Such an incredible talk. Also, James Cameron revealed that Avatar: Fire and Ash will begin with a title card after the 20th Century and Lightstorm logos that ‘no generative A.I. was used in the making of this movie’.”

Cameron has been vocal in the past abo6ut his feelings on artificial intelligence, speaking to CTV news in 2023 about AI-written scripts. “I just don’t personally believe that a disembodied mind that’s just regurgitating what other embodied minds have said – about the life that they’ve had, about love, about lying, about fear, about mortality – and just put it all together into a word salad and then regurgitate it,” he told the publication. “I don’t believe that’s ever going to have something that’s going to move an audience. You have to be human to write that. I don’t know anyone that’s even thinking about having AI write a screenplay.”

  • oce 🐆@jlai.lu
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    There are various independent reproducible measurements that give weight to the hot big bang theory as opposed to other cosmological theories. Are they any for the deterministic nature of humans?
    Quantum physic is not deterministic, for example. While quantum decoherence explains why macro physical systems are deterministic, can we really say it couldn’t play a role in our neurons?
    On a slightly different point, quantum bits are not binary, they can represent a continuous superposition of multiple states. Why would our mind be closer to binary computing rather than quantum computing?

    • chemical_cutthroat@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      23 hours ago

      The comparison between human cognition and binary isn’t meant to be taken literally as “humans think in 1s and 0s” but rather as an analogy for how deterministic processes work. Even quantum computing, which operates on superposition, ultimately collapses to definite states when observed—the underlying physics differs, but the principle remains: given identical initial conditions, identical outcomes follow.

      Regarding empirical evidence for human determinism, we can look to neuroscience. Studies consistently show that neural activity precedes conscious awareness of decisions (Libet’s experiments and their modern successors), suggesting our sense of “choosing” comes after the brain has already initiated action. While quantum effects theoretically could influence neural firing, there’s no evidence these effects propagate meaningfully to macro-scale cognition—our neural architecture actively dampens random fluctuations through redundancy.

      The question isn’t whether humans operate on binary code but whether the system as a whole follows deterministic principles. Even if quantum indeterminacy exists at the micro level, emergence creates effectively deterministic systems at the macro level. This is why weather patterns, while chaotic, remain theoretically deterministic—we just lack perfect information about initial conditions.

      My position isn’t merely philosophical—it’s the most parsimonious explanation given current scientific understanding of causality, neuroscience, and complex systems. The alternative requires proposing special exemptions for human cognition that aren’t supported by evidence.

      • oce 🐆@jlai.lu
        link
        fedilink
        English
        arrow-up
        1
        ·
        12 hours ago

        Even quantum computing, which operates on superposition, ultimately collapses to definite states when observed—the underlying physics differs, but the principle remains: given identical initial conditions, identical outcomes follow.

        I think this is incorrect, it does collapse to definitive state when observed, but the value of the state is probabilistic. We make it deterministic by producing s large number of measurements and deciding on a test on the statistical distribution of all the measurement to get a final value. Maybe our brain also does a test on a statistic of probabilistic measurements, or maybe it doesn’t and depends directly on probabilistic measurements, or a combination of both.

        we just lack perfect information about initial conditions.

        We also lack fully proven equations or complete resolution of equations in fluid dynamics.

        I think parsimony is very much based on personal opinion at this point of knowledge.

        • chemical_cutthroat@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 hours ago

          You’re right about quantum measurement—I oversimplified. Individual quantum measurements yield probabilistic outcomes, not deterministic ones. My argument isn’t that quantum systems are deterministic (they’re clearly not at the individual measurement level), but rather that these indeterminacies likely don’t propagate meaningfully to macro-scale neural processing.

          The brain operates primarily at scales where quantum effects tend to decohere rapidly. Neural firing involves millions of ions and molecules, creating redundancies that typically wash out quantum uncertainties through a process similar to environmental decoherence. This is why most neuroscientists believe classical physics adequately describes neural computation, despite the underlying quantum nature of reality.

          Regarding fluid dynamics and weather systems, you’re correct that our incomplete mathematical models add another layer of uncertainty beyond just initial conditions. Similarly with brain function, we lack complete models of neural dynamics.

          I concede that parsimony is somewhat subjective. Different people might find different explanations more “simple” based on their background assumptions. My deterministic view stems from seeing no compelling evidence that neural processes harness quantum randomness in functionally significant ways, unlike systems specifically evolved to do so (like certain photosynthetic proteins or possibly magnetoreception in birds).

          The question remains open, and I appreciate the thoughtful pushback. While I lean toward neural determinism based on current evidence, I acknowledge it’s not definitively proven.