• FermiEstimate@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 年前

    OpenAI: “Our AI is so powerful it’s an existential threat to humanity if we don’t solve the alignment issue!”

    Also OpenAI: “We can devote maybe 20% of our resources to solving this, tops. We need the rest for parlor tricks and cluttering search results.”

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 年前

    Cstross was right! You tell everybody. Listen to me. You’ve gotta tell them! AGI is corporations! We’ve gotta stop them somehow!

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 年前

        It was more Soylent green, but it also is partially based on (friend of the club) the writings of C Stross yes. I think he also has a written lecture on corps being slow paper clipping AGI somewhere.

        • smiletolerantly@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 年前

          That is a beautiful comparison. Terrifying, but beautifully fitting.

          I read Stross right after Banks. I think if I hadn’t, I’d be an AI-hype-bro. Banks it the potential that could be, Stross is what we’ll inevitably turn AI into.

          • gerikson@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 年前

            Banks neatly sidesteps the “AI will inevitably kill us” scenario by making the Minds keep humans around for amusement/freshness. Part of the reasons for the Culture-Idiran war in Consider Phlebas and Look to Windward was that the Idirans did not want Minds in charge of their society.

            • smiletolerantly@awful.systems
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 年前

              Noone was trying to force that on them though, the actual reason IIRC correctly that Idirans had a religious imperative for expansion, and the Culture had a moral imperative to prevent other sentients’ suffering at the hands of the Idirans.

              IMO he mostly sidestepped the issue by clarifying that this is NOT a future version of “us”