the-podcast guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • SerLava [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 months ago

    Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that’s just a static snapshot

    I’ve read long ago that replicating all the functions of a human brain is probably possible with computers around one order of magnitude less powerful than the brain because it’s kind of inefficient

    • bumpusoot [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      2 months ago

      There’s no way we can know that, currently. The brain does work in all sorts of ways we really don’t understand. Much like the history of understanding DNA, what gets written off as “random inefficiency” is almost certainly a fundamental part of how it works.

    • FunkyStuff [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Resident dumb guy chipping in, but are these two facts mutually exclusive? Assuming both are true, it just means you’d need a computer that’s 1e12x as powerful as our supercomputers to simulate the brain, which is itself 1e13x as powerful as a supercomputer. So we’re still not getting there anytime soon.

      *With a very loose meaning of what “powerful” means seeing as the way the brain works is completely different to a computer that calculates in discrete steps.