• Dodecahedron December@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      This isn’t a perfect example but consider you feed a program all of the books ever written. The program parses these books and keeps track of how often one word correlates with another word, based on the frequency that the word appears along the other words.

      Now store all that data into a huge (35gb) file. This file isn’t human readible, it’s just sort of a large table of all of these word correlations. Install this program with its large language model (the 35gb file generated from parsing all the books) on a system or systems capable of doing lots of math fast. Something like a high end GPU.

      Now, as a user, send a series of words to the program. The program will look at the words you have written and come up with words that correlate to what you have written and what the bot has already written.

      “Correlate” isn’t really the best term to use here, but statistics are done based on surrounding words. The program still acts like a program, just predicting the next word using statistics found in the LLM. The program doesn’t know how to do math, or write code, but it can have very convincing discussions on both, or anything really.