• 0 Posts
  • 43 Comments
Joined 22 days ago
cake
Cake day: April 12th, 2025

help-circle











  • It would have happened regardless. The cost of production has increased while the price of the product itself has ostensibly gone down. Like I said, adjusted for inflation, Chrono Trigger would have cost roughly $170 USD. Yes, the cost of cartridge production was relatively expensive, but that’s only a portion of the overall production cost for the game. At its peak, that dev team only had like 200 developers, and that was only during part of the development. Compare that to something like an Assassin’s Creed title that has 2-3x that sized team for most of the life of development.

    With the costs of development increasing and the cost of the game itself remaining stagnant, it was only a matter of time. People wonder what happened to all the middleware games that existed in the 90s and early 00s. This is why they died out. Companies can’t afford to take risks on titles because of ballooning production costs, so they stick to churning out recognizable IPs. tbh, they should have raised the prices a long time ago.








  • When I say “how can you be sure you’re not fancy auto-complete”, I’m not talking about being an LLM or even simulation hypothesis. I’m saying that the way that LLMs are structured for their neural networks is functionally similar to our own nervous system (with some changes made specifically for transformer models to make them less susceptible to prompt injection attacks). What I mean is that how do you know that the weights in your own nervous system aren’t causing any given stimuli to always produce a specific response based on the most weighted pathways in your own nervous system? That’s how auto-complete works. It’s just predicting the most statistically probable responses based on the input after being filtered through the neural network. In our case it’s sensory data instead of a text prompt, but the mechanics remain the same.

    And how do we know whether or not the LLM is having an experience or not? Again, this is the “hard problem of consciousness”. There’s no way to quantify consciousness, and it’s only ever experienced subjectively. We don’t know the mechanics of how consciousness fundamentally works (or at least, if we do, it’s likely still classified). Basically what I’m saying is that this is a new field and it’s still the wild west. Most of these LLMs are still black boxes that we only barely are starting to understand how they work, just like we barely are starting to understand our own neurology and consciousness.