• Janus argues that language models like GPT are simulators, pretending to be something they are not.
• GPT can simulate different characters, such as the Helpful, Harmless, and Honest Assistant, or Darth Vader.
• Bostrom’s Superintelligence argued that oracles could be dangerous if they were goal-directed agents.
• GPT is not an agent, and is not likely to become one, no matter how advanced it gets.
• Psychologists and spiritual traditions have accused humans of simulating a character, such as the ego or self.
• People may become enlightened when they realize that most of their brain is a giant predictive model of the universe.
Published January 26, 2023
Visit Astral Codex Ten to read Scott Alexander’s original post Janus’ Simulators