Analogies can be used far too easily to poison the well through introducing bias if they are only superficially analogous. Saying that AI is like nuclear weapons immediately introduces the thoughts of danger at a global scale, even though there is no analogous links between the two. Similarly, saying that AI is like the invention of the steam-engine introduces the thought of productivity and economic revolution, even though the links between 18th century steam engines and AI are tenuous.
Analogies don't need to be formal, but the less analogous they are, the more bias they introduce.
This was a valuable and interesting read. You took a specific route; but I feel the main point is more universal. For the bulk of our day-to-day knowledge, we "know" new things by metaphor first....by comparisons to other things that we "know," which in turn also spring from earlier metaphor. And for the large majority of our knowledge base it remains there, in this informal and analogy-based form; refined over time but still based in comparison/analogy/metaphor. Rare indeed is the knowledge gained by formal proofs of logic or mathematics. (Special and powerful yes... but rare).
In this, I think the foundations of our human thinking are reflected by AI, far more than most people realize.
Great post! Fully agree that while analogies are often used instrumentally for persuasion, they do also have value in exploration - especially when facing a lot of uncertainty/complexity. I made a similar argument here: https://machinocene.substack.com/p/ai-analogies-an-introduction
Analogies can be used far too easily to poison the well through introducing bias if they are only superficially analogous. Saying that AI is like nuclear weapons immediately introduces the thoughts of danger at a global scale, even though there is no analogous links between the two. Similarly, saying that AI is like the invention of the steam-engine introduces the thought of productivity and economic revolution, even though the links between 18th century steam engines and AI are tenuous.
Analogies don't need to be formal, but the less analogous they are, the more bias they introduce.
This was a valuable and interesting read. You took a specific route; but I feel the main point is more universal. For the bulk of our day-to-day knowledge, we "know" new things by metaphor first....by comparisons to other things that we "know," which in turn also spring from earlier metaphor. And for the large majority of our knowledge base it remains there, in this informal and analogy-based form; refined over time but still based in comparison/analogy/metaphor. Rare indeed is the knowledge gained by formal proofs of logic or mathematics. (Special and powerful yes... but rare).
In this, I think the foundations of our human thinking are reflected by AI, far more than most people realize.
Great post! Fully agree that while analogies are often used instrumentally for persuasion, they do also have value in exploration - especially when facing a lot of uncertainty/complexity. I made a similar argument here: https://machinocene.substack.com/p/ai-analogies-an-introduction