What if we’re expecting the wrong thing regarding AGI?
Expecting a single AI that does it all might be the wrong approach, especially when we consider seemingly intelligent behavior in nature. Look at the concerted actions of ants, termites, or bees. Their individual constituents often seem rather dull and sub-intelligent, guided by a couple of simple rules and sensors that interact with these rules.
Emergent, intelligent, and dynamic behavior starts when all these „dumb“ parts begin interfacing and interacting.
Looking at the immense plethora of AI we’ve already established, perhaps it’s time to empower these systems to interface and interact with each other.
Or maybe they already did?