
Shreya Kudumala
•
16 October 2025
All This Talk About (Artificial) Intelligence
What is intelligence?

No, not the dictionary definition.
Human intelligence is an aggregate of multiple systems like reasoning, perception, memory, intuition, empathy, cultural knowledge. Psychologists from Howard Gardner onward have argued that what we casually call “intelligence” is actually a network of different modes of thought.
For example, humans are uniquely good at inferring meaning from ambiguous or incomplete signals (contextual reasoning). We detect sarcasm, read body language, understand a pause as much as a word.
AI doesn’t do that. Not really. It doesn’t “know” anything in the human sense but it is very good at generating plausible continuations by predicting the next word (statistical inference).
That’s not a limitation in itself. It’s a different kind of intelligence.
The most transformative technologies in history didn’t imitate nature but they extended it. The solar panel doesn’t imitate photosynthesis and the airplane doesn’t flap its wings.
Likewise, the most useful role for AI is as a complementary system, like yin and yang.
Humans:
excel at context, judgment, creativity, improvisation
struggle with scale, memory, consistency
AI:
excels at scale, precision, speed
struggles with ambiguity, meaning, intuition
This asymmetry uncovers a new design principle.
The future of AI tools lies in building a productive division of labor between human cognition and machine capability.
Today, a product planning team might start with a vague idea, manually dig through dashboards and search for context in archived Slack channels. All this while, the AI mostly sits idle, just waiting to be prompted.
In a better world, the AI surfaces relevant user insights the moment a new feature is logged, flags overlapping sprint items, and highlights architectural dependencies, all before anyone has asked. This leaves the team to focus on the vision, making judgment calls, and tradeoffs.
This is what complementarity looks like. Not AI pretending to be us, but systems that do the things we can’t easily do—because they aren’t us.



