March 3, 2026 · Podcast · 3h 26min
Meng Yan Talks with Li Jigang: After AI Takes Our Brainpower, What's Left Is Heart-Force
The Industrial Revolution took our physical labor. The internet took our space. AI is taking our time. What’s left for humans is heart-force. Not what you know, but what you want. Not what you can compute, but what moves your heart.
Episode Overview
This is a three-and-a-half-hour deep conversation. Meng Yan — founder of YouZhiYouXing, a Chinese investment education company — admits this is the first time he’s ever brought a laptop into a guest recording session. With Li Jigang, the conversation always goes beyond what he can prepare for.
The dialogue begins with investment, moves into the foundational philosophy of “viewfinders” and structures, then opens into Li Jigang’s most complete articulation of the present moment: three parallel worlds (atomic, bit, vector). From there, the two explore how humans should collaborate with AI, the split between “dry” and “wet” states, the claim that taste is neural weight, the fundamental transformation of education, and finally converge on a question that keeps resurfacing between them — how should humans conduct themselves?
Li Jigang contributes a cascade of original frameworks. Meng Yan probes each one through the lens of investment practice and inner experience. Two thinking paths — one walking from the edge of rationality toward the heart, the other walking from the heart toward structure — meet at a single point.
Viewfinders: Where Everything Begins
Meng Yan opens by naming why he enjoys talking with Li Jigang: his reframing ability. Li Jigang consistently offers another angle on familiar things.
Li Jigang calls these angles “viewfinders” (取景框). Everyone develops a framework for understanding the world through years of experience — eventually it solidifies into “this is who I am.” But he believes every viewfinder is partial. What’s real — call it truth, the Tao, the thing-in-itself — is a higher-dimensional object, and each person sees only one projection.
Accepting this produces two dispositions: humility and openness. Not as moral performance, but as worldview-level architecture. His opinions and his identity are completely decoupled: “If someone can offer a better viewfinder that destroys mine, I’m grateful.”
Three Foundational Formulas
Meng Yan pushes deeper: with all these viewfinders collected, what sits beneath them? Li Jigang offers three “ranks” (秩) — three formulas he believes can derive everything else he knows.
Bayes’ theorem. Your prior is always partial. Only new information updates your posterior. This isn’t just knowledge for him — it’s worldview. Humility comes from “my prior is at best 0.3 or 0.4.” Openness comes from “only the likelihood function’s update can raise the posterior.” Silicon Valley’s “iterate fast, fail fast” can be derived from this single formula.
Occam’s Razor. Among a thousand variables, perhaps only three are dimension-level — they form a coordinate system, and the other 997 are just positions within it. He calls these three “the rank” (秩). First principles thinking, in his framework, is simply searching for these three.
Theory of Everything. The physics aspiration to unify the four fundamental forces under one equation. He’s not a physicist, but he borrows the conviction: there must exist something simpler that explains everything below. “The feeling of governing all with a few — you discover that just these variables can handle everything.”
Three Parallel Worlds
From these three formulas, Li Jigang derives his most consequential judgment about the present: we simultaneously inhabit three worlds.
The atomic world, made of atoms. Scarcity comes from position — one atom occupies a coordinate, another can’t enter. Water on a mountaintop costs more than water at the base, not because of the labor of carrying it, but because of position. The Industrial Revolution over the past 200 years liberated human muscle.
The bit world, made of bits. The spatial dimension has been eliminated — the distance between any two points is zero. When people say the internet is a “dimensional reduction attack” on traditional business, the dimension being reduced is space. Every internet company does one thing: weave a network connecting people to something. The underlying law is singular: the Matthew Effect. Whoever crosses the critical threshold first owns that network. Scarcity shifted from position to attention. The past 30 years.
The vector world, made of vectors. This is the new world that emerged in the past three years. It still wears the internet’s clothing — another app, another chat box — but underneath, it’s completely different. It has eliminated the time dimension.
All of humanity’s knowledge across thousands of years has been “crystallized” through training. When you converse with a model, you’re directly accessing that crystallized wisdom. Previously on the internet, you could download a hundred books instantly, but reading them still took time. Now that time doesn’t exist. Past, present, and future coexist — ask the model to project 300 paths forward, and it instantly tells you path 49 is best.
Each world removes one constraint, opening a vast new possibility space. The atomic world made the world abundant. The bit world made the world flat. The vector world made time accelerate. “This isn’t metaphor. This is fact.”
Pulling Yourself Out
Li Jigang describes leaving his computer screen with a single word: pull (拔).
“I feel like I’m planted in a world. When I try to leave, something is pulling me back. I literally have to wrench myself out.”
Inside the vector world, feedback is so immediate and exquisite that you can’t stop. You throw out a question, the model responds in your preferred structure, finds cross-disciplinary isomorphisms you’d never have thought of, connects dots across fields effortlessly. “In terms of knowledge, no human can compare.”
Meng Yan shares the same experience. Using Li Jigang’s “roundtable discussion” prompt, he summoned Buffett, Kahneman, Graham, and Morgan Housel to debate whether buying and holding are the same question. “I’m not exaggerating — the depth of discussion was on a completely different level. And after they finished, they could take the topic to yet another depth.”
But Meng Yan noticed a side effect: he stopped posting on social media, lost the desire to express himself — “I just instinctively felt like I know nothing.”
Li Jigang’s response was sharp and urgent:
“Every one of us, in this one trip through life, has something to say and something to do. Expression isn’t because I know more than you. It’s because I am who I am, and right now I have something to say to this world.”
If that massive “Other” — the AI — suppresses all your words, what are you doing here? “I acknowledge the gap. But I still want to make my voice heard.”
Dry State and Wet State
This is a core metaphor running through the entire conversation, borrowed from Duan Yongchao’s book The Origin of New Species. Li Jigang believes the book’s ideas didn’t truly land during the internet era, but fit the AI era “perfectly.”
Dry state: Society functions like a dehydrator, treating people as fuel. Whether you’re a willow, a peach tree, or a pine — you’re all just combustible wood. Standardized, uniform. Your emotions, your stirrings of the heart, are “unnecessary” to the social machine.
Wet state: Your emotions, your fluctuations, the joy of falling in love, the surge of energy. Previously stripped away by society’s dehydrator, these may be what’s truly meaningful and valuable now.
“A reversal has occurred. What knowledge you have, what skills you’ve mastered — these may be completely unimportant in the new era. And what we previously dismissed has become what matters.”
Li Jigang maps “dry” and “wet” onto “brain” and “heart”: brainpower should be handed to models, heart-force should stay with humans.
“Brainpower and heart-force are two different forces. On the cognitive, dry level, I believe humans should completely cede ground — you simply cannot compete. But we have wetness. It has none. The combination of dry and wet is the strongest complementarity.”
Two Collaboration Postures
Facing human-AI collaboration, Li Jigang identifies two fundamentally different postures.
Posture One: Being penetrated. Boss sends a screenshot, AI completes it, Ctrl C + Ctrl V to submit. You’re a conduit — information flows through you, but nothing moves inside. Your brain thinks even less than before AI existed.
Posture Two: Being amplified. You first plant something of your own — even if it’s only 0.3, fragile and incomplete — and then AI, as unlimited compute power, builds upon it. “You have something first, then it amplifies.”
“These two types of people will diverge onto completely different trajectories. What AI amplifies is your intentionality — you need to have something standing there first.”
Meng Yan illustrates with his own hiring process: AI helped him screen resumes, prepare interview questions, record his spoken impressions, and generate evaluation reports. The process was remarkably smooth. But he felt a disorienting question arise: is the AI my agent, or am I the AI’s agent?
If you remove “me” from the equation, a virtual entity could ask the same questions, record impressions, reach conclusions. What’s missing is presence — “I need this person to leave a real impression on me, because we’ll be working together.”
Who is whose agent determines which path you’re on.
Taste as Neural Weight
Meng Yan pushes to a deeper layer: everyone in the AI era says “taste is all you need,” but what actually is taste? Where does it come from?
Li Jigang’s answer: taste is weight. Taste = Weight.
A UI designer who has viewed a thousand design mockups sees the 1,001st and feels “something’s off.” That feeling didn’t come from nowhere — those thousand previous images washed through the brain’s neurons and formed weights. You can’t skip the washing process and jump straight to the result.
“Taste is the weight you’ve developed through massive training. Your aesthetics aren’t innate — they’re trained.”
This produces a causal chain: your information feed washes your neurons → forms your viewfinder → the same information enters but your interpretation differs → your decisions differ → your life trajectory differs.
“Your feed is your fate.”
This is why he’s ruthlessly selective about what enters his brain. He cut his WeChat contacts from 4,000 to 500. He allows himself only 10 public account subscriptions — adding an 11th means deleting one. RSS feeds: also 10.
“Only with these constraints do I stop being overwhelmed. I found that what truly deserves your deep engagement is not that much.”
Meng Yan worries this might reduce his exposure to new viewfinders. Li Jigang’s response surfaces another belief: constraint and freedom operate on different levels. Constraint at one level creates freedom at the next.
“The ancients said: discipline the body, free the heart. You find freedom through constraint, then find new constraints, and freedom at yet another level. Cultivation is exactly this process.”
Chains and Rings: How Knowledge Persists
A research paper on memory changed Li Jigang’s thinking. Its core idea: memory is a ring structure. An open chain — firing from A to B — fades over time. Only a closed ring persists in the brain.
This transformed his approach to prompting and AI collaboration. Writing a prompt used to be a chain: intent → prompt → result → done. Like an arrow shot into the void, leaving no trace. Now he wants to form rings.
He built a system where the model maintains two living documents about him: Memory (what it remembers about him) and Soul (its portrait of who he is at the deepest level). Both update with every conversation. He also wrote 15 principles describing the relationship between them.
After each conversation, a coded signal saves the content to his local notes. The model produces a weekly summary: “This week your cognitive structure gained these new elements, this conflict remains unresolved.” A custom Skill lets any two notes collide — finding isomorphisms, finding higher-dimensional structures that explain both.
“It’s no longer an open arrow shot into the void. It becomes a ring — things accumulate and become the starting point for the next round of thinking.”
Companies as Wells, Not Networks
The conversation shifts to AI’s impact on organizational form.
Internet companies weave networks — connecting people to things. But in Li Jigang’s imagery, AI companies are more like wells being drilled.
Model providers are drilling a general-purpose well. Startups are drilling narrower but deeper wells. Pure wrappers with insufficient depth get swallowed when the model provider upgrades. But if you find a path where your well leads the provider’s — where a stronger model makes you stronger too — you’re secure.
What does well depth represent? The depth of understanding of a user. The internet era had “user portraits” — labels stuck all over people, an expedient born of necessity. The AI era can truly see a person.
“Today’s AI can see a person. It directly maps your soul. What kind of person you are, what you pursue — it pins that down, and then every word hits home.”
Business models shift accordingly. The internet’s underlying law is the Matthew Effect. The AI world’s underlying law is the long tail effect. From one-face-for-all (industrial era) to one-face-for-each-thousand (internet era) to one-face-for-one (AI era) — what you receive is made specifically for you.
Future companies might serve just ten or a hundred thousand people, but serve each one exquisitely. A one-to-three-person company, thriving.
Two People This Era Needs
Li Jigang argues our era needs two landmark figures.
The first is a Drucker for the AI age. Drucker answered “how does one person manage ten thousand people.” The new question is “how does one person manage ten thousand Agents.” Agents have no self-interest — removing that variable should produce an entirely different management philosophy.
The second is an existentialist philosopher for the AI age. Heidegger and Camus faced the absurdity of “God doesn’t exist.” Today we face the absurdity of “my proudest capability — thinking — is being crushed by a machine.”
“How should humans exist in the AI age? Someone needs to stand up and answer this. The intersection of the AI world and the philosophy world is too small right now.”
Meng Yan asks: why have there been no great thinkers on the scale of 2,000 years ago? Li Jigang offers two reasons. First, the intellectual architecture built by the ancients is so powerful that everyone since has been operating within their coordinate system — it’s extremely hard to break free. Second, the pace of change makes sustained deep thought almost impossible. “But precisely because we’re experiencing fast change — two hundred years, thirty years, three years — our era will produce new thinkers. Not because they’re smarter, but because the era’s questions are erupting before our eyes.”
Education: From Water to Fire
From the parent question of “how should humans conduct themselves,” Li Jigang derives his vision for education.
Today’s education system was born from the Industrial Revolution. Factories needed literate, punctual workers, so nations built the Prussian education system — perfectly coupled to factory operations. Two centuries later, we sit at computers weaving Excel spreadsheets. “Aren’t we just modern-day textile workers?” The underlying logic hasn’t changed. Only the factory has.
But now the factory isn’t hiring. It only needs ten thousand Agents.
“When the company you were going to work for no longer hires people — just Agents — what do you do? Why read books? Why take exams?”
Li Jigang calls the old paradigm water education — pouring knowledge into buckets, grading students by how full they are. He calls the future paradigm fire education.
“Fire education has two phases. First, parents explore the world with their children to discover where their uniqueness lies — which dimension is their little matchstick. Second, education’s job is to light it.”
Meng Yan distills: what makes them unable to sleep? Li Jigang: exactly — find that.
Under water education, a child who loves drawing manga gets told “stop fooling around.” Under fire education: “You love manga? Wonderful — your spark has been found.”
During the transition period while the education system catches up, Li Jigang offers a pragmatic suggestion: water education by day, fire education by night. Complete schoolwork during the day. At home in the evening, use AI to kindle the child’s uniqueness. AI doing homework doesn’t mean copy-paste — it means: “What do you love? Ultraman? Great — let’s write a story about Ultraman that weaves in these 13 vocabulary words, generate a comic, read it, memorize the words. Tell your teacher tomorrow: I already know them, test me."
"I Am” versus “It Is”
Running beneath the entire conversation is Li Jigang’s foundational reward function.
He calls a person’s irreducible, singular subjectivity “I am” (我在). He calls the standards and expectations imposed by society “it is” (他在). The flourishing of “I am” is defined as gain. The erosion by “it is” is defined as loss.
“I often trace back: why do I have this thought? Based on one-two-three. Good — who gave me one-two-three?” He learned this from Descartes — doubt everything, trace the origin.
He doesn’t examine every passing thought, but at major junctures, he strips away layers: is this truly mine, or did society tell me I “should” think this? With enough practice, common instances of “it is” get automatically identified and peeled away.
He has had a defining trait since childhood: a sense of detachment from the world. “I’ve always felt like an observer of this world, not a participant. There are always two of me — one watching from the outside.” When everyone is laughing, he doesn’t feel joy. When everyone grieves, he doesn’t feel grief. He once thought something was wrong with him, then accepted it. This makes him an unusually steady investor — market crashes don’t move him. “The structure hasn’t changed. I bought the structure.”
How Should One Conduct Oneself
Three and a half hours of conversation converge on this point at the end.
Before recording, Meng Yan did something unusual. He fed all their prior materials — presentation slides, a 9-hour conversation transcript, scattered WeChat exchanges — to his AI partner, which also had deep knowledge of Meng Yan himself from hundreds of essays and podcast episodes. He asked just one question: what do you most want to hear us discuss?
The AI’s answer silenced both of them:
“Li Jigang is someone walking from the edge of rationality toward the heart — starting from Bayes’ theorem and Occam’s Razor, he eventually picked up the Diamond Sutra and the Tao Te Ching. Meng Yan is someone walking from the heart toward structure — starting from intuition, he built decision frameworks through investment practice. You’re walking from completely opposite directions, and you’ve met at the question of ‘how should one conduct oneself.’”
Li Jigang’s response:
“Two stones striking in the middle — that spark.”
The question has no answer. It’s a mother-question, one that “endlessly generates good sub-questions.” But scattered throughout the conversation are their fragmentary answers: heart-force, taste, freedom through constraint, the return of wetness, expression as existence, finding the unique dimension where your little matchstick burns.
Perhaps the closest thing to an answer is what Li Jigang has been doing all along — “advance one inch, and gain one inch of joy.”