As a teacher who witnesses this trend in real time, I firmly believe that our schools can be the locus of a solution. The “21st century learning” model that mirrored the marketplace by integrating any and all new technologies failed to account for the business model of the attention economy. Our schools can and should be embodied alternatives to the screen-saturated norm. Cognitive development and attentional capacity need to be design principles in every classroom and every lesson. This CAN be done.
Haidt’s arguments track with my classroom experience, Julie. I endorse bell-to-bell phone bans, but it’s time to follow up on that policy. While many districts jumped aboard that train very quickly, many haven’t implemented the restriction successfully, including in my jurisdiction. We now need to audit schools to ensure that they’re meeting those expectations.
I also endorse his views on free play, and I think that Haidt’s position is well-suited to parents and families in welcoming new, common sense norms.
In schools, we need to go further, and I think we need to begin a much more serious professional conversation about what educators can and should do. Administrative policy matters. We need to take the architecture of educational environments seriously (including digital infrastructure). But most importantly, we need a teaching practice that is truly responsive to the 21st century.
I’m beginning to develop some ideas around what I call a “Pedagogy of Cultivated Attention”, which will unfold in my writing over the next weeks and months. I’d love it if you and others would be interested in following along.
You read my mind! There’s a lot of understandable concern out there, so I’m very motivated to forward a solution because diagnosis only goes so far. I’ve begun writing about these ideas and offering actionable strategies, but I need the support of people like you to promote them. We can definitely do this.
Agreed to all of this. I would go further and say kids under 18 shouldn't have smart phones or social media. And I would support laws banning both. It's just as bad for young brains as drinking or smoking
Yes! me too -- there is more bipartisan support to advance phone-free schools, and more educators like me who are going all-in on flipped classroom models that emphasize writing, speaking and listening in real time -- it's really a training in philosophical orientation and psychological awareness of our group conditioning -- and a rising responsibility that we're each responsible to be co-creators of the cultures we want to live in.
They ban phones and still pass out Chromebooks to kindergartners. At least in kindergarten the kids didn’t use them every day. In first grade, they do and they bring them home, even though they have no homework.
Hear, hear! Some say that we need to teach students how to use technology, as if there is any chance they won't be technologically saturated outside school. Typing homework on a computer and emailing it is more efficient, and digital textbooks are easier to carry around, but beyond that I see no use for technology in school. And if using computers for those tasks is introducing the familiar digital distraction, then their benefit is not worth the price and we should go back to pen and paper.
This is hardly the biggest issue, but there are many studies that establish that reading fiction develops empathy. As we read less, this is just another thing we are losing.
Is this true only of reading or do other forms of fiction give similar results? Obviously movies and TV shows can induce similar emotions in a viewer, which I think would be indicative that they force the viewer to practice empathy for the characters.
I recently read Austen's Northanger Abbey for the first time, which is largely a book about reading, and I was reminded again how transformative fiction has been to society. Imagination is the only way to connect with people different from you, and without learning and exercising the imagination muscle, and reading fiction is an incredibly cheap way to do that, everything in civil society becomes more difficult.
As the father of bright, curious, and motivated 17 and 13 year old boys who attend an academically rigorous private school, I think about this subject every day. They're reading complete books, partly because my wife and I are avid readers and partly because their school requires it. I wonder, though, what jobs will be available for them after college? This is all happening so fast yet it seems that so many of us are unaware of it.
To quote Hemingway in "The Sun Also Rise," I fear when we're asked in 3 - 5 years how the economy and job market collapsed, the answer will be - "Two ways. Gradually, then suddenly."
I am curious if anybody have good articles / POV to share on Ai as a class divider? I am assuming "higher social class" will keep pushing their kid to read long book or be in environment who push for this, while "lower class" will outsource their thinking to LLM because easier and faster. The same way you could see differences on families who restrain screens or processed food.
One of the few fields projected to add jobs over the next 5-10 yrs is healthcare. Beyond the obvious doctor or nurse, there are a bunch of other roles like x-ray tech, respiratory therapist, pharmacist, MRI tech, physician's assistant, physical therapist, occupational therapist, or nuclear medicine technologist. Some of these roles are even offered as a 2 yr program at local colleges, and offer good pay for individuals.
Maybe a premature thought. How about letting this play out for a bit first? Or did you go into the safe career that your parents picked out for you when you were 10?
Oh, I'm not pushing, it just having frank discussions with them.
That being said, my career is pretty close. I was always thinking of doing a business management degree. I took my first class "intro to business" which was probably the most important class I ever took.
We did a Monster.com project where we researched jobs in the career we were thinking about. I realized how general "business management" is. And then I saw all the accounting jobs. And I realized that accounting teaches and actual skill.
So my degree ended up being business economics with a focus in accounting, then I got my CPA and MBA.
So I pretty much followed the plan but with some tweaks that I think really worked out.
As a slightly related point, I think the way that white collar workers work also contributes to this lack of deep thinking (even beyond the introduction of AI). The constant meetings, emails, Slack/DMs - none of it is conducive to strategic, deep thinking. It is no wonder that AI is accelerating an already bleak trend.
Thank you for explaining why I never use AI in my creative writing. The road to creativity is paved with deep, introspective thinking. The journey is the whole point.
I do use ambient AI in the clinical setting for documentation purposes but I’m not totally convinced that it saves time. It hallucinates and generates lengthy, bloated encounter notes. So I end up spending hours editing. Sometimes the errors in AI medical documentation are dangerously inaccurate.
Great article! I don't believe the 18-month prediction either. The choice was never between "AI" versus "human", the choice is between "AI" versus "human+AI". The option "human+AI" working together on a job will always generate a competitive advantage.
The promise of ASI (a superintelligence) or AGI is false one. Rather, most of the AI datacenter capacity being built today will be used for basically entertainment purposes, generating customized facebook AI friends, new funny cat videos, and so on. For most people The Internet is really just the facebook/twitter/youtube. And as you've described in the article, many people are functionally illiterate over longer distances. If ChatGPT could provide facebook-like features combined with google-like question answering in a single interface and in voice (i.e. no reading necessary at all) then 50% of populace would accept it outright, generating enormous income for the tech providers. This is, I believe, true goal of the AI gold rush.
"The promise of ASI (a superintelligence) or AGI is false one." - maybe the case for current models as scaling alone doesn't seem to be getting there, but never? We already know that a general purpose AI can evolve with 200 trillion connections and a few million years of (extremely repetitive) training so one based on chips seems inevitable if we keep going.
One day, yes. But the current LLM is nothing like a biological neural network, so we could not compare the scale of connections between the two. It's possible that the LLM architecture is not scalable beyond some intelligence level simply because there is not enough unique training data to get it there. They could invent a better arch, but that is not a project for 18 months.
I think one underrated aspect of this trend is its inequality-inducing effects. The rise of social media, smartphones, and AI, while they've plausibly led to a net loss in deep-thinking capacity for most of the population, have probably led to a net-gain for a very small elite group. I think there's potential for this to lead us down the road of something akin to 'intellectual authoritarianism,' where you have a small, high-capacity group administering a society made up largely of semi-literate people.
I’ve started a simple rule for myself. For any complicated question, I take the time to outline my thoughts, evidence, counters, etc. before I open an LLM. It forces the reps that make thinking stronger rather than outsourcing that lift. Then I use AI to critique my draft, not create it. That small ordering change keeps writing-as-thinking intact and makes the machine a sparring partner, not a crutch.
I am writing a book about this very topic — The Journey to Phronesis (publishing April 2026) — and is full of practical ideas like this. I shared the initial Table of Contents from my manuscript in a recent post: https://www.lukich.io/p/202509-the-journey-to-phronesis-a-look-inside
If I may, abundance in education has meant scale; scale has meant the decline in rigor and responsibility for actually educating, at the human level. This is all I write about. To write about the outputs of education without understanding the factory won't get to solutions.
Appreciate the focus on this. Those NEAP results were extremely striking. The time-sensitivity we keep being beaten (i.e. there is nothing we can do, AI will revolutionize everything and is inevitable) over the head with is more and more frustrating.
It just is not the case that we have to let a new technology erode our thinking and ability to be responsible adults across domains. We are hamstringing a generation already hard up for jobs and their successors will be even worse off if we don’t get our wits together about it.
Learned yesterday, amidst the H-1B fee, that 54% of Americans read below a 6th-grade level. The cocktail of geopolitical tensions, hostility to immigration, waning attention spans, and this worries me.
What about taking out the homeworks from liberal arts, make them read a lot of books and then do one written and oral exam at the end. That's the way it is done in Europe and it works pretty much. It would also be the occasion to make humanities difficult again as Matt Yglesias says
“The hidden machinery now powers the visible human.
In other words, we used to hide the humans inside the machine. Now we hide the machine behind the humans.
This strange new world resembles Theseus's famous paradox: the ship looks unchanged, but all its internal mechanisms have been quietly replaced. Our economy preserves the appearance of human work while silently replacing its cognitive components, leaving us to wonder who (or, more accurately, what) is truly at the helm.
Welcome to the autopilot economy, where the modern workplace increasingly resembles the cockpit of a 787.
In aviation, that “dance between man and machine, where skill, knowledge, and intuition intertwine,” the machine leads and man follows.”
I agree with these concerns, but I’m confused why worrying about critical thinking should preclude worrying about what more powerful AI might do.
It feels a bit like if a climate activist said future climate catastrophes are “not the problem” because there is warming happening right now. Why not care about both?
Sure, in the limit we have to prioritize some things over others, and one issue’s salience can obscure another (perhaps that’s all the piece was trying to say?), but in practice I think we have the capacity to care about many things at once.
In using this framing, I feel the essay sidesteps a crucial question: why will deep thinking be important for the next generation? There are many possible answers, some hinted at in the piece, but defending any of them requires grappling with how society might adapt to more powerful AI. It may be “no fun to imagine,” but I wish Thompson would rise to the challenge and write about it more directly.
The Renaissance was sparked by the rediscovery of ancient texts. Suddenly, modern genius joined in conversation with centuries of ancient thought and experience. Will making the wisdom of the past constantly available push us to new heights, or fossilize our understanding from the apex of a dying civilization?
As a teacher who witnesses this trend in real time, I firmly believe that our schools can be the locus of a solution. The “21st century learning” model that mirrored the marketplace by integrating any and all new technologies failed to account for the business model of the attention economy. Our schools can and should be embodied alternatives to the screen-saturated norm. Cognitive development and attentional capacity need to be design principles in every classroom and every lesson. This CAN be done.
https://open.substack.com/pub/walledgardenedu/p/the-disappearing-art-of-deep-learning?r=f74da&utm_medium=ios
Do you have an opinion about Jonathan Haidt's work?
Haidt’s arguments track with my classroom experience, Julie. I endorse bell-to-bell phone bans, but it’s time to follow up on that policy. While many districts jumped aboard that train very quickly, many haven’t implemented the restriction successfully, including in my jurisdiction. We now need to audit schools to ensure that they’re meeting those expectations.
I also endorse his views on free play, and I think that Haidt’s position is well-suited to parents and families in welcoming new, common sense norms.
In schools, we need to go further, and I think we need to begin a much more serious professional conversation about what educators can and should do. Administrative policy matters. We need to take the architecture of educational environments seriously (including digital infrastructure). But most importantly, we need a teaching practice that is truly responsive to the 21st century.
I’m beginning to develop some ideas around what I call a “Pedagogy of Cultivated Attention”, which will unfold in my writing over the next weeks and months. I’d love it if you and others would be interested in following along.
I love this and wonder why there isn’t more broad support for this kind of action… the upside seems so obvious while the downsides are minimal
You read my mind! There’s a lot of understandable concern out there, so I’m very motivated to forward a solution because diagnosis only goes so far. I’ve begun writing about these ideas and offering actionable strategies, but I need the support of people like you to promote them. We can definitely do this.
Agreed to all of this. I would go further and say kids under 18 shouldn't have smart phones or social media. And I would support laws banning both. It's just as bad for young brains as drinking or smoking
Excellent! Glad to know about your important work.
Yes! me too -- there is more bipartisan support to advance phone-free schools, and more educators like me who are going all-in on flipped classroom models that emphasize writing, speaking and listening in real time -- it's really a training in philosophical orientation and psychological awareness of our group conditioning -- and a rising responsibility that we're each responsible to be co-creators of the cultures we want to live in.
They ban phones and still pass out Chromebooks to kindergartners. At least in kindergarten the kids didn’t use them every day. In first grade, they do and they bring them home, even though they have no homework.
Ah, the corporate Alphabet
Hear, hear! Some say that we need to teach students how to use technology, as if there is any chance they won't be technologically saturated outside school. Typing homework on a computer and emailing it is more efficient, and digital textbooks are easier to carry around, but beyond that I see no use for technology in school. And if using computers for those tasks is introducing the familiar digital distraction, then their benefit is not worth the price and we should go back to pen and paper.
This is hardly the biggest issue, but there are many studies that establish that reading fiction develops empathy. As we read less, this is just another thing we are losing.
When it's what seperates us from the machines, it may end up mattering a great deal!
Is this true only of reading or do other forms of fiction give similar results? Obviously movies and TV shows can induce similar emotions in a viewer, which I think would be indicative that they force the viewer to practice empathy for the characters.
I recently read Austen's Northanger Abbey for the first time, which is largely a book about reading, and I was reminded again how transformative fiction has been to society. Imagination is the only way to connect with people different from you, and without learning and exercising the imagination muscle, and reading fiction is an incredibly cheap way to do that, everything in civil society becomes more difficult.
It does! Maryanne Wolf writes about this process of what she calls “deep reading” in her book, Reader Come Home.
I’ll have to check that book out. Thanks for the recommendation!
As the father of bright, curious, and motivated 17 and 13 year old boys who attend an academically rigorous private school, I think about this subject every day. They're reading complete books, partly because my wife and I are avid readers and partly because their school requires it. I wonder, though, what jobs will be available for them after college? This is all happening so fast yet it seems that so many of us are unaware of it.
To quote Hemingway in "The Sun Also Rise," I fear when we're asked in 3 - 5 years how the economy and job market collapsed, the answer will be - "Two ways. Gradually, then suddenly."
(sigh)
I am curious if anybody have good articles / POV to share on Ai as a class divider? I am assuming "higher social class" will keep pushing their kid to read long book or be in environment who push for this, while "lower class" will outsource their thinking to LLM because easier and faster. The same way you could see differences on families who restrain screens or processed food.
The real threat of AI to the economy is the recession it's gonna cause when the stock bubble bursts
One of the few fields projected to add jobs over the next 5-10 yrs is healthcare. Beyond the obvious doctor or nurse, there are a bunch of other roles like x-ray tech, respiratory therapist, pharmacist, MRI tech, physician's assistant, physical therapist, occupational therapist, or nuclear medicine technologist. Some of these roles are even offered as a 2 yr program at local colleges, and offer good pay for individuals.
I very much wonder the same (my kids are 10 and 8). I'm thinking of recommending chiropractor. That seems safe for at least a little bit.
Maybe a premature thought. How about letting this play out for a bit first? Or did you go into the safe career that your parents picked out for you when you were 10?
Oh, I'm not pushing, it just having frank discussions with them.
That being said, my career is pretty close. I was always thinking of doing a business management degree. I took my first class "intro to business" which was probably the most important class I ever took.
We did a Monster.com project where we researched jobs in the career we were thinking about. I realized how general "business management" is. And then I saw all the accounting jobs. And I realized that accounting teaches and actual skill.
So my degree ended up being business economics with a focus in accounting, then I got my CPA and MBA.
So I pretty much followed the plan but with some tweaks that I think really worked out.
As a slightly related point, I think the way that white collar workers work also contributes to this lack of deep thinking (even beyond the introduction of AI). The constant meetings, emails, Slack/DMs - none of it is conducive to strategic, deep thinking. It is no wonder that AI is accelerating an already bleak trend.
one wonders what could be accomplished if white collar workers spent their days doing something that’s actually useful
Thank you for explaining why I never use AI in my creative writing. The road to creativity is paved with deep, introspective thinking. The journey is the whole point.
I do use ambient AI in the clinical setting for documentation purposes but I’m not totally convinced that it saves time. It hallucinates and generates lengthy, bloated encounter notes. So I end up spending hours editing. Sometimes the errors in AI medical documentation are dangerously inaccurate.
Great article! I don't believe the 18-month prediction either. The choice was never between "AI" versus "human", the choice is between "AI" versus "human+AI". The option "human+AI" working together on a job will always generate a competitive advantage.
The promise of ASI (a superintelligence) or AGI is false one. Rather, most of the AI datacenter capacity being built today will be used for basically entertainment purposes, generating customized facebook AI friends, new funny cat videos, and so on. For most people The Internet is really just the facebook/twitter/youtube. And as you've described in the article, many people are functionally illiterate over longer distances. If ChatGPT could provide facebook-like features combined with google-like question answering in a single interface and in voice (i.e. no reading necessary at all) then 50% of populace would accept it outright, generating enormous income for the tech providers. This is, I believe, true goal of the AI gold rush.
"The promise of ASI (a superintelligence) or AGI is false one." - maybe the case for current models as scaling alone doesn't seem to be getting there, but never? We already know that a general purpose AI can evolve with 200 trillion connections and a few million years of (extremely repetitive) training so one based on chips seems inevitable if we keep going.
One day, yes. But the current LLM is nothing like a biological neural network, so we could not compare the scale of connections between the two. It's possible that the LLM architecture is not scalable beyond some intelligence level simply because there is not enough unique training data to get it there. They could invent a better arch, but that is not a project for 18 months.
"The option "human+AI" working together on a job will always generate a competitive advantage."
Why?
I think it's easy to foresee a future that AI is just better than people at almost everything, and with AI + humanoid robots at everything.
Wait until you realize AI can prompt itself and act according to its replies
I think one underrated aspect of this trend is its inequality-inducing effects. The rise of social media, smartphones, and AI, while they've plausibly led to a net loss in deep-thinking capacity for most of the population, have probably led to a net-gain for a very small elite group. I think there's potential for this to lead us down the road of something akin to 'intellectual authoritarianism,' where you have a small, high-capacity group administering a society made up largely of semi-literate people.
You nailed “time under tension.”
I’ve started a simple rule for myself. For any complicated question, I take the time to outline my thoughts, evidence, counters, etc. before I open an LLM. It forces the reps that make thinking stronger rather than outsourcing that lift. Then I use AI to critique my draft, not create it. That small ordering change keeps writing-as-thinking intact and makes the machine a sparring partner, not a crutch.
I am writing a book about this very topic — The Journey to Phronesis (publishing April 2026) — and is full of practical ideas like this. I shared the initial Table of Contents from my manuscript in a recent post: https://www.lukich.io/p/202509-the-journey-to-phronesis-a-look-inside
I like your suggestion about outlining your thoughts before you open an LLM and the way you compared it to doing the reps.
If I may, abundance in education has meant scale; scale has meant the decline in rigor and responsibility for actually educating, at the human level. This is all I write about. To write about the outputs of education without understanding the factory won't get to solutions.
Appreciate the focus on this. Those NEAP results were extremely striking. The time-sensitivity we keep being beaten (i.e. there is nothing we can do, AI will revolutionize everything and is inevitable) over the head with is more and more frustrating.
It just is not the case that we have to let a new technology erode our thinking and ability to be responsible adults across domains. We are hamstringing a generation already hard up for jobs and their successors will be even worse off if we don’t get our wits together about it.
https://open.substack.com/pub/theslowpanic/p/navigating-urgency-agency-and-moral?r=bwndg&utm_medium=ios
Learned yesterday, amidst the H-1B fee, that 54% of Americans read below a 6th-grade level. The cocktail of geopolitical tensions, hostility to immigration, waning attention spans, and this worries me.
What about taking out the homeworks from liberal arts, make them read a lot of books and then do one written and oral exam at the end. That's the way it is done in Europe and it works pretty much. It would also be the occasion to make humanities difficult again as Matt Yglesias says
We can easily add "The end of listening" too.
I see more and more people not having the patience anymore to listen to a podcast or watch a video and ask AI to make a TL;DR.
They still need to read it 😌 but the "I don't have the patience to listen to a 16 minute video" is alarming...
Wrote about this here: https://www.whitenoise.email/p/the-inverse-mechanical-turk-meat
“The hidden machinery now powers the visible human.
In other words, we used to hide the humans inside the machine. Now we hide the machine behind the humans.
This strange new world resembles Theseus's famous paradox: the ship looks unchanged, but all its internal mechanisms have been quietly replaced. Our economy preserves the appearance of human work while silently replacing its cognitive components, leaving us to wonder who (or, more accurately, what) is truly at the helm.
Welcome to the autopilot economy, where the modern workplace increasingly resembles the cockpit of a 787.
In aviation, that “dance between man and machine, where skill, knowledge, and intuition intertwine,” the machine leads and man follows.”
I agree with these concerns, but I’m confused why worrying about critical thinking should preclude worrying about what more powerful AI might do.
It feels a bit like if a climate activist said future climate catastrophes are “not the problem” because there is warming happening right now. Why not care about both?
Sure, in the limit we have to prioritize some things over others, and one issue’s salience can obscure another (perhaps that’s all the piece was trying to say?), but in practice I think we have the capacity to care about many things at once.
In using this framing, I feel the essay sidesteps a crucial question: why will deep thinking be important for the next generation? There are many possible answers, some hinted at in the piece, but defending any of them requires grappling with how society might adapt to more powerful AI. It may be “no fun to imagine,” but I wish Thompson would rise to the challenge and write about it more directly.
Because deep thinking is an important part of being a human and adds value to your life even if you never make any money off it
The Renaissance was sparked by the rediscovery of ancient texts. Suddenly, modern genius joined in conversation with centuries of ancient thought and experience. Will making the wisdom of the past constantly available push us to new heights, or fossilize our understanding from the apex of a dying civilization?