What is AI Brain + Recommendations [Friday Wrap-Up]
AI brain is when you’ve come to rely on AI for even the most basic things — and after a week deep in Claude Max, I felt it creep back in. I’m sharing the three warning signs I’ve identified so you can catch it early, plus how bad sleep and brain fog made it worse.
I’ve also got recommended reading from Mike Schmitz on using Claude to script YouTube videos (not my approach, but a thoughtful one), and a More Perfect Union video that exposes Polymarket’s prediction markets for what they really are.
On my mind
- AI Brian
- Good Sleep
- How much are we really on our phones?
Recommended Reading
Recommended Media
- (00:00) – – What’s on my mind
- (08:03) – – Recommended reading
- (12:14) – – Recommended media
————
Streamlined Solopreneur is the podcast for solopreneurs who want to automate their business and take time off worry-free. Each week, Joe Casabona shares practical systems, tools, and strategies to help you reclaim your time and run your business without sacrificing your the rest of your life, or your health.
Start with the free Solopreneur Sweep — a step-by-step method for finding where your business is losing time: https://streamlined.fm/sweep
If this episode helped you, leaving a review on Apple Podcasts helps other solopreneurs find the show — it only takes a minute and means a lot.
Connect with Joe on LinkedIn: https://www.linkedin.com/in/jcasabona/
00:00:00.220 –> 00:00:09.240
Hey everybody and welcome to the Friday wrap up for April 17th, 2026 on Streamline Solopreneur.
00:00:09.460 –> 00:00:11.700
This is a short episode where I talk about three things.
00:00:12.200 –> 00:00:16.780
What’s on my mind this week, recommended reading, and recommended media.
00:00:17.500 –> 00:00:22.340
This show is to help you automate your business so you can take time off for free.
00:00:22.960 –> 00:00:26.060
And hopefully this curation will help you think more about your systems.
00:00:26.800 –> 00:00:29.980
I’m your host, Joe Casabona, and here’s what’s on my business.
00:00:30.000 –> 00:00:37.120
mind this week. The first thing is something that my friend Bianca and I lovingly call
00:00:37.420 –> 00:00:43.740
AI brain. I’m writing a much longer piece on this, but I’ve been trying out Claude Code as I
00:00:43.960 –> 00:00:51.140
or Claude Max, Claude co-work, just a bigger plan for Claude. And I’ve been thinking about this
00:00:51.320 –> 00:00:57.600
because in the middle of the week, I got hit with this weird brain fog that felt familiar
00:00:57.600 –> 00:01:06.580
like I had experienced it before, and in fact I had experienced it before, in June and July of
00:01:07.250 –> 00:01:17.460
2025, where I realized that I was relying on AI for too much. Now, I recognized the warning
00:01:17.640 –> 00:01:25.180
signs very early, but it caught me differently this time because I wasn’t using it to create
00:01:25.200 –> 00:01:32.160
content or even come up with episode ideas. I was using it to crunch a bunch of data and then help
00:01:32.240 –> 00:01:40.380
me draw conclusions, but I still felt like I had spent a lot of the last week talking to a robot
00:01:41.440 –> 00:01:49.380
and treating it like a person, which is something I try very hard not to do. So this is something
00:01:49.480 –> 00:01:54.940
that’s on my mind. Again, I’m working on a bigger piece for this. Uh,
00:01:55.180 –> 00:02:02.400
where I mentioned the three signs that you might be getting AI brain and how to prevent it.
00:02:02.520 –> 00:02:05.780
And I’ll just share the three signs with you because those are the things that I’ve worked out.
00:02:06.660 –> 00:02:11.920
The first is that you can’t think about something for more than five minutes before running to an LLM for advice.
00:02:12.120 –> 00:02:18.460
And this is the thing that made me realize it is I was sitting there kind of stuck between multiple tasks.
00:02:19.500 –> 00:02:24.960
I was up in the dining room, which is just a terrible place for me to work because it’s right next to the playroom.
00:02:25.599 –> 00:02:30.160
and it’s, you know, it’s between the playroom and the living room. And so there’s just a lot going on.
00:02:30.200 –> 00:02:35.760
It’s very hard to focus. And instead of deciding I was going to move, I was like, oh, I’ll just ask AI.
00:02:35.940 –> 00:02:41.860
And then I thought, wait, I spent no time thinking about this. So if you can’t think about something for
00:02:41.960 –> 00:02:46.240
more than five minutes before running to an LLM, that’s the first and most significant sign, I think.
00:02:46.880 –> 00:02:53.280
The second is that you’re not talking to the people who matter, people being the emphasis here.
00:02:53.380 –> 00:02:59.280
If you are deciding not to talk to potential customers, clients, or people in your audience,
00:02:59.390 –> 00:03:05.940
and instead just asking AI what it thinks about the avatar it has for your customers, clients,
00:03:06.190 –> 00:03:09.140
an audience, you probably have AI brain.
00:03:09.680 –> 00:03:14.920
And if you have no skin in the game, and this was the second thing that I noticed from this week,
00:03:15.040 –> 00:03:20.900
is, you know, I asked it to crunch a bunch of data and then look at a bunch of client calls
00:03:21.400 –> 00:03:23.100
and come up with common objections.
00:03:23.980 –> 00:03:25.180
And I, of course, compared that
00:03:25.260 –> 00:03:26.940
against my own anecdotal evidence.
00:03:27.700 –> 00:03:30.220
And I thought, all right, well, help me come up
00:03:30.260 –> 00:03:31.280
with a content strategy.
00:03:31.500 –> 00:03:34.580
Don’t think of the actual content titles,
00:03:34.700 –> 00:03:36.720
just kind of like the pillars for the content.
00:03:37.660 –> 00:03:39.680
And then as I’m sitting down to plan my podcast,
00:03:40.600 –> 00:03:43.980
I thought, I don’t care about this plan.
00:03:44.320 –> 00:03:45.740
I had no skin in the game
00:03:45.860 –> 00:03:48.840
because I didn’t spend appreciable time thinking about it.
00:03:48.920 –> 00:03:56.400
And so I was able to course correct a lot quickly than I was able to in summer 2025.
00:03:57.540 –> 00:04:04.340
But it is, it’s a something that requires to quote Matt I. Moody, constant vigilance, I think,
00:04:04.520 –> 00:04:12.500
something that you need to think about. Because what Claude Max especially, like what AI can do,
00:04:13.560 –> 00:04:18.820
what large language models can do with bigger context when,
00:04:18.900 –> 00:04:26.800
Windows truly seems like magic sometimes until it doesn’t. And it might be a while before it feels
00:04:26.980 –> 00:04:31.280
like it doesn’t and that thing snaps you out of it. And so you know, I think that you want to,
00:04:32.500 –> 00:04:38.880
I have a, I’m giving you a problem without a solution right now because the solution is
00:04:40.080 –> 00:04:47.360
basically just an outline, but, you know, I’ll talk about it more on this show, how to prevent
00:04:47.580 –> 00:04:52.840
AI brain or at least the things that I’m doing to prevent AI brain.
00:04:54.440 –> 00:04:58.320
The second thing that is on my mind is getting good sleep. I’ve been getting bad sleep this
00:04:58.440 –> 00:05:02.860
week. I have been inexplicably, I shouldn’t say inexplicably. I’ve been eating kind of like
00:05:02.940 –> 00:05:11.080
garbage too close to bedtime. And I’ve been waking up somewhere between 2.30 and 4 every
00:05:11.960 –> 00:05:17.340
morning. And it hasn’t been great. Last night, I ate reasonably
00:05:20.020 –> 00:05:21.620
a reasonable amount of time before bed.
00:05:22.760 –> 00:05:27.680
And I slept all the way until 4.50, which, I mean, I usually wake up at 5 anyway.
00:05:29.040 –> 00:05:35.240
And so I just, I’ve just been thinking about maybe part of the reason where I felt
00:05:36.080 –> 00:05:43.800
AI brain or where I felt the need to lean on Claude more than I would have otherwise
00:05:43.820 –> 00:05:49.560
is because of the brain fog from not getting very good sleep.
00:05:49.780 –> 00:05:57.220
And that’s also something to think about, like check in with yourself and try to get good sleep.
00:05:57.400 –> 00:06:00.600
You know, I’m not a sleep expert or sleep doctor.
00:06:01.680 –> 00:06:02.840
I’m a 40-year-old man.
00:06:03.340 –> 00:06:13.780
And as somebody who’s now in his midlife, and I and someone who experienced a lot of sleep deprivation with three small kids.
00:06:15.320 –> 00:06:21.440
I know the importance of sleep. My brother said that one thing he was kind of jealous of me about
00:06:21.700 –> 00:06:28.800
is the fact that for a long time I did not need a lot of sleep to function well.
00:06:30.540 –> 00:06:35.200
And while that is true, right, I can usually get away with a day of not great sleep
00:06:35.680 –> 00:06:43.760
or I can one day after not having good sleep. I do see the difference between getting good sleep.
00:06:43.780 –> 00:06:44.740
and getting bad sleep.
00:06:45.780 –> 00:06:48.260
And you’re definitely at your best when you get good sleep.
00:06:48.380 –> 00:06:50.660
So just something that’s on my mind.
00:06:51.620 –> 00:06:55.820
A super secret bonus third thing that I don’t have in my outline here,
00:06:55.920 –> 00:06:59.080
but that I’ve been thinking about a lot is,
00:06:59.800 –> 00:07:01.600
has to do with my latest LinkedIn post,
00:07:01.680 –> 00:07:06.160
which I will link in the show notes and the description.
00:07:07.280 –> 00:07:10.700
And that is just like being on our phones too much.
00:07:11.780 –> 00:07:13.200
I went to a coffee shop this morning.
00:07:13.400 –> 00:07:15.560
and literally everybody in front of me
00:07:16.000 –> 00:07:18.040
was staring down at their phone with the same face.
00:07:19.180 –> 00:07:20.560
And I was just looking around
00:07:20.860 –> 00:07:23.780
to make eye contact with another human being
00:07:24.740 –> 00:07:26.360
and I didn’t.
00:07:27.200 –> 00:07:29.540
And this is coming off of the heels of me writing
00:07:31.220 –> 00:07:34.080
a pretty, I feel important piece
00:07:35.040 –> 00:07:36.840
for the Home and School Association,
00:07:38.020 –> 00:07:41.140
which is I’m president of my kids’ home and school association
00:07:41.380 –> 00:07:43.120
and I talk about the importance of technology
00:07:43.200 –> 00:07:45.500
and waiting until our kids are 14 to give them smartphones.
00:07:47.860 –> 00:07:50.600
And this is something I feel increasingly strongly about.
00:07:50.900 –> 00:07:53.760
My year of digital detox is going well, maybe too well.
00:07:54.680 –> 00:07:58.040
Definitely not too well, but it’s just something I’m constantly thinking about.
00:07:58.140 –> 00:07:59.560
And it’s a little bit of a half-baked idea,
00:07:59.560 –> 00:08:00.920
but it’s something I wanted to bring up here.
00:08:02.000 –> 00:08:02.200
Okay.
00:08:03.480 –> 00:08:07.880
So, recommended reading and media.
00:08:08.840 –> 00:08:14.100
Recommended reading is an article from my friend Mike Schmitz called How Clawed
00:08:14.260 –> 00:08:21.060
helped me make the videos I want to make. And he goes through this process about how he has
00:08:21.220 –> 00:08:27.320
started a new YouTube channel about basically book reviews. He has the bookworm podcast,
00:08:27.500 –> 00:08:37.460
which is really good. And he was stuck because he had a certain quality in his mind for
00:08:37.460 –> 00:08:45.860
these videos and scripting them seemed like something he’d be able to, like something he wouldn’t
00:08:45.860 –> 00:08:55.200
be able to do consistently. And so he asked himself, I’m quoting the article here, could I expedite
00:08:55.200 –> 00:09:01.200
the process by hiring Claude as a research assistant and scriptwriter? And he says the answer is a
00:09:01.480 –> 00:09:07.420
resounding yes. And now he’s personally walked me through this system because
00:09:08.500 –> 00:09:09.440
we’re in a mastermind together
00:09:10.420 –> 00:09:10.820
and
00:09:12.140 –> 00:09:12.920
I am
00:09:13.660 –> 00:09:15.280
definitely the town crier
00:09:15.760 –> 00:09:17.300
for AI and overuse of AI
00:09:17.540 –> 00:09:18.440
in this mastermind group
00:09:19.620 –> 00:09:20.760
I am easily
00:09:22.100 –> 00:09:23.780
I would say I’m easily the most skeptical
00:09:23.900 –> 00:09:25.320
person I don’t know if that’s true
00:09:26.840 –> 00:09:28.620
but I feel like I am
00:09:28.620 –> 00:09:29.960
or I feel like I’m at least the most
00:09:30.300 –> 00:09:32.060
vocally skeptical about it
00:09:34.240 –> 00:09:35.480
but he walked me through this whole process
00:09:35.490 –> 00:09:36.220
and it was really interesting
00:09:36.400 –> 00:09:43.120
he takes really, really in-depth notes with like an emoji system and things like that.
00:09:43.120 –> 00:09:46.000
And he quote unquote trained Claude on this.
00:09:46.000 –> 00:09:48.120
He has a video script template.
00:09:50.100 –> 00:09:58.600
And so he feeds all of this into a Claude skill that pulls in all of his personal notes,
00:09:59.140 –> 00:10:02.060
pulls in the mind map, grabs his rating for the book,
00:10:03.460 –> 00:10:09.620
pulls in the bookworm episode, which is a full transcript to get his thoughts on the book,
00:10:09.800 –> 00:10:15.600
interprets his emoji coding system, and puts it together. And so this is really interesting
00:10:15.720 –> 00:10:24.120
because this is a, this is a riff on the question I get a lot, which is if I train AI on everything
00:10:24.280 –> 00:10:31.480
I’ve ever written, doesn’t it make something that sounds like me? And I think what’s really
00:10:31.500 –> 00:10:33.300
important here is that
00:10:34.280 –> 00:10:34.920
Claude is
00:10:36.100 –> 00:10:37.240
writing the video script,
00:10:37.440 –> 00:10:39.600
quote unquote, which I question
00:10:40.520 –> 00:10:41.580
but I also
00:10:41.700 –> 00:10:43.440
watched Mike’s video and it felt
00:10:43.540 –> 00:10:44.680
very natural, so I don’t know
00:10:45.760 –> 00:10:47.620
I don’t know what the delta is between
00:10:48.280 –> 00:10:49.720
the thing that Claude outputs
00:10:49.860 –> 00:10:51.460
and then the thing that he modifies.
00:10:52.780 –> 00:10:53.020
But
00:10:54.540 –> 00:10:55.620
the AI is also
00:10:55.760 –> 00:10:57.140
not the delivery mechanism.
00:10:57.600 –> 00:10:59.140
So he takes this script,
00:10:59.320 –> 00:11:01.460
probably fixes it up. The script
00:11:01.560 –> 00:11:08.300
is heavily influenced on a very narrow topic, I should say, right? He’s not saying come up with a book
00:11:08.500 –> 00:11:17.520
review for me. He’s saying here is a lot of assets about a book I’ve read now put together a script
00:11:17.520 –> 00:11:26.380
in this template for me. And then he delivers it. And so I think that this is an interest,
00:11:26.480 –> 00:11:31.160
I’m going to say an interesting approach. It’s not for me.
00:11:32.400 –> 00:11:33.200
I don’t think
00:11:34.580 –> 00:11:34.700
but
00:11:35.640 –> 00:11:35.860
you know
00:11:36.040 –> 00:11:37.500
he records the video
00:11:37.920 –> 00:11:39.560
and then he ships it off to an editor
00:11:39.820 –> 00:11:40.420
who edits it
00:11:41.940 –> 00:11:43.140
and so his bottom line
00:11:43.260 –> 00:11:44.680
is Claude helps him make videos
00:11:44.730 –> 00:11:46.420
he wouldn’t otherwise be able to make
00:11:47.900 –> 00:11:48.340
so
00:11:50.220 –> 00:11:50.880
I’m you know I
00:11:51.920 –> 00:11:53.160
again I’m going to say this is
00:11:53.380 –> 00:11:55.140
not a process for me but I think
00:11:55.260 –> 00:11:57.180
it’s an interesting process and I don’t want to have
00:11:57.240 –> 00:11:59.060
like some monoculture
00:11:59.240 –> 00:12:01.160
on this podcast I want to
00:12:01.640 –> 00:12:07.960
present other points of view, even if I do present mine the most because it’s my show.
00:12:09.020 –> 00:12:12.860
So I think this is really interesting. I’ll have a link to this and everything in the show notes.
00:12:14.000 –> 00:12:20.040
And then recommended media. I was going to recommend the Scrubs revival, which we’ll call it season
00:12:20.200 –> 00:12:26.840
10 of Scrubs. One of my favorite show, maybe my favorite show of all time, usurping friends for
00:12:26.960 –> 00:12:31.120
that spot. And I thought that the nine episode revival,
00:12:32.579 –> 00:12:33.380
was fantastic.
00:12:34.740 –> 00:12:35.300
And so I love that.
00:12:35.380 –> 00:12:36.100
I love that show.
00:12:36.400 –> 00:12:37.740
I hope it comes back for season two.
00:12:38.400 –> 00:12:39.840
They opened up a lot of threads.
00:12:40.860 –> 00:12:42.660
And so I’d imagine they’re pretty confident
00:12:42.820 –> 00:12:44.160
it’s going to come back for a season two,
00:12:44.280 –> 00:12:46.020
but I don’t think it’s been officially renewed
00:12:46.580 –> 00:12:47.900
as of the time of this recording.
00:12:48.820 –> 00:12:51.000
So that was my original media recommendation
00:12:51.360 –> 00:12:53.720
until I watched a video from More Perfect Union
00:12:54.980 –> 00:12:57.520
called Polymarket Asked to Work with Us.
00:12:58.120 –> 00:13:00.180
We exposed their scam instead.
00:13:00.220 –> 00:13:08.200
It’s a 20-minute video that I think does a, I have no opinions on the outlet more perfect union.
00:13:09.260 –> 00:13:17.760
I have not consumed a lot of their content, but I certainly liked this.
00:13:18.760 –> 00:13:29.540
And so I have been suspicious of things like this, Polymarket, and the other one they mentioned,
00:13:29.900 –> 00:13:32.880
I don’t think it’s called Kashi.
00:13:34.740 –> 00:13:36.400
But whatever it’s, uh,
00:13:37.140 –> 00:13:39.200
Kalshi. Close. Really close.
00:13:39.320 –> 00:13:39.840
Call she.
00:13:40.270 –> 00:13:44.580
Uh, about, you know, democratizing financing and,
00:13:44.760 –> 00:13:46.480
and really just a bunch of scams, right?
00:13:46.620 –> 00:13:51.540
Anybody who fights this hard to say they’re not a scam almost certainly is a scam.
00:13:52.580 –> 00:13:56.200
This is based, and it’s gambling and they go right of their way.
00:13:56.280 –> 00:13:57.880
It’s really hard to say it’s not gambling.
00:13:59.500 –> 00:14:09.440
And I just think it’s a very well-done video that looks at, you know, that if you’ve never heard of polymarkets or in sports, this is called prop betting.
00:14:09.600 –> 00:14:15.440
And prop betting is bad, really bad, really bad.
00:14:16.520 –> 00:14:19.800
Death threats to players bad, right?
00:14:20.220 –> 00:14:29.220
Players throwing, like doing the thing that Pete Rose has long been accused of or like the, the Black Sox scandal.
00:14:29.640 –> 00:14:34.920
of 1919 on a much more granular level.
00:14:37.760 –> 00:14:49.080
So it’s just like a combination of a really low barrier way to gamble and a really easy way
00:14:49.080 –> 00:14:52.060
to do insider trading if you’re the insider.
00:14:53.880 –> 00:14:55.480
So I thought that was a really good video.
00:14:55.490 –> 00:14:57.360
I think that if you don’t know anything about.
00:14:59.160 –> 00:15:27.960
Polly Market, which is there are these prediction markets. And so you can bet on like what, how many days this ceasefire will be or how many pitches Max Fried will throw in his next outing or like, will Trevor Noah say potato? That’s like right from the video. Will Trevor Noah say the word potato when he hosts whatever he’s hosting?
00:15:29.180 –> 00:15:32.560
And so like a lot of knowable things for the right people.
00:15:33.180 –> 00:15:34.040
So I think it’s really good.
00:15:34.040 –> 00:15:34.780
I was strongly recommended.
00:15:35.600 –> 00:15:37.060
So that’s my media recommendation.
00:15:37.420 –> 00:15:44.200
So that’s what’s on my mind, some recommended reading, and some recommended media.
00:15:44.920 –> 00:15:51.300
That’s it for this episode of the Friday wrapup at Streamline Solopreneur.
00:15:52.780 –> 00:15:58.760
If you have thoughts or a story, you can send it over at Streamline.
00:15:59.840 –> 00:16:00.800
Feedback.com.
00:16:01.840 –> 00:16:08.180
If you want to think more about your systems as a solopreneur, maybe you’re interested in
00:16:08.280 –> 00:16:15.120
trying something like Mike mentioned here, then you can take a closer look at everything you
00:16:15.220 –> 00:16:15.420
do.
00:16:15.820 –> 00:16:21.200
And I have a process for how to do that over at streamlined.fm slash sweep.
00:16:22.200 –> 00:16:23.280
That’s my solopreneur sweep.
00:16:23.500 –> 00:16:24.420
It’s a free process.
00:16:25.080 –> 00:16:27.500
It’s like a 30 minute audit of your business.
00:16:28.560 –> 00:16:32.240
So I would check it out, maybe ask yourself those questions.
00:16:32.540 –> 00:16:35.840
Again, that’s over at streamlined.fm slash sweep.
00:16:36.540 –> 00:16:37.520
Thanks so much for listening.
00:16:38.300 –> 00:16:42.660
And until next time, I hope you find some space in your weekend.
