What Is the Forgetting Curve, and How Do You Beat It?

The Ebbinghaus forgetting curve explains why you forget 70% of what you read within a day. Here's the 1885 study, the data, and what actually flattens the curve.

Elliott Tong

Elliott Tong

April 29, 2026

14 min read

What Is the Forgetting Curve, and How Do You Beat It?

The forgetting curve is the graph of how fast you lose new information after learning it. Hermann Ebbinghaus discovered it in 1885: you forget around 50% within an hour, around 70% within a day, with no review. The curve flattens with active recall and spaced review. It does not flatten with highlighting, re-reading, or good intentions.

I have a stack of books in my flat I can describe in two sentences each. I read them. I underlined things. I felt smart. Then I closed them, and most of the contents drained out of me within a week.

That's not a personality flaw. It's a curve.

The forgetting curve is one of the oldest, most-replicated findings in psychology, and it explains almost every frustration the modern reader has. The article you read on the train and can't summarise by dinner. The book you finished last month that you can recommend but not actually quote. The course you "completed" that left almost no trace. None of that is unusual. It's the default behaviour of human memory when you don't fight back.

This piece is the fight. What the curve is. Where it came from. What the data actually says. Why most of the things that feel like learning don't change the slope. And what does.


What Is the Forgetting Curve?

The forgetting curve is a mathematical description of how memory decays over time when you don't actively reinforce it. Plot retention on the y-axis, time on the x-axis, and you get a steep drop in the first 24 hours followed by a long, shallow tail.

The shape is the important part. Forgetting isn't linear. You don't lose 10% a day for ten days. You lose most of it almost immediately, and what's left after about 48 hours is relatively stable, but most of what was in your head an hour after reading is already gone.

Ebbinghaus's original 1885 numbers, summarised:

Time since learningMaterial retainedMaterial lost
20 minutes~58%~42%
1 hour~44%~56%
9 hours~36%~64%
1 day~33%~67%
2 days~28%~72%
6 days~25%~75%
31 days~21%~79%

Source: Ebbinghaus, Über das Gedächtnis (On Memory), 1885. Reconstructed from his retention savings data.

Two things to notice. First, the steepest drop happens in the first hour. By the time you finish a podcast on the way home, you've already lost almost half of what you started with. Second, the curve doesn't go to zero. Some residue stays. But that residue is small, and it's mostly the parts your brain happened to find meaningful, not the parts you decided were important.

This is why "I read a lot" and "I know a lot" are not the same sentence. They're not even on the same axis.


Who Was Ebbinghaus, and Why Does His 1885 Study Still Matter?

Hermann Ebbinghaus was a German psychologist working in Berlin in the 1870s and 1880s. Before him, memory was something philosophers wrote essays about. He turned it into something you could measure with a stopwatch.

His method was a bit mad. He invented a stimulus deliberately designed to have no meaning: the nonsense syllable. Three letters, consonant-vowel-consonant, no resemblance to any real word. ZOF. BIK. DAX. He generated thousands of them, shuffled lists into sequences of about 13 syllables, and then memorised the lists himself. He was the experimenter and the only subject.

Once he could recite a list perfectly, he'd note the time. Then he'd come back at fixed intervals (20 minutes, an hour, a day, a week) and try to relearn the same list. The clever part was the metric: he didn't measure how much he remembered, he measured how much less time it took to relearn the list compared to the original session. He called this the savings score. If it took half as long to relearn, half the memory was still there in some form. That's the data the forgetting curve is built on.

Three things make this study still load-bearing in 2026:

  1. It was quantitative. First time anyone had put real numbers on memory.
  2. It controlled for meaning. Nonsense syllables strip out the confound of "this thing reminded me of something I already knew." The decay he measured is closer to a memory baseline.
  3. It has been replicated. Murre and Dros at the University of Amsterdam ran a careful replication in 2015. Their curve closely matched Ebbinghaus's 1885 numbers. The shape held.

Modern research has refined the picture. Meaningful prose decays slower than nonsense syllables, because meaning gives the brain extra retrieval hooks. Sleep between learning and recall slows decay. Emotional content sticks longer. But the basic shape, fast loss followed by a shallow tail, is one of the most replicated findings in cognitive psychology.

When people quote "the forgetting curve," they usually mean the Ebbinghaus result. They're right to. It's older than the lightbulb and it's still good.


How Fast Do We Actually Forget?

Faster than feels reasonable.

Take the most charitable possible reading. Your brain holds on to things you find meaningful. You're reading articles you chose, written by people you respect, on topics you care about. Surely it can't be that bad?

It is.

Here's what the research suggests for normal, prose-based learning, drawn from a synthesis of Ebbinghaus, modern replications, and the meaningful-prose adjustments documented in Dunlosky et al. (2013):

Time since readingRetained (passive)Retained (one active recall)Retained (spaced review)
1 hour50%75%80%
1 day33%65%80%
1 week25%55%78%
1 month15%40%75%
6 months10%30%70%

These are approximate, drawn from the literature rather than a single study. The point isn't the precise digits. It's the shape. A passive read of an article is a slow leak. A single act of retrieval, done within an hour, plugs most of the leak. Spaced review makes the leak almost stop.

If you read on Sunday and don't think about it again, by next Sunday you can probably articulate one or two ideas, plus a vague feeling that you "got something out of it." That feeling is what the cognitive scientists call fluency. It's the brain mistaking familiarity for understanding. You recognise the topic when someone mentions it. You don't actually have the goods.

I'd call this state Phantom Knowledge. It feels like a thing you know. It behaves like a thing you don't.

I read The Magic of Thinking Big in January 2023. While I was inside it, I felt clear on every chapter. I could have explained the central argument with confidence the day I finished. A month later, ask me what made the book actually work, and I had the title, the vague impression of bigness, and almost nothing else. That's Phantom Knowledge in a single example.


Why Does Highlighting Feel Productive but Doesn't Stop Forgetting?

Because effort and effectiveness are not the same thing.

Highlighting is one of the most popular study habits in the world and one of the worst on a per-minute basis. Dunlosky and colleagues' 2013 meta-analysis, which reviewed ten common study techniques across hundreds of studies, rated highlighting as low utility. It performs no better than plain re-reading. In some studies it performs worse, because it gives you a false sense of having done the work, which means you stop earlier.

Same goes for re-reading. You move your eyes back over a paragraph, the words feel familiar, and your brain reads that familiarity as comprehension. It isn't. It's just the second time you've seen the same sentence. The retrieval system in your head, the part that has to produce the answer when no answer is on the page, never gets exercised.

Compare these four common reading behaviours:

BehaviourForces retrieval?Effort requiredEffect on forgetting curve
Re-readingNoLowNegligible
HighlightingNoLowNegligible
Underlining + summaryPartialMediumSmall
Closing the book and writing what you rememberYesHighLarge

The pattern is consistent. Anything that lets your brain coast doesn't change the curve. Anything that makes your brain produce an answer it isn't currently looking at does.

This is the thing that took me the longest to accept, because it cuts against how reading feels. Productive reading is supposed to feel like progress. Active recall feels like failure. You sit there, blank, trying to remember the three points the author made, and you can't. That blank is the work. That blank is what eventually flattens the curve. The smooth, fluent re-read is what doesn't.

The reader's version of this is what I'd call Comprehension Debt. Every time you finish an article and don't pay the small price of a 90-second retrieval, you take on debt. Compounded over a year of reading, you end up with a library you "know" and a head that holds almost none of it.


What Actually Flattens the Curve?

Two interventions, well-evidenced, with effect sizes large enough to actually matter: active recall and spaced repetition. Used together, they are the closest thing learning science has to a closed case.

Active recall

Active recall is the practice of pulling information out of your head, instead of pushing more in. Roediger and Karpicke's 2006 study at Washington University is the canonical demonstration. They had students read a passage and then either re-read it or take a recall test. Five minutes later, the re-readers did slightly better. Two days later, the test group did much better. A week later, the test group recalled around 50% more than the re-readers. The act of retrieval, not the act of restudy, was what stuck.

Dunlosky et al. (2013) gave practice testing one of only two "high utility" ratings in the entire ten-technique review. The other was distributed practice. They were the only two techniques the authors recommended unreservedly across age groups, materials, and learners. Everything else (highlighting, summarising, mnemonic devices) got "low" or "moderate."

Spaced repetition

Spacing is the second half. The principle: review information at intervals long enough that you've started to forget, but not so long that you've actually lost it. The forgetting and the retrieving are the work. Cepeda et al.'s 2006 meta-analysis of 317 experiments confirmed the spacing effect across age, content, and time horizon: distributed practice consistently beats massed practice (cramming) for long-term retention.

The standard spacing schedule looks roughly like this:

Review numberTime since previous reviewCumulative time from learning
11 day1 day
23 days4 days
31 week11 days
42 weeks25 days
51 month~2 months
63 months~5 months

Each successful retrieval at the right interval re-stabilises the memory and stretches the next interval out. The curve gets shallower every pass. After four or five reviews spread over a couple of months, you're holding most of what you learned at almost no daily cost.

Modern algorithms like FSRS (Free Spaced Repetition Scheduler, the default in Anki since 2023) automate this scheduling and improve on the older SM-2 algorithm by 20-30% in efficiency. You don't have to track the dates. The software does, based on your own pattern of recalls and lapses.

The combination beats either alone. Spaced repetition is just retrieval practice on a calendar. Retrieval practice without spacing decays faster. Spacing without retrieval is empty time. Together, they're the thing.


How Do You Apply This to Articles, Not Just Flashcards?

This is where most of the existing advice falls apart for normal readers. Spaced repetition was developed for flashcards. Anki decks are full of language learners memorising vocabulary and medical students memorising biochem. If you read articles and books, the flashcard format is overkill, and the friction kills the habit before it starts.

You don't need flashcards. You need a small, opinionated workflow.

Here's a stripped-down version that maps the science onto how a real reader actually behaves:

Step 1. Pick the keepers. As you read, mark the few ideas worth keeping. Two to five per article. Not paragraphs, ideas. The claim, restated in your own words.

Step 2. Close it and write them down. Within an hour of finishing, close the article and write the keepers from memory in plain prose. Not a summary of the whole thing. Just the claims you decided mattered, plus a sentence each on why. This is the act of retrieval that does most of the work. If you can't reproduce them an hour later, they were never going to make it to next week.

Step 3. Come back the next day. A two-minute revisit. Read your own notes, not the original article. Try to add a line or correct an error. This second retrieval pushes the curve out by days.

Step 4. Come back a week later. Same drill. Skim your notes, test yourself, fix anything you got wrong.

Step 5. Come back a month later. Final reinforcement. After this, the keepers are usually yours for years.

Five touches, total time per article maybe 10 minutes spread over a month, against the alternative of zero touches and 10% retention.

The reason most people don't do this isn't that the workflow is hard. It's that the original article is rarely re-findable a week later. You read it, closed the tab, and the moment to reinforce it has passed. The curve is steep. By the time you remember to come back, there's nothing to come back to.

This is the structural problem Alexandria solves. Alexandria is the comprehension-first reading platform: every article you read or listen to gets saved with the keepers extracted into structured knowledge blocks, the retrieval prompts queued, and the spaced-review schedule running quietly in the background. You don't need to remember to come back. The system reminds you on the day the curve says you'd otherwise forget. It works for articles, podcasts, emails, anything you choose to keep. The science doesn't change. The friction does.

For the science-comparison view (active recall vs passive reading), active recall vs passive reading covers the Karpicke and Roediger evidence in depth. For the four-move method that uses the curve as its core argument, how to learn faster without any of the hacks is the operational version. If you want the underlying breakdown of the techniques themselves, the science of reading retention piece goes deeper on the mechanics. If you want the practical step-by-step, how to remember what you read is the operating manual. And if you want to see what this looks like at the worst end (where the curve has run unchecked), why you forget everything you read is the diagnosis.


What Slows the Curve Even Without a System?

A few smaller interventions help, even if you don't run a full review schedule. These are worth knowing because they're cheap.

Sleep. Memory consolidation happens during sleep, particularly slow-wave sleep in the first part of the night. A read followed by a night's sleep retains more than a read followed by an all-nighter. The effect is real and fairly large, around d = 0.5 in modern meta-analyses. Read in the evening, sleep on it.

Spacing within a single session. Even a 10-minute break in the middle of a longer reading session improves retention more than reading the whole thing straight through. Same total time, more breaks, more retention.

Talking about it. The act of explaining something to another person is a high-quality retrieval. You have to reconstruct the idea in your own words, monitor whether they're following, and adjust. You'll discover the gaps in your own understanding within thirty seconds. That's also useful, because gaps you can name are gaps you can fix.

Writing about it. Same mechanism. Free recall, in your own structure, no original to lean on. The blog post or note you write about a book sticks harder than the book itself.

Linking to existing knowledge. New information attached to an existing schema is dramatically harder to forget. When you read something and think "oh, this is just like X," that link is what's keeping it alive. The more you can do that consciously while reading, the more your new material gets stored with multiple retrieval paths.

None of these is a substitute for retrieval and spacing. They're multipliers on top.


Frequently Asked Questions

What is the forgetting curve in simple terms?

The forgetting curve is a graph showing how fast new information leaks out of your head after you learn it. Hermann Ebbinghaus discovered in 1885 that you lose roughly 50% of new information within an hour and around 70% within 24 hours, with no review. After that the loss slows, but most of what stays is what you actively used.

Who discovered the forgetting curve and when?

Hermann Ebbinghaus, a German psychologist, published the forgetting curve in his 1885 book Über das Gedächtnis (On Memory). He ran the experiments on himself, memorising lists of nonsense syllables and testing his recall at fixed intervals. It was the first quantitative study of human memory and the curve has held up under modern replication.

Is the Ebbinghaus forgetting curve still accurate today?

Broadly yes. Murre and Dros replicated Ebbinghaus's original study in 2015 and produced a curve that closely matched his 1885 numbers. The exact percentages shift with the type of material and how meaningful it is, but the shape (rapid early loss, then a long shallow tail) is one of the most stable findings in memory research.

How fast do you forget what you read in an article?

Without any review or retrieval, you can expect to lose around 50% of a long article within an hour and around 70% by the next day. Meaningful, well-structured prose decays slower than Ebbinghaus's nonsense syllables, but not by much if you never come back to it. The default state of reading is forgetting.

Does highlighting flatten the forgetting curve?

No. Dunlosky and colleagues' 2013 meta-analysis of common study techniques rated highlighting as low utility, performing no better than plain re-reading. Highlighting feels productive because it demands attention, but it doesn't force retrieval, so it doesn't change the slope of the curve.

What actually flattens the forgetting curve?

Two things, mostly: active recall (testing yourself instead of re-reading) and spaced repetition (reviewing just before you'd otherwise forget). Each successful retrieval re-stabilises the memory and pushes the next forgetting further out. Combine the two and the curve gets shallower with every review.

How can I apply the forgetting curve to articles, not just flashcards?

Pick the few ideas worth keeping, write them out from memory within an hour of reading, then review them again the next day, again a week later, and again a month later. You don't need flashcards. You need a small list of claims you've decided are worth defending, plus a habit of returning to them on a schedule.

How long does it take to commit something to long-term memory?

There's no fixed time, but a useful rule from spaced repetition research is roughly four successful reviews spread over four to six weeks. The first review within a day, the next within three days, the next within a week, the last within a month. After that the interval stretches to months and years.


Related reading: Active Recall vs Passive Reading | How to Learn Faster | Why You Forget Everything You Read | How to Remember What You Read | The Science of Reading Retention