Methodology
Most news is noise. We measure which stories actually mattered - not with opinion, but with data.
How We Measure Durability
Every news story generates attention. We track that attention over time using Wikipedia pageview data - one of the most reliable, freely available proxies for public interest.
For each story, we establish a pre-event baseline - the normal level of interest before the story broke. Then we measure how long interest stays elevated above that baseline.
The Core Metric: Elevation
Our primary signal is baseline elevation: the ratio of recent pageviews to pre-event baseline. An event that permanently elevated interest in a topic has a high elevation ratio. One that spiked and returned to baseline has an elevation near zero.
- Noise: Elevation below 1.5x. The story consumed enormous attention and produced nothing lasting. Wikipedia traffic returned to pre-event levels.
- Still Standing: Elevation above 3x, sustained for 6+ months. The story changed something real - policy, markets, institutions, lives.
Why Wikipedia?
Wikipedia pageviews are a uniquely good proxy for sustained public interest:
- Free and open - no API keys, no paywalls
- Covers virtually every notable news event
- Reflects genuine information-seeking behavior, not algorithmic amplification
- Available as monthly aggregates going back years
Limitations
Wikipedia traffic measures attention, not importance. Some genuinely important stories fly under the radar. Some tragic events get sustained traffic from morbid curiosity rather than lasting significance. We acknowledge these limitations and let the data speak for itself.
We are a curation layer, not a news source. Every story links back to original reporting and its Wikipedia article.
Automation
Our pipeline runs daily: sourcing new stories from Wikipedia's Current Events portal, measuring signals via the Wikimedia Pageviews API, classifying events algorithmically, and generating branded charts. No human edits the data. The curve doesn't have an agenda.