loading...
an
Unnecessary Demonstration
for
RUMOURS!
by aniruddh pramod, august 2025 • originally by nicky case
loading... let's play!


As an internet baby, I am very familiar and fond
of Nicky Case's little shtuffs. The adorable artstyle
and the fun stories that she tries to tell through them
have been inspiring to say the least. So when I took part at
a summer school for graph theory, I couldn't resist the
opportunity to go back to this one and display some of the cool shtuff
that I had learned as well.

If you aren't familiar, pleease go checkout the original
at ncase.me/crowds and her other works at ncase.me so what did I learn?
As a quick primer, what the
original explainer was trying to talk
about how the density of a network's connections
controls the ability of a rumour to spread through it. If
groups are connected too sparsely then rumours just don't spread.
But if a groups' connections are too dense, it becomes resistant to
the spread of new ideas, called groupthink.
The class of ideal balanced networks are called Small World Networks!
In this explainer, I will focus more on how fast a rumour spreads.
We will look at the push model for rumour spreading, first passage
percolation, the relationship between speed and density
and finally look at a case of Competing Rumours.
So, how is this ... ... going to work
Let's quickly do a short tutorial! Each connection represents a friendship between two people: draw to connect scratch to   disconnect when you're done doodling and playing around, let's continue
Naturally, your friendships influence you. Information spreads through your friend group and that information informs you about the world you live in. For example, people look to their peers to find out what % of their friends (not counting themselves) are, say, binge-drinkers. Draw/erase connections, and see what happens!
cool, got it PUZZLE TIME!
Fool everyone into thinking the majority of their friends (50% threshold) are binge-drinkers (even though binge-drinkers are outnumbered 2-to-1!)
FOOLED: out of 9 people Congrats! You manipulated a group of students into believing in the prevalance of an incredibly unhealthy social norm! Good going! ...uh. thanks? What you just created is called The Majority Illusion, which also explains why people think their political views are consensus, or why extremism seems more common than it actually is. Madness. Networks are a nice way to model how the information might spread through, (and affect) a society. You could use them to study how resillient it is to fake news, or maybe how devastating a virus might be, and how quickly it might be able to reach everyone! Network scientists typically call this model a ... “Contagion!”
Let's put aside the "threshold" thing for now. Below: we have a person with a rumour. And every day, someone with this rumor will spread it, like a virus, to their friends. People who do not have the tea yet, have no gossip to share, so we can ignore them.
Start the simulation!
(p.s: you can't draw while the sim's running)
Note: despite the negative name, "rumours" can be good or bad (or neutral or ambiguous). There's strong statistical evidence that smoking, health, happiness, voting patterns, and cooperation levels are all "contagious" -- and even some evidence that suicides and mass shootings are, too. well that's depressing
Indeed it is. Anyway, PUZZLE TIME!
Draw a network & run the simulation, so that everyone gets infected with the "rumour".
(new rule: you can't cut the thick connections)
fan-flipping-tastic
This model is a little unrealistic yes, after all most ideas don't spread like viruses. For many beliefs and behaviors, you need to be "exposed" to the contagion more than just once.

To make this model more realistic about the way the contagion spreads, we can use a threshold system, like in the drinking example, which is called a complex contagion. However. I want to focus on the time it takes for the infection to spread. And the math for that is much more elegant if we use what's called the push model.
So what's that like?
Let's now change the example we saw to a push model with Poisson clocks. Pay attention to the differences in the way the infection will spread this time. Not much changes, except for the fact that now instead of all possible edges spreading the infection, one of the potential ones is picked at random and gets to transmit the rumour at each time step.

To be precise, we are actually giving each node a Poisson clock, and when the clock ticks, the node is allowed to pass an infection (if it can). But that'd look quite awkward in a simulation. So instead, one tick of the simulation will automatically activate the edge with the minimum clock time. And we will add an amount sampled as per the appropriate distribution to the counter in the top right.
sounds about right
← Too few connections, and an idea can't spread. Too many connections, and you get groupthink.
That was the main learning of the original. The network below is an example of a Small World Network, where ideas spread easily, but groups are not locked in insular echo chambers either.
Try adding more connections within the centre group
to stop the flow of ideas to it.
But what can we instead say about the time it might take for the entire group to get infected? Provided of course, that they can be affected. And how is that affected by the connections within a group? ... let's see, shall we? Establishing a Reference Here is a fully connected graph with 9 nodes, with exactly one infected node. We're going to evolve the class as per the push model. We know that in 8 moves, every node will have been infected. But how much time does it take on average to infect the entire graph? Turns out, the amount of time needed is just log n. But what if the graph... ... isn't fully connected? Well in certain cases, where the graph is nice, we can still use clever mathematical tricks, or just straight up work it out to get such a value. Unfortunately, as you might have guessed, most real-life graphs really aren't like star-graphs or ring-graphs or anything else that is easy to work with. To simulate a real graph you'd probably want to randomly add edges in a huge population. And how could we ever hope to make a useful deterministic statement about a random graph? Well, we use a little something called ... ... conductance As you may have already guessed, the number of connections between groups is called bridging social capital. This is important, because it helps groups break out of their insular echo chambers!
Build a bridge, to "infect" everyone with complex wisdom:
Like bonding, there's a sweet spot for bridging, too. (extra challenge: try drawing a bridge so thick that the complex contagion can't pass through it!) Now that we know how to "design" connections within and between groups, let's... ...do BOTH at the same time! FINAL PUZZLE!
Draw connections within groups (bonding) and between groups (bridging) to spread wisdom to the whole crowd:
Congrats, you've just drawn a very special kind of network! Networks with the right mix of bonding and bridging are profoundly important, and they're called... “Small World Networks”
"Unity without uniformity". "Diversity without division". "E Pluribus Unum: out of many, one".
No matter how it's phrased, people across times and cultures often arrive at the same piece of wisdom: a healthy society needs a sweet spot of bonds within groups and bridges between groups. That is:
Not this...
(because ideas can't spread)
nor this...
(because you'll get groupthink)
...but THIS: Network scientists now have a mathematical definition for this ancient wisdom: the small world network. This optimal mix of bonding+bridging describes how our neurons are connected, fosters collective creativity and problem-solving, and even once helped US President John F. Kennedy (barely) avoid nuclear war! So, yeah, small worlds are a big deal. ok, let's wrap this up...
(pst... wanna know a secret?) Contagion: simple complex The Contagion's Color: Select a tool... Draw Network Add Person Add "Infected" Drag Person Delete Person CLEAR IT ALL (...or, use keyboard shortcuts!) [1]: Add Person     [2]: Add "Infected"
[Space]: Drag     [Backspace]: Delete
IN CONCLUSION: it's all about...
Contagions & Connections
Contagions: Like how neurons pass signals in a brain, people pass beliefs & behaviors in a society. Not only do we influence our friends, we also influence our friends' friends, and even our friends' friends' friends! (“be the change you wanna see in the world” etc etc) But, like neurons, it's not just signals that matter, it's also...
Connections: Too few connections and complex ideas can't spread. Too many connections and complex ideas get crushed by groupthink. The trick is to build a small world network, the optimal mix of bonding and bridging: e pluribus unum.
(wanna make your own simulations? check out Sandbox Mode, by clicking the (★) button below!)
So, what about our question from the very beginning? Why do some crowds turn to...
...wisdom and/or madness?
From Newton to NASA to
network science, we've covered a lot here
today. Long story short, the madness of crowds
is not necessarily due to the individual people, but due
to how we're trapped in a network's sticky web.
That does NOT mean abandoning personal responsibility, for
we're also the weavers of that web. So, improve your contagions:
be skeptical of ideas that flatter you, spend time understanding
complex ideas. And, improve your connections: bond with similar
folk, but also build bridges across cultural/political divides.
We can weave a wise web. Sure, it's harder than doodling
lines on a screen... ...but so, so worth it.
“The great triumphs and tragedies of history are caused, not by people being fundamentally good or fundamentally bad, but by people being fundamentally people.”
~ Neil Gaiman & Terry Pratchett
<3
created by
ANIRUDDH PRAMOD
back to my blog · go see the inspiration




♫ music is "Friends 2018" and "Friends 2068" by Komiku
</> This project is fully open source

WIN start simulation reset & re-draw Fan-made translations: What the, no fan-made translations exist yet?! (add your own!) (original in English)

A quick response to James Surowiecki's The Wisdom of Crowds

First off, I'm not dissing this book. It's a good book, and Surowiecki was trying to tackle the same question I am: “why do some crowds turn to madness, or wisdom?”

Surowiecki's answer: crowds make good decisions when everybody is as independent as possible. He gives the story of a county fair, where the townsfolk were invited to guess the weight of an ox. Surprisingly, the average of all their guesses was better than any one guess. But, here's the rub: the people have to guess independently of each other. Otherwise, they'd be influenced by earlier incorrect guesses, and the average answer would be highly skewed.

But... I don't think "make everyone as independent as possible" is the full answer. Even geniuses, who we mischaracterize as the most independent thinkers, are deeply influenced by others. As Sir Isaac Newton said, “If I have seen further, it is by standing on the shoulders of Giants.”

So, which idea is correct? Does wisdom come from thinking for yourself, or thinking with others? The answer is: "yes".

So that's what I'll try to explain in this explorable explanation: how to get that sweet spot between independence and interdependence — that is, how to get a wise crowd.

What other kinds of connections are there?

For the sake of simplicity, my simulations pretend that people can only be connected through friendships, and that all friendships are equal. But network scientists do consider other ways we can be connected, such as:

Directional connections. Alice is the boss of Bob, but Bob is not the boss of Alice. Carol is the parent of Dave, but Dave is not the parent of Carol. "Boss" & "parent" are directional relationships: the relationship only goes one way. In contrast, "friends" is a bidirectional relationship: the relationship goes both ways. (well, hopefully)

Weighted connections. Elinor and Frankie are mere acquaintances. George and Harry are Best Friends Forever. Even though there's a "friendship" connection in both cases, the second one is stronger. We say that these two connections have different "weights".

Just remember: all these simulations are wrong. The same way any map is "wrong". You see the map on the left? Buildings aren't gray featureless blocks! Words don't float above the city! However, maps are useful not despite being simplified, but because they're simplified. Same goes for simulations, or any scientific theory. Of course they're "wrong" — that's what makes them useful.

What other kinds of contagions are there?

There are so, so many ways that network scientists can simulate "contagions"! I picked the simplest one, for educational purposes. But here's other ways you could do it:

Contagions with Randomness. Being "exposed" to a contagion doesn't guarantee you'll be infected, it only makes it more likely.

People have different contagion thresholds. My simulations pretend that everyone has the same threshold for binge-drinking (50%) or volunteering (25%) or misinformation (0%). Of course, that's not true in real life, and you could make your sim reflect that.

An ecology of contagions. What if there were multiple contagions, with different thresholds? For example, a simple "madness" contagion and a complex "wisdom" contagion. If someone's infected with madness, can they still be infected with wisdom? Or vice versa? Can someone be infected with both?

Contagions that mutate and evolve. Ideas don't pass perfectly from one person to another the way a virus does. Like a game of Telephone, the message gets mutated with each re-telling — and sometimes the mutant will be more infectious than the original! So, over time, ideas "evolve" to be more catchy, copy-able, contagious.

I wanna learn more! What else can I read and/or play?

This explorable explanation was just a springboard for your curiosity, so you can dive deeper into a vast pool of knowledge! Here's more stuff on networks or social systems:

Book: Connected by Nicholas Christakis and James Fowler (2009). An accessible tour of how our networks affect our lives, for good or ill. Here's an excerpt: Preface & Chapter 1

Interactive: The Evolution of Trust by Nicky Case (me) (2017). A game about the game theory of how cooperation is built... or destroyed.

Interactive: Parable of the Polygons by Vi Hart and Nicky Case (also me) (2014). A story about how harmless choices can create a harmful world.

Or, if you just want to see a whole gallery of interactive edu-things, here's Explorable Explanations, a hub for learning through play!

“virtually all [college] students reported that their friends drank more than they did.”

“Biases in the perception of drinking norms among college students” by Baer et al (1991)

“The Majority Illusion”

“The Majority Illusion in Social Networks” by Lerman et al (2016).
Related: The Friendship Paradox.

“strong statistical evidence that smoking, health, happiness, voting patterns, and cooperation levels are all contagious”

From Nicholas Christakis and James Fowler's wonderfully-written, layperson-accessible book, Connected (2009).

"like a virus, to their friends"

I am oversimplifying a model here to make it easier to imagine (and visualize). In actuality, we use a Poisson model. Every node has, associated with it, a Poisson clock that will tick after Exp(1) time. When this clock ticks, it picks a neighbour, and spreads the rumour to them. If this neighbour already has the rumour, nothing happens, but if it doesn't, then they acquire a Poisson clock, and will spread the rumour. In practicality though, we care little about the times in between. We can pretend that by a 'day' we just mean the time that passes before another Poisson clock rings.

“some evidence that suicides are [contagious], too”

“Suicide Contagion and the Reporting of Suicide: Recommendations from a National Workshop” by O'Carroll et al (1994), endorsed by the frickin' Centers for Disease Control & Prevention (CDC).

“some evidence that mass shootings are [contagious], too”

“Contagion in Mass Killings and School Shootings” by Towers et al (2015).

Also see: the Don't Name Them campaign, which urges that news outlets DO NOT air mass murderers' names, manifestos, and social media feeds. This spreads the contagion. Instead, news outlets should focus on the victims, first responders, civilian heroes, and the grieving, healing community.
This is not just an empirical result. You can do the math and prove that this is true for a fully connected graph. However, for nearly every other kind of graph, the result is going to be almost impossible to prove, especially most realistic graphs.

“The world's financial institutions fell for such a cascade in 2008.”

“Lemmings of Wall Street” by Cass Sunstein, is a quick, non-technical read. Published in Oct 2008, right in the wake of the crash.

“the complex contagion theory.”

“Threshold Models of Collective Behavior” by Granovetter (1978) was the first time, as far as I know, anyone described a "complex contagion" model. (although he didn't use that specific name)

“Complex Contagions and the Weakness of Long Ties” by Centola & Macy (2007) coined the phrase "complex contagion", and showed the important differences between that and "simple contagion".

“Evidence for complex contagion models of social contagion from observational data” by Sprague & House (2017) empirically showed that complex contagions do, in fact, exist. (at least, in the social media data they looked at)

Finally, “Universal behavior in a generalized model of contagion” by Dodds & Watts (2004) proposes a model that unifies all kinds of contagions: simple and complex, biological and social!

“the possum has 13 nipples”

arranged in a ring of 12 nipples, plus one in the middle

“groupthink”

This Orwell-inspired phrase was coined by Irving L. Janis in 1971. In his original article, Janis investigates cases of groupthink, lists its causes, and — thankfully — some possible remedies.

“bonding and bridging social capital”

These two types of social capital — "bonding" and "bridging" — were named by Robert Putnam in his insightful 2000 book, Bowling Alone. His discovery: across almost all empirical measures of social connectiveness, Americans are more alone than ever. Golly.

“bridging social capital has a sweet spot”

“The Strength of Weak Ties” by Granovetter (1973) showed that connections across groups helps spread simple contagions (like information), but “Complex Contagions and the Weakness of Long Ties” by Centola & Macy (2007) showed that connections across groups may not help complex contagions, and it fact, can hurt their spread!

“the small world network”

The idea of the "small world" was popularized by Travers & Milgram's 1969 experiment, which showed that, on average, any two random people in the United States were just six friendships apart — "six degrees of separation"!

The small-world network got more mathematical meat on its bones with “Collective dynamics of small-world networks” by Watts & Strogatz (1998), which proposed an algorithm for creating networks with both low average path length (low degree of separation) and high clustering (friends have lots of mutual friends) — that is, a network that hits the sweet spot!

You can also play with the visual, interactive adaptation of that paper by Bret Victor (2011).

“[small world networks] describe how our neurons are connected”

“Small-world brain networks” by Bassett & Bullmore (2006).

“[small world networks] give rise to collective creativity”

“Collaboration and Creativity: The Small World Problem” by Uzzi & Spiro (2005). This paper analyzed the social network of the Broadway scene over time, and discovered that, yup, the network's most creative when it's a "small world" network!

“[small world networks] give rise to collective problem-solving”

See “Social Physics” by MIT Professor Alex "Sandy" Pentland (2014) for a data-based approach to collective intelligence.

“[small world networks] helped John F. Kennedy (barely) avoid nuclear war!”

Besides the NASA Challenger explosion, the most notorious example of groupthink was the Bay of Pigs fiasco. In 1961, US President John F. Kennedy and his team of advisors thought — for some reason — it would be a good idea to secretly invade Cuba and overthrow Fidel Castro. They failed. Actually, worse than failed: it led to the Cuban Missile Crisis of 1962, the closest the world had ever been to full-scale nuclear war.

Yup, JFK really screwed up on that one.

But, having learnt some hard lessons from the Bay of Pigs fiasco, JFK re-organized his team to avoid groupthink. Among many things, he: 1) actively encouraged people to voice criticism, thus lowering the "contagion threshold" for alternate ideas. And 2) he broke his team up into sub-groups before reconvening, which gave their group a "small world network"-like design! Together, this arrangement allowed for a healthy diversity of opinion, but without being too fractured — a wisdom of crowds.

And so, with the same individuals who decided the Bay of Pigs, but re-arranged collectively to decide on the Cuban Missile Crisis... JFK's team was able to reach a peaceful agreement with Soviet leader Nikita Khrushchev. The Soviets would remove their missiles from Cuba, and in return, the US would promise not to invade Cuba again. (and also agreed, in secret, to remove the US missiles from Turkey)

And that's the story of how all of humanity almost died. But a small world network saved the day! Sort of.

You can read more about this on Harvard Business Review, or from the original article on groupthink.

“we influence [...] our friends' friends' friends!”

Again, from Nicholas Christakis and James Fowler's wonderful book, Connected (2009).

“be skeptical of ideas that flatter you”

yes, including the ideas in this explorable explanation.

★ Sandbox Mode ★

The keyboard shortcuts (1, 2, space, backspace) work in all the puzzles, not just Sandbox Mode! Seriously, you can go back to a different chapter, and edit the simulation right there. In fact, that's how I created all these puzzles. Have fun!