digital wellbeing labs

we tune technology to create harmony in your life

Filters vs. Serendipity


Whatever you’re looking for on the Internet—entertainment, a product to purchase, a connection to a community—in most cases, you’re likely to receive an overwhelming amount of results to choose from. These relevant search results are valuable to you… Or are they?

Published on :

More and more commentators are wondering if the tools we create to give us more choices—such as search engines—are delivering less variety, ultimately limiting chance discoveries and exposure to new ideas.

On the BBC’s The Culture Show, Aleks Krotoski recently examined the role of serendipity as an online commodity, questioning whether the Internet is as innovative as we think. She points out that computers have the unique ability to make valuable, unseen connections for us. Instead of maximizing that potential, our search filters keep us focused on only the most relevant information.

Alex explains, “We will never have the opportunity to bump into something truly new, because the machines are predicting our futures based on our past preferences, creating an infinite loop of cultural homogenization.”

The concern over the consequences of homogenized choice is not entirely new. David Byrne noted in his book Bicycle Diaries, that in many urban developments gentrification leads to separation, rather than integration, of different social and cultural groups. This separation leads to less collisions between ideas and the stifling of creativity.

David describes, “I think online communities tend to group like with like, which is fine for some tasks, but sometimes inspiration comes from accidental meetings and encounters with people outside one’s own demographic, and is less likely if you only communicate with your ‘friends’…”

Other commentators also question if recommendations based on a combination of one’s preferences, social profile, and history of consumption really offers new opportunities. In an article for Design Week,Steve Price discussed how the role of media retailers is changing in the age of the “Filter Bubble.”

“Google, as amazing as it is, can only answer the questions you ask it,” he states. “It cannot tell you which questions you should be asking. Search results and news feeds are all now influenced by engines that take as a point of entry all that they know about you and spit back the information they think you’ll want. What is on the screen when you open Spotify? Recommendations on new music based on its knowledge of you. What happens if you visit Rough Trade Records? You often leave with albums and music from artists you’ve never heard of, having heard it played in the store, or from talking to one of the employees who clearly live and breathe music.”

Concerns aside, the tech community seems to be moving in the direction of “smarter” recommendation engines. For example, The Filter founded by Peter Gabriel. These developments suggest we might soon see recommendations for vacuum cleaners based on one’s music tastes. For example, a robotic system called HyperActive Bob has been developed to anticipate customer behaviors in fast food restaurants. This includes correlating a customer’s type of car with what he or she might order, but this particular filter has failed to prove successful so far.

When the self-referential nature of media increases the speed of recycling ideas in film, design, music, fashion and global culture as a whole, what will it take to receive truly original recommendations? What can we design into user experiences that will allow for the unexpected?

Imagine the possibilities of using “dumber” algorithms that will allow us to be pleasantly surprised by serendipity wherever we are…and whenever we “don’t” expect it.

If you liked this article we recommend:

Alexander Grünsteidl & Nikki Roddy @ Method

Tagged as: , , , , , ,


  1. Relevant update
    “How to Burst the “Filter Bubble” that Protects Us from Opposing Views”

    Computer scientists have discovered a way to number-crunch an individual’s own preferences to recommend content from others with opposing views. The goal? To burst the “filter bubble” that surrounds us with people we like and content that we agree with.

    The term “filter bubble” entered the public domain back in 2011when the internet activist Eli Pariser coined it to refer to the way recommendation engines shield people from certain aspects of the real world.

    Pariser used the example of two people who googled the term “BP”. One received links to investment news about BP while the other received links to the Deepwater Horizon oil spill, presumably as a result of some recommendation algorithm.

    This is an insidious problem. Much social research shows that people prefer to receive information that they agree with instead of information that challenges their beliefs. This problem is compounded when social networks recommend content based on what users already like and on what people similar to them also like.

    This the filter bubble—being surrounded only by people you like and content that you agree with.

    And the danger is that it can polarise populations creating potentially harmful divisions in society.

    …. read more about the solution on :

Leave a Response