Computer scientists have discovered a way to number-crunch an individual’s own preferences to recommend content from others with opposing views. The goal? To burst the “filter bubble” that surrounds us with people we like and content that we agree with.
The term “filter bubble” entered the public domain back in 2011when the internet activist Eli Pariser coined it to refer to the way recommendation engines shield people from certain aspects of the real world.
Pariser used the example of two people who googled the term “BP”. One received links to investment news about BP while the other received links to the Deepwater Horizon oil spill, presumably as a result of some recommendation algorithm.
This is an insidious problem. Much social research shows that people prefer to receive information that they agree with instead of information that challenges their beliefs. This problem is compounded when social networks recommend content based on what users already like and on what people similar to them also like.
This the filter bubble—being surrounded only by people you like and content that you agree with.
And the danger is that it can polarise populations creating potentially harmful divisions in society.
…. read more about the solution on :
This was quite funny as having grown up in Eindhoven, The Netherlands, surrounded by Philips and being involved with light systems even since before I studied Industrial Design, I had to rethink the role of lighting systems, separated from the furniture object they often are and how these would be sold to households?]]>
and a nice comment around the issues of remuneration of staff when the value of a customer relation rather than the individual transactions becomes the driving force for high street presence.
by Adrian Wakeham on http://adrianwakeham.wordpress.com/tag/alexander-grunsteidl/]]>