Tuesday, July 19, 2011

eli pariser



via
As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy. 


facebook and google's invisible, automated, editing of the web
there are 57 signals that google looks at, what kind of computer, where you are, etc, there is not standard google anymore.
hard to see - can't see how different your searches are than others.

yahoo news, huffington post, washington post, is also doing this personalization 

trying to show us what they think we want to see, but not so much what we need to see.
your filter bubble - you don't decide what gets in and you don't see what gets edited out

netflix cues
the challenge with these algorithmic filters is they look at what you look at first.. so you can throw off that balance

we're seeing a passing of the torch from human gatekeepers to machine gate keepers
they don't have the mental capacity

make sure algorithms are transparent enough for us to see and help control

we need the interenet, to be that thing that we dreamed it would be, that thing that connects us, draws us together, introduces us to new ideas and new people and different perspectives
but it's not going to do that if it leaves us all isolated in a web of one 

google advance search