MyOOPS開放式課程
請加入會員以使用更多個人化功能
來自全球頂尖大學的開放式課程,現在由世界各國的數千名義工志工為您翻譯成中文。請免費享用!
課程來源:TED
     

 

Eli Pariser 談當心網路「過濾泡沫」

Eli Pariser: Beware online "filter bubbles"

 

Photo of three lions
hunting on the Serengeti.

講者:Eli Pariser

2011年3月演講,2011年5月在TED上線

 

翻譯:洪曉慧

編輯:朱學恆

簡繁轉換:洪曉慧

後製:洪曉慧

字幕影片後制:謝旻均

 

影片請按此下載

MAC及手持裝置版本請按此下載

閱讀中文字幕純文字版本

 

關於這場演講

隨著網路公司努力根據我們個人喜好調整其服務(包括新聞和搜尋結果),將產生一個危險且意想不到的後果:我們被困在一個「過濾泡沫」中,而得不到可能挑戰或擴展我們世界觀的資訊。Eli Pariser強烈批評,這最終將被證明對我們及民主產生負面影響。

 

關於Eli Pariser

網路組織者先驅Eli Pariser是《過濾泡沫》一書作者,談論為何個人化的搜尋結果可能會縮小我們的世界觀。

 

為什麼要聽他演講

在2001年9月11日的攻擊事件不久後,Eli Pariser創建了一個網站,呼籲以多方途徑對抗恐怖主義。接下來幾個星期,來自192個國家,超過50萬人簽署,Pariser相當意外地成了一位網路組織者。該網站於2001年11月與MoveOn.org合併,Pariser–當時20歲-加入這個團隊指導其外交政策宣傳。他領導紐約時報雜誌所謂的「和平運動的主流武器」-讓MoveOn的會員數量成長三倍,展示了藉由網路活動可募集大量的小額捐款。

 

2004年,Pariser成為MoveOn執行長。在他的領導下,MoveOn.org Political Action這個政治團體的會員已成長到500萬名,並募集超過1.2億來自於數百萬筆支持競選宣傳及政治候選人的小額捐款。2004年,Pariser致力於MoveOn的線上對應線下實體組織,發展電話銀行工具及選區計劃,並於2006年為歐巴馬非凡的網路競選力量奠定了基礎。2008年,Pariser將MoveOn執行長職務移交給Justin Ruben,成為MoveOn董事長,他現在是羅斯福學會資深會員。

 

他的著作《過濾泡沫》訂於2011年5月12號出版。書中他提出一個問題:現代搜尋工具-許多人藉由這個過濾器看到更寬廣的世界-是如何不斷進步,並為我們將更寬廣的世界做篩選,送回它「認為」我們希望看到的搜尋結果。

 

「當面對來自Google的搜尋結果列表,一般用戶(包括我自己,直到我讀到這篇文章)傾向於認為,這份列表詳盡無遺。並不知道那竟不是...就等於沒有選擇。以這份搜尋結果的品質來看,可以這麼說,我正被餵食垃圾資訊-因為我不知道我有其他已被Google過濾掉的選擇。」

-Aubrey Pek對Kim Zetter《垃圾資訊演算法》評論http://www.wired.com/epicenter/2011/03/eli-pariser-at-ted

 

Eli Pariser的英語網上資料

Home: elipariser.com

Fellowship: Roosevelt Institute

Twitter: @elipariser

 

[TED科技‧娛樂‧設計]

已有中譯字幕的TED影片目錄(繁體)(簡體)。請注意繁簡目錄是不一樣的。

 

Eli Pariser 談當心網路「過濾泡沫」

有位記者問Mark Zuckerberg一個關於動態消息的問題,這位記者問他,「這為什麼這麼重要?」Mark Zuckerberg說,「一隻在你前院垂死的松鼠,目前跟你的關係,可能比非洲垂死的人們更重要。」我想談談,網路的相關性概念可能基於何種情形。

 

當我成長於緬因州一個非常鄉下的地方時,網路對我來說意味著某種非常不同的事物,它意味著跟世界的連結,意味著某種能將我們全都聯繫在一起的東西,而我確信這將對民主和我們的社會有所助益。但網路上的訊息流通漸漸有了轉變,而這是看不見的,如果我們不注意,它可能成為一個真正的問題。我第一次注意這個,是在我花了很多時間的地方,我的Facebook首頁。我在政治上是進步主義者-驚訝吧!但我總是跳脫常軌結識保守派份子,我喜歡聽他們的想法,觀看他們連結,我喜歡從中學到一些東西。所以當有一天,我發現保守派的消息從我的Facebook動態消息消失,我很驚訝,發生這種情形的原因是,Facebook觀察我所點擊的連結,它注意到,事實上,我點擊自由派朋友連結的次數比保守派朋友多,在跟我確認之前,就將他們排除在外,他們都消失了。

 

而Facebook不是唯一用這種隱形的、演算式的方法編輯網頁的地方,Google也這麼做。如果我搜尋某樣東西,你也這麼做,即使在同一時間,我們可能得到非常不同的搜尋結果。一位工程師告訴我,即使你已登出,Google仍監看著57個信號,包含一切資訊,從你使用的電腦類型,到你使用的瀏覽器類型,到你所在位置,用來為你制定個人化的查詢結果。花一秒鐘想想這個:不再有標準化的Google了,你知道,奇怪的是,這很難發現,你無法看到你的搜尋結果跟別人有何不同。

 

但幾個星期前,我請一群朋友Google「埃及」這個字,並將他們所得結果的螢幕截圖傳送給我。這是我朋友Scott的螢幕截圖,這是我朋友Daniel的螢幕截圖,當你把它們放在一起,你甚至不必詳讀這些連結,就可以看出這兩頁有多麼不同。但當你確實讀了這些連結,這實在令人訝異。Daniel的第一頁Google結果沒有任何關於埃及的抗議活動,Scott的結果則滿是這些,這是那時當日的大新聞,這就是搜尋結果如何變得不同。

 

不只是Google和Facebook這樣,這是席捲整個網路的現象,有很多公司都進行這樣的個人化。Yahoo新聞,網路上最大的新聞網站,現在已個人化-不同的人得到不同的內容;赫芬頓郵報、華盛頓郵報、紐約時報全都以各種不同方式操作個人化,這讓我們非常迅速地邁向一個世界,其中網路顯示的是它認為我們想看的東西,但未必是我們需要看到的。正如Eric Schmidt所說,「人們將很難看到或消費一些,在某種意義上不是為他們量身訂做的東西。」

 

因此,我確實認為這是個問題。我認為,如果你將這所有的過濾器集合在一起,如果你集合這所有的演算法,就會得到我所謂的過濾氣泡,你的過濾氣沫,就是你個人存在於網路上的獨特訊息宇宙。你過濾氣泡中的內容取決於你是誰以及你的所作所為。但問題是,你無法決定什麼會被包含在內,更重要的是,你無法真正看到什麼被編列在外。一些Netflix的研究人員,發現過濾氣泡帶來的其中一個問題是,當他們看Netflix的影片列表時,發現一些我們很多人可能已經注意到的怪事,就是有一些電影迅速浮現在列表上,在我們家中播放,它們進入列表後,立刻浮現在我們眼前,因此,《鋼鐵人》立刻出現,而《等待超人》就得等待很長一段時間(編註:這是美國談教育的半紀錄片,並非超級英雄片)。

 

他們發現,在Netflix的影片列表中,在我們未來渴望的自我和當下較為衝動的自我之間,正有這樣大規模的鬥爭發生。你們知道,我們都希望成為看過《羅生門》的人,但現在我們想看第四次《王牌威龍》(笑聲)。因此,最好的編輯方式應兩者都為我們提供一些,給我們一點小賈斯汀和一點阿富汗,給我們一些蔬菜的資料和一些甜點的資料。對這種演算法過濾器和個人化過濾器的挑戰是,它們主要觀察的是你最先點擊的東西,它會使這種平衡喪失,得到的不是一份均衡的資訊餐點,你最後會被垃圾資訊包圍。

 

這暗示著,事實上我們可能對網路有錯誤的觀念;在一個傳播訊息的社會,一些迷思就是這麼傳播的。在一個傳播訊息的社會,有這些看門者,即編輯,控制了訊息的流向。隨著網路出現,將這些人一掃而空,使我們所有的人連結在一起,真是太棒了!但這不是現今發生的真實情況。我們現在看到的是,這種情形變本加厲地從人類看門者傳承給演算法。而問題是,這種演算法還沒有嵌入編輯者擁有的道德規範,如果它們要決定我們能看到或不能看到什麼,那麼,我們需要確保它們不只是基於相關性判斷,我們必須確保他們也會向我們展示令人不舒服、或挑戰性、或重要的東西。TED就是這麼做,展示其他觀點。

 

其實這種情形之前在社會上也發生過。1915年時,報紙對他們的公民責任不是很重視,然後,人們注意到他們正在做一些很重要的事,就是如果公民無法得到有效的訊息流通,事實上你無法擁有一個正常運作的民主。報紙相當重要,因為他們像過濾器般運作,然後發展出新聞倫理。它並不完美,但帶領我們走過了上個世紀。因此,現在我們在網路上可算是倒退到1915年,我們需要新的看門者,將那種責任編入他們正在撰寫的代碼中。

 

我知道這裡很多人來自Facebook和Google,Larry 與 Sergey這些人幫助建立了現在的網路,我心懷感激,但是,我們真的需要你們確保,在這些演算法中編入公眾生活、公民責任的意識,我們需要你們確保它們足夠透明,讓我們可以看到,是什麼規則決定通過我們過濾器的是什麼,我們需要你們給我們一些控制權,讓我們可以決定什麼可以通過,什麼不能。因為我認為,我們真的需要網路成為我們夢想中的網路,我們需要它將我們連繫在一起,我們需要它來為我們介紹新觀念,以及新的人們和不同觀點,如果它讓我們全都孤立在一個個人網路中,它將無法做到這一點。

 

謝謝。

 

(掌聲)

 

以下為系統擷取之英文原文

About this talk

As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.

About Eli Pariser

Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview. Full bio and more links

Transcript

Mark Zuckerberg, a journalist was asking him a question about the news feed. And the journalist was asking him, "Why is this so important?" And Zuckerberg said, "A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa." And I want to talk about what a Web based on that idea of relevance might look like.

So when I was growing up in a really rural area in Maine, the Internet meant something very different to me. It meant a connection to the world. It meant something that would connect us all together. And I was sure that it was going to be great for democracy and for our society. But there's this shift in how information is flowing online, and it's invisible. And if we don't pay attention to it, it could be a real problem. So I first noticed this in a place I spend a lot of time -- my Facebook page. I'm progressive, politically -- big surprise -- but I've always gone out of my way to meet conservatives. I like hearing what they're thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends' links than on my conservative friends' links. And without consulting me about it, it had edited them out. They disappeared.

So Facebook isn't the only place that's doing this kind of invisible, algorithmic editing of the Web. Google's doing it too. If I search for something, and you search for something, even right now at the very same time, we may get very different search results. Even if you're logged out, one engineer told me, there are 57 signals that Google looks at -- everything from what kind of computer you're on to what kind of browser you're using to where you're located -- that it uses to personally tailor your query results. Think about it for a second: there is no standard Google anymore. And you know, the funny thing about this is that it's hard to see. You can't see how different your search results are from anyone else's.

But a couple of weeks ago, I asked a bunch of friends to Google "Egypt" and to send me screen shots of what they got. So here's my friend Scott's screen shot. And here's my friend Daniel's screen shot. When you put them side-by-side, you don't even have to read the links to see how different these two pages are. But when you do read the links, it's really quite remarkable. Daniel didn't get anything about the protests in Egypt at all in his first page of Google results. Scott's results were full of them. And this was the big story of the day at that time. That's how different these results are becoming.

So it's not just Google and Facebook either. This is something that's sweeping the Web. There are a whole host of companies that are doing this kind of personalization. Yahoo News, the biggest news site on the Internet, is now personalized -- different people get different things. Huffington Post, the Washington Post, the New York Times -- all flirting with personalization in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see. As Eric Schmidt said, "It will be very hard for people to watch or consume something that has not in some sense been tailored for them."

So I do think this is a problem. And I think, if you take all of these filters together, you take all these algorithms, you get what I call a filter bubble. And your filter bubble is your own personal unique universe of information that you live in online. And what's in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out. So one of the problems with the filter bubble was discovered by some researchers at Netflix. And they were looking at the Netflix queues, and they noticed something kind of funny that a lot of us probably have noticed, which is there are some movies that just sort of zip right up and out to our houses. They enter the queue, they just zip right out. So "Iron Man" zips right out, and "Waiting for Superman" can wait for a really long time.

What they discovered was that in our Netflix queues there's this epic struggle going on between our future aspirational selves and our more impulsive present selves. You know we all want to be someone who has watched "Rashomon," but right now we want to watch "Ace Ventura" for the fourth time. (Laughter) So the best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables, it gives us some information dessert. And the challenge with these kinds of algorithmic filters, these personalized filters, is that, because they're mainly looking at what you click on first, it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.

What this suggests is actually that we may have the story about the Internet wrong. In a broadcast society -- this is how the founding mythology goes -- in a broadcast society, there were these gatekeepers, the editors, and they controlled the flows of information. And along came the Internet and it swept them out of the way, and it allowed all of us to connect together, and it was awesome. But that's not actually what's happening right now. What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important -- this is what TED does -- other points of view.

And the thing is we've actually been here before as a society. In 1915, it's not like newspapers were sweating a lot about their civic responsibilities. Then people noticed that they were doing something really important. That, in fact, you couldn't have a functioning democracy if citizens didn't get a good flow of information. That the newspapers were critical, because they were acting as the filter, and then journalistic ethics developed. It wasn't perfect, but it got us through the last century. And so now, we're kind of back in 1915 on the Web. And we need the new gatekeepers to encode that kind of responsibility into the code that they're writing.

I know that there are a lot of people here from Facebook and from Google -- Larry and Sergey -- people who have helped build the Web as it is, and I'm grateful for that. But we really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility. We need you to make sure that they're transparent enough that we can see what the rules are that determine what gets through our filters. And we need you to give us some control, so that we can decide what gets through and what doesn't. Because I think we really need the Internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it's not going to do that if it leaves us all isolated in a Web of one.

Thank you.

(Applause)
 


留下您對本課程的評論
標題:
您目前為非會員,留言名稱將顯示「匿名非會員」
只能進行20字留言

留言內容:

驗證碼請輸入1 + 7 =

標籤

現有標籤:1
新增標籤:


有關本課程的討論

目前暫無評論,快來留言吧!

Creative Commons授權條款 本站一切著作係採用 Creative Commons 授權條款授權。
協助推廣單位: