MyOOPS開放式課程
請加入會員以使用更多個人化功能
來自全球頂尖大學的開放式課程,現在由世界各國的數千名義工志工為您翻譯成中文。請免費享用!
課程來源:TED
     

 

Martin Rees 談我們能阻止世界末日嗎?

Martin Rees: Can we prevent the end of the world?

 

Photo of three lions hunting on the Serengeti.

講者:Martin Rees

2014年3月攝於TED2014

 

翻譯:洪曉慧

編輯:朱學恒

簡繁轉換:洪曉慧

後制:洪曉慧

字幕影片後制:謝旻均

 

影片請按此下載

MAC及手持裝置版本請按此下載

閱讀中文字幕純文字版本

 

關於這場演講

世界末日後的地球杳無人煙,看起來就像科幻影集或電影的場景。但在這場簡短而令人驚異的演講中,Martin Rees爵士要求我們思考真正存在的危機-能毀滅人類的自然與人為威脅。身為關心人類的一員,他問道:「可能發生的最壞情況是什麼?」

 

關於Martin Rees

Martin Rees爵士是世上最傑出的天文學家之一,他是劍橋大學宇宙學與天體物理學榮譽教授,也是英國皇家天文學家。他對於人類未來在宇宙中的定位提出許多重要看法。

 

為什麼要聽他演講

Martin Rees爵士提出對人類未來的呼籲。他2004年的著作有個不祥的書名-《我們的最後時刻》(Our Final Hour,繁體譯本書名為《時終》),列出在21世紀前所未有、突飛猛進的科學發展下人類面臨的威脅。他呼籲科學家與非科學家都得採取確保人類生存的措施。

 

Rees是世上頂尖的天文學家,也是劍橋大學宇宙學與天體物理學榮譽教授,及英國皇家天文學家。他撰寫了500多篇關於宇宙論的研究論文,內容包括黑洞、量子物理學和宇宙大爆炸。Rees因對科學的貢獻獲得無數獎項,但他對於向一般大眾解釋複雜的科學同樣貢獻良多,相關著作包括《宇宙創生之前》、《我們的宇宙棲地》。

 

Martin Rees的英語網上資料

ast.cam.ac.uk

Our Final Hour

Just Six Numbers

 

[TED科技‧娛樂‧設計]

已有中譯字幕的TED影片目錄(繁體)(簡體)。請注意繁簡目錄是不一樣的。

 

Martin Rees 談我們能阻止世界末日嗎?

 

十年前我寫了一本書,我將它命名為《我們最後的世紀?》,後面有個問號。我的出版商刪掉了那個問號。(笑聲)美國出版商把我們的標題改成《我們的最後時刻》。美國人喜歡即時行樂及與此相反的概念。(笑聲)

 

我想討論的主題如下:我們的地球已存在四千五百萬個世紀,但這個世紀與眾不同-這是人類這個種族首次將地球的未來掌握在手中。縱觀地球的全部歷史,威脅一向來自於自然-疾病、地震、隕石撞擊等等。但從現在開始,最大的威脅來自於我們。不僅是核威脅,在這個緊密相連的世界網路災難可席捲全球;空中旅行可在數天內將傳染病傳播到全世界;社群媒體可以光速傳播恐慌與謠言。我們太過憂心偶發的危機-發生機率不高的空難、食物中的致癌物、低劑量輻射等等-但我們與政策領導者都拒絕正視毀滅性災難的可能性,欣慰最壞的情況尚未發生。確實,它們或許不會發生,但如果某個事件具有潛在毀滅性,它值得我們花費大量金錢,採取相應保護措施,即使不太可能發生,就像我們替房子保火險。

 

科學提供的力量和希望越多,隨之而來的負面影響也越可怕,我們將更容易受到傷害。幾十年後,數百萬人將有能力濫用迅速發展的先進生物科技,就像他們現在濫用電腦科技一樣。Freeman Dyson在一場TED演講中,預見下一代將設計及創造新的有機體,如同他那個世代進行化學實驗一樣普遍。好,這或許只是科幻小說情節,但即使他的設想僅部分實現,我們的生態、甚至我們的種族也肯定不會毫髮無傷地生存下去。例如一些生態極端主義分子認為人類越少,對地球-對大地之母蓋亞來說越有利。當這些人掌握了2050年將廣為流傳的生物合成技術會發生什麼情形?到那時,其他科幻小說中的噩夢也許會成為現實:沉默的機器人變得瘋狂,或網路發展出自我心智,對所有人類產生威脅。

 

好,我們可藉由規範避免此類危機嗎?我們必須嘗試,但相關企業競爭如此激烈、如此全球化,且深受商業壓力影響,因此任何可行的事都會在某處實現,無論有何規範。就像禁毒法規-我們試著管理掌控,卻徒勞無功。地球村也存在愚民,遍布世界各地。因此如我在書中所述,這個世紀我們將歷經一段崎嶇的路程。或許我們的社會將遭遇挫折-事實上,50%的機會可能遭遇嚴重挫折-但是否可能發生更糟的事件?足以使所有生命滅絕的事件?當新型粒子加速器上線時,一些人憂心忡忡地問道它是否會破壞地球,或者更糟,破壞太空的構造?值得慶幸的是,我們能提出令人安心的保證。我和其他一些學者指出,大自然已藉由宇宙射線的碰撞進行過無數次相同的實驗,但科學家確實應該對自然界中不曾出現的情況採取預防措施,生物學家應避免釋出具潛在毀滅性的基因改造病原體。

 

順帶一提,我們對真實危機存在可能性的特殊反感源於一個哲學和倫理上的問題,那就是-考慮兩種情況;情況A,90%人類消失;情況B,所有人類消失。情況B比情況A糟糕多少?有人會說10%,死亡人數多了10%,但我認為情況B糟透了。身為天文學家,我無法相信人類是故事的終結。太陽形成前,宇宙已存在了五十億年,或許能永遠長存。因此後人類的演化無論是在地球上或地球外,將如同經歷達爾文演化過程的人類歷史一樣綿長,甚至更加精彩。事實上,未來的演化會更加迅速,基於科技發展的時間標度,而非物競天擇的時間標度。

 

因此鑒於那些巨大的風險,我們不應接受即使是十億分之一的風險,認為人類的滅絕將阻止這些巨大的潛在危機。某些設想中的情況也許確實只存在於科幻小說中,但某些卻可能是令人憂心的事實。有句重要的格言說:不曾發生的事不等於不可能發生。事實上,這就是為什麼我們在劍橋大學設立一個研究如何減輕這些存在危機的中心,似乎只有少數人認為這些潛在性災難值得思考。我們需要所有來自其他人的幫助,因為我們是浩瀚宇宙中一顆珍貴藍色星球的主宰,這個星球已存在五千萬個世紀,因此別讓未來陷入危機。

 

我想引用傑出科學家Peter Medawar的一句話作為結束:「為人類敲響的鐘聲就像阿爾卑斯山的牛鈴,它與我們密切相關,如果無法發出和諧優美的聲音,必定是我們的錯。」

 

十分感謝。(掌聲)

 

以下為系統擷取之英文原文

About this Talk

A post-apocalyptic Earth, emptied of humans, seems like the stuff of science fiction TV and movies. But in this short, surprising talk, Lord Martin Rees asks us to think about our real existential risks — natural and human-made threats that could wipe out humanity. As a concerned member of the human race, he asks: What’s the worst thing that could possibly happen?

About the Speaker

Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos. Full bio

Transcript

Ten years ago, I wrote a book which I entitled "Our Final Century?" Question mark. My publishers cut out the question mark. (Laughter) The American publishers changed our title to "Our Final Hour." Americans like instant gratification and the reverse. (Laughter)

And my theme was this: Our Earth has existed for 45 million centuries, but this one is special — it's the first where one species, ours, has the planet's future in its hands. Over nearly all of Earth's history, threats have come from nature — disease, earthquakes, asteroids and so forth — but from now on, the worst dangers come from us. And it's now not just the nuclear threat; in our interconnected world, network breakdowns can cascade globally; air travel can spread pandemics worldwide within days; and social media can spread panic and rumor literally at the speed of light. We fret too much about minor hazards — improbable air crashes, carcinogens in food, low radiation doses, and so forth — but we and our political masters are in denial about catastrophic scenarios. The worst have thankfully not yet happened. Indeed, they probably won't. But if an event is potentially devastating, it's worth paying a substantial premium to safeguard against it, even if it's unlikely, just as we take out fire insurance on our house.

And as science offers greater power and promise, the downside gets scarier too. We get ever more vulnerable. Within a few decades, millions will have the capability to misuse rapidly advancing biotech, just as they misuse cybertech today. Freeman Dyson, in a TED Talk, foresaw that children will design and create new organisms just as routinely as his generation played with chemistry sets. Well, this may be on the science fiction fringe, but were even part of his scenario to come about, our ecology and even our species would surely not survive long unscathed. For instance, there are some eco-extremists who think that it would be better for the planet, for Gaia, if there were far fewer humans. What happens when such people have mastered synthetic biology techniques that will be widespread by 2050? And by then, other science fiction nightmares may transition to reality: dumb robots going rogue, or a network that develops a mind of its own threatens us all.

Well, can we guard against such risks by regulation? We must surely try, but these enterprises are so competitive, so globalized, and so driven by commercial pressure, that anything that can be done will be done somewhere, whatever the regulations say. It's like the drug laws — we try to regulate, but can't. And the global village will have its village idiots, and they'll have a global range.

So as I said in my book, we'll have a bumpy ride through this century. There may be setbacks to our society — indeed, a 50 percent chance of a severe setback. But are there conceivable events that could be even worse, events that could snuff out all life? When a new particle accelerator came online, some people anxiously asked, could it destroy the Earth or, even worse, rip apart the fabric of space? Well luckily, reassurance could be offered. I and others pointed out that nature has done the same experiments zillions of times already, via cosmic ray collisions. But scientists should surely be precautionary about experiments that generate conditions without precedent in the natural world. Biologists should avoid release of potentially devastating genetically modified pathogens.

And by the way, our special aversion to the risk of truly existential disasters depends on a philosophical and ethical question, and it's this: Consider two scenarios. Scenario A wipes out 90 percent of humanity. Scenario B wipes out 100 percent. How much worse is B than A? Some would say 10 percent worse. The body count is 10 percent higher. But I claim that B is incomparably worse. As an astronomer, I can't believe that humans are the end of the story. It is five billion years before the sun flares up, and the universe may go on forever, so post-human evolution, here on Earth and far beyond, could be as prolonged as the Darwinian process that's led to us, and even more wonderful. And indeed, future evolution will happen much faster, on a technological timescale, not a natural selection timescale.

So we surely, in view of those immense stakes, shouldn't accept even a one in a billion risk that human extinction would foreclose this immense potential. Some scenarios that have been envisaged may indeed be science fiction, but others may be disquietingly real. It's an important maxim that the unfamiliar is not the same as the improbable, and in fact, that's why we at Cambridge University are setting up a center to study how to mitigate these existential risks. It seems it's worthwhile just for a few people to think about these potential disasters. And we need all the help we can get from others, because we are stewards of a precious pale blue dot in a vast cosmos, a planet with 50 million centuries ahead of it. And so let's not jeopardize that future.

And I'd like to finish with a quote from a great scientist called Peter Medawar. I quote, "The bells that toll for mankind are like the bells of Alpine cattle. They are attached to our own necks, and it must be our fault if they do not make a tuneful and melodious sound."

Thank you very much.

(Applause)

 


留下您對本課程的評論
標題:
您目前為非會員,留言名稱將顯示「匿名非會員」
只能進行20字留言

留言內容:

驗證碼請輸入9 + 0 =

標籤

現有標籤:1
新增標籤:


有關本課程的討論

目前暫無評論,快來留言吧!

Creative Commons授權條款 本站一切著作係採用 Creative Commons 授權條款授權。
協助推廣單位: