╌>

Are You Familiar With The Concept Of "Longtermism" ? What Do You Think Of It?

  

Category:  Religion & Ethics

By:  john-russell  •  2 years ago  •  3 comments

Are You Familiar With The Concept Of "Longtermism" ? What Do You Think Of It?


Longtermism is an ethical stance which gives priority to improving the long-term future . It is an important concept in effective altruism and serves as a primary motivation for efforts to reduce existential risks to humanity. [1]

Sigal Samuel from Vox summarizes the key argument for longtermism as follows: " future people matter morally just as much as people alive today;... there may well be more people alive in the future than there are in the present or have been in the past; and... we can positively affect future peoples' lives." [2] These three ideas taken together suggest, to those advocating longtermism, that it is the responsibility of those living now to ensure that future generations get to survive and flourish. [3]

https://en.wikipedia.org/wiki/Longtermism



Sounds good, right?  

But is there a dark side? 



In the view of philosopher and former longtermist Émile Torres,  longtermism  is “an immensely dangerous ideology [that] goes far beyond a simple shift away from myopic, short-term thinking.” newrepublic.com/article/168047…




=================================================================================



The New Republic   @newrepublic

Even if humans could establish a utopia on Earth lasting billions of years,  longtermism  would judge it a catastrophe on par with nuclear war, since any civilization not obsessed with technological progress will fail to build the great digital Valhalla.





============================================





The father of longtermism, Nick Bostrom, a transhumanist and Oxford philosophy professor, has been trying to push into the mainstream the idea that the future's hypothetical digital people matter more than the billions of humans alive today because there will be at least 1058 of them. That's a 1 followed by 58 zeroes -- the number of human simulations he calculates we could run using the stars' computing power.

The New York Times, the New Yorker and other media have given longtermism fawning coverage this year with little or no mention of its deranged core. The global fad and media frenzy are almost understandable at this moment in history. It truly is hard to watch: climate change, war, migration crises, economic instability, political regression into nativism, fascism and dictatorships. It's not science fiction but current events that inspire the quest for an escape path from planet Earth.

Longtermism is often framed as a way to protect Earth. But its architects care less about ecosystems than about making sure nothing stops humanity from reaching what Bostrom calls "technological maturity." That's a nice way of characterizing that moment when people turn into bits.

Last year, Emile P. Torres, a philosopher who studies existential threats and has extensively investigated longtermism, warned that the traction longtermism is gaining makes it "the most dangerous secular belief system in the world today."


Digital immortality will never be real; But the current fad for 'longtermism' does real harm. It reminds me of my teen yearning to upload my mind.




https://chipublib.idm.oclc.org/login?url=https://www.proquest.com/docview/2722999290?accountid=303

-==============================================================================




Tags

jrDiscussion - desc
[]
 
JohnRussell
Professor Principal
1  author  JohnRussell    2 years ago

From the little I have read of this, it appears that longterism is a pet project of tech gurus and entrepreneurs who place more value on the future than the present. For example, the well being of the future human inhabitants of Mars and other planets is more important than the well being of people living today. One of these "longtermer" billionaires even suggested that funds being used to try and cure cancer be diverted to explore the needs of future interplanetary generations. 

 
 
 
JohnRussell
Professor Principal
2  author  JohnRussell    2 years ago
Longtermism is often framed as a way to protect Earth. But its architects care less about ecosystems than about making sure nothing stops humanity from reaching what Bostrom calls "technological maturity." That's a nice way of characterizing that moment when people turn into bits.

Last year, Emile P. Torres, a philosopher who studies existential threats and has extensively investigated longtermism, warned that the traction longtermism is gaining makes it "the most dangerous secular belief system in the world today."

Leading longtermists have arrived at abhorrent conclusions, such as that philanthropy should focus on saving and improving wealthy people's lives more than poor people's because that's a more direct way to ensure the innovation needed to launch us into space.

Douglas Rushkoff, author of "Survival of the Richest: Escape Fantasies of the Tech Billionaires," argues that the only way to reduce carbon emissions and salvage the Earth is to reduce consumption. "Longtermism is a way for [tech giants] to justify not looking back at the devastation they're leaving in their wake," he told me. "It's a way for them to say it doesn't matter all the damage I'm doing now because it's for a future where humans will be in the galaxies."

Whether it's Musk's plan to colonize Mars or Mark Zuckerberg's promise of a Metaverse, these billionaires' visions of escape via more industrial tools, more mass-produced technologies, can be seductive. At least Icarus' hubris cost only his own life.

As a preteen, I'd never heard of the transhumanists, the longtermists or the Extropians. But their early members were pumping propaganda into the culture, including the possibility of escaping our human forms, which they depicted as "weak, vulnerable, stupid." This perspective infected me at a time when I was frightened of my body -- of its origins and its uncertain future.

The chaos and doom that Extropians and their heirs saw in the Earth and its mortal vessels, I sensed in myself. Years later, when I heard Musk talking on a podcast about human bodies as hideous sacks of meat that we must ditch for robot encasements, I remembered my teen self and the pain I harbored. The tech supremacists promised a clean escape. I wanted one.

I thought I couldn't possibly matter as much as what those men might make out of me.

 
 
 
Jack_TX
Professor Quiet
3  Jack_TX    2 years ago

It just seems like this is yet another thing where a little moderation goes a long way.

Do we have a responsibility to the generations that will come after us?  Yeah, most responsible people would say we do.

Does that include the generations of digital non-humans created in a computer simulation?  What??  Wait...  who?  Hang on.   What just happened?  

 
 

Who is online


445 visitors