[ prog / sol / mona ]

sol


Singularity != Utopia

1 2020-08-15 13:02

What is the delusion that singularity will automatically create an utopia and end of scarcity?
Hasn't the previous technological leaps demonstrated that
scarcity remains and technology doesn't work for free?
What if singularity ends up in some AI-driven totalitarian surveillance state with rations and labor camps? How do you think a non-human AI will treat entities it doesn't have innate empathy towards?

2 2020-08-16 14:48

I think the idea is to merge humans with AI (cyborg) before a true singularity has a chance to become an both too dangerous and without empathy. Maybe also te use of CRISPR tech to gene edit. One a few hundred people have done either one they may have advantages in work/school which will drive further adoption (assuming it is affordable and not too scary to people). Parents will not want their children to be the only one in their class failing because they don't modify their children, and children aren't likely to resist. It could be an end of scarcity and a surveillance state. Privacy might already be an illusion, although I don't think backdoors and dragnet surveillance yet have complete coverage. Human's might kill themselves off intentionally or accidentally, but either way I suspect in 200 years there will likely not be any unmodified/living humans as we know them today. In some ways I hope I am wrong, but either way it will be interesting.

3 2020-08-16 15:15

>>2
1.What makes you think it wouldn't be some elite class exclusive to be a cyborg and millions will never afford it?
2.If the cyborgization will be cheap and widely available, what prevents stratification, like millions of "worker drone" humans vs
super-cyborg elite who have best augmentation?
3.If genetics is the pathway, what happens to billions who can't afford it? Genetic modifications have to be tailor made to genotypes.

4 2020-08-16 15:20

Scarcity ending implies that resources will be unlimited and equally distributed. What is the economic incentive to equally distribute unlimited resources, when a monopoly or cartel-type economy will allow elite to remain in power?

5 2020-08-16 16:41

>>4
The end of scarcity I think only implies that resources are so abundant that they can easily supply not only needs, but also the ever-growing wants. It doesn't imply anything about unlimited supply or anything at all about equal distribution. Distribution is a service problem which either will be handled automatically by robotics or by people in the service industry. People will still want to work and earn credit/money that can be used either for fame or for the limited resources that must exist in any society, such as time at desired locations (resorts, ocean front properties). If we go full science fiction we could foresee full body immersion as well as a way for people to forego those limitations of experiences as well.

Elites might remain in power, but in power of what? If anyone in society can easily produce or obtain goods and services without any difficulty then there is no means to control. Consider for example just a few techs: 90% efficiency solar panels with advanced 3D printers with near replicator (StarTrek) like efficiency. Maybe such tech is impossible or forever beyond limited human understanding, but the combination of those two would leave very limited needs to be fulfilled, such as housing. I think pessimism is only natural with daily view of the corruption and greed in today's elites, but it is hard to know if that will always be the case.

1.What makes you think it wouldn't be some elite class exclusive to be a cyborg and millions will never afford it?

This is very likely. But for how long? Will the first people that become advanced cyborgs forever wish to keep everyone as replaceable cattle? Will their live-spans be extended? and if so will they want to be forever alone as others die? It is possible, but we cannot assume their mind will not also be enhanced, and become something more than human. We cannot be sure how that will affect them, or what their motivations will be. There will likely be incentives for this tech to become available and reasonably priced, otherwise the companies that invested money on research and development of such products would lose money.

2.If the cyborgization will be cheap and widely available, what prevents stratification, like millions of "worker drone" humans vs super-cyborg elite who have best augmentation?

What prevents this? Not sure it would be prevented. But don't forget about CRISPR and brain enhancing tech. Would you want worker drones that are prone to injury or robotics however? At some point and in some fields those worker drones are likely going to be more costly than robots. We can already see a trend towards automation, but it is difficult to predict the future.

3.If genetics is the pathway, what happens to billions who can't afford it? Genetic modifications have to be tailor made to genotypes.

I have no idea if "billions can't afford it" is the forever truth or just the temporary truth. There will also be many that (even if they can afford it) will choose to not, for fear or religious reasons), but it is difficult to know if they will hold out for generations or simply years. In time I suspect it would take an incredibly strong cultural indoctrination or imposed ignorance upon their children to keep their children from seeking out potential real life immortality (assuming that is possible). That would also require those living beyond scarcity to feel near zero empathy for those living in relative squalor. The unempathetic mind that humans have today might be a result of factors that wouldn't impact 100% of the people living beyond scarcitc, so they may wish to bring them into "modern" times and improve their living condition. Of course that might mean bringing them under a surveillance state, as seen in popular novels.

The link between the singularity, end of humanity as we know it, utopias, and dystopia is hard to predict.

Human slave labor has been a highly profitable system to exploit, be it slavery or simply wage slavery. AI might take that system from the elites and rule over humanity, or maybe the elites will merge with AI and continue their rule. Maybe humans will persist under such a system for some time, but I don't think it will be hundreds of years. I highly doubt human slave labor is the pinnacle of efficiency, and over time incentives will drive robotics to replace meat things that need and inefficient amount of rest and sustenance.

6 2020-08-16 18:59

Why worry about empathy and thoughtcrime when it can simply be gene-edited out? Would you want everyone to be autistic for example? Sociopaths keep us around because they're not creative. Maybe the elites could keep the empathy for their own. The wealthy are going to self-perpetuate with life extensions everyone else simply can't afford to keep up with. Remember how expensive Major Kusanagi's parts were in Ghost in the Shell?

This is as good a reason as any to make as much money as you can now to give your genotype a shot. The bloodlines of the world are already their own ethnic clans. No one deserves to be born in shit-encrusted India. A singularity-level event is way more likely than antinatalism being embraced by everyone if you're concerned with mitigating suffering.

The nobles of yore saw themselves as having more in common with their fellow nobles of other lands than the peasants they lorded over.

Who would want to live forever as a slave? It is far better to die with an ounce of freedom than to be on your knees without end. Every man may have his price, but you can't put a price on freedom. Can these newfound boons add to the free will that we don't have? Control is only achieved at the barrel of a gun. The Federation knocks on your door and goes join or die. It is mistake to think increased intelligence whether AI or otherwise can only mean benevolence.

The question is whether you could afford not to be changed when that's the only option that keeps you economically viable. Being fed and housed won't mean much if you're stuck on a prison planet in a vat connected to the internet because you're indebted/every other option is too expensive.

I think most everyone agrees it's a good idea to not forcibly attempt to evolve the uncontacted peoples of this planet.

>That would also require those living beyond scarcity to feel near zero empathy for those living in relative squalor. The unempathetic mind that humans have today might be a result of factors that wouldn't impact 100% of the people living beyond scarcitc, so they may wish to bring them into "modern" times and improve their living condition. Of course that might mean bringing them under a surveillance state, as seen in popular novels.

So we're going to have secret wars between factions of the cyborg elite. They're the only ones who can afford to evolve as they see fit anyways.

7 2020-08-16 20:50

>>6

Why worry about empathy and thoughtcrime when it can simply be gene-edited out?

Editing peoples general thoughts and emotions probably could be done to some degree with gene-editing, but by the time the few amoral scientists who are motivated to learn how wouldn't other scientists have already done easier things, like remove common diseases? That might fundamentally alter society alone, maybe for the better or worse, but I don't think scientists are going to be thrilled to work on gene-editing that could doom themselves or others to enteral emotionless slavery.

Would you want everyone to be autistic for example?

A certain degree of variation in minds can be helpful for society and each individual therein. Making widespread edits to all people would seem very reckless to me, even if they were completely evil and completely foolish, it is hard to foresee the industry behind such a product could go through research and development and product production and product role-out without serious internal pushback, leaks, controversy. I don't think it could happen easily, but maybe if all the unlikely techs came into place at the right time making it easy, then maybe it could happen.

Maybe the elites could keep the empathy for their own.

Why would they want to intentionally give others less empathy? Wouldn't that make them less secure?

The wealthy are going to self-perpetuate with life extensions everyone else simply can't afford to keep up with.

Quite possible, but is this presuming they will intentionally try to keep others from getting life extension? Certain techs remain expensive, but if there is a large enough demand and no single monopoly then supply tends to increase and production costs improve and competition can cause eventual cost decreases.

This is as good a reason as any to make as much money as you can now to give your genotype a shot.

I disagree. Perpetuating your genes is a poor reason to seek to prioritize wealth acquisition. At the very least you should prioritize production over wealth acquisition; otherwise you could be undermining your own future.

...if you're concerned with mitigating suffering.

Even today, suffering is most often caused by peoples own ignorance overreaching desires, not by a lack of resources to satisfy needs. There are those that are in pain and suffer from actual lack of obtaining their needs, but the vast amount of suffering can be seen in everyday people whose suffering is a result of poor mental maintenance, which causes a lot of suffering on themselves and some pain on others.

The nobles of yore saw themselves as having more in common with their fellow nobles of other lands than the peasants they lorded over.

You are assuming elite of tomorrow will be like elite of today. That might be reasonable, but it also might not. Assuming a truly post-scarcity world there is little incentive to be sadistic.

Who would want to live forever as a slave? It is far better to die with an ounce of freedom than to be on your knees without end. Every man may have his price, but you can't put a price on freedom.

Death is eternal; which is a kind of freedom from suffering, but not a any kind of experience. Some people would choose to suffer over returning to the void of non-existence that they already experience for eons before their were born.

Can these newfound boons add to the free will that we don't have?

Freedom is a word that can mean many things. Even people are death row fight to stay alive and imprisoned for life over death. The boons are unknown, as well as the level of freedom, so this question cannot yet be answered.

Control is only achieved at the barrel of a gun. The Federation knocks on your door and goes join or die.

It is certainly possible for those like the elites we see today to become immortal masters seeks control over this and other planets, enslaving and sadistically working gene-edited cyborgs for all eternity, but I think that likelihood of such an outcome is close to zero.

It is mistake to think increased intelligence whether AI or otherwise can only mean benevolence.

Agreed, but a drastic change in humanity is likely to result if trans-humanism can't be stopped. That change could result in people that simply think deeper and faster, but no more morally, or it could cause people to realize they know have the means to put an end to much of the unnecessary pain.

The question is whether you could afford not to be changed when that's the only option that keeps you economically viable. Being fed and housed won't mean much if you're stuck on a prison planet in a vat connected to the internet because you're indebted/every other option is too expensive.

True, a singularity doesn't necessarily mean post-scarcity, and post-scarcity doesn't necessarily mean freedom to go off world. There are many different kinds of freedoms. One is found in self sufficiency, and a post-scarcity world would probably look a lot more like the being always online and hardly ever going outside. But instead it could mean easy to produce parts for replicated space ships. Given the general minds of humans today that could mean very dangerous things with that much freedom, so it is likely that surveillance and crime and security nightmares would only get worse and more intrusive.

I think most everyone agrees it's a good idea to not forcibly attempt to evolve the uncontacted peoples of this planet.

Look at religious missionaries. When people have a cause they think is just and for the better of people they will attempt to change their target's cultures. Some of that is for profit, but most of the missionaries aren't thinking of how they will financially trick people, but genuinely believe in their gospels. I think if it was easy post-scarcity peoples would do the same. The reason people don't try to end the suffering in others is a feeling that it is too big of a task now. But if all it meant was giving them replicator like tech with blueprints of cybernetic parts, drugs that cure aging, etc. I think they would do so. Even if 99.99% of people were against it, just a few people producing replicators with their replicators and sending them on drones would fundamentally challenge these non-modern cultures.

So we're going to have secret wars between factions of the cyborg elite. They're the only ones who can afford to evolve as they see fit anyways.

Assuming we (read: anyone and everyone) get the tech to build world ending bombs before we evolve the human mind to be wiser then there won't be any war, just a world ending event.

8 2020-08-17 08:55

The max post length must be drastically decreased.

9 2020-08-18 02:31

Looks like the cyborg drones are in.
https://www.msn.com/en-gb/news/techandscience/groundbreaking-new-material-could-allow-artificial-intelligence-to-merge-with-the-human-brain/ar-BB1836wm

10 2020-08-22 12:18

>>9
https://www.freethink.com/articles/cyborgs

11 2020-09-19 14:41

From the very first sentence this discussion refers to some specific "singularity" as a predefined phenomenon. I feel out of the loop. Could someone point out what is this specific singularity that everyone refers to?

12 2020-09-19 15:55

>>11
It's technological singularity, a predicted point in history where technological progress reaches such a velocity that humanity is no longer able to control.

13 2020-09-20 19:21

>>12
Oh ok, I just learned about the concept. Thank you.

14 2020-09-21 16:18 *

>>12

"Humanity" is no longer in control compared to what? We already live in a world where bureaucracy and human constructs run human lives without direct human management. I don't see that we're so in control now.

15 2020-09-21 18:18

>>14
Sorry I'm not an expert on the topic. I'd be glad if you could give us a better explanation of what it means.

16 2020-10-03 11:26

>>15
Likely he means that the world is run by computer algorithms predicting stuff from weather to stock prices, neural networks trained to detect anomalies and other software not under human 'control':
Bureaucracy increasingly relies on computer databases, protocols and various software that is getting more and more complex to analyze.

17 2020-10-03 23:32 *

>>16
Isn't humanity already there? Humanity controls the weather also.

18 2020-10-09 08:12 *

He who controls the spice ...

19 2020-10-16 21:36 *

>>18
... controls the nice

20 2021-08-06 02:16

>>1
But my Star Trek.... before JJ Abrams ruined it anyways.

21 2021-08-06 02:32 *

automation under capitalism typically makes many people's lives worse without really benefiting anyone, and likewise with most work, which would be superfluous if it wasn't harmful. in my opinion AI is completely necessary to have a post-scarcity society, and I would not be surprised if the means of production were sufficient even in the 90s in advanced countries. the issue is that countries with resources are not oriented towards bringing such a change about, and those handful of states which are oriented in this direction are too poor, fearful conservative, or ignorant to even try.

22 2021-08-06 13:15 *

>>21
AI being statistics, linear-programming, expert systems on a computer, etc. (all of which I believe were available in the 80s, but computation was far more expensive)

23


VIP:

do not edit these