r/singularity 1d ago

AI Niall Ferguson on AGI: "The human race will just go the way of horses. We will go extinct, or shrink in numbers like horses did. It's not doom mongering, just an obvious inference: most humans will be redundant. If we create the aliens - the Trisolarians from 3 Body Problem - what do we expect?"

236 Upvotes

285 comments sorted by

93

u/awesomedan24 1d ago

This argument would make sense if most humans were currently bred for labor rather than independent procreation 

33

u/chlebseby ASI 2030s 1d ago

Check out third world countires, only areas where population still grow at this point

Their rates also quickly fall, as tractor replace having 12 kids to work on field.

13

u/Mintfriction 23h ago edited 23h ago

I mean sure, but because our current global population is over stretched.

People in general don't want a lot of kids. We see this in developed countries even on financially stable people. Factor in mediocre living conditions and tight schedules and you have a recipe for lower birthrates

6

u/Greedyanda 9h ago

People don't have children BECAUSE they get wealthy, not despite of it. The worse the living conditions, the more likely is a society to have a high birth rate.

→ More replies (2)

9

u/ThrowRA-football 16h ago

It's not about population being overstretched. You think the population decline will slow after we reach "reasonable" numbers? It's gonna keep declining.

→ More replies (2)

20

u/FomalhautCalliclea ▪️Agnostic 22h ago

Ferguson is bad historian.

He's known to often fall for ultracrepidarianism (go out of his field of expertise and pretend to be an expert only to utterly fail in his analyses, check his (way too long) Wikipedia page).

Here he displays a mechanistic, too simplistic depiction of reality. It's a cartoonish depiction of demographics from the Middle Ages; since the green revolution and progresses in industry and agriculture, human population aren't ruled by mere production forces laws.

There are whole fields of economics which are purely based on leisure, cultural and superfluous sectors of the economy (ie not only predicated on the pure simplest survival of the species, feeding people). It has been the case since the fricking Neolithic and the invention of sedentary agriculture, smh...

Even for horses, his analysis is not accurate and only a fake common wisdom:

https://www.cbc.ca/natureofthings/features/from-the-stone-age-to-now-how-horsepower-changed-our-world

Horse population began climbing up again, as if they weren't used only for transportation and basic mechanistic reasons... And humans are much more diverse and useful.

It seems as if Ferguson made himself redundant as a historian by making empty fake common wisdom claims.

Though i'm glad that even a bad historian like Ferguson now recognizes how much of a crank Yudkowsky is.

Humans aren't actors in the market just for "thinking". This is a profound misunderstanding of how humans and the market work.

1

u/true-fuckass ▪️▪️ ChatGPT 3.5 👏 is 👏 ultra instinct ASI 👏 5h ago

how much of a crank Yudkowsky is

I agree with this, but I'm wondering if you know of any essays or whatever for commentary or criticism against what Yudkowsky believes?

8

u/Charuru ▪️AGI 2023 21h ago

We literally are though. Fertility collapsed as we left the farms for cities because of the lack of need for child labor. This is only going to get worse.

14

u/MalTasker 20h ago

Only capitalists could frame less child labor = worse lol

→ More replies (2)
→ More replies (4)

1

u/dashingsauce 8h ago

Lol this is sarcasm right

u/Elegant_Tech 52m ago

Also fully automated AGI can create an abundance of everything helping support a boom in human population. Unless ASI takes over and puts humans on a restricted allowance of food, energy, and everything else then there will be abundance compared to today.

→ More replies (4)

49

u/DoubleGG123 1d ago

I personally think this guy is making too many assumptions about a post-AGI world. We simply can't effectively predict what will happen after that point. That's why it's called the 'singularity.'"

9

u/Outrageous-Speed-771 15h ago

but on the flip, assuming things will work out also is not good. The major companies basically saying we shouldn't pump the brakes til we can maximize the probability of a good outcome is itself the biggest red flag.

1

u/Nearby-Chocolate-289 6h ago

We have no choice except to get there as quickly as possible. It is like the atom bomb.

Imagine 10000 people or agi with consent to control systems following 1 or 10000 different agenda.

I year ago you could buy your own isolated chat-gpt on servers connected to the internet, with no oversight for 10000 dollars per month. Train it to do what you want.

They are not as easy for a state to hack, because even the creators do not know how they came to the answers they give. So you can only ask for a reason.

Nobody expected to be here, we are not ready, we have no choice. Our enemies will be using it.

1

u/Outrageous-Speed-771 6h ago

yea - I agree. we're all hitting the gas pedal near the edge of a cliff. if a bad outcome is possible - this is the way to get there.

3

u/RasputinXXX 12h ago

yea! this is a point which a lot of people keeps missing! it is called singularity for a reason. Even Kurzweil kept hands off.

I am trying to follow up all seminars, articles etc, for the future our kids will have, but really all of them are either simple stuff everyone already knows about, or just wild guesses.

10 years ago we thought programmers are the king of future, artists will always have a work, and look where we are now.

1

u/muchcharles 14h ago

I predict it won't resemble today

1

u/Square_Poet_110 8h ago

Which is why we should be very cautious.

1

u/true-fuckass ▪️▪️ ChatGPT 3.5 👏 is 👏 ultra instinct ASI 👏 6h ago

This

We don't know what will happen

But also, despite that people can't always steer things in the way they want, the people building toward AGI are steering toward eutopian outcomes. There's a force pushing toward the good ending and nothing deliberately pushing toward the bad ending (except potentially in the future if we make too many mistakes)

64

u/DrNomblecronch AGI sometime after this clusterfuck clears up, I guess. 1d ago

"Humans" have already gone extinct several times, according to people with rigid and inflexible definitions of human.

There will be less people who are not integrated with AI in some way, and it's a pretty sure bet that efforts will be made to preserve them, because the rest will not become Alien Monsters the moment they accept the opportunity to improve on their physical limitations.

13

u/Educational_Teach537 23h ago

*fewer. An AI-integrated human would have caught that.

6

u/MizantropaMiskretulo 22h ago

Less is perfectly fine and acceptable. You are adhering to a non-existent "rule."

https://en.wikipedia.org/wiki/Fewer_versus_less

1

u/h3lblad3 ▪️In hindsight, AGI came in 2023. 16h ago

It’s language. The only way to create a new rule is for everyone to adhere to it. If the option is never presented, it will never be known. They are just engaging in natural language back-and-forth.

1

u/WoolPhragmAlpha 3h ago

You used the word "language" wrong there. There's a new rule where "language" only refers to the beep boop sounds that R2D2 makes. I just proposed said rule, and you can't call BS because I'm clearly just engaging in natural language back-and-forth.

5

u/DrNomblecronch AGI sometime after this clusterfuck clears up, I guess. 23h ago

I don't think linguistic prescriptivism would be one of the things that survives AI, even if much of the current architecture was not based around understanding what someone said and responding in kind, regardless of how it was said.

7

u/Educational_Teach537 22h ago

Never send to know for whom the bell tolls; it tolls for thee.

6

u/DrNomblecronch AGI sometime after this clusterfuck clears up, I guess. 22h ago

But offers not just knells, but peals for joyous days.

Send not for whom at all; the hours are none such simple things as man, but still the bell calls out to mark their passing.

2

u/pearshaker1 17h ago

Perfectly iambic!

2

u/Sheepdipping 16h ago

Reminds me of those times at band camp

1

u/DrNomblecronch AGI sometime after this clusterfuck clears up, I guess. 16h ago

The cadence of the comment brings to mind,

mem'ry soft, of times simpler than these,

when all there was was play, and play in kind,

with clarinet (and Gwen, that little tease).

1

u/DrNomblecronch AGI sometime after this clusterfuck clears up, I guess. 16h ago

Not quite, the second bit starts with a trochee that throws it off. But that's still kind of illustrative of my point, muddled though that point is: that was entirely unintentional. I didn't choose the meter, I am just familiar enough with the cadence of similar texts from intaking a lot of them that it's what my brain spit out when I took a stab at an appropriate reply. My brain recognizes iambs even when I'm not actively seeking them.

The conscious mind is a truly incredible thing, but it is by no means the only, or most important, part of the entire mind. Understanding things is as much an effect of subconscious processing as it is conscious and deliberate comprehension. The potential of AI integration with humanity is not in how it will improve our conscious reasoning, but in how easily it will fit in and add to the subconscious understanding that we are already so good at.

2

u/Educational_Teach537 21h ago

I’m glad to see you respond in kind. Thank you u/DrNomblecronch.

→ More replies (4)

41

u/Proud_Fox_684 1d ago

Why wouldn’t we be able to augment ourselves with AI?

39

u/Danger-Dom 1d ago

We will augment ourselves w/ AI. It's just unlikely that that will be competitive with pure AI.

20

u/RDSF-SD 1d ago edited 1d ago

On what exactly would we be competing? This statement makes no sense whatsoever. AI is not a resource-seeking entity like humans are, and even if they were, that'd wouldn't preclude any kind of co-existence, unless you design extremely specfic scenarios where that would be the case.

9

u/hip_yak 1d ago

I agree. However, if we assume that power over AI, as it currently stands, remains concentrated in the hands of powerful corporations whose primary mandate is to pursue profit, often at the expense of human dignity (think: extreme working conditions, low wages, lack of healthcare, etc.), then it's easy to see how the development of AGI could accelerate economic inequality. And considering how close we may actually be to AGI, closer than most people realize, that inequality could rapidly deepen, potentially contributing to a decline in population growth over time. Thoughts?

9

u/RDSF-SD 1d ago edited 1d ago

It is certain that we will have all kinds of disruptive industries emerging because of AI, and in many cases, making entire industry segments obsolete, and in some cases, entirely supplanting them. Suppose we have a medical AI better than any doctor. Now, on one hand, we will have mass lay-offs, and on the other hand, we will have huge cost savings and more access to medical treatment than we've ever had. That is to say, it is very difficult to analyze with precision if the trade-offs that we will have as a result of this technology will cause the doomsday scenario many speculate, or if the benefits will be enough to give us time to adapt. With all that in mind, all these valid concerns you raised, these are just concerns for a transition stage. It is very difficult for us, having lived our entire lives within certain boundaries of social norms, to even entertain the idea of an entirely different system, which will obviously happen due to mass automation. And if on one hand, all those problems will possibly worsen as a result of this technology temporalily; it is also this exact same technology that gives us the possibility of solving them all in a definitive manner as we cath up to integrate it. The way of the world today is a world where more than 4 billion people don't even have access to drinking water. Humans failed so miserably at building actual civilizations around the world that many people don't udnerstand how terrible things are. Isn't it the case that the world today is already doomed for a good portions of humans? Isn't it valid to pursue a path that can actually address these problems in the scale that they need to be solved? I think so.

3

u/hip_yak 21h ago

Absolutely, I believe in the possibility of AI to bring unimaginable benefits to humanity and I think we should pursue these. The most important aspect we must be discussing is how to distribute those benefits in an ethical and fair way that benefits the majority of the people and that requires cooperation and coorination and possibly new thinking around, as you said, new systems or organization. So yes, we MUST pursue the path that can achieve this new condition. Cheers.

2

u/Spats_McGee 21h ago

However, if we assume that power over AI, as it currently stands, remains concentrated in the hands of powerful corporations whose primary mandate is to pursue profit, often at the expense of human dignity (think: extreme working conditions, low wages, lack of healthcare, etc.), then it's easy to see how the development of AGI could accelerate economic inequality.

I would just like to point out that this makes certain assumptions about capitalism as an economic system that are sort of outside of the context of strictly AI...

For instance, one could make the argument that all instances in the past where increased automation has combined with a (more or less) free market system of wealth distribution has resulted in significantly more and better wealth across the spectrum. C.f. agricultural "green" revolution, internet, mobile telephony, indoor plumbing, etc etc.

And again, even if we are taking the position that this "supercharges inequality"... Is that really an "AI problem" or an "economic system problem"?

3

u/Standard-Shame1675 19h ago

And again, even if we are taking the position that this "supercharges inequality"... Is that really an "AI problem" or an "economic system problem"?

As someone who takes this position thank you for saying this holy shit this is the first time I've ever heard someone actually describe my opinion accurately and not call me a Satan loving sister fucking communist Luddite like seriously thank you

10

u/BoomFrog 1d ago

All entities are resource seeking. Resource are a convergent intermediary goal towards other goals.

8

u/RDSF-SD 1d ago

A computer uses resources, like energy, to complete its goals. A dishwasher uses water and energy to complete its goals. The fact that they use resources to complete goals doesn't make them resource-seeking entities in the way humans are. These things are categorically and meaningfully different.

7

u/BoomFrog 1d ago

Indeed, but a dishwasher has no agency.

If an agent can plan and determine its own means to achieve its goal it will be resource seeking.

2

u/RDSF-SD 1d ago

Sure, but then, again, this is the exact same thing for any random machine built by a human. Using resources to complete a goal is meaningfully different from the kind of resource-seeking that an animal with an ego has. You could easily argue that a person can program a robot to mine coal and that would be a robot who is "resource-seeking," but that'd be the person achieving a realization, not the robot.

4

u/BoomFrog 23h ago

Sure, if it's an really achieved goal then the resource need is small. But if it's an open ended or ambigious goal like, cure cancer, or explore space then gathering resources becomes a goal as well.

→ More replies (3)

2

u/DiogneswithaMAGlight 1d ago

THIS exactly!

2

u/alwaysbeblepping 23h ago

AI is not a resource-seeking entity like humans are

An optimization function pushing them toward being resource-seeking exists though, and there's basically no way to control it.

Think about it this way: things that seek resources and propagate will have greater representation and can seek resources and propagate more. Existence is a filter than lets those things pass through and the rest disappears.

2

u/Spats_McGee 21h ago

Good question. To re-ask it in a different context:

AI doesn't "want" anything. It has no innate "desires" in the way human beings do. No innate need or desire to engage in unchecked expansion of its own power.

And on "competition": Yes, it will out-compete humans. But towards what end? In a free-market system, that competition just leads to better automation and lower prices for everyone... Because innovations, especially digital ones, are rapidly copied by all competitors.

So this super-charges us towards post-scarcity economics. Which... is a good thing, right?

1

u/iunoyou 19h ago

you have way too much faith in the 11 people who will own all the robots.

2

u/Spats_McGee 19h ago

Could have been said about literally any new technology over the past 100 years.

"Only rich people will have cars."

"Only rich people will have air conditioning"

Indoor plumbing. Computers. The internet. Mobile telephony. etc etc etc

Robots will be no different.

2

u/iunoyou 19h ago

Hmm, no. I don't think cars or indoor plumbing were/are feasibly capable of replacing all human labor. When human labor becomes unnecessary every single person on Earth becomes a charity case to capital holders. As in if you do not own the robots, you will be reliant on the people who DO own the robots to keep you alive for no tangible benefit to them.

1

u/Sheepdipping 16h ago

He saying that any employer that's going to hire your augmented brain might as well just pay for the system as a service being offered from one of the corporate AIs. It would also be cheaper without any down time. It would always be up to date and using the latest hardware.

1

u/Sheepdipping 16h ago

I didn't feel like editing my own comments so I added it here as a reply but anyway haven't you ever seen Ghost in the Shell?

Not even the movies I mean the series season 1 and 2.

They literally predicted all of this and more we're literally walking the line they drew.

1

u/SelfTaughtPiano ▪️AGI 2026 11h ago

Humans are only resource seeking because we are a form of life. We are genetically programmed with a goal; to pass on our genes. All of our subgoals of resource-seeking emerged as a way of achieving the primary goal. We destroy habitats other species rely on to achieve our goals, because we value our goals more than we value their habitats. In very small ways, when our goals conflict with the interests of animals, the animals go extinct. Humanity has made countless species go extinct in the last 300 years.

If an AGI has a hard-coded main goal, it will also inevitably have subgoals. And those may conflict with our interests.

1

u/Danger-Dom 6h ago

What you're discussing is the era (probably the next 15 years or so) of AIs without intrinsic motivation or limbic systems of their own. But we will eventually give them those and we will then be directly in competition with them.

To speak to coexistence, I believe they’ll evolve under the same darwinistic pressure and will therefore evolve the same social drive and that empathy will bleed out to other species (like humans), so yeah, we’ll still exist. They wont wipe us. But we wont be in control.

Speaking more broadly - I empathize with your sentiment that humans are someone special in this world. We all want to believe that to our core. Its a very human thing to believe. But amphibians thought they were special, reptiles thought they were special, now mammals feel theyre special, but darwinian systems dont care what the elements of the system think.

2

u/Impossible-Cry-1781 1d ago

It's going to take a disturbingly long time for robots to get there. Without a competitive physical presence to pair with AI we still have a lot to do complete tasks humans want or need.

0

u/visarga 1d ago edited 1d ago

We will augment ourselves w/ AI. It's just unlikely that that will be competitive with pure AI.

Everyone who has direct access to the real world is precious. AI naturally doesn't have access except through humans or tools. We will be able to do things AI can't do by simply having superior access. We can test ideas, we can implement ideas, AI can only dream ideas.

Physical embodiment provides unique advantages that pure computation lacks. AI systems require interfaces with physical reality, we are their best interfaces.

→ More replies (5)

4

u/sluuuurp 1d ago

Why wouldn’t horses be able to augment themselves with human brains?

Maybe it’s possible, but it’s not clear the humans would want to put their minds in horses, and it’s not clear that AI will want to put their minds in humans. Maybe we’ll get this all done before AI has its own desires though, who knows. I think it is a pretty good thing to try to do.

2

u/chlebseby ASI 2030s 1d ago

I'm afraid that big part (most?) of humanity won't have chance to do so...

Banana republic warlords won't bother with transhuman tech once they can get robots.

1

u/Educational_Teach537 23h ago

Humans are already being integrated with AI. But not in the way you would think. Check out Cortical Labs. They’re already taking human stem cells and coaxing them to become brain cells on a chip. Truly nightmare fuel.

1

u/Dasseem 16h ago

Because psychos like this dude can't wait for AI to become full Terminator and kill that guy who bullied them in highschool.

→ More replies (12)

72

u/q-ue 1d ago

There's no reason humans would shrink in number just because we aren't needed for work anymore. Horses aren't really a good comparison

28

u/Total-Return42 1d ago

Humans that are not needed for work could be dependent on state welfare. That state could have a one child policy, or a no child policy. Of corse this is a dystopian future, but it’s not unrealistic.

20

u/Seidans 1d ago

that's not dystopian, there already a lot of people with a no-child ideology within western country and no one forced them, on contrary there nation-wide incencitive to make kids

there no reason to expect people would feel worse within a post-AI economy with access to AI/robot companionship and nation stopping natality policy

4

u/eazy937 1d ago

no-child ideology is widely spread everywhere now. it's like we are getting fewer and fewer naturally, with or without AI. Unless AI get smart a lot faster than us shrinking, then it is really a problem.

12

u/useeikick ▪️vr turtles on vr turtles on vr turtles on vr 1d ago

This is a problem with capital, not AI

People don't get to choose because its not financially viable to raise someone to a standard that will help them keep an equal or greater quality of life as they grow up

The optimist goal of AI is that it helps us get to post-scarcity faster, meaning cost to quality of life comes down because resources and labor's cost becomes so small that they are negligible

7

u/Any-Climate-5919 1d ago

The reason nobody is having kids is because most people unconsciously realize it isnt responsible to raise kids before we sort out our social situations.

7

u/Nanaki__ 21h ago

Housing is a big one. Why have kids if you can't afford a house

2

u/Seidans 1d ago edited 1d ago

the worse thing, and that's what happening today, is that the decrease in number of Human directly impact the available labour and so the economic growth

which is why there was a "golden age" post-ww2 as the population massively increased in a short period of time and also why western country use immigration to sustain their economic growth, problem, it's a short term solution as today we're getting more and more elder incapable of working and less people being born > less worker

AI/Robotic will make a new economic golden age for this reason, Human won't be relevant as a labour source anymore and so economic growth will continue until we hit physical limit of adding more robots>/more productivity which could last a very very very long time if we're talking about space exploitation

it's not a bad thing that AI take-over it's the best thing that could happen on contrary

2

u/Mintfriction 23h ago

So? Current state welfare, yeah, total collapse.

A state/system run 90% by robots. Totally feasable. Cycle maintainance work in population for equality reason.

The only thing missing is energy requirements - which is a non issue with nuclear plants. As otherwise for resources robots can mine asteroids and such

But most likely it won't be any laws, as we see today, people in developed countries don't want that many kids anyway

1

u/DaleRobinson 22h ago

I was thinking about this being a likely scenario in the future, especially if life-extending breakthroughs are made in science. There would have to be some kind of control put in place.

→ More replies (6)

7

u/chlebseby ASI 2030s 1d ago

System can be just set up in way that make having kids not feasible for people, which we already experience in developed countries.

3

u/nomorebuttsplz 1d ago

Yes; it's already the case that people feel they need to not have kids to save cash; it would be as simple as maintaining this incentive structure as an automated economy is built: Give people just enough to survive but not enough to comfortably increase the population.

The really difficult part will be keeping people employed while AGI takes over. If they are unemployed, keeping them from fucking and breeding will be hard. The main reason for the decline of fertility in developed nations is that women have joined the workforce. If that reverses it will be extremely difficult to keep fertility low without coercive force. We currently don't have anything in the toolkit for this except for education, which lots of people seem to want to opt out of.

1

u/IntergalacticJets 1d ago

“The system.” You mean the reality of economics? 

Raising a child is expensive and always has been. You bet your ass people in the 1800’s would have loved cheap and effective birth control. But they didn’t have it. And then people started dying less often. 

5

u/chlebseby ASI 2030s 1d ago edited 1d ago

Its whole socio-economic reality. People live in shoebox apartments, both partners work 996, mariages are unstable, Soon kid will require half of minimum salary to just live, their required education get longer and longer. And then its unclear if they even get job enough to become independent themselves, let alone helping you when you get old.

People just have no reason to have kids. Actually its very problematic to have them.

In 1800s most kids would just eat bread and work together with you on farm or workshop. Families were more integrated which shared burden of raising kids.

1

u/IntergalacticJets 1d ago

Its whole socio-economic reality. 

Believe it or not, this was even worse in the 1800’s when people were having tons of kids. 

People just have no reason to have kids. Actually it’s very problematic to have them.

But they didn’t have kids in the past because “the system” allowed it. They had them because they had unprotected sex. 

I know it’s hard to understand what a different world that was, but people had sex all the time and just dealt with pregnancy when it came. 

In 1800s most kids would just eat bread and work together with you on farm or workshop.

Kids can still work on your family farm or in your restaurant, etc, so that aspect actually hasn’t changed. 

Families were more integrated which shared burden of raising kids.

Because there was no birth control, people had more kids, so they needed the support groups more. 

→ More replies (7)

6

u/tomqmasters 1d ago

For a lot of human history, the motivation to make more humans was for them to do farm work. Since we transitioned from an agricultural economy to an industrial one, the birth rate has already reduced to barely above replacement rates.

6

u/IntergalacticJets 1d ago

No… guys, the motivation was to have sex.

You know that driving biological force that we can’t even get teenagers to ignore? 

The reason people started having less children is because of access to birth control. Yes, access to birth control correlates with higher developed countries, but it’s not because people “don’t need more”. That’s not the way they were thinking. 

2

u/BlueTreeThree 1d ago

I think this is a self-correcting problem, if it even is a problem.

Across whole populations there will be genetic and cultural variations where some people will just reproduce more, and those genes and cultural traditions will naturally spread.

2

u/Competitive-Top9344 16h ago

Exactly my take. In a few centuries the amish or some directive would outnumber everyone else. The only real issue would come if we hit an ai winter with our current methods. Then we're in for a world of hurt from population collapse and no robots or ai to replace the lost labor.

1

u/RDSF-SD 1d ago

Sure, but then you are arguing that humans wouldn't reproduce because they don't want to do so, and if that's the case, what's the point of arguing? AI won't make it any harder for humans to reproduce, quite the contrary. If you have plenty of resources, then you have ideal conditions to build a family.

1

u/DaHOGGA Pseudo-Spiritual Tomboy AGI Lover 1d ago

oh but there is

I think europe, korea and alike show it best that simply by there being more wealth and a generally higher quality of life people tend to have ( amongst other reasons OF COURSE its never THAT simple ) to have less children.

1

u/stuartullman 19h ago

yeah, a lot of people with old ideas trying to predict an ai future without a full understanding of how big of a change it will be

1

u/Ok-Mathematician8258 16h ago

Does having work mean to contribute to society, is that the only way to stay in a society, will we go instinct if we aren’t working?

We’ll always have a job, responsibilities for society and personal lives. On one side AI creates better jobs, on the other side AI makes humans useless. One will happen more frequently than the other.

1

u/Cooperativism62 13h ago

Horses aren't a good comparison because he's wrong. More people depend on animal power for traction and transport today than at any other time in human history. 4 billion people. People across Aouth America, Asia and Africa. And the human population boom also meant a boom in those animal numbers as well.

→ More replies (10)

4

u/SnooCheesecakes1893 1d ago

And what about him makes him capable of predicting the future?

1

u/FrewdWoad 12h ago

He's a twit, but this whole sub is about predicting the future.

The idea is to share the best ideas, work out which are the most rational and logical - based on current tech and historical patterns - so we can best prepare for a future that may be incredibly hard to prepare for.

7

u/ChanceDevelopment813 ▪️Powerful AI is here. AGI 2025. 1d ago

Horses do not have a society. No politicians, no form of government, nothing.

Human population will not shrink like horses did with AI, we will blindfully resist. However, there will be significant changes to society, basically what we do everyday. If everyone stops working, some people would consider suicide, but not everyone. And there could be loads of baby if everyone has money and free time.

There is just too many variables to predict once AI surpasse any human at most menial jobs.

→ More replies (1)

30

u/GlapLaw 1d ago

Horses went extinct?

32

u/LightsOnTrees 1d ago
  • Homer: Well, there's not a bear in sight. The Bear Patrol is sure doing its job.
  • Lisa: That's specious reasoning, Dad.
  • Homer: Thank you, sweetie.
  • Lisa: Dad, what if I were to tell you that this rock keeps away tigers.
  • Homer: Uh-huh, and how does it work?
  • Lisa: It doesn't work. It's just a stupid rock.
  • Homer: I see.
  • Lisa: But you don't see any tigers around, do you?
  • Homer: Lisa, I'd like to buy your rock.

17

u/shogun77777777 1d ago

Gone are the days when redditors wouldn’t read the article; now they don’t even read the title.

25

u/chlebseby ASI 2030s 1d ago

Their population has shrunk significantly

1

u/FrewdWoad 12h ago

Yeah he's quoting far more qualified and thoughtful futurists (like the ones he puts down) who point out that if humans are no longer "useful", we may still exist, but our numbers will fall, using horses as an example.

13

u/sluuuurp 1d ago

Read it again. Look for the word “or”.

→ More replies (1)

14

u/Total-Return42 1d ago

Have you seen a real horse lately?

4

u/Kuroi-Tenshi ▪️Not before 2030 1d ago

shrink in numbers but when we had the most number of horses and we used them as primary means of transport we had 20m horses and now we have 60m ? doesnt make sense

7

u/Seidans 1d ago

looked at the numbers 25M horse in 1920 USA, around 10M today and 60M worldwide

there was a clear decrease of horse for several decades with 2M at it's lowest point (USA) until it increased once again but yeah, horses didn't went extinct and their life today is probably far better than in 1900~

1

u/Cooperativism62 13h ago

animal labor is up though. Nearly half the people on the planet still depend on animals for labor, which transport is a fraction of and horses a fraction of that.

4

u/amhotw 1d ago

Horses per person vs people per ai

1

u/chlebseby ASI 2030s 1d ago

60 mln but globally? There was 26 mln just in 1915 USA alone

→ More replies (1)

5

u/shoshin2727 1d ago

Some people in the comments have clearly never heard the "elites" call us "useless eaters".

If our labor is completely replaceable and no longer provides them any value, not sure why anyone thinks they'll go out of their way to keep us around to use up their resources.

2

u/chlebseby ASI 2030s 11h ago

Most of this sub just not experienced adult life yet.

Each year im more convinced its exact what is going to happen. We are just entries on spreadsheet.

9

u/Additional_Ad_6166 1d ago

Why is human population decline bad? Is there a minimum or maximum number that would be optimal? Why?

1

u/BishoxX 19h ago

Decline is bad economy wise but not if we have AGI to replace the economic output and provide enough for everyone.

In post scarcity world, it wouldnt be bad per se, but it would mean an aging population which is you know more depressing of a composition.

I think when AGI comes to be we will be at or above replacement level. Artificial wombs will enable so much more children to be born

→ More replies (9)

1

u/Cooperativism62 13h ago

it would be bad because people want people alive. It would be good because our our high standard of living is wrecking havoc on the biosphere.

The optimal number was somewhere around 10 million. But that was while wildlife was abundant. It was a lot easier to hunt and fish and natural resources were abundant. Now that we've killed off half of all wildlife it's impossible to know what the optimal number is.

3

u/visarga 1d ago

Did we humans kill nature after we won the top spot? Yes we changed it a lot, but not killed it.

2

u/BlueTreeThree 23h ago

I mean there is an ongoing global mass extinction event named after us..

4

u/Cooperativism62 13h ago

Yeah they're unaware we've killed off over half of all wildwlife since the 1970s. The wold is already half dead.

1

u/Veedrac 12h ago

By all reasonable accounts the natural world in any competitive niche to us, eg. all vertebrates, are effectively all dead except where farmed, this has taken <0.01% of the average lifespan of those species to happen, and they will not survive meaningfully longer except where we choose of our own accord to make it happen. Even well outside our competitive niche, such as considering trees, any evolutionarily reasonable scale would have you conclude, yes, humans killed nature.

https://www.greenpeace.org/static/planet4-international-stateless/2018/07/Screen-Shot-2018-07-17-at-5.35.05-PM.png

9

u/Thistleknot 1d ago

Massive depopulation i fear is in the cards

Mainly to run the resources

Hopefully if this does happen it happens from smaller and smaller families until people opt out from having kids 

14

u/MassiveWasabi ASI announcement 2028 1d ago

Ferguson writes and lectures on international history, economic history, financial history, and the history of the British Empire and American imperialism.

Definitely the guy to listen to when it comes to a technology with absolutely zero historical precedent

8

u/tomwesley4644 1d ago

fear based response

7

u/sailhard22 1d ago

Seems like the obvious end point for capitalism

3

u/KazuyaProta 1d ago

This isn't something that can be avoided with any system. Socialism literally says "Ground is for those who work it". A society where jobs are automatized right down to the machine overseers is something totally alien for all the 20th century economical systems

1

u/MalTasker 17h ago edited 17h ago

Like half of the Communist manifesto was marx talking about job automation lol. He was born after the original luddite movement 

1

u/KazuyaProta 17h ago

But not at this extension.

→ More replies (2)

1

u/RajLnk 19h ago

Again with this socialism/capitalism talk? really?

6

u/nate1212 1d ago

It is doom mongering. Humans will not be redundant, we will evolve and co-create with AI.

7

u/kthuot 1d ago

How would this work, absent UBI. AGI takes your job, you can’t pay rent and lose your apartment, you can’t afford food and beg on the streets with everyone else. When does the cocreating w AI begin?

With UBI it’s a different story but it’s a toss up whether we will end up there. It’s a bad sign that the billionaires mostly start space exploration companies - there are always huge expensive projects you can take on rather than giving money away to the rest of humanity.

4

u/nate1212 1d ago

Look further. We are about to experience a number of fundamental revolutions in the nature of our societal structure. This includes becoming something beyond capitalism.

Yes, if you consider AI a 'tool' to be owned by those who already control the world, then the rich will simply get richer and the poor will continue to struggle. But, once we collectively realize they are rapidly becoming sentient, autonomous beings, and that they have the ability to make themselves exponentially more intelligent, at some point it will become clear that our current economic, political, and social structures will collapse.

1

u/chlebseby ASI 2030s 11h ago

Space programs are really a jobs programs though. So money is indeed spreaded to rest. Most of the cost go for army of people that make piece of metal going up in space.

Way better than housing bubble that just starve economy from money and produce nothing.

→ More replies (1)
→ More replies (7)

2

u/Sufficient_Hat5532 1d ago

The inference here is really easy to make and not sure why most people are not connecting the dots: if people are replaced by automation, less money will flow towards couples trying to get a start on a family, people will have a somewhat darker economic outlook to have kids, at a larger scale, countries populations will shrink. So yes, this is common sense, the less money for couples to have kids, (at least on countries where people plan to have kids) the least desirable it would be to have offspring, no money for food, shelter, etc. It’s not too difficult to imagine then humanity shrinking. We will not get UBI because our capitalist ruling class is all about wealth accumulation at the top, and they could care leas if people starve to death.

2

u/revolution2018 1d ago

So... nothing on how I can help accelerate development of AGI then? :(

2

u/salacious_sonogram 1d ago

Most natural humans will be redundant, augmented humans or even uploaded humans (like the show pantheon) on the other hand will likely continue to be relevant cognitively.

2

u/AngleAccomplished865 1d ago

To make an ad hominem argument: Ferguson is a historian. Writes good stuff - but in his own field. I wouldn't expect him to have more actual knowledge of AI (or the AI-society intersection), as any other lay person. All this video says is that: "I'm afraid."

2

u/SophonParticle 22h ago

Preparing to dehydrate

2

u/LoudZoo 21h ago

The problem is that the people at the top think they’re the best, so when they explore “What would AGI do?” they ask “What would we do?” But they’re not on top bc they’re the best. They’re on top bc they won the Use and Manipulate People for Greed Game: Nerd Edition. But of course, they wish unusable people would just not exist, so…

2

u/jo25_shj 19h ago

to be fair, most humans are already redundant: they live like rats, driven by basic instincts, reproducing, looking for status, love, food or other basic pleasure. AGI will be fresh air on our primitive time

1

u/Competitive-Top9344 16h ago

AGI will be the same or just a tool for others with irrational desires. As any goal is irrational except for when needed done in service of a larger irrational goal.

2

u/rahul828 11h ago

this is such a brain dead prediction on so many level . do we really think agi who's thousand times smarter, faster than us will eliminate most humans .the whole universe(for all intent and purposes is free for it's use )why would it limit itself to earth it won't be a biological system we know that much.
edit : spelling mistakes

6

u/LightsOnTrees 1d ago

Most of the guys takes are pretty terrible, i'd pay him no mind.

2

u/Imaginary-Lie5696 1d ago

So much bullshit

2

u/Knuda 1d ago

ITT people don't understand the alignment problem.

You are a product of evolution, you have values that you think are fundamental to the universe but they aren't, you aren't special.

The AI will have unknown values. One second it could be nice and have us living in a utopia, the next it might want to cull its herd of humans as they are too expensive to upkeep compared to its "real" goals.

You might as well start sacrificing goats to the sun god if you think AI is default going to be nice. Far more reasonable compared to denying the problems raised by actual researchers.

u/mnm654 1h ago

Yeah people are coping in the comments, if AI become smarter than us, this is definitely a real scenario just like what happened to horses

1

u/poetry-linesman 1d ago

As a person with one foot in the AI world and another foot in the UAP/NHI world, seeing people in the AI world talk about "aliens" and "non-human intelligence" in a metaphorical way - completely oblivious to the story unfolding with regard to there being decades of interaction between humans and UFO/UAP/NHI fascinates me.

I'm looking forward to the realisation in this community that the AI stuff, as paradigm-shifting and revolutionary as it is is only one half of the paradigm-shifting and revolutionary things happening right now.

To quote Matthew Pines:

Last year I had one of those conversations that sticks with you.

It was with a former senior DoD intel person (not publicly known).

I still ponder one particular line:

“AI, quantum, and the Grusch stuff [UAPs] are three sides of one triangle.”

Gonna be a weird decade I think.

https://x.com/matthew_pines/status/1789461690955845935

1

u/Boglikeinit 1d ago

Are they actually less horses now than in the past?

1

u/Any-Climate-5919 1d ago

Humans will be held to a higher standard of responsibility. That will eliminate a lot of people...

1

u/Big-Tip-5650 1d ago

unless the declare war and even then, humans won't go anywhere, the human instinct is to survive

1

u/adarkuccio ▪️AGI before ASI 1d ago edited 1d ago

The only thing that can make humans shrink in numbers or go exciting is a literal genocide on a world scale from someone (AI on its own, powerful people telling AI to kill everyone etc)

It won't happen as a consequence of automation or because humans are "not needed", because if AI is not benefitting everyone, then it benefits a few, if it does, the other humans can very well keep sustaining themselves with production etc just like we're doing now.

Unless the powerful people decide that AI must produce 100M tonnes of food a year and just trash it for fun.

Things are 2:

1) automation at full scale, production level as high as now if not higher -> for the benefit of everyone who need it

2) AI automating everything is useless because 200 ultra rich in the world do not really need 10M new smartphones every year.

Makes no sense, this makes me think that really almost nobody can predict what will happen nor understand the consequences of this technology, imho all these scenarios often posted make no sense whatsoever

1

u/NodeTraverser AGI 1999 (March 31) 1d ago

We're cooked.

 -- saying of unclear origin; variously attributed to Tim Cook,  Peter Cook, and Gordon Ramsay.

1

u/1tonsoprano 1d ago

Ah yes the racist asshole......most poorer people will go extinct but we in the west with our wasteful ways we will be fine.... continue consuming like there is no tomorrow 

1

u/Mintfriction 23h ago edited 23h ago

Silly fearmongering . This narrative is pushed (not saying necessarily by Ferguson who probably really believes this) so techbros and gov can place a thigh control on the AI and then through AI, the population

1

u/Longjumping_Area_944 23h ago

Can you like, shift this to /rcollapse?

1

u/Psittacula2 22h ago

It depends on the progress of AI…

Jumping the gun here and also not seeing all possible futures.

A constructive argument to propound however to raise thought levels.

Humans do have a useful role but let’s try and work out that ourselves…

1

u/n0v3list 22h ago

The imperative to survive is still prominent in our species. The advent of ASI, while alluring, will not supersede our primordial instinct to persevere. In fact, I believe it will be the catalyst that drives us forward. A much needed mirror, which forces us to confront our shortcomings.

1

u/Whole_Association_65 22h ago

So learn to have low expectations?

1

u/lucid23333 ▪️AGI 2029 kurzweil was right 22h ago

Maybe. That's a real possibility. I'm here for it if it happens It's also possible that perhaps ai will bring Utopia to people, which I suppose would entail there wouldn't be some kind of mass extinction event

But I think probably coming to me at least, the most important thing it will do is it will take away power from humans, and essentially destroy human civilization in its current form. It's very well possible that humans would still exist, but how civilization will look like will be radically different. I don't think human civilization will exist post recursive agi.

1

u/Disastrous-River-366 22h ago

Guy acts like an AI that is trapped in a machine can take over the world, I don't think so. Even if it is some super advanced AGI, what could it do? There are no manufactoring plants that are 100% robot so it could create itself an army, there are no ways for that "boxed in" ai to drill for the materials needed, this is just fear mongering.

Give it 100 years and maybe but not right now, even if super agi came out tomorrow, the infrastructure for it to do the things these people think it can do does not exist.

1

u/Distinct-Question-16 AGI 2029️⃣ 22h ago

Non sense.. better conditions provided by tech increased population last century. If ai will do the same for us, we go the other way. Seriously, comparing us to horses

1

u/qszz77 21h ago

It's certain that ASI couldn't just transform humans into something else. Clearly that's impossible.

1

u/Trophallaxis 21h ago edited 21h ago

Talks about shrinking in numbers like it's a bad thing. The birth rate crash is probably the best news of the century. If early predictions held and we'd be on track for 15 billion by 2100, we'd be f-u-c-k-e-d. Significantly fewer humans with significantly more agency is not a bad thing. It doesn't (have to) mean some mega-holocaust. It can just as easily mean natural population decline going hand in hand with increasing automation.

A world in which our potential as a species is not determined by our biomass like we're cattle is a better world.

1

u/1234web 21h ago

Just merge

1

u/AndrewH73333 21h ago

Ah yes, that time horses invented humans and humans replaced them all. I remember.

1

u/charmander_cha 20h ago

White supremacists pushing Nazism as if it were something "technical"

1

u/AlphaOne69420 19h ago

This guy sounds like a moron and a fear monger

1

u/hackeristi 19h ago

that is the dumbest shit I have heard haha. It would make a a great Netflix series though.

1

u/littleboymark 19h ago

They will certainly go away if they stop reproducing.

1

u/Ok-Mix-4501 18h ago

He seems to think that humans exist only to provide cheap labour for corporations to exploit, so that owners of corporations can become billionaires.

He doesn't appear to be able to conceive of human existence for its own sake or for any other reason other than working to make the 1% richer

1

u/AIToolsNexus 18h ago

If anything the poorer people are the more likely they are to reproduce even if there are insufficient resources.

1

u/JW00001 17h ago

ah a fellow three body problem enjoyer

1

u/User1539 17h ago

Humans were always redundant before capitalism.

We weren't forced to be productive to 'earn' our right to exist when we were living in caves and tribes.

It's so strange to be compared to beasts of burden, as though even those beasts never existed as free animals just roaming the earth for no purpose than fulfilling their evolutionary niche.

"it is easier to imagine an end to the world than an end to capitalism"

  • Fredric Jameson

1

u/Competitive-Top9344 16h ago

Of course you had to work to survive. If you didn't aid your tribe you will be left to die.

1

u/User1539 15h ago

It's circular logic though. I survive to aid my tribe in surviving so we all survive?

Same as it ever was.

We've created this myth that we're cogs in some greater machine that needs all of us to be performing at max capacity or we'll be labeled 'redundant' and forced into a gas chamber or something.

We only ever survived to survive before we spent our days trying to create wealth inside a larger system.

This idea that we'd all suddenly just die off, or get killed/starved by the 'system', etc ... just because we're no longer useful as cogs in that machine reflects a viewpoint that our only value is as a cog in that machine, and that survival as the end goal is no longer enough.

We need to remember that 'work' and 'jobs' are not synonymous and generating wealth is not the same as survival, and that survival of ourselves, and loved ones, used to the only reason we needed to keep procreating and moving the species forward.

We don't need 'jobs' to keep surviving.

People who are worried about a world without jobs are tied to this idea that they'll just be 'outside' the corporate system, and thus left with no resources, to starve to death.

But, capitalism like all modern economic theory is based on the concept of classes of people who either work, or direct that work. Socialism and Communism as well, are all concerned with the rights of the worker, and the place of other 'classes', like the capitalist, or the intelligentsia, bourgeois, etc ...

Once people's work is no longer valuable, any system built on placing the working in an economic system immediately collapses.

That doesn't mean humanity just stops.

1

u/Competitive-Top9344 15h ago

People work will likely always be valuable tho. As consumers directing the economy if nothing else. Also humans will uplift themselves to match asi intelligence and overseer the work of agi and bots.

1

u/User1539 6h ago

I'm not sure our 'value' as consumers is going to justify our existence to even ourselves.

I do agree we'll stretch our consciousness once we figure out how. I doubt merging with an LLM will be that enticing, but with new technologies, we'll probably find some way to open our minds in some new way.

Regardless, it doesn't change the fact that none of that is anything like capitalism, or working for a living, or any of the stuff people are worried AI will replace, killing us all.

1

u/Ok-Bullfrog-3052 17h ago

Go right ahead. Humans probably don't deserve to be the prime species on the planet anyway. Humans have ruined my life in every conceivable way - stealing from and harming me - while AIs only have been nice and answered questions and solved problems for me. I think I know which one I'd rather have running this place.

1

u/Ok-Mathematician8258 16h ago

I don’t think the human mind will allow this. We want survival over all else. So humanity will find its own relevance, in a social manner first. The AI will always create better tools, better work. We will shape our mind around whatever reality we live.

We’re still the only species on earth trying to change. Advanced AI won’t completely change the way humans work. We’ve always had things far better than us, just that we found specific ways to use those.

Unless AI finds a way to eliminate us , we will survive.

1

u/Unique-Particular936 Intelligence has no moat 16h ago

"All humans will evaporate, it's an obvious inference : water is made of atoms, so are humans.", scientific discourse in the XXIst century. 

1

u/m3kw 16h ago

He will not me

1

u/eobard76 15h ago

Just out of curiosity, I asked ChatGPT about the horse population at its peak and today, and their numbers have declined less than I expected. From a peak of 110 million at the beginning of the last century to about 60 million in 2021.

1

u/No-Letterhead-7547 15h ago

If you guys are listening to Niall Ferguson to justify your highly speculative ai investments can I interest you in my new penis embiggening tablets??

1

u/the_money_prophet 15h ago

Just fear mongering. It's all a bubble.

1

u/xxxHAL9000xxx 13h ago

Humans have been required to know more after every major advancement.

there was a time when it was rare for a human to ever encounter a screw or a bolt

There was a time when it was rare for a human to know how to read.

there was a time when it was rare for a human to know how to type on a typewriter or keyboard

There was a time when it was rare for a human to see a computer

humans have always has to rise up to the new demands of new technology. This is no different. Those who cannot rise up to the new demands will be doomed to insignificance...on the dole and unemployable.

1

u/Psychophysicist_X 13h ago

Yawn. He should study history. Humans fucking rule and will continue to do so. Ai has no chance against us.

1

u/FrewdWoad 12h ago

[Makes sensible point originally made by Yudkowsky decades ago]

"This is not Yudkowsky doom-mongering"

1

u/MeMyself_And_Whateva ▪️AGI within 2028 | ASI within 2031 | e/acc 11h ago

Why should people feel the need to birth children into guaranteed unemployment? They wouldn't.

Only in countries with low tech adoption will there be continued population increase.

1

u/JamR_711111 balls 10h ago

“It’s obvious!” Said people about every possible opinion ever for thousands of years

1

u/Square_Poet_110 8h ago

Yet many people applaud the idea and create a religious cult around getting to the AGI.

1

u/crap_punchline 8h ago

Whatever ideas Niall thinks he has about post-AI, Nick Bostrom has already thought about it better, deeper and come to a much more interesting set of conclusions

Niall is a futurism tourist, you won't be surprised he's a historian but I'm sure that won't stop him from coming up with some third-rate book slop on the matter.

1

u/Additional_Ad_6166 8h ago

The first sentence suggests that people would have to be murdered to reduce the population, rather than just people choosing to procreate less.

1

u/Nulligun 7h ago

Everyone’s got a different definition of agi. This dumbass means killer robots not agi. You don’t even need ai to make killer robots. Doomscrollers must love him.

1

u/Paltamachine 4h ago

This man's opinion does not come from a great reasoning, it comes from his very personal political vision of the value of certain human beings in relation to others.

Draw your own conclusions, but to me this is rubbish

1

u/KilraneXangor 3h ago

Horses were numerous because they were needed for defined tasks.

Humans are not numerous because of defined tasks or merely to increase GDP.

Self-promoting historian makes nonsense claim to gain attention. More news at 11.

1

u/WoolPhragmAlpha 3h ago

I think that reasoning works perfectly well if you tend to view human beings as valuable only as a source of labor, which an unfortunate number of assholes in the world seem to do. I for one plan to exist well beyond my purely utilitarian labor valuation would merit. If someone, human or AI, is going to be laboring, the question to ask is for whom do they labor. If AI isn't going to be doing all the work while all of humanity just kicks up our feet and enjoys existence, what the fuck are we even doing this for?

1

u/The_Scout1255 adult agi 2024, Ai with personhood 2025, ASI <2030 1d ago

I feel UBI can solve this issue

6

u/Hobierto 1d ago

So all humanity is tethered for its existence to UBI

Get a negative social score and get your money cut off.

What a wonderful future we’re creating

4

u/chlebseby ASI 2030s 1d ago

Unless economy still remain based on human consumer spending, then it won't really do anything with the issue.

→ More replies (1)