pull down to refresh

If AI produces better Einsteins, etc -- does the argument really hold that humans need to keep having kids for civilization to continue to prosper?
I don't think we need population growth for humanity to continue prospering as it is.
As I see it, the issue is more that the purpose of prosperity is to enable human flourishing. Less people, means less flourishing (other things equal).
reply
Do you think the world will be a better place with less people?
reply
Not in general. It would be a better place without a small number of particular people, though.
reply
That is true. The politicians, perhaps?
reply
Perhaps
reply
Yes. We shall find out next year after ww3 breaks out. It will be much worse for a time, and then better, for the survivors.
reply
No way WW3 is happening. Its just a small skirmish.
reply
I’m afraid it’s more likely that you believe. We can only hope it’s a minor conflict. But no, it’s all looking more like a global conflict between bifurcated economies that will take out a large slice of the world population.
Which is sad because we just got through a “once in a generation” pandemic only blow ourselves up with the weapons of war.
History repeats itself in cycles.
reply
Not only do I agree that WW III is likely, I think we’re probably already in it.
Remember that the date given in hindsight for both previous world wars is long before anyone thought they were in such a conflict.
reply
Even you believe there will be a WW3?
reply
Yeah, I think we're in it already and have been for a few years.
I think you are right.
reply
WW3 where? Ukraine vs Russia? Israel?
reply
China+Russia vs NATO. Other belligerents will take sides in the conflict, as in ww2.
“Everywhere”
reply
I dont know... I dont think the whole world will wrap themselves into war.
Not if AI can solve longevity and humans can live 200 years.
reply
This is not possible, you have to look at that what we call 'AI'. The word intelligence in that context is not equivalent to the context of intelligence of mammals. There were some discussions or suggestions to call these models not AI but PP (probability programs). And AI should only be used in the context of a real AGI (artifical general intelligence). Furthermore, it is absolutely unclear whether it is even possible to develop AGIs. There are no programmers who can speak in this area without having the same in-depth knowledge in the areas of psychology and biochemistry. At the same time, a psychology and/or biochemist needs appropriate programming skills, although it actually goes far beyond simple programming to be precise. Unfortunately, there are very few people like that in the world. It remains to be seen what results current research will deliver over time.
reply
31 sats \ 1 reply \ @freetx 19 May
Furthermore, it is absolutely unclear whether it is even possible to develop AGIs.
Yes, I think we need "modified turing test". Sure, it can be difficult or impossible to tell the difference between a LLM and human, however, that LLM was seeded with conversation from conscious / intelligent humans, thus it renders the test invalid.
However all this is a moot point. Humans (esp since giving up God), love worshiping the various idols they create....therefore its conceivable a huge part of humanity decides LLMs are in fact "intelligent" and even "conscious" and treats them accordingly.
reply
I will try to find any prompts to let gpt-4o reveal that its just code or even sending it in a loop
reply
Something is wrong with the world if it is becoming too costly to have kids.
reply
I know its a bit scifi out there, and i'm not sure this Ai we see now will get to this point
But sometimes I think that if Ai does get to the general AI point it's actually the collective thinking of humanity, and it extends it beyond the limitations of the biological, if we're merely data, we can dematerialse the human experience and move beyond the current limitations.
We could transport that data to new worlds, reform it there, start again, to expand the pursuit of humanity to a wider universe with more resources to keep on exploring
reply
This hinges on a good definition of civilization level prosperity.
I think humans will be happier in the company of their own kind, prospering with their own kind, but civilization prospering might not require humans be maximally happy.
reply