Progress in this area is slow, mainly due to increased computing power, for example with video cards – useful for this type of application – computing power is still increasing well every two years, in part due to TSMC in Taiwan. But we’ll likely reach a limit in terms of shrinking circuits on chips around 2030. Then you have an application-specific hardware that can perform certain calculations more efficiently than a regular CPU or even a GPU.
Artificial intelligence was also a sensation in the 1980s, when there were big predictions about what might become possible in the foreseeable future. Then you had post-millennium hype, and now that hype. The word hype can be a bit negative, but there have often been exaggerated predictions.
Intelligence is about understanding, and computers don’t understand anything. For example, computers can easily defeat people during a game of chess, but they do not understand the game itself. I prefer to think that the concept of intelligence means less and less that there is an “artificial intelligence effect”.
Here is an example from a leading mathematician:
https://www.npostart.nl/v…/24-01-2021/VPWON_1322203 (from 28:10)
Intelligence is a unique thing on the planet, and we don’t fully understand how our brains, or the minds of relatively intelligent animal species, work. Real AI could one day become a possibility, I don’t rule it out, but then at least another decade. minimum.
“Professional web ninja. Certified gamer. Avid zombie geek. Hipster-friendly baconaholic.”