- Buy Microsoft Visio Professional or Microsoft Project Professional 2024 for just $80
- Get Microsoft Office Pro and Windows 11 Pro for 87% off with this bundle
- Buy or gift a Babbel subscription for 78% off to learn a new language - new low price
- Join BJ's Wholesale Club for just $20 right now to save on holiday shopping
- This $28 'magic arm' makes taking pictures so much easier (and it's only $20 for Black Friday)
Is Artificial General Intelligence around the corner? – Cisco Blogs by Utkarsh Srivastava
Intelligent machines have helped humans in achieving great endeavors. Artificial Intelligence (AI) combined with human experiences have resulted in quick wins for stakeholders across multiple industries, with use cases ranging from finance to healthcare to marketing to operations and more. There is no denying the fact that Artificial Intelligence (AI) has helped in quicker product innovation and an enriching user experience. However, few of these use cases include context-aware marketing, sales forecasting, conversational analytics, fraud detection, credit scoring, drug testing, pregnancy monitoring, self-driving cars – a never-ending list of applications.
Undoubtedly AI-powered systems are a boon but what next? AI-powered systems have a niche in their implementation domain. Imagine a smart machine having the ability to learn in a similar way to humans and apply learnings from one domain to another. This might sound crazy and scary at the same time. AI has managed to do things humans do, yet far better. Present-day AI-powered systems can predict tumors more accurately than human doctors, create better AI algorithms than human developers, and defeat world champions in games like chess. Instances like these may lead us to believe that perhaps, there is nothing that AI-powered systems can’t achieve.
For instance, deep learning algorithms leveraged by social networking sites are rapidly improving at identifying individuals, their artifacts, and even specific features of these objects and individuals. Modern computer vision technologies driven by deep learning can now recognize people in social media photographs, their place in the picture, their faces, and even any accessories that they might be wearing. This enables AI-powered systems to view images in the same way as humans do. These systems can even go a step further and interpret subtle patterns to detect non-obvious characteristics as well.
With such advancements, the gap between human intelligence and artificial intelligence seems to be diminishing at a rapid pace. Artificial General Intelligence (AGI) is a machine’s ability to understand or learn any intellectual task that a human being would do. While the above applications demonstrate AI’s ability to execute tasks with greater efficacy than humans, they are not necessarily intelligent — they might be very good in one domain but of no use outside their niche. As a result, while an AI-powered system can be as good as a hundred highly qualified humans at one task, it may prove less capable than a ten-year-old child at another. An individual, on the other hand, can perform a wide variety of tasks compared to AI-powered systems but at a definitively lower efficacy.
Researchers have a strong belief that “transfer learning” would play a major role in the successful implementation of Artificial General Intelligence (AGI). Demis Hassabis of DeepMind, calls “transfer learning” as the key to general intelligence. Transfer learning is a machine learning technique where a model trained on one task is re-purposed on a second related task. The idea is that with this precedent knowledge learned from the first task, AI systems will perform better, train faster and require less labeled data than a new neural network trained from scratch on the second related task.
While an AI system needs to be trained with massive volumes of data, humans can learn with comparatively fewer learning experiences. Furthermore, humans generalize their learning, allowing them to adapt what they’ve learned from one incident/domain to other related ones. Following a similar approach, Artificial General Intelligence agents can not only practice with fewer training details, but they can also transfer information from one domain to another. This will enable systems to learn the same way as humans do, simultaneously minimizing their training time while allowing the machine to acquire multiple areas of competency.
AI-powered systems – and especially Artificial General Intelligence systems – are designed with the human brain as their reference. Theoretically speaking, it is possible to develop such systems which have computational capabilities similar to the human brain. As noted in the Church-Turing thesis, given infinite time and memory, any kind of problem can be solved algorithmically. Deep Learning (DL) and Natural Language Processing (NLP) are few such examples where we see an implementation of such complex algorithms leveraging higher memory spaces and computation capabilities.
Although theoretically, Artificial General Intelligence seems to be a great possibility, we are still leaps and bounds away from it in reality. Considering the rapid rate at which we have made advancements in the field of AI, the future is promising. Experts have predicted the development of Artificial Intelligence to be achieved as early as 2030. A survey of AI experts recently predicted the expected emergence of AGI or the singularity by the year 2060. Though it seems to be a long journey, the exponential advancement of AI research may possibly culminate into the invention of Artificial General Intelligence sooner than we imagined. And that means we would have a machine partner as smart as humans that is able to collate, perceive, and respond to external stimuli and can even be leveraged to tackle a myriad of the world’s problems in a far more efficient way with multifold benefits. The promise of AI is that it will augment what humans do to improve our lives. So as scary as it may seem, Artificial General Intelligence is coming. I’m choosing to embrace the change and look forward to the future. How about you?
Share: