Artificial intelligence (AI) is intelligence exhibited by machines. This is a cluster of topics in AI or related to AI.
Artificial Intelligence is classified into types based on the degree to which an AI system can replicate or go beyond human capabilities. One classification system uses four types: reactive machines, limited memory machines, theory of mind, and self-aware AI. Another classification divides AI into two divisions: Weak AI or Narrow AI and Strong AI or General AI or Artificial General Intelligence. Different branches of AI are referred to by the method used to achieve AI.
Machine learning is a technique for realizing AI, and it is an application of AI in which machines are given access to data from which they learn for themselves.
Tools, algorithms, libraries, and interfaces for machine learning.
Artificial neural network (ANN) processing devices can be algorithms or actual hardware that are loosely modeled after the neuronal structure of the mammalian cerebral cortex. Neural networks are used in the branch of machine learning called deep learning. The following are types of neural networks used in machine learning as well as topics associated with neural networks.
A Deep Learning Framework is an interface, library, or tool that allows users to build deep learning models more easily and quickly, without getting into the details of underlying algorithms. Libraries are useful for individuals who want to implement Deep Learning techniques but don’t have robust fluency in back-propagation, linear algebra, or computer math. These libraries provide pre-written code for functions and modules that can be reused for deep learning training for different purposes.
Reinforcement learning is an area of machine learning focusing on developing agents that can learn from their environment over time by taking actions and receiving rewards. The following are algorithms, tools, and research topics related to reinforcement learning.
Supervised learning is a type of machine learning in which data is fully labeled, and algorithms learn to approximate a mapping function well enough that they can accurately predict output variables given new input data. This section contains supervised learning techniques. For example, Support Vector Machine (SVM) is a type of algorithm that is a discriminative classifier formally defined by a separating hyperplane used for regression and classification tasks.
A decision tree is a model for supervised learning that can construct a non-linear decision boundary over the feature space. Decision trees are represented as hierarchical models of "decisions" over the feature space, making them powerful models that are also easily interpretable.
Unsupervised learning is a branch of machine learning focused on structuring data that has not been labeled. The following are methods used in unsupervised machine learning.
In unsupervised machine learning, clustering is the process of grouping similar entities together in order to find similarities in the data points and group similar data points together.
Ensemble methods are meta-algorithms that combine several machine learning techniques into one predictive model. The purpose is to decrease variance (bagging), bias (boosting), or improve predictions (stacking).
In machine learning classification problems there can be too many factors or variables, also called features. When most of the features are correlated or redundant, dimensionality reduction algorithms are used to reduce the number of random variables. Certain features are selected and others are extracted.
Machine learning models are parameterized to tune their behavior for a given problem. Noise contrastive estimation (NCE) is an estimation principle for parameterized statistical models. Noise contrastive estimation is a way of learning a data distribution by comparing it against a defined noise distribution. The technique is used to cast an unsupervised problem as a supervised logistic regression problem. It is often used to train neural language models in place of Maximum Likelihood Estimation.
Computer vision is the ability of artificially intelligent systems to “see” like humans. In the computer vision field, machines are developed that automate tasks that require visual cognition. Deep learning and artificial neural networks are used to develop computer vision. The following are topics related to computer vision, as well as tools and libraries. Companies developing or selling computer vision products are under the Computer Vision subheading under the AI applications and companies section.
Natural language processing is a branch of AI that helps computers understand, interpret, and manipulate human language. The following are tools and topics related to NLP. Natural language processing companies developing or selling NLP applications are found in the AI applications and companies section, under Natural language processing.
Advances in deep learning are expected to increase understanding in quantum mechanics. It is thought that quantum computers will accelerate AI. Quantum computers have the potential to surpass conventional ones in machine learning tasks, such as data pattern recognition. The following are topics, companies, and technologies that link quantum computing and AI.
Semantic computing deals with the derivation, description, integration, and use of semantics (meaning, context, and intention) for resources including data, documents, tools, devices, processes, and people. Semantic computing includes analytics, semantics description languages, integration of data and services, interfaces, and applications. In AI, semantic computing involves the creation of ontologies that are combined with machine learning to help computers create new knowledge. Semantic technology helps cognitive computing extract useful information from unstructured data in pattern recognition and natural language processing.
The Internet of Things (IoT) refers to objects that connect and transfer data via the internet and the sharing of information between devices. Internet of Things-based smart systems generate a large volume of data, including sensor data valuable to researchers in healthcare, bioinformatics, information sciences, policy and decision making, government, and enterprises. AI can be combined with machine learning for the analysis of data and prediction.
While some lines of AI research aim to simulate the human brain. Artificial life or animate approach is concerned with the conception and construction of artificial animals as simulations or actual robots. It aims to explain how certain faculties of the human brain might be inherited from the simplest adaptive abilities of animals. Evolutionary computation is a generic optimization technique that draws inspiration from the theory of evolution by natural selection.
As AI technology has matured, it has seen significant integration and adoption within the field of radiology, the science of using x-rays or other high-energy radiation for the diagnosis and treatment of diseases. Reliance on AI-driven applications in radiology has risen substantially; in 2020, clinical adoption of AI by radiologists was 30 percent, up from zero in 2015. AI applications for radiology are mainly used for two reasons: to improve workflow and to provide clinical decision support. With workflow applications, radiologists use AI apps to gather patient reports and exams in one place, in order to analyze patient information, making it easier to interpret. Clinical decision applications are able to do a wide variety of analyses, including data analytics, image reconstruction, disease and anatomy identification, and advanced visualization. Challenges with AI in radiology include concerns about integrating AI applications, especially with an influx of applications gaining regulatory approval in recent years.
The following are companies using AI to develop products or producing AI software for various applications. Artificial intelligence programs designed for a specific application are also listed.
Timeline
Further Resources
2021 Final Report | NSCAI
National Security Commission on Artificial Intelligence
Web
2021
artificial intelligence | Definition, Examples, Types, Applications, Companies, & Facts
Web