Artificial intelligence (AI) is an artificial reproduction of a part of human intellectual behavior using software. By learning from experience and adapting to new inputs, you have the flexibility to perform tasks as humans do.
From computers playing chess to self-driving cars, most of the AI cases we hear these days rely heavily on deep learning and natural language processing. By applying these technologies, you can train your computer to perform a variety of difficult tasks in business and life by recognizing patterns from large amounts of data.
- History of artificial intelligence
- artificial intelligence
- what is artificial intelligence
- artificial intelligence is
- artificial intelligence course
- course in artificial intelligence
- what is artificial intelligence examples
- course for artificial intelligence
History of artificial intelligence
The term artificial intelligence (AI) was coined in 1956, but due to recent trends such as increased data volume, advanced algorithms, and development of computing performance and storage technology, it has been called AI in recent years. Abbreviations are becoming more widely known.
Early AI research in the 1950s explored topics such as problem solving and symbol processing. In the 1960s, the US Department of Defense took an interest in this area and began research into training computers to mimic basic human logical thinking (reasoning). For example, the Defense Advanced Research Projects Agency (DARPA) completed a street mapping project in Aspen, Colorado in the 1970s. DARPA also developed an intelligent personal assistant in 2003, long before Siri, Alexa, and Cortana penetrated the home.
These early studies paved the way for automation and formal reasoning found in today’s computers, leading to the realization of decision support systems and smart search systems aimed at complementing and strengthening human capabilities. It was.
Hollywood movies and science fiction stories tell the story of humanoid robots conquering the world, but the evolutionary stages of AI technology today haven’t reached that eerie or smartness. That said, AI has evolved to bring many tangible benefits to every industry. The following introduces cutting-edge AI utilization examples in a wide range of industries, including medical and retail, so please read it to the end.
AI has become an integral part of SAS software. We will strongly support customers in all industries to utilize AI technologies such as machine learning and deep learning.
|What is Artificial Intelligence?|
Why artificial intelligence is important
- AI automates iterative learning and discovery with data. But AI is different from hardware-based robotics automation. Instead of automating manual tasks, AI performs large, high-frequency, computerized tasks with high reliability and without “tiredness”. Human involvement remains essential in this type of automation in terms of system setup and raising appropriate questions.
- AI adds intelligence to existing products. In most cases, AI will not be sold as a standalone application, and just as Siri has been added as a new feature to a new generation of Apple products, it will improve and enhance the products people are already using. I will use it. Combining automation, conversational platforms, bots, smart machines, and more with large amounts of data can improve and enhance many technologies used at home and at work, from security intelligence to investment analytics.
- AI adapts through gradual learning algorithms. This is achieved in a way that “makes the data itself programming”, so to speak. AI discovers the structure and regularity of data, on which algorithms acquire skills (that is, algorithms become classifiers and predictors). So the algorithm can self-learn the next product to recommend in an online shop, just as it can “self-learn” how to play chess. The model is also adapted when new data is given. The AI technique of error backpropagation autonomously adjusts the model through training and additional data if the first solution is not very appropriate.
- By utilizing neural networks with many “hidden layers”, AI analyzes more and deeper data. A few years ago, it was almost impossible to build a fraud detection system with five hidden layers. Incredible computing power and big data have changed the situation. Deep learning models learn directly from the data, so training these models requires a lot of data. The more data you give to your model, the more accurate your model will be.
- Through deep neural networks, AI achieves incredible accuracy. This was previously impossible. For example, all human interactions (type operations) with Alexa, Google Search, Google Photos, etc. are based on deep learning, and the more you use them, the more accurate they become. In the medical field, the accuracy of the task of detecting cancer from MRI images has reached the same level as highly trained radiologists by utilizing AI techniques such as deep learning, image classification, and object recognition.
- AI makes the most of your data. As algorithms become self-learning, the data itself becomes intellectual property. The answer is in the data, so all humans need to do is apply AI to get the answer. Data can be a source of competitive advantage, as data is more important than ever in business these days. Even if all companies apply similar techniques in a highly competitive industry, the one with the best data is likely to be the winner.
Table of Contents
1Artificial Intelligence Implementation Guide for Enterprises
A vision that is an achievement goal is indispensable for utilizing AI. Without a clear vision, the focus of AI applications will be uncertain. In preparation for the construction of an AI system, in order to understand how to use its elements and functions, a concrete strategy supported by the implementation procedure specified through thorough consideration and consideration to achieve your own goals Need to formulate.
With SAS guidance, you can incorporate a variety of advanced analytics (including AI) into your strategy and understand the strengths and weaknesses of different methods based on your goals.
2To separate and recognize the boom and reality
AI is helping to “incorporate more smartness into machines”, but it’s not conquering the world. What should we expect from AI
AI is currently a big boom, but there are also scenes where excessive expectations and vigilance misunderstand the reality, and timely value creation is stagnant in solving management issues in companies. Oliver Schabenberger, Senior Vice President and CTO of SAS, has written an article to capture reality correctly, so this time I will write it.
We live in an exciting time. The relationship between us humans and machines, objects, and things is changing rapidly. Since living in a cave, humans have entrusted their will to passive (not automatically moving) tools and their own voices. Today, mice and keyboards work as you operate them, and smart devices such as the Amazon Echo respond to simple tasks like turning on lights and more complex tasks (eg, responding to human questions using analytics. Will help you to execute.
However, with the development of artificial intelligence (AI), the tide may change. Can machines transform from passive objects to active beings that weave themselves into human life? Will machines move humans, or will humans continue to move machines? Will the object start reporting to humans that “I’ve done it for you”, or will it continue to instruct the object what to do in the future? As everything becomes smarter and more intelligent, are we humans at risk of becoming “captives” of the living space controlled by autonomous intelligence?
3How close are we to such a situation?
If you’re worried night after night that machines might conquer the world, get a good night’s sleep. With the technology in use today, that never happens. Nowadays, it seems that the trend is to call anything AI if it behaves wisely or unexpectedly, but many are not actually AI. My calculator is more computationally powerful than I am, but it’s not AI. Decision trees are not AI, and SQL query condition clauses are not AI. However, it is true that there is a trend towards AI that is, “incorporating more smartness into machines, devices, appliances, cars and software.”
Incredible progress has been made in developing algorithms that can perform tasks with greater accuracy than humans. Although it was thought that Go was impossible for computers until a while ago, machines are now defeating humans and pushing to a level that is unrivaled by humans. In the medical field, the accuracy of algorithms for detecting specific types of cancer from medical images has reached the same level as that of radiologists, which is a truly life-changing result for patients.
These algorithms demonstrate superhuman abilities because they perform a given task with high reliability and accuracy, insomnia and repetitive tasks. However, the current situation is far from the stage of creating a machine that can think or act like a human being.
Today’s AI systems are trained to perform human tasks in a “computerized and clever way,” but only one task is trained. A system that allows you to play Go will not be able to play solitaire or poker, nor will you acquire that skill. Software that drives self-driving cars cannot control the lighting of the house.
This does not mean that this kind of AI is underpowered. Rather, it has the potential to transform many, and perhaps all, industries because it can provide a high degree of expertise for all applications. But when it comes to what you can achieve with AI, don’t be ahead. A system that uses supervised learning based on training data and learns in a top-down manner cannot grow beyond the content of the data. In other words, it is impossible to create, innovate, or reason (logically think) in such a system.
4Main usage of artificial intelligence
The need for AI functions is increasing in all industries. Among them, Q and A systems that can be used for legal support, patent search, risk notification, medical research, etc. are in particular demand. Other than that, AI has the following uses.
Applications that incorporate AI are effective for personalization such as treatment, medication, and X-ray diagnostic imaging. As a “life coach,” your personal medical assistant encourages you to take medicine, exercise, and eat healthy.
AI is also effective for “factory IoT”. You can analyze the data flowing from the equipment connected to the net and predict the load and demand using a recursive network (a type of special deep learning network used for sequence data).
The virtual shopping function realized by AI not only provides personalized recommendations, but also provides consumer consultation regarding purchasing options. Technology related to inventory management and in-store layout will also be improved and enhanced by AI.
AI is being used to analyze match images and videos and provide leaders (directors and coaches) with reports on better match performance, such as optimizing player positions and game strategies.
5How artificial intelligence works
AI works by combining large amounts of data with fast iterations and intelligent algorithms, and programming broad behaviors so that software can automatically learn from the patterns and features in the data. AI is a broad field of study, including numerous theories, methods, technologies, and major sub-fields such as:
- Machine learning: Automate the creation of analytical models. Machine learning discovers hidden insights in data by leveraging techniques such as neural networks, statistics, operations research, and physics, without the need for humans to explicitly program scope and conclusions. can.
- Neural network: A type of machine learning that consists of interconnected processing units such as neurons (nerve cells) in the brain. These units process information by responding to external inputs and passing information to each other. This process requires the data to be passed through multiple processing paths in order to discover relationships and derive meaning from undefined data.
- Deep learning: A method that utilizes a large-scale neural network with multiple layers of processing units, and learns complex patterns from a large amount of data by taking advantage of advances in computing performance and improvements in training methods. I will. Common uses include image recognition and voice recognition (= speech recognition / speech recognition).
- Cognitive Computing: One of the sub-disciplinary fields of AI, which aims to realize human-like natural dialogue between machines and humans. When using AI or cognitive computing, the ultimate goal is to enable machines to simulate human processes with image and audio interpretation capabilities and to have coherent conversations with humans.
- Computer Vision: Pattern recognition and deep learning help you recognize what’s in your photos and videos. The ability of machines to process, analyze, and understand images also means that they can capture images and videos in real time and interpret the surroundings of the filming location.
- Natural Language Processing (NLP): We aim to enable computers to analyze, understand, and generate human language, including human speech (speech / speech). The next stage of development of NLP is “dialogue in natural language,” which will enable humans to communicate with computers and direct tasks to be performed in ordinary everyday language.
6In addition, a number of technologies support the realization and utilization of AI.
- GPU (Graphical Processing Unit): The GPU is attracting attention as one of the key areas of AI because it can provide the advanced computing performance required for large-scale iterations. In addition to big data, advanced computing performance is essential for training neural networks.
- Internet of Things (IoT): The Internet of Things (IoT) produces vast amounts of data from interconnected devices, most of which is left unanalyzed. You can get the most out of your data by automating model creation and application with AI.
- Advanced Algorithms: Advanced algorithms are being developed and new methods are being devised to analyze more data faster and at multiple levels. These intelligent processes play an important role in identifying and predicting rare events, understanding complex systems, and optimizing unique scenarios.
- API (Application Programming Interface): An API is a mechanism that simplifies the use of specific program functions, which makes it easy to add AI functions to existing products and software. For example, adding image recognition to your home security or Q and A system can automate the description of image data attributes, the creation of captions / titles, and the ability to recall interesting patterns and insights within image data. Become.
Summary: The goal of AI is to create software that can logically interpret inputs and explain outputs to humans. AI provides human-to-human interaction between humans and software to help make decisions about specific tasks, but it is not a replacement for humans and is unlikely to happen in the near future.