Welcome to the 21st Century: the Benefits of Artificial Intelligence
Over one hundred thirty million people worldwide use the Netflix streaming service; however, most may not know how the recommendation system works. The brilliant mind behind this program is actually an algorithm produced under the influence of Artificial Intelligence (AI). AI is a developing technology with a “learning” capacity that seemingly imitates human capabilities. The field of AI originated in 1950s thanks to John McCarthy, a professor of computer science at Stanford, whose goal was to “[mimic] the logic-based reasoning [of] human brains” (Levy). However, Artificial intelligence today is not created to mirror the human mind; instead, AI is programmed to use probability based data interpreting algorithms to provide the best option in a given scenario. Through this method, Artificial Intelligence has been able to be beneficial by increasing business productivity, quality of life, and the human capacity.
Today’s advancements in technology have allowed significant increases in business productivity. Since the early 1950’s computers have been able to compute math equations 1000s of times faster than humans, and with a higher accuracy (as long as the machines were properly coded). Now, anyone can google a question and receive an answer in a fraction of a second. The genetic algorithms coded into an AI allows for it to complete a specific task with minimal effort. Furthermore, unlike the brain of a human, AI’s are unable to be distracted and are capable of doing only what they are programmed to do. Because of this, Artificial Intelligence can “master discrete tasks” (Levy) thanks to their “learning, massive data sets, sophisticated sensors, and clever algorithms.” These discrete tasks are completed efficiently and without error due to the coding. On the organic side, humans have been described as ‘jacks of all trades,’ but ‘masters of none’. Humans have a knack for deviation, incapable of producing consistently precise results. These complications regarding human error result in loss of productivity, money, and quality of service. In sales alone, the presence of human error raises is not only a waste of resources, but a hassle due to the amount of time and effort needed to handle returns, the cost of shipment, and the time to correct the error. In the Business world, any mistake can bruise a future potential business. The benefits of incorporating an AI system is to reduce variation, thus reducing the likelihood of mistakes. People make mistakes, computers do not.
The Kiva system, used by diapers.com, an infant production website is a perfect example of the greater efficiency of machines. The organization algorithm used in the warehouses of diapers.com allows for robots to sort, supply, and ship packages automatedly with ease. Ordinary warehouse workers stick to one sorting system as it is easier on the memory. Also, these workers are often provided with limited machinery, therefore, waste time waiting for machinery to become available. However, with the Kiva system, a shared hive mind allows robots to “adjust to constantly changing data like [merchandise] size and popularity,… [warehouse] geography,… and the location of each robot,” (Levy) thus allowing for optimization to changing customer demands. Integrating the Kiva system into these warehouses such as “Gap, Staples, and Office Depot” (Levy) allows for the delivering process to move at a “rate of one every six seconds” (Levy).. Given that the Kiva system is programmed to this one warehouse job, it can focus all its energy on that task, unlike a human. Similar to the Kiva robots is Watson, an IBM AI featured on the show Jeopardy, who adjusts his output to a given response. In order to provide an answer on Jeopardy, Watson used a pre-programmed database to search and gather information based on patterns to determine what would be the most accurate answer. This and Watson’s weighted scheme, which allowed it to buzz in within at least 10 milliseconds, allowed it to defeat a recurring Jeopardy champion. Artificial Intelligence is capable of grand tasks, outdoing even the best humans at their own jobs. This technology is capable of improving not only the business field, but also healthcare, the stock market, and agriculture.
As the years go by, the amount of technology that people rely on has dramatically increased. Landlines have become obsolete while cellular phones have taken over. Newspapers have shifted to online news articles. The digital age has arrived and life has become simpler. Tasks that once took humans hours to complete can be done in minutes thanks to machines. Research that once had to be done in libraries can now be done at home with a few clicks of a button. As we continue to increase the amount of AI used in everyday life, the overall quality of life will also increase.
As previously mentioned, Netflix uses a series of algorithms to suggest programming that is like minded to that of past viewed shows. As small of an impact this may seem to have on the users overall experience, it actually helps to save the users time and effort when searching for a new show. This seemingly small convenience can be duplicated onto a much larger scale. “Geeks from Princeton University” (Levy) did just that. By implementing a system known as Plasma to the Norfolk Southern’s operations railway system, they were able to begin “predicting the impact of changes in [various factors] on real-world operations” (Levy,). The system is capable to do the work of an entire dispatch center, and even provide suggestions as to how to improve the efficiency.
The technological advances that has plagued the twenty-first century has gone on to exceed the imaginable. There have been enormous gains that once seemed impossible, now being made possible. The lifespan of a human has substantially increased by the help of AI’s in the healthcare industry. The use of AI’s in healthcare is a relatively new field of study. Research is being done all over the world on how AI’s can be used during operations to minimize the risk of complications caused by the lack of precision. Although this is new, there are high hopes for the overall usage of artificial intelligence to progress even further. Surgeries that were once deemed too risky and experimental can now be considered as viable options. These new procedures that once seemed unimaginable have now come to fruition. Fantasies have come to life.
Fantasies that were only imaginable can now be viewed in real life. This is the idea of virtual reality. AI’s in the form of ski goggles have allowed people to enter a whole new world. “After you’ve used one of these for a while and you understand that it has this power to teleport you to a different world, you sort of look at it [in] a different way,” noted Atman Binstock, chief architect at Oculus. By going beyond human capacity people can access a whole new reality away from their own. It is like an escape from what humans once knew. With these types of technological advancements, the only way is forward with no end in sight.
The human race is not perfect nor will it ever be so where the human race lacks, AI’s are made to make up for that. As it has been made aware, Artificial Intelligence is able to cause increases in business productivity, quality of life, and the human capacity. Although, Artificial intelligence in today’s era is not created to mirror the human mind. It is however, able to be programmed to use probability based data interpreting algorithms to provide the best option in a given scenario like that of the Plasma system. The brilliant mind behind this program is actually an algorithm produced under the influence of Artificial Intelligence (AI). The field of AI originated in 1950s thanks to John McCarthy, a professor of computer science at Stanford, whose goal was to “[mimic] the logic-based reasoning [of] human brains” (Levy). The capacity of the human brain seems rather small in the grand scheme of what an AI is capable of. Artificial intelligence is used around the world in almost all fields of work. It has become a basic necessity in the everyday lives of the individual and to the continued growth of mankind.
- Baker, Stephen. “Watson is Far from Elementary.” Wall Street Journal, 14 Mar. 2011, Proquest, login.libweb.lib.utsa.edu/login?url=https://search.proquest.com/docview/856748141?accountid=7122. Accessed 25 Jan. 2018.
- Fish, Stanley. “What did Watson the Computer Do?” New York Times, 21 Jan. 2011. opinionator.blogs.nytimes.com/2011/02/21/what-did-watson-the-computer-do/. Accessed 25 Jan. 2018.
- Jastrow, Robert. “Toward an Intelligence beyond Man’s.” Time, vol. 111, no. 8, 20 Feb. 1978, p. 59. Academic Search Complete, login.libweb.lib.utsa.edu/login?url=http://search. ebscohost.com/login.aspx?direct=true&db=a9h&AN=53521917&scope=site. Accessed 25 Jan. 2018.
- Levy, Steven. “The A.I. Revolution.” Wired, vol. 19, no. 1, 01, 2011, p. 88, Proquest, login.libweb.lib.utsa.edu/login?url=https://search.proquest.com/docview/851871772?accountid=7122. Accessed 25 Jan. 2018.