Why is Artificial Intelligence becoming mainstream now

Artificial intelligence has in recent years been enjoying massive resurgence and for the seasoned experts who have operated in the industry since the 1980s, it could feel like déjà vu. But why is it becoming the mainstream now? Well, the reasons are pretty simple – computing power is cheaper, cloud computing is here and big data is available. Each of these elements have a special role in the progression of AI and they have all been favorable as technology advances. The purpose of this chapter is to provide a detailed discussion on how AI continues to become mainstream largely due to these events.

Computing Power is Cheaper

Machine learning, and by extension AI, is a computational process. For that matter, it is inextricably associated with computational power. Computing power and computing architectures have significant impact on the speed of training and inference in AI, thus impacting how fast the technology progresses (Gaskell, 2018).

There is no doubt that computing power is shaping the future of AI. Early 2018, OpenAI released an analysis documenting the explosion of computing power for the last 6 years, which has been at the forefront of helping AI make significant progress. The analysis established that for every 3.5 months, the largest AI trainings used more than double compute power.

This breakdown of compute power required in the creation of reputable AI systems such as AlphaGo Zero and ResNets clearly demonstrates why AI is the mainstream now. As improvements are made in compute, AI progresses further. One of the most significant improvement is not just making compute more advanced but also making it cheaply available.

As more startups emerge to develop AI specific chips, we expect the trend to continue. Some of these startups claim that their chips will feature an increase in FLOPS/Watt over the next 2 years. Gains are also likely to be made as a result of hardware reconfiguration to make it able to handle more operations at a less economic cost (Gaskell, 2018).

Hardware is crucial in determining the methods that researchers and engineers use in developing AI models. Features like power consumption of the chips are instrumental in determining how AI is used in the real world. As technology advances and computing power becomes cheaper, AI makes bigger steps of progress.

Cloud Computing is Here

Cloud computing has proved to be an impactful technology, bringing about transformations in the working of systems, storage of information, and altercation of decisions. Due to the many benefits associated with the cloud, experts who understand the power of AI yearn for a time when the two could be merged. Think of a cloud that is smarter. When AI and cloud computing are coupled, more benefits are bound to be enjoyed. This kind of amalgamation is referred to as Intelligent Cloud. It goes past the current usage of cloud that is limited to just networking, computing, and storage. Artificial intelligence, when infused into the cloud, vastly increases its capabilities. The cloud would be able to learn from the immense amount of data it holds and create predictions, in addition to analyzing situations.

The promise of the benefits that would result from intelligent cloud have kept major tech companies at work. Cloud computing and machine learning could assist in storage, analysis and learning concurrently from the information finding its way to the cloud. Just from the user inputs, the intelligent cloud would be able to predict trends and help in better decision making.

And the impact that cloud computing has had on artificial intelligence are pretty obvious. For instance, Google announced that it had been working on its AI and cloud capabilities to do more than what we are used to. At the start of 2018, the company announced that it had developed technology that would help enterprises with their cloud computing initiatives. But it’s not just Google that is at work. Top competitors such as Microsoft and Amazon have also made significant progress. In fact, the two tech giants have surpassed Google when it comes to artificial intelligence and they continue to release new projects that give them an edge.

As these companies compete each other to stamp their position in the AI and cloud computing world, the end user stands to benefit immensely.

Big Data Is Available

The rate at which data is generated across the globe is mindboggling. These enormous amounts of data are what basically make up the term “big data.” As big data excites experts and researchers from various fields, those interested in machine learning and artificial intelligence have not been left out either. Machine learning is essentially all about the interconnected machines gaining access to databases and learning new things from the data obtained on their own. This is to say that the more data the AI systems have access to, the more they are bound to learn. Big data provides the best opportunity for that to happen. The new things learnt are becoming more meaningful and contextually relevant to the machines such that they get to better understand how they should function just from big data analysis.

The one thing that people have always feared about AI is that it may replace human labor at the workstations. However, big data makes this a null and void argument. That is because big data analysis on the basis of emotions and sentiments still needs human intelligence given that machines do not have emotional intelligence and sentimental decision making ability. As a result, combined efforts of AI and big data continue placing a demand on human data scientists in a market that is constantly rising.

The speed of communication is at the forefront of making sure that more data is generated. Research done by Demo found that within two years, we were able to generate more data than what we were able to from the dawn of man to 2015. This data is instrumental in fueling AI applications. But if the data is not moved to where it is needed, it becomes less useful. That is where 3G and 4G wireless communications as well as fiber-optic cables becomes handy. These make it possible for data to move back and forth at a faster rate. When combined with cloud computing, these communication technologies have made it possible for an app economy that continues to demand for AI.

Leave a Reply

Your email address will not be published. Required fields are marked *