You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Artificial intelligence is not the future, it is the now


This paper is based upon a presentation given at the NFAIS conference on Artificial Intelligence that was held May 15–16, 2019 in Alexandria, VA, USA. In it the author discusses real world examples of companies that are deploying machine learning and artificial intelligence technology, the goal of which is to empower the readers with information that will allow them to begin work on AI projects in their respective organizations immediately. Various open source tools and strategies for funding AI projects are also discussed.


The term “artificial Intelligence” has become part of the English language lexicon, so much so that the words seem to have lost their meaning. The term conjures up images of science fiction movies like the Terminator as well as other nebulous futurism and no longer accurately represents the work that is being done in the field of computer science.

When one discusses the topic of artificial intelligence (AI), the conversation inevitably turns toward the future. The discussion may involve self-driving cars, AIs that can author articles, identify military targets in real-time video or the prospect that AI will eliminate entire industries through the automation of manual labor.

This, however, is a mischaracterization because those tasks are not going to happen in the future. Instead, they are already happening right now. And with the advent of 5G technology and the expected two orders of magnitude increases in smart devices and Internet communication speed, AI is about to become an integral part of our everyday life [1].

Thus, there are very real and important advances in artificial intelligence that are having a profound impact on industry. It is not hyperbole to say that the work being done is so important that this technology will imminently touch the lives of every person, industry, and business on the planet.

Indeed, artificial intelligence is not the future; it is the now.

2.More than a competitive advantage, AI is an existential threat to your business

Over the past two decades, emerging technology has proven to be an important competitive advantage for companies. In the publishing industry, for example, Elsevier was an early adopter of digital publishing technologies, XML databases [2] and advanced content management. Later, through investment into its own intellectual property and acquisitions such as Plum Analytics [3], it delved into the data analytics space. Elsevier now considers itself to be not a publishing company, but rather an information analytics business [4].

These ventures fundamentally changed the nature of the company, resulted in the creation of new products and new revenue, and made Elsevier one of the most successful publishing companies of the 21st century.

This of course is only one example, but whether it was the invention of the transistor, the Internet, mobile devices or cloud computing, emerging technology is often the catalyst that allows companies to accelerate their growth, create more revenue and build a significant competitive advantage in the marketplace.

Artificial intelligence, however, is different in that the impact will be even more profound, perhaps unprecedented.

According to Kai Fu Lee, a leading futurist and expert in artificial intelligence, as many as 40% will be eliminated within the next fifteen to twenty-five years. Lee is quoted as saying that “Many jobs that seem a little bit complex, chef, waiter, a lot of things will become automated [5].”

Of course, if forty percent of jobs will be eliminated, implicit in that statement is that a significant amount of companies will likewise be eliminated. Enterprises that cannot adapt to such enormous change will be left behind. Therefore, AI is not simply another emerging technology that can provide a competitive advantage, but in fact an existential threat to companies.

3.Why has AI suddenly become so prevalent in the corporate world?

First, especially given the various contexts in which the term “artificial intelligence” is used, it is important to define what the phrase means in this paper. In this particular context, artificial intelligence is the use of a suite of machine learning algorithms and neural networks to model and process nonlinear relationships. These relationships are defined first through algorithmic analysis of training data, then through reinforcement with real-world data, often in real time.

The algorithms that are being deployed are not new. The equations underpinning commonly-used machine learning algorithms such as Random Forest [6], XGBoost [7], and ADABoost [8] were actually invented in the late 1990’s. The first version of the Random Forest algorithm, for example, was first developed by IBM computer scientist Tin Kam Ho in 1995 [9].

If the algorithms are not new, why has machine learning only recently become practical? There are two very important innovations that have spurred the use of AI in business - the explosion of horizontally-scalable computing power and the availability of data, content, and metadata that is necessary to train AI algorithms.

The concept of horizontally-scalable computing power is also not new, but became more practical with the widespread availability of cloud computing services such as Amazon’s AWS, Google Cloud, and Microsoft Azure. These services allow programmers to mount customized instances of operating systems and CPUs [10] and to dynamically create clones of them once CPU or memory limits are reached. This means that machine learning algorithms can now process massive amounts of data by scaling-up CPU resources as needed during peak usage.

The explosion of data, however, is the primary driver for the deployment of artificial intelligence in industry. There is perhaps no better example than the health care industry.

In the late 90’s, most hospitals and clinics still recorded each patient’s clinical history on paper. If it was digital, it was stored on a personal computer (PC) in the doctor’s office. By 2019, each patient’s detailed clinical history is now stored digitally, and often times in the cloud as Amazon AWS has launched HIPPA-compliant cloud servers [11] for storage of sensitive information.

This means patient records including clinical history, x-rays, laboratory work, diagnosis, prescriptions, treatment plans, and other pertinent information are now collectively stored digitally. Leading companies in the health care space such as IQVIA [12] and Purple Lab [13] now have more than fifty terabytes of patient data, longitudinal, encompassing more than 300 million Americans.

Machine learning data is not limited to structured patient records, of course. Cameras are now ubiquitous in society and both video and photos are being trained by AI algorithms to do everything from identifying troop movements to the ability to recognize a cat. Companies such as Ziotag [14] are creating biometric audio footprints to create voice profiles that identify speakers in video in real time (see example at:

With the advent of fifth-generation (5G) technology, this data explosion will not only continue but increase by several orders of magnitude. It’s not difficult to see the implications of this for industry and why it is imperative that companies make a plan to innovate and adapt this technology as soon as feasible.

4.Artificial intelligence is now a commodity: It is no longer expensive and does not need the Deep Expertise once required to integrate it into Business

For most organizations, the idea of adopting AI is a frightening prospect. It is assumed that this adoption is both cost-prohibitive and requires deep subject area expertise. In fact, until only recently this was true.

However, since late 2018 AI technology has been commoditized and integrated into all of the major cloud computing platforms. Natural language algorithms used to train taxonomies and classify documents, as one example, once could easily cost millions of dollars in licensing and implementation fees. Using Google AI’s API, these costs now cost penny’s per minute of computing time.

Each of the major cloud platforms, Google Cloud, Amazon AWS, Microsoft Azure and IBM Watson as well, have launched their own artificial intelligence components and made them available to users of their respective platforms. These components have API-level access and cost only the computing power that they need. They include not only out-of-the-box algorithms but also tutorials, training data sets, use cases and communities of users active on their forums who answer questions.

This availability is a true paradigm shift. It means that the combination of AI algorithms, training sets, and horizontally-computing power is now available to any organization. These AI components are well-documented such that organizations do not need veteran programmers or data scientists to implement them.

Even better, some platforms, such as Google, are practically giving away their services. Programs such as Google Cloud for Startups [15] are providing $3,000 to $5,000 to organizations that want to experiment with their services. Additional capital is available for companies that receive venture capital investment.

The urgency for companies to act has never been higher and the barriers to entry have never been lower. This is driving widescale adoption of AI across industries such as healthcare, finance, publishing, retail, and others. The time to act is now, before it is too late.

About the Author

Michael Puscar is a renowned data scientist, technologist and entrepreneur who built his 25-year career assisting companies to develop innovative products and bring them quickly to market. Puscar is the author of two patents, one approved and one pending, for automated content aggregation and semantic classification technology. His companies have been market leaders in the delivery of solutions relating to Big Data technology, scalability, search, advanced security techniques and semantic indexing. As a speaker, Puscar has presented at numerous international conferences including O’Reilly Tools of Change, the Frankfurt Book Fair, London On-Line, Contac and the NFAIS Blockchain for Scholarly Publishers conference (May 2018). He was a contributor to the Vivir Con Sentido television program on Life Design TV Network and been featured by various on-line publications including NBC News, Vice, the Financial Times, TechCrunch, and the Next Web, among others. Contact Information: Telephone: +1 (610) 601-4937; Email: [email protected].



F. Boccardi, R.W. Heath, A. Lozano, T.L. Marzetta and P. Popovski, Five disruptive technology directions for 5G, IEEE Technology Communication Magazine 52: (2) ((2014) ), 74–80.


Unlocking the Value of Content at Elsevier, MarkLogic,, accessed September 28, 2019.


Elsevier Acquires Leading Altmetrics Provider Plum Analytics, Elsevier,, accessed September 28, 2019.


This Is Elsevier,, accessed September 28, 2019.


D. Reisinger, A.I. expert says automation could replace 40% of jobs in 15 years, Fortune Magazine,, accessed September 28, 2019.


Random forest, Wikipedia,, accessed September 28, 2019.


XGBoost, Wikipedia,, accessed September 28, 2019.


ADABoost, Wikipedia,, accessed September 28, 2019.


T. Ho, Random decision forests, in: Proceedings of the Third International Conference on Document Analysis and Recognition, Vol. 1: , IEEE, (1995) , p. 278, ISBN:0-8186-7128-9.


Central Processing Units, Wikipedia,, accessed September 28, 2019.


AWS Healthcare and Life Science Customers,, accessed September 28, 2019.


IQVIA corporate web site,, accessed September 28, 2019.


Purple Lab corporate web site,, accessed September 28, 2019.


Ziotag corporate web site,, accessed September 28, 2019.


Google Cloud for Startups,, accessed September 28, 2019.


Google Cloud Artificial Intelligence Platform,


Machine Learning on AWS,


Microsoft Azure’s AI Platform,


Build with Watson,