Any AI effort will rely on three main building blocks: data, infrastructure, and talent.
The following is a guest post by Rita C. Waite, a Growth Strategy & Investments Manager at Juniper Networks.
Artificial Intelligence (AI) is fundamentally changing how businesses operate across all sectors, including manufacturing, healthcare, IT, and transportation.
Advancements in AI over the last decade are presenting opportunities for companies to automate business processes, transform customer experiences, and differentiate products offerings.
AI pioneers like Google and Amazon, who have adopted these new technologies to create a growing competitive advantage, have already witnessed bottom-line benefits from their AI strategies.
While enterprise AI adoption is still in the early stages, the scale of the opportunity in AI demands more C-level discussions within any business. It is crucial then to have a concrete understanding of AI, its ecosystem, and how industry leaders are taking steps to drive unfair advantage from it.
AI is a branch of computer science that aims to create machines capable of intelligent behavior. Within AI there are multiple technologies and segmentations, with machine learning (ML) being one of the largest and fastest growing.
Machine learning algorithms learn from examples and experience rather than relying on predefined rules or algorithms. Within machine learning there are additional segments, such as deep learning, which focuses on deep neural network structures.
Today, AI is poised to benefit from the convergence of several technological innovations and wider expertise, specifically: affordable cloud computing infrastructure, availability of large datasets, and leaps in algorithm optimization.
These advancements, combined with increased investment in AI research, have created an environment for AI to sustainably thrive and continue to affect businesses and society into the future.
What’s so special about Machine Learning?
The recent resurgence of AI has been largely driven by advances in machine learning. These advances have led to breakthroughs in natural language processing (Apple’s Siri, Google Translate), recommender systems (Amazon’s recommendation engine, Pandora), and image recognition (diagnosis tools, self-driving cars).
Machine learning is broadly divided into two learning methods:
- Supervised learning, which uses a known dataset to make inferences based on labeled input and output data.
- Unsupervised learning, which draws inferences from datasets containing data without labeled outputs.
The most prevalent method at work today is supervised learning, with unsupervised learning showing great promise for broader applications.
Within each method of learning, there are multiple algorithm categories and select algorithms to choose from. Decisions here will vary depending on the type of problem or the desired result.
In a machine learning workflow, each segment of the process requires specific types of expertise and levels of resources. While domain expertise is important for the pre-processing / feature engineering portion of the workflow, the training phase requires distinct AI expertise and less domain knowledge.
From an infrastructure perspective, the most resource-intensive phase is the model training phase, when the data is processed. Again, understanding the tradeoffs of each approach and the type of problem being solved becomes important when building a ML model.
The AI Stack
The AI stack is the infrastructure required to run AI models, including optimization components, storage, data process, and analytics tools.
- Components: CPU, GPU, FPGA, and specialized ASIC are the foundational components of the AI stack. While CPU are ubiquitous, GPU and FPGA used in the resource-intensive training phase of machine learning have led to great advances in deep learning. For the inference portion, which requires fewer resources, traditional CPU or ultra-low power FPGA or ASIC are the most common options.
- Compute: Public cloud vendors are now offering solutions tailored for AI. The availability of cloud computing options enables any enterprise, SMB, or small team to run AI models at an affordable price point.
- Storage: With the vast amount of data required in machine learning, particularly during the features engineering phase, data storage is critical. The emergence of Hadoop clusters and cloud object storage have significantly advanced data storage capacity to support AI use cases.
The AI stack relies on services provided by public cloud vendors and open source projects. Investment by cloud giants — such as Google, Amazon, Facebook, Microsoft, and Baidu — into AI services has facilitated a shift away from proprietary vendors owning the stack.
In concert, the embrace of open source as an accepted standard has caused more rapid development across the AI ecosystem. Google’s open-source TensorFlow library exemplifies this mindset by enabling anyone with an interest in machine learning to develop models without having to build libraries and algorithms from scratch.
The last decade brought AI out of research institutions and into the forefront of some of the world’s most progressive technology companies. These companies have embedded AI into their core products and services, accelerating technology advancement, talent development, and investment seen in the AI ecosystem. For example:
- Amazon is using AI to improve personalized recommendations and optimize inventory management. In Amazon’s annual letter to shareholders, CEO Jeff Bezos discussed the importance of adopting AI to deliver goods more quickly, enhance existing products, and create new tools through its cloud-computing division.
- Google uses its own DeepMind technology to manage data center energy and reduce cooling costs by 40%. The company’s AI-first strategy is focused on leveraging AI for search optimization, self-driving cars, and numerous other portfolio solutions.
- Facebook is committed to building the foundational technologies of AI. Its research group, FAIR, is one of the top producers of breakthroughs in neural networks.
- Microsoft has created an AI business unit with over 5,000 computer scientists and engineers focused on driving AI into the company’s products.
- Intel is updating its servers to cope with the increased computation required to process and train AI systems. To do this, the company has aligned its AI efforts under a single organization led by Naveen Rao, former CEO of Nervana (a deep learning startup Intel acquired in 2016).
- Baidu is investing heavily in artificial intelligence, building image-recognition technology, advancing autonomous driving, launching digital assistants, and developing augmented reality tools.
The shortage of AI talent remains an issue. According to McKinsey, 70% of AI investment comes from internal R&D investment by the largest technology companies. We continue to see the cloud giants hiring key AI talent from academia to head their AI efforts. It comes as little surprise that 80% – 90% of all AI talent is working at the largest technology companies in the world.
The fierce competition for talent has contributed to a large uptick in acquisitions of AI companies. According to CB Insights, over 55 private companies using AI across different industries have been acquired in 2017 alone. Google, Apple, Facebook, Intel, Microsoft, and Amazon have been the most active acquirers in AI, with the majority of acquisitions falling in core AI technologies, such as image recognition and natural language processing.
Led by these technology vendors, AI has seen some early winners, and in the process created an active ecosystem of technologies and tools. A record $6.5B of capital has been deployed across 650+ deals in 2017, already surpassing the $5.7B deployed across almost 1000 deals in all of 2016.
AI companies range from those focusing on developing core AI technologies to those building AI tools for solving industry-specific problems. In terms of investment, the largest sub-segments of AI have been cybersecurity and horizontal solutions, followed by business intelligence and IoT startups.
How companies can leverage AI
When assessing how to deploy or build AI tools, companies should both analyze the highest value use cases and plan to build a foundation of robust support and talent.
Any AI effort will rely on three main building blocks: data, infrastructure and talent.
- Data drives insights but requires access to large datasets. The effectiveness of machine learning is often correlated to the amount of data that is available. At this stage, access to large amounts of data is a requirement to drive value out of ML tools.
- Infrastructure, both software and hardware, must be in place to run machine learning models effectively. Cloud Service Providers are well positioned to extend their offerings into AI infrastructure and offer solutions that can be used in conjunction with open source software. For some companies, moving the training data to the cloud would be either too expensive or impossible due to regulation or other business reasons. For those companies, a large amount of computing power will be necessary and sometimes hardware acceleration with GPU, FPGA, or ASIC will be required.
- AI talent is vital in making effective use of machine learning. While not every company will seek to build an internal AI organization, having access to experienced data scientists is key to driving value from AI. Machine learning is a difficult subject that requires expertise.
Driving AI into core product and services offerings creates competitive differentiation. Companies must bring expertise in-house to build a robust infrastructure able to handle AI development.
In many cases, operationalizing a strategy requires significant capital investment. If building an internal solution is not an option, adopting third-party tools can be a suitable alternative. Companies unable to differentiate their products with AI from the onset can still take steps to improve and automate core operations. Operational efficiency is also a competitive advantage.
- Differentiated customer service through advanced bots and virtual assistants
- Smarter forecasting for financial planning, inventory management, and sales pipeline
- Automated HR processes through optimized recruitment, automated talent management, and tailored benefits
- Increased salesforce productivity through automated outbound sales, intelligent customer engagement and target marketing
- Streamlined legal tasks with AI contract due diligence and review, assisted legal research, and automated IP monitoring.
Not every company will share the same priorities. While some may find an automated customer service solution drives the most value to their business, smarter forecasting for inventory management may have a greater impact to another. Encouraging the leadership team to analyze and explore the benefits of adopting AI tools within their own departments will shed light on where AI will have the highest impact.
AI no longer refers to theoretical research at academic institutions or R&D labs; instead, it is a foundational technology that will disrupt society and lead to decades of innovation. From the way we get to work, to how doctors identify and treat diseases, AI is poised to forge a future of endless new possibilities.
Companies implementing an AI strategy today will be best positioned to take advantage of the opportunities to come. AI is already transforming how we do business, and for companies large and small this could mean an unsettling change.
The convergence of accessible technology and an active ecosystem, however, suggests that companies are more prepared than ever to take part in this new wave of innovation.
Rita Waite (@ritacwaite) is a Growth Strategy & Investments Manager at Juniper Networks, where she focuses on emerging networking technologies. She is Juniper’s representative and observer to the Board of Directors of several portfolio companies. Rita is also a VP of West-to-West, an organization that promotes and supports Portuguese entrepreneurship in Silicon Valley. Rita graduated from the University of San Diego with a B.A. in Economics.