top of page

AI Powers Smarter Technology



We hear about new smart technology all the time, from smart speakers to smart watches and even smart refrigerators. While smart products promise to make our lives better, how do we make them “smarter” so they truly complement our lives? In short, how do we ensure that “smart” is actually better?

Beyond just personal & consumer devices that we use in our day-to-day lives, we now have more connected devices in our factories and businesses producing vast amounts of data from a burgeoning number of IoT devices. Such devices, including remote and on-site sensors, cameras, and wearables could be the key to making our factories and businesses “smarter.” But we must carefully decide how we leverage and make sense of the vast amounts of data that is being generated. Without doing so, collecting, moving, and storing that data simply becomes a cost burden. It could also result in a cacophony of data when what we really want is insight and order, not just more noise. To provide some context, a recent analysis suggests that “more data has been created in the past two years than in the entire previous history of the human race”. Further supporting this view, a recent report from IDC included the prediction that the sum total of all the world’s data will grow from 33 zettabytes (as of December 2018) to 175 zettabytes by 2025. This represents a staggering rate of growth, and one that calls out fairly obviously for a sophisticated approach to its management.

The promises of Industry 4.0 will be realized when we leverage Artificial Intelligence (AI) to derive value from data and redefine what automation means. Just as data is said to be the “new oil,” it must be extracted and refined before it is useful. It is easy to hear the phrase [data is the new oil], and believe that we’re sitting on ready-to-use potential in our organization. But, in actuality, many organizations are sitting on “crude oil” without a clear path to monetize and make this data actionable, i.e. turn the “crude” oil into fuel.

The process of deriving intelligence from data is not a trivial task. Each organization will encounter many obstacles and roadblocks along the way — from building your digital foundation to empowering users with the tools and resources to utilize the data they have in a meaningful way. Whether you’re powering a new breed of research with the infrastructure to solve complex problems, building Data Science divisions within your organization, or even providing your citizen data scientist with tools needed to start incorporating machine learning (ML) and advanced analytics within their functional areas, each organization will need a data-centric strategy to deliver smarter products and solutions to their customers. Better doesn’t just happen; it unfolds when planning meets opportunity.

Not only have we at Lenovo been on our own digital transformation journey, we have had the privilege of accompanying our customers along their respective journeys, as well. One of the first sectors looking to leverage AI and advanced analytics was the academic research community. High Performance Computing centers have responded quickly to the new paradigm of AI, empowering a new breed of researchers looking to leverage ML & Deep Learning (DL) techniques to advance scientific discovery in ways that hold the promise to propel the human experience forward. The convergence of HPC & AI created new challenges for HPC centers that needed to support both traditional workloads and new AI workloads. We worked with our community of clients to develop new technology to support the new dynamic environment of HPC with new CPU & GPU systems that support accelerated workloads for AI and software technology that provides an easier path to managing diverse workloads, resources and users.

Last year we launched the first version of our LiCO (Lenovo intelligent Computing Orchestration) platform running on OpenHPC open-source software stack that helps HPC centers manage infrastructure, workloads, and users across their cluster for both traditional HPC workloads & new AI workloads. Since then, we’ve added support for additional frameworks, libraries and even accelerated “no-code” templates that allow users to start training models without writing a single line of code. Recently, we launched our AI Studio module to help users with data curation and hyperparameter tuning. But we did not want to empower just a few with smarter technology. Our goal is to democratize AI, and put powerful tools in the hands of more organizations.

This week at AI Summit in San Francisco, we’re pleased to announce the launch of LiCO 5.4 AI Platform that will now support integration with The Jupyter® Notebook and run in a Kubernetes® environment, giving more organizations across the enterprise the opportunity to leverage AI within their organization. By providing a more straightforward path to implementation and management, LiCO on Kubernetes® will provide faster time to value for organizations adopting AI. This democratizing of AI drives greater adoption, greater usage, and helps provide smarter technology for all.


Comments


bottom of page