Democratizing AI Within Your Supply Chain Leadership
Reading time: 5 mins
By Kumar Singh, Research Director, Automation & Analytics, SAPinsider
Supply Chains may never become fully autonomous but….
Supply chains of the future will be increasingly driven by algorithms. Key planning processes will be significantly automated and managed by Algorithms. In a nutshell, supply chains of the future will be part of mathematical corporations -corporations that are driven by data. Data will feed ingenious algorithms.
But you already know all that jazz and have heard and read this theme multiple times. However, there are certain key aspects to building these “mathematical corporations”. To successfully lead supply chains in mathematical corporations, you need to better understand certain important aspects like how to invest in the infrastructure, software and machine intelligence expertise that are the foundation for taking organizations to the next level of performance.
So as a supply chain leader, you DO need to develop a certain level of breadth of the rapidly evolving technology landscape.
Four key areas that you need to develop a good overview of
The machine intelligence of the future will not be like automation of the past in which bots do the rote, mundane, repetitive work. Machine will perform select elements of knowledge work long held to be the sweet spot for people in some of the most revered professions like law, medicine, engineering etc. And this applies to the world of supply chains too !
So coming back to the need for current and future supply chain leader to understand the current AI and ML landscape, the key question is – What kinds of analytics technology should supply chain leaders understand – and buy ? I have divided critical technologies of machine intelligence into four areas. This is not an exhaustive categorization but if you invest in learning selected topics across all these four areas, you will position your organization to profit from galaxies of new information.
- Data Collection, storage and preparation methodologies and Technologies
- Applications that help visualize and interpret
- Algorithms applied to data
- Foundation infrastructure technologies
Note that the topics indicated below for each of the above four areas are for you to use as a list to explore these topics. Due to the broad nature of intent of this article, explanations are either short or not included at all in this article but you can find plenty of resources to dig deeper.
Data Collection, Storage, preparation and Foundational Infrastructure
The following are some aspects of this area that you should research:
- Types of Data (ex: Structured, Unstructured)
- What is a Data Lake ?
- Types of databases prominently in use (ex: Relational, Distributed)
- How many different types of existing systems hold relevant data currently and how is the data format different for each of these ? What type of database supports these systems ?
- Basics of Cloud technology and Cloud computing
- Data Security fundamentals
- What technologies are available to automate data extraction, cleaning and processing (ex: Alteryx)
- What is the computing architecture that will be best suitable depending on size of my organization’s data and computing requirements ?
Applications that visualize and Interpret
- Basics of Descriptive statistics to interpret statistical graphs and charts to do data discovery in a Dashboard environment
- Ability to decipher key types of charts like (not an exhaustive list):
- Pie charts
- Line charts
- Heat Maps
- Tree Maps
- Different type of Dashboard Software options available and their high level offering (ex: Tableau, Qlik)
Algorithms applied to Data
What are the different types of discovery tools available in the market ? These tools generally uncover patterns, associations and anomalies Generally, data scientists work on these analysis but this is changing rapidly. Soon there will be easy click and query tools so that non technical executives can apply analytics.
In the mathematical corporation, machine learning is the way you move from simply programming computers to carry out tasks to enable them to learn the world around you and provide recommendations.
Remember that machine learning is not just a learning tool. It is also a tool for approximation, prediction and creating original understanding that enhances supply chain leader’s high level capabilities to imagine the future. There are many approaches to leveraging machine learning for predictions and those can be classified into two key categories:
Supervised Learning: In supervised learning, you show the model the input and output data for known examples of pattern. An example is Regression.
- Regression: Finds priority factors that influence an output. Essentially, it defines relationship between certain input variables that lead to an output. Ex: Which of the following variables is a more stronger predictor of product getting damaged during transportation ? Transit time, Product weight, carriers etc. ?
- Classification : Categorizes data by type. You determine the parameters for classification and a good application area example can be a classification algorithm that classifies a new SKU into A,B, or C type Inventory Management SKU based on certain product characteristics.
Unsupervised Learning: It often starts with no knowledge of data or relationships within the data. Instead of training set, Data Scientists tell the model to begin dividing the data in many different ways to learn what choices and summarizations can occur. These algorithms essentially identify groups of data that exhibit similar traits. Example of unsupervised learning algorithms is:
- Clustering: Puts data into groups where each group contains data with similar characteristics, as determined by the algorithm.
Simulation and Optimization
- High level understanding of Linear Programming and Mixed Integer Programming (how the problem is structured etc.) and leading off the shelf tools available (ex: Llamasoft, JDA)
- Types of simulation modeling and leading off the shelf tools available (ex: AnyLogic, FlexSim)
- Where to leverage simulation vs optimization
Last but not the least…..Deep Learning
This is a type of machine learning that tries to mimic a human brain in terms of architecture. That is the basis but it is still driven by pure computation. These algorithms can process a wider range of data, typically require less data pre-processing by humans and generally deliver results that are more accurate than traditional machine learning models.
In terms of architecture, there are multiple interconnected layers of software based calculator nodes that are known as neurons (taking a cue from human brain architecture, as mentioned earlier). The network can intake large amount of data and this data is processes through increasingly complex computations as it travels through the network.
What does this mean for SAPinsiders ?
By the end of this decade, AI capabilities would have penetrated significant aspects of your supply chain operations. In order to ensure that your organization emerges out of this decade successfully, you need to be cognizant of certain aspects:
Upskill your workforce: While tools and technologies will automate a significant portion of your supply chain operations, people will still play a significant role. A gradual upskilling of your supply chain leadership needs to start as soon as possible in order to start building the foundation.
Invest in self-service tools: While not every tool in your portfolio can be, or needs to be a self-service tool, it is imperative that tools leveraged by employees on a day to day basis are self-service. This stands true for non AI tools as well. Why ? Because you need to start building a data driven culture now in order to reap the true benefits from semi-autonomous supply chains of the future.