The 3rd Industrial Revolution was the Digital Revolution with the advancement of Internet, PC and ICT.
We have now entered in the 4th Industrial Revolution (1) and as say IBM CEO Ginni Rometty "If it’s Digital, it will become Cognitive" (2).
Gartner and McKinsey, as well as others, are showing big increases in the amount of investment being made in AI and Cognitive Systems, citing already benefits at the immediate and medium-term.
Every industry will be affected by AI and the question is no longer When to prepare for it but What to do to and How?
At Fresche we transform and modernise IBMi Applications, in other words: We transform one digital situation into another.
If Ginni Rometty’s prediction is correct, then Fresche is certainly well positioned to help companies leverage their existing IT infrastructure to build AI powered systems.
In understanding the foundation of this Fourth Industrial Revolution, the guidelines to follow when approaching AI have been: identifying areas where AI can add value, setting milestones and leveraging competitive advantage.
Last November, I attended IBM’s Driveway to Watson event to present how Fresche is integrating Watson. We currently use one of the Watson API for language translation. From our experience, integrating our solution with Watson was relatively simple and the pedagogical purpose of my presentation was to make the public aware of the accessibility of AI, especially when it is considered as an element than can be part of a bigger solution.
For example, in Forbes’ article The Growing Role of AI and Machine Learning in Marketing and Customer Engagement, Logan Rosenstein of NVIDIA explains how they use big data AI to “listen to the constant feed of internet postings” to detect when their organization is mentioned (3
). They then take this data and further analyze it using another AI process to determine the general tone of sentiment being expressed. This is an excellent example of AI being used to capture valuable data that can be used in multiple ways.
In Fresche’s case, there are many opportunities to make use of AI, from database modernization to UI development, app transformation to service delivery, the possibilities for uses of cognitive applications are endless.
AI Analytic, security, bots, and Machine Learning are already fundamental candidates for AI development at Fresche. But how does all this work?
Let’s use the game of Sudoku as an example. The goal of Sudoku is to complete a partially filled 9×9 grid with the correct numbers so that each row, column and 3×3 square contains all of the numbers between 1 and 9, without repeating.
I have created a program to solve Sudoku puzzles and I wrote two versions of it.
The first version solves any empty square starting always from the top-left square, then the next to the right and so on for any lines until the last square at the bottom-right. It tries all combinations of numbers between 1 and 9, following the rule and go back and forth for any invalid attempts, or series of attempt, until all numbers are corrects.
I then get a “diabolic” level Sudoku puzzle and the program has to run for hours to solve the puzzle!
The reason this process is endless is because, starting systematically, box-by-box from the top-left square is not considered the most “intelligent” strategy to use when solving a Sudoku puzzle.
Another strategy, surely better, is to start with the square that contains the most pre-filled numbers around. We would say start with the “easiest” square and work from there. I then write a second version of my program that is constantly detecting the “easiest” squares and attempting to solve those first, a process that drastically reduces the solve time for my diabolic Sudoku, from hours to mere minutes.
Now, imagine we add statistical information to both versions of my program, about the number of “invalid” attempts. The first version contains millions of invalid attempts and the second, a few hundred. We already know the second version is smarter but the measurement will underline its principle: What makes it smarter is that it contains fewer “invalid” attempts.
In a Machine Learning (ML) process, an AI program would not attempt to solve a Sudoku puzzle by starting at the top-left square, nor would it start at the square with the most pre-filled numbers around, because, a priori, it doesn’t know which square is easiest to start with. It starts with a random square and adds statistics. The measurement principle to “learn” is that the less invalid attempts, the better. ML will correlate this statistic with any available contextual information within its memory and eventually determine the smartest method to solve the puzzle, which is to start at the square with the most pre-filled numbers around.
You can imagine that all of this data crunching requires a lot of potency and memory; this is where GPU and Neural Network come in. A software application has more squares and dimensions than Sudoku and the horsepower required to crunch so much data, and take into account so many variables is what makes cognitive computing so astounding.
We, at Fresche, are excited by the possibilities of digital transformation toward cognitive. I will keep you updated.
Machine Learning: a field of computer science that gives computers the ability to learn without being explicitly programmed.
Deep Learning: part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms.
Big Data: refers to voluminous amounts of structured data (Database) or unstructured data (any document or media) that organizations can potentially mine and analyze for business gains. Big data can be internal data or external data (from internet).
AI bot/Chatbot: A chatbot is a computer program which conducts a conversation via auditory or textual methods. One pertinent field of AI research is natural language processing.
CPU: A Central Processing Unit is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logical, control and input/output operations specified by the instructions.
GPU: Graphical Process Unit A graphics processing unit is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device.
The primary purpose of a GPU is to render 3D graphics, which are comprised of polygons. ...etc. Technologies like OpenCL and CUDA allow developers to utilize the GPU to assist the CPU in non-graphics computations. This can improve the overall performance of a computer or other electronic device.
This is what is necessary for AI processes.
Neural Network: Interconnected group of nodes. This is used in Machine Learning or Deep Learning.
Join me at our upcoming IT Leadership Forum in Milan, Italy, as I will be available to discuss business cases and examples of Cognitive and AI with European CIOs and IT leaders this March 1st, 2018.