Computer Processors: What’s Coming Up?

0
298

We already live in a world of sensors, counters, and taggers. Modern factors, hospitals, automobiles and airplanes, most homes, and businesses have dozens of sensors to measure temperature, door openings, speed of rotating devices, pressure, humidity, color, etc.

By hsaniba

GPUs have become our daily workhorse for visualization and compute. We are heading towards adding ASICS, FPGAs and quantum computing.

Who knew when programmable vertex shaders were first introduced by 3Dlabs in 2002, and the TI TMS3410 16 years earlier in 1986, that programmable graphics processors would find their way into supercomputers, scientific instruments, autonomous vehicles, neural nets and machine learning, and inferencing machines?

The GPU with its massive parallel processing capability, over 5,000 cores such as Nvidia’s latest bemouth that uses over 3 billion transistors, has become not just the darling of the industry, so much so that even Intel has finally gotten in the game, that scarcely a day goes by without some new announcement of its application.

Despite the applicability, and love to GPU, the world has shifted in the last 18 to 24 months. Whereas it used to be software was king and we had three tried and true processors: the venerable x86, ARM, and the GPU, now we have a plethora of new processors being developed to enable and employer the exploding area of artificial intelligence, machine learning, robots, and autonomous things.

The big news and excitement about processors this year revolves around four major technology areas, led by applications, as they should be (rather than new processor designs looking for applications). Those application segments (which have dozens of subsegments), in alphabetical order, are:

  • Artificial intelligence
  • Blockchain
  • Cryptocurrency
  • Internet of Things

And unlike the evolution and introduction of applications in the past, which were built on the platforms available at the time, these new applications are demanding and inspiring new architectures and processors.

Interesting Interrelatedness

The other interesting thing about these new applications is how interrelated they are. Artificial intelligence (AI), which is also often referred to as machine and deep learning, relies on and requires a large sample base, often referred to as big-data. Internet of Things (IoT) devices, often called smart sensors, generate large quantities of data. That data can be effectively, efficiently, and securely captured, stored, and distributed via blockchain mechanisms. And if the data, or the AI training has to be paid for, it can be done via cryptocurrency exchanges.

There are hundreds of software robots, such as telephone answering menu systems with voice recognition, and bots that post Twitter comments.

Artificial intelligence (AI)

Artificial intelligence is one such application that was originally run using x86 servers. Because of the data nature of AI, it was soon learned that a massive and low-cost parallel processor like the GPU could be applied to the applications. But even the GPU, with its incredible compute density and computer efficiency, was good enough and so some organization developed application-specific solutions using FPGAs and ASICs for the convolutional neural network (CNN) workloads. The ASICs have various names, probably the most well-known one being Google’s Tensor processor, or TPU, and the Tensor cores Nvidia added to their GPU in the Volta processor.

Other examples can be found such as Intel’s Nervana. IBM developed the TrueNorth Neuromorphic CMOS ASIC in conjunction with the DARPA SyNAPSE program, and other companies such as ST, HiSilicon, Rockchip, and MediaTek have developed AI-CNN processors.

Training

When you start listing AI processor suppliers, you have to segregate them into training and inferencing applications. The big “iron” processors like AMD’s IBM’s, Intel’s, and Nvidia’s are used for sucking in massive amounts of big data to train an algorithm on how to find cats, terrorists, or glaucoma.

Inferencing

Once the algorithms have been trained and tuned, they can then be applied to smaller processors such as the type made by HiSilicon, MediaTek, Nvidia, Qualcomm, Rockchip, ST, and others to do inferencing. Examples would be facial recognition of you for security sign in, or recognizing Alexa’s name and an instruction. The work commissioned by the instruction (Alexa, what time is it in Moscow) is done in the cloud on big AI machines.

Blockchain

Blockchain is a virtual application, in that it doesn’t run on just your computer, but on everyone’s computer.

Blockchain network are simply lots of virtual machines or “nodes” connected to every other node to create a mesh. Each node runs a copy of the entire blockchain and competes to mine the next block or validate a transaction. Whenever a new block is added, the blockchain updates and is propagated to the entire network, such that each node is in sync.

Block chaining is a distributed ledger. There are free and commercial blockchain programs one can use and customize for individual needs such as Ethereum, MultiChain, and HyperLedger.

Ethereum and MultiChain products that claim to be open to some degree. HyperLedger was developed by IBM and given to the Linux Foundation. The licensing is not yet clear on these programs so one needs to investigate before implementing.

The number and types of applications, opportunities, and challenges being presented in our modern world are mind-boggling and difficult to keep up with, let alone be expert in them. As new concepts and vocabularies are introduced, so will confusion and misunderstanding about the terms, devices, functions, and dangers.

To become a node in a network, one’s computer has to download and update a copy of the entire blockchain. To achieve this, blockchain applications like HyperLedger or Ethereum provide tools that you can download, connect to the specific block chain network, and then interact with it.

Because of the mesh nature of blockchaining, GPUs have proven to be particularly good.

Cryptomining

For a blockchain transaction to work, it has to be verified. The verification process can be done by anyone, and those doing it charge a fee for the verification. People set up their computer to search the web looking for open or waiting transactions. That is known as blockchain mining. And since the token of payment for providing the verification is a cryptocurrency, it has become known as cryptocurrency mining, or simply cryptomining.

To use a blockchain, you need a special driver for the processor you want to use. Typically, a GPU is used, and so you can get a blockchain driver from AMD, Intel, or Nvidia for their GPUs. Those drivers are used for cryptomining.

Internet of Things

We already live in a world of sensors, counters, and taggers. Modern factors, hospitals, automobiles and airplanes, most homes, and businesses have dozens of sensors to measure temperature, door openings, speed of rotating devices, pressure, humidity, color, etc. Data is also collected by point of sale (POS) devices. All that data is sent to servers in a continuous or impulse manner depending upon occurrences and location intelligence. For example, there’s no need to report the steady state temperature or rotation of a machine more often than maybe once an hour, but a potentially critical need if it changes in a fraction of a second.

Internet of Things devices, despite their tiny size and ubiquitous deployment, are being upgraded with smart sensors capable of wireless communications, and in some cases without power.

And these smart sensors and POS terminals spew out data every day; in some cases, every day, all day, which leads us back to AI.

One thing that is clear is that one size or type of processor does not fit all applications or needs, and we will have dozens of similar and specialized processors, most of which we will not even be aware of, nor should we be if they are to do their job.

Robots

Robots might be considered an application, more likely a system or device when a physical manifestation is envisioned. However, there are hundreds of software robots, such as telephone answering menu systems with voice recognition, and bots that post Twitter comments.

Robots of course need AI training to function. And if it’s a physical robot, it will have lots of sensors, and their data collection may be used in real time correction (and/or protection), and potentially fed to a server for further data analysis and program refinement. You can think of an autonomous vehicle as a robot.

Summary

The number and types of applications, opportunities, and challenges being presented in our modern world are mind-boggling and difficult to keep up with, let alone be expert in them. As new concepts and vocabularies are introduced, so will confusion and misunderstanding about the terms, devices, functions, and dangers. One thing that is clear is that one size or type of processor does not fit all applications or needs, and we will have dozens of similar and specialized processors, most of which we will not even be aware of, nor should we be if they are to do their job.

Life will get much better, maybe more complicated and challenging, but better overall.

Based on: Dr. Jon Peddie’s presentation