The dawn of accelerated computing is underway, marking a transformative era within the tech world.
As artificial intelligence and machine learning take center stage, modern hardware solutions and novel architectures are outpacing traditional computing methods.
“Traditional general-purpose computing is sort of a Swiss Army Knife,” said Shehram Jamal (pictured), director of product management for AI applications software at Nvidia Corp. “It could possibly do many things, but none of them extremely well. It’s a one-size-fits-all approach where the identical processor is used for various tasks from browsing the net to editing videos. Accelerated computing, alternatively, is sort of a specialized tool. It’s designed to do one thing exceptionally well.”
Jamal spoke with theCUBE Research’s John Furrier at the AI Infrastructure Silicon Valley – Executive Series event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the evolution of AI infrastructure, how accelerated computing is reshaping industries and what the long run holds for enterprise AI systems.
The shift to accelerated computing intimately
Specialization and efficiency drive the hardware underpinnings of accelerated computing. The architecture is built around specialized hardware, such GPUs and tensor processing units. These processors excel at parallel processing, making them higher fitted to AI tasks resembling machine learning, data analytics and scientific simulations. This architecture results in faster processing times, higher energy efficiency and lower costs, making accelerated computing essential for contemporary AI workloads, in response to Jamal.
“General-purpose computing can panel a broad range of applications but may struggle with high-performing tasks attributable to limited parallel processing capabilities, whereas, accelerated computing has three primary concepts, resembling heterogeneous architecture, parallel processing and efficiency,” he said. “Combining CPUs with specialized accelerators, like GPUs and TPUs, to handle specific varieties of workloads more efficiently is a heterogeneous architecture in accelerated computing.”
The demand for AI-driven applications has exposed the restrictions of traditional computing. Modern AI systems are designed otherwise from previous iterations, requiring specialized hardware and software configurations. For example, applications resembling self-driving cars, medical diagnostics and virtual assistants, resembling Siri or Alexa, depend on the capabilities of accelerated computing for real-time performance and accuracy, Jamal explained.
“Principally, you may do faster and smarter applications with accelerated computing,” he said. “You may enhance healthcare with AI-powered diagnostics. You can even improve entertainment as well. After which there are smarter home devices as well.”
Within the context of AI systems, the 2 dominant processes are training and inference. Training is akin to teaching a model to acknowledge patterns, resembling animals in pictures. This process requires vast amounts of information and computational power, making it a resource-intensive task. Inference, alternatively, involves using the trained model to discover patterns in latest data, a much faster and fewer compute-intensive process.
While training is crucial for developing accurate AI models, inference will develop into the dominant use case in the long run, in response to Jamal. As AI models develop into more efficient through techniques resembling transfer learning, the necessity for extensive retraining will diminish. Nevertheless, ongoing model updates and refinements will still require a sturdy training infrastructure, Jamal identified.
“I’d say they’re training the S-curve to flatten as models develop into more efficient and specialized and techniques resembling transfer learning and few-shot learning develop into more prevalent,” he said.
Here’s the whole video interview, a part of SiliconANGLE’s and theCUBE Research’s coverage of the AI Infrastructure Silicon Valley – Executive Series event:
Photo: SiliconANGLE
Your vote of support is vital to us and it helps us keep the content FREE.
One click below supports our mission to supply free, deep, and relevant content.
Join our community on YouTube
Join the community that features greater than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and lots of more luminaries and experts.
THANK YOU