AI + ML

NVIDIA Completely Re-Imagines The Data Center For AI

It is all about tighter integration with memory, CPUs, and accelerators for trillion-parameter AI models.

"...the goal is to create a tightly integrated computational foundation to pursue the next wave of AI innovation: a trillion-parameter computer intelligence. Today's largest AI model is the Open.ai GPT-3, totaling 170 billion parameters for language processing, requiring over one thousand NVIDIA GPUs hosted by Microsoft Azure. The human brain has about 100 trillion synapses, roughly equivalent to the deep neural network parameters. If successful, the NVIDIA system would be only 100 times slower than the human brain."