NVidia introduced a supercomputer that stands as a better Mac Mini

Nowadays, one of the main topics is artificial intelligence. In order to actively develop it, you need incredibly expensive, large and demanding computers. Or not? The nVidia company introduced a computer that can handle this and does not take up much space on the desk.

nVidia DIGITS

Project DIGITS it is equipped with a new superchip NVIDIA GB10 Grace Blackwellwhich offers petaflops of AI computing power for prototyping, fine-tuning and running large AI models.

Performance and efficiency with the GB10 chip

A key part of the DIGITS Project is the GB10 superchip. It offers advanced capabilities thanks to the connection of the latest Blackwell GPU with Tensor Cores and the Grace processor. This actually a single-chip computer (SoC) is the result of a collaboration with MediaTek, which has ensured its superior energy efficiency and performance. The DIGITS project is equipped 128 GB shared memory, up to 4TB of NVMe storage and can run language models with up to 200 billion parameters. Two interconnected DIGITS supercomputers can even handle models with 405 billion parameters.

vnidia digits zepredu

Software

The DIGITS project runs on Linux platform NVIDIA DGX OS. It is designed to support the entire process of developing AI models from prototyping to deployment on cloud and data infrastructures. Users can access a rich library of AI tools and frameworks such as PyTorch or NVIDIA NeMo through the NVIDIA NGC catalog.

Availability and cost of supercomputer

The supercomputer will be available from May for a price starting at $3,000. How it will be with availability in the Czech Republic is not known at the moment

vnidia digits from behind

What is a petaflop

Flops (floating-point operations per second) is a unit that says how many decimal calculations a computer can do in a secondat. So when we say petaflops, we are talking about trillions of calculations per second (10¹⁵).

Imagine that you have a calculator and on it you do one calculation per second (e.g. 2+2, 5×3, etc.). If you wanted to do 1 billion calculations, it would take about 31 million years, if you were to calculate without a break and if we did not count the time to change the batteries.

abstraction, artificial intelligence and RAM

But a supercomputer with a performance of 1 petaflop can handle it in one single second. Such enormous power is used, for example, for weather simulations, modeling of complex biological processes, the design of new medicines or the calculations required for space research. And of course for AI purposes.

Will it start ChatGPT?

Now you’re thinking that it must already be running ChatGPT. Well… almost. From what I pulled directly from ChatGPT, the Model 3 is sized 175Bso DIGITS could handle that. It’s worse with RAMbecause just to save ChatGPT 3 you need 350 – 700 GB of RAM, and then several TB for training.

ChatGPT

With the ChatGPT 4o model, we are at the size of the 1T model and around 4 TB of RAM for operation and 8 – 12 TB of RAM for training. That’s just for idea

Would you give $3000 for nVidia DIGITS?

Source: nVidia, StorageReview

Source: www.svetandroida.cz