NR1-P

NR1-P

AI Accelerators – Inference (Data Center)

Information

NeuReality's novel AI-centric inference platform NR1-P is a new type of AI Server-on-Chip (SoC). It includes 16 Dual-slot FHFL FPGA card with 25Gbe each installed in a 4U PCI carrier server. Each NR1-P perform true linear scalability running AI-over-Fabric with Native k8S support to deliver up to 3x greater performance/Dollar/Watt compared to the latest Nvidia A100 or T4 based systems.

Log in