wiki:CommercialLimulus

Version 5 (modified by admin, 12 years ago) (diff)

corrected peak flops calculation

Nextlink Limulus Specifications

The  Nextlink Limulus cluster is a commercial version of the the Limulus Project Design. The base system has 16 cores (one master node, three worker nodes each with a quad-core Intel i5 processor), 16GB of RAM, a 256GB SSD, an internal Gigabit Ethernet switching, a single 850W power supply, front panel video/usb/power light and switch, and automatic/smart power control for the worker nodes.

Recent tests of the Nextlink Limulus delivered a remarkable 200 GFLOPS running the Top500 HPL benchmark (58% of peak, Raw HPL Results, these are double precision CPU GFLOPS). The base system cost will be somewhere around $25/GFLOP.

Detailed Specifications

Host Node (1)

  • Intel i5-2500K CPU @ 3.30GHz (four cores)
  • 4GB RAM PC3 12800
  • On-board Intel 82579V Gigabit Ethernet
  • Additional Intel Gigabit LAN NIC
  • Crucial RealSSD C300 256GB
  • 1 x PCI Express 2.0 x16
  • 2 x PCI Express x1
  • 1 x PCI Slots
  • External Ports
    • 1 x DVI
    • 1 x HDMI
    • 6 x USB 2.0
    • 2 x USB 3.0
    • 1 x eSATA 3Gb/s
    • 1 x Optical S/PDIF Out
    • 5 Audio Ports

Compute Nodes (3)

  • Intel i5-2400S CPU @ 2.50GHz (four cores)
  • 4GB RAM PC3 12800
  • On-board Intel Intel 82579V Gigabit Ethernet
  • External Video, USB, PWR LED, PWR Switch

Switching

  • HP 1410-8G Switch
  • All cables Included

Power

  • Single 850W 110-120V Power Supply
  • Compute Nodes power internally controlled by Host Node

Software:

Performance:

  • 200.3 GFLOPS (Raw HPL Results)
  • 58% of Peak (3.3GHz * 4 cores * 8 FLOPS/cycle) + (2.5Ghz * 12 cores * 8 FLOPS/cycle) = 345.6 GFLOPS Peak