wiki:WikiStart

Version 65 (modified by admin, 11 years ago) (diff)

added 2/1/13 note

Welcome to The Limulus Project! ™

What is the Limulus Project?

Limulus is an acronym for LInux MULti-core Unified Supercomputer. The Limulus project goal is to create and maintain an open specification and software stack for a personal workstation cluster. Ideally, a user should be able to build or purchase a small personal workstation cluster using the Limulus reference design and low cost hardware. In addition, a freely available turn-key Linux based software stack will be created and maintained for use on the Limulus design. A Limulus is inteneded to be a workstation cluster platform where users can develop software, test ideas, run small scale applications, and teach HPC methods. Consult the following draft of a Linux Magazine  Limulus Article (pdf) for more information. The Limulus idea actually came from the 2005  AMD Value Cluster Project.

More Information (and commercial availability): Submit  Questions or join the  Limulus Announce List. Or, join the Twitter feed above.

Wiki Contents

Project Status

Limulus Advert

February 12, 2013: Added a Limulus benchmark page. General availability of commercial units -- any day now.

November 05, 2012: The commercial Limulus (Ivy Bridge processors) will be available for purchase the week of November 12th (contact  Nexlink. Final software will be ready as well. If you are attending SC12 you can see a Limulus at Seneca booth 2334. There will be live demos on Tues (2:30-3:30PM), Wed (11AM-noon), Thur (11AM-noon).

June 26, 2012: The software stack is ready to go. All development is based on Scientific Linux 6.X (SL6). Both SRPMS and RPMS will be officially available in July. You can peak at the RPMS  here, but there may be some changes before the official release date. Plus, documentation on how to install these on top of SL6 will be made available as well.

December 9, 2011: We broke 200 GFLOPS running HPL on the  production ready Limulus! The latest version uses the Intel i5-2400S processor. Check out the The Commercial Limulus Cluster page for the details. (Note: the results are double precision CPU GFLOPS.) Plus, you can win your very own Nexlink Limulus system. See the  product page for details

September 15, 2011: We are working with a vendor to manufacture Limulus!. If you are interested, join the  Limulus Announce List or the Twitter feed. We are going to make a small pre-production run of ten (Sandy Bridge) systems in the October time frame. We will entertain questions on the announce list.

March 24, 2011: New annotated views of the Limulus Case

March 23, 2011: We are evaluating Sandy Bridge (i5-2400S) systems now that the motherboards are fixed. There is now an announce mailing list (see above).

January 18, 2011: Software is almost done. Expect the repository to go up any day now. Check out the automatic power control test in the Limulus Software Page. The new case needs a few small adjustments which should be done by the end of February.

Novemeber 10, 2010: I will be at  SC10 booth number 1731 with the improved case. From the outside it looks the same, I added a new and better RAID drive case (4 drives in 3 bays). The new one (Xclio SS034) has a key locks on the drives and blue LED's to match the rest of the case (pictures coming soon). Using a 4 drive arrangement allows for either RAID drives for the head node or a drive for each node. The final design will include an SSD boot drive plus 4 drives in a raid 10 configuration that could easily get you to 4 TB of storage. If you are at the show, stop by for demo and peak inside. If you are not attending, keep an eye on this page for more updates. The software distribution is almost done as well!

September 14, 2010: The improved case is almost done (really). Who knew sheet metal could be so much fun. Lot's of RPMS are done. I also decided on the next system to build with the new sheet metal. I'm going to use AMD quad-core 910e (2.6 GHz @ 65 Watts) for the three nodes and a six-core AMD 1055T (2.8GHz @ 125 Watts) for the head node. That will be 18 cores, one box, one power supply, cool, quiet, and fast. Should have it done in October time frame.

August 5, 2010: The improved case is almost done. We have been moving on both hardware and software fronts and expect some announcements this fall. There are also a bunch of Xeon 3400 series Micro ATX motherboards now available that support dual Intel GigE and ECC memory. (Supermicro, Asus and Gigabyte).

November 17, 2009: The Limulus case (with running hardware) is on display at SC09. Take a look at the Limulus Case section for some pictures and new video.

April 30, 2009: I have been busy with the hardware aspects of the project. The goal is to improve packaging. The result is that the entire Norbert Limulus Cluster now fits in a single case measuring 23x20x8.5 inches (58x51x21 cm). It uses a single power supply, is very power efficient, quiet, and looks rather cool. There will be more real soon. The first version will use quad-core CPUs for a total of 16 cores per system. Also, there has been some concern about using non-ECC memory. Based on current memory technology, actual tests, and some research papers I do not think this is going to be a concern for most users. More information when the tests are done.

October 22, 2008: I have posted the new  results and the issue with Jumbo frames was due to flow control in the NICs. I used ethtool to turn off flow control and was able to see the expected performance boost for larger frames, the variability seemed to get worse however. I'll be writing this up and posting it on  ClusterMonkey real soon. Also,  Open MX 1.0.0 has been released.

August 13, 2008: I recently  posted some interesting benchmarks for Norbert. I have been testing  Open-MX over GigE and found some lower throughput when Jumbo frames are used. I think it could be the low cost switches. More testing later this week. I just noticed that the latest version of OpenMX (0.9.1) can run over standard frame sizes (1500). More testing indeed.

July 31, 2008: At long last an update! I have been busy with the software. Our goal is to have a Fedora 8 spin ready by the end of August 2008. We will also be adding our complete set of spec files plus our build/install scripts to the source tree. We will also host {S}RPMS at this time. Staring in September we will be working on packaging issues (i.e a case). In the mean time you can get more background from a pre-print (draft actually) of my  Limulus Article (pdf) in the November 2007 Issue of  Linux Magazine (It has not made it to the Linux Magazine website yet).

January 1, 2008: The Trac page is up and running. The next step is to get the Wiki and Milestones completed. There is now a hardware manifest and pictures for the The Norbert Limulus Cluster.

Contacting/Joining? The Limulus Project

Once we have the project site completed, we will be posting more information on how to participate. The main contacts are:

The project is hosted by  Basement Supercomputing.

Limulus Trademarks and Licensing

The Limulus Logo and the Limulus Project are trademarks of  Basement Supercomputing. For commercial use of project trademarks or sponsorships, contact  Basement Supercomputing.

Software used by the project is subject to license terms and copyrights as per the authors. Please consult individual packages for more information.

Trac is brought to you by  Edgewall Software, providing professional Linux and software development services to clients worldwide. Visit  http://www.edgewall.com/ for more information.

Attachments