Interview with Stefan Eilemann

·

·

This is an interview with Stefan Eilemann, Senior Software Engineer and Consultant at Tungsten Graphics, who was running the multimonitor display at WWDC07. Stefan was responsible for porting Chromium to MacOSX, and is now working at Eyescale Software developing and deploying Equalizer, a framework for distributed, scalable graphics software. He also kindly provided the images.

What can to tell us about the detailed information about the hardware requirements, basically a “How To set up a multi-monitor display”?

(Stefan) The hardware requirements are pretty straight-forward. What we had
at the show is standard Apple hardware:

  • 9 MacPro, Dual dual-core processors, ATI x1900 cards
  • 18 30-inch Cinema HD Displays
  • Gigabit Ethernet
  • One MacPro as master which runs the applications

Depending on the target application, this can vary. You’ll definitely
want a faster interconnect like Infiniband and 10-GigE for some
applications.
On the other hand, you’ll probably get away with slower graphic cards
if you are not putting two 30-inch displays on each of them.

The main reason for this is the way Chromium works. It grabs the OpenGL command stream from the application by providing a custom OpenGL library. The command stream is packed and send to all cluster nodes, which unpack it and render it locally to a modified view frustum. Depending on how much dynamic data is updated during a frame, you might be hitting a network bottleneck. The Google Earth demo with its constant texture updates was definitely network-bound.

How much time did it take to set up the current installation?

(Stefan) We started the project about one month before the show. The initial bringup using X11 as the windowing layer took about a week, thanks to some previous work on the Mac OS X port. We quickly realized that supporting only X11-based application would not be enough for a good demo, so I’ve invested some more time to get Carbon-based applications, namely Google Earth and Amira, up and running.

The actual setup on the target hardware was done by Apple. My guess is that it was less than a week of actual work, mostly to install the OS, networking and adaptation of the Chromium config files.

What was the motivation, the problems and solutions in bringing the software to the Mac?

(Stefan) The motivation is pretty straight-forward. Chromium enables unmodified
applications to run a display wall. The display wall enables a couple
of new uses:

  • Collaboration in larger groups, no crouching behind a single screen
  • Lots of detail while keeping the bigger picture – see detail in any point of the model, without zooming in and out.
  • Presentations for larger groups
  • Color quality, no back projection (space)

Any insights on what sort of problems work well in this type of setup and when Equalizer might be a better choice?

(Stefan) To keep it simple: If your application works fine through Chromium,
use Chromium. If not, consider porting to Equalizer. Equalizer enables
you to run 100% perfect in VR environments and to use scalable rendering. It is also possible to write parallel applications with Chromium (and people have done that), but I think that Equalizer provides now a more structured, flexible and faster way to create scalable OpenGL-based applications.

Equalizer applications typically have much less bandwidth requirements than
Chromium, since all the rendering happens locally and only high-level data, like
camera position, is sent over the network.

If we could also compile a list of applications that are known to work it would be great.

(Stefan) For the show we did prepare Google Earth, Amira and VMD. These three applications addressed three different user groups: geographic information systems, scientific visualisation and molecular modelling.

During the show some people asked us to try other applications, and we’ve also got LigandScout (Java-based protein modelling) and SOAP (GPS satellite analysis) running. We were especially happy to see LigandScout, first because it was Java-based, and secondly because it performed really well, even with big data sets.


Leave a Reply

Your email address will not be published. Required fields are marked *