Alice 4 IRIS GL Implementation
Overview
libgl
handles window creation and
configuration, event queueing, and interpreting drawing
commands for the demos on the Alice 4 device.
Reference Material
We used the IBM AIX Graphics Library pages as a reference. They document API usage on an IBM IRIS-compatible graphics device, so they're not 100% applicable but helped us a lot. We also found a scanned PDF of the IRIS Programming Guide somewhat helpful.
In the end we also used some screenshots in Google Image searches, our own memories of these demos, and just guessing at the meaning of functions from the code. This project included a fair amount of code and design archaeology.
Project Bounds and Limitations
We implemented only the functions required by the demos we wanted to run. These included opening a window, handling essential events, and only the drawing operations used in the source.
Our GL is far short of a production IRIS GL replacement. We neglected a lot of the object / display-list system. We didn't implement any special blend modes or alpha test or depth test modes, stencil planes or overlay planes.
We have a lot of constant-sized-arrays which would probably blow up if anyone tried to write a new IRIS GL program using our library.
Basic operation
The application configures the window system with a series of calls to set parameters like whether the app requires double-buffering (doublebuffer
), whether the framebuffer should be interpreted as RGB colors (RGBmode
) or a color table, and then opened on the screen with an optional title bar (winopen
).
The app manages events by indicating that it wants to receive them (qdevice
with enumerants like REDRAW
, ESC_KEY
, or MOUSEX
). Continuous “valuators” are exposed through a direct call (getvaluator
) or by specifying the valuator should be read and coupled to certain events (tie
). Finally, applications test the event queue (qtest
) and read and process any events that exist (qread
).
Each app sets graphics state for things like lighting, materials, the lighting model, the current color. In most of the demos we cared about, this state is set up once, in some initialization code. In production apps, though, these might be specified in each frame or more frequently.
The application draws on the screen by beginning a polygon of various kinds or a line, submitting vertices, and closing the primitive. Apps can also draw points (effectively pixels in our implementation since we have no smoothing or antialiasing) or a parametric shape like a circle. Each of the vertices is transformed through one or two matrices (depending on the library's current mode), lit using the light and material parameters, projected, and clipped.
The platform implementation only accepts vertices within the 800-pixel by 480-pixel viewport. Therefore, all primitives must be trimmed to fit within the viewport before rasterization.
When all drawing is complete, the application swaps the front and back buffers (swapbuffers
).
Windows and configuration
In our implementation, we only support one fullscreen, 800 pixel wide and 480 pixel tall “window”, and the title is ignored. Color index mode (each “color” value looks up an RGB color in a table of colors) is emulated, as described below. Workstation functions like borderless windows with noborder
, window stacking (e.g. winpop
) and ringing the bell with ringbell
are not supported and the functions are stubbed out.
Events
Libgl manages event requests and an event queue based on the platform event implementation. We wrote an implementation of a subset of events for the FPGA simulator program and another for the handheld device itself. See below for more information on the platform interface.
Objects
IRIS GL supports recording some functions in a list
which can be played back later. It calls these “objects”
but they will be more familiar to some as “display lists”
in OpenGL.
An application creates an object using genobj
, starts
recording commands in the object using makeobj
, and
finishes recording with closeobj
. The functions can
be played back with callobj
.
An object can even contain a callobj
, and that secondary object can be tagged with gentag
and maketag
and later replaced with objreplace
, making a kind of hierarchical, editable model.
Many functions (not all) can be stored in objects, but we only stored function calls used in the demos.
Lighting and materials
IRIS GL only supports fixed-function lighting (no shaders!), where an empirically based, per-vertex equation (“Phong” lighting) based on the material and light parameters determines the color of that vertex. Our implementation of the lighting equation is contained in the light_vertex
function in our source.
We didn't implement any lighting and material parameters not used by the demos. We didn't code spotlights or fog. We even ignored the “local” lighting model, which forces a local viewer, because lighting is basically okay without it. (On those very early graphics workstations, using a local viewer came at a performance cost!)
Primitives
The demos use a few different code paths for drawing polygons, triangles, and lines. The IRIS Graphics Library went through a few revisions over its lifetime, mostly trying different programming styles to give performance and ease to applications. And we think these different geometry functions reflect those revisions.
They all funnel through the process_polygon
, process_tmesh
, and process_line
functions in our implementation. Those functions all transform and light vertices using transform_and_light_vertex
and then clip and project the resulting primitive to the viewport before passing the primitive to the platform implementation.
Miscellaneous libgl
Features
Color-index mode
The earliest IRIS workstations supported only “color index” mode, where the “color” value for a pixel actually looked up an RGB color in a table before display. We implement this by indexing the color when provided and storing only the RGB color. However, this ended up being a problem for the “arena” demo.
IRIS GL has a mask for which bits of the framebuffer are updated by drawing commands. Arena uses that mask to draw the HUD in the upper nybble and the arena geometry into the lower nybble. The colors from 0 to 15 are normal colors. The application stores all color indices from 16 to 255 only based on the upper nybble. For example, colors 16 to 31 are all set to yellow, so it doesn't matter what's in the lower nybble.
Therefore arena can set the drawing mask to 0xF0
,
clear the color buffer to 0 and draw the HUD at any
time. It can then set the drawing mask to 0x0F
and draw the normal arena geometry every frame, knowing
that the HUD is always drawn “over” the arena geometry because
of the color table. This gives kind of a crude overlay
functionality.
Arena was one of the last demos we investigated. We
had already written libgl
assuming RGB
colors and it would have been a real pain to
deindex colors in every drawing command. Instead we
just edited arena to draw the HUD last over top of the
maze for every frame. Oh, well.
PUPs (Pop-up menus)
IRIS GL had a facility for pop-up menus that an application could invoke, typically on the right mouse click. (newpup
, addpup
, and dopup
) Both “jello” and “bounce” originally used menus to change lighting, line edges, and toggle automatic rotation, among other things.
We implemented a lot of the menu functionality. We could toggle the states in jello and bounce, but finally agreed that the hand-held, tablet-like quality of the device didn't lend itself to old workstation-style menus, and turned it off.
Tracing
The first time an unimplemented function is called, libgl
prints a warning but continues. This allowed us to see early on in development how far we were getting, and make some guesses about what we absolutely needed to do, and what we could safely ignore.
If trace_functions
is enabled, every implemented function prints its own name and parameters, so that we can more easily debug a complete stream of commands. Just knowing which command we were in during a crash (e.g. output of “bt” in the debugger) was not sufficient.
Platform layer interface
We abstracted the implementation of actual events and rasterization away from the bookkeeping performed by libgl
.
The event interface is specified in libgl/event_service.h
. An implementation is responsible for handling platform specifics and translating platform-specific events into libgl
events. We wrote an event implementation for both the FPGA simulator over the network and the final hardware platform. On the final platform, touchscreen taps are provided as MOUSEX
and MOUSEY
and LEFTMOUSE
and the calculated and smoothed tilt from the accelerometer are provided as DIAL0
and DIAL1
valuators. We altered the demos to use those valuators, so they still have the style of IRIS GL applications.
An implementation of the rasterizer interface, specified in libgl/rasterizer.h
, must translate drawing commands (such as setting the pattern, linewidth, depth buffer enable, and drawing primitives) into platform specifics. We implemented the rasterizer interface for the FPGA simulator, the hardware device, and also for a reference rasterizer. The reference rasterizer was written to be quick to write and easy to maintain, and we used it to debug the other two implementations and perform some quality tests.
The rasterizer interface allows drawing a batch of primitives with a variable count and those primitives can be lines, triangles, triangle fans, points, or a triangle strip. In all three of our rasterizer implementations we converted points to a pixel-sized quad; converted lines to a long, thin quad stretching between the line endpoints; and converted all triangle types to independent triangles. We also never ended up passing anything but single lines and triangles from libgl
to the rasterizer implementations.
Notably, for the final implementation of our hardware, we rotated the screen 180 degrees to improve viewing. (The LCD in retrospect seems designed to be viewed somewhat from above, as if it was in a vending machine interface.) We also rotated touch, tilt, and graphics primitives in the platform implementations. We didn't have to change libgl
.
In software/platform.mk
, you can choose one of several different combinations of event and rasterizer implementations, but the three most useful pairings are
- network events and network rasterizer (full FPGA simulator operation),
- hardware events and reference rasterizer (events from the hardware but rasterize to an offscreen buffer and save to a file for comparisons and debugging), and
- hardware events and hardware rasterizer (use the final Alice 4 device events and FPGA rasterizer).