I joined SVS when it was 34 people strong in 1995. They grew to over 100 employees before Boeing bought the company. During my tenure at Boeing-SVS I was involved in many different projects, mostly related to directed energy programs. Most of the folks at SVS came from the Air Force and had participated in projects like the Airborne Laser Lab and other 'star wars' endeavors.

What I particularly enjoyed about SVS was the diverse set of projects that I was able to work on. These included building the prototype trackers for the Airborne Laser, the recording system for the Advanced Tactical Laser, a tracking demonstrator for the C-RAM program, a jet fuel reclamation station, a video-based tester for the Lightweight Exoatmospheric Projectile, and an internal R&D project for a solid-state laser.


Senior Embedded Software Engineer/Real-Time Software Engineer


Airborne Laser

Airborne Laser

I was software lead for the pointing and tracking system on the Airborne Laser. As part of my responsibilities, I was the prime developer on the video trackers. The most demanding of the video trackers had a source CCD that ran at 5 KHz and had eight data outputs. We used 32 PPC processors to collect and process each frame. Each PPC reported data in a tree-format; the root processor determined the final track error and reported it to the optical controls. There were 5 other trackers, some for target detection and some for laser guidance. For development and test, I created a series of user interfaces in Tcl/Tk for command/control of each tracker.


Phalanx for C-RAM

I really enjoyed this project because I got to see things blow up from the results of my labor (I've also seen PCBs ignite, which is quite a different adventure). Watching a mortar explode from a rapid-fire phalanx at night is quite spectacular! A small group from SVS rapid-prototyped a mortar tracking system and deployed it to the C-RAM tests. The system primarily comprised of a gimbal and a single IR camera. The controls accepted a radar signal for the physical position of the mortar; once the target was in the field of view, the system tracked the mortar until it was either decimated or to impact. Our tracking algoirhtms worked in both day and night scenarios.

The software ran on a PPC running VxWorks. Most of the control algorithms were written in Matlab and code-generated for the real-time system. The PPC controlled the gimbal via TCP/IP and interfaced with a home-grown FPGA-based tracker. Image data was collected via a COTS storage system for post processing.

Air Tactical Laser

Air Tactical Laser

My primary role for ATL was software lead. The team consisted mostly of self-starters, so my "administration" job was straight forward. Most of my leadership role focused on requirements definition, enforcing consitent code develoment across different subsystems, and verifying interfaces between sub-systems.

My technical role on ATL was the development of the recording sub-system. The sub-system recorded the various video sources and data streams on the aircraft. There were 5 cameras used in the optical train, although none as stressing as the 5Khz ABL camera. I designed a generic engine to handle recording of a single data stream. In some cases, multiple instances of that engine ran on one processor; in others, a processor handled a single camera input. The engine used configuration files to specify the frame rate, image size, and image format of each video sensor.

Lessons Learned

Fun Facts



Notice: images acquired from various government public-access web sites.