I've spent the last month or so engaged in a job hunt. That job hunt has now come to a close, as on September 26, I start work at my new place of employment.
During this job hunt, a few places have asked me to write sample essays or articles as part of the interview process. The following is one of those essays. The company in question was very big on using OODA Loops as part of their overall design philosophy and wanted to know if I could wrap my mind around the concept.
While writing the essay, I realized that OODA Loops are easy to apply to software design as the concept merely describes what most good programmers already do. The Observer, Orient, Design, Act methodology greatly reduces the time wasted in redesigning and rewriting code. If done properly, the actual programming is a matter of following the pseudo code that's already been worked out.
The following assumes some familiarity with OODA Loops, which can easily be gained in a few minutes by reading the Wikipedia article on OODA Loops.
The main goal of any User Interface should be to allow the user to accomplish their desired goal as quickly as possible. Despite the typical use of OODA Loops to disrupt the opposition, be they business or military, the same principals can be used to enhance the user experience and create the illusion of having fully anticipated the user's desires.
The OODA Loop cycle was designed to describe a military engagement. It's tempting to depict the end user as an Enemy Combatant. In this scenario, the process of getting inside the enemy's OODA Loop would consist of anticipating the User's actions and presenting interface options that lead them in the direction you desire.
This model has a few flaws. First, it describes an adversarial relationship between the user and the developer. A typical development cycle already has enough stresses between these groups without creating more in fundamental design models. Second, all Military analogies break down. In many respects the User's goals and the developer's are the same. It would be a bit like describing how to help an enemy pilot bomb your own base.
A more useful metaphor would be to view the User as your own soldier. The goal of a UI developer becomes streamlining the user's entire OODA Loop, eliminating bottlenecks and avenues for error.
Example Project, The Report Interface
I unknowingly used many OODA Loop concepts in developing a report interface when at FinancialCampus. The "second generation" interface created by a subsequent developer was rejected by users in part because its development ignored the OODA Loops concept. let's examine both development processes in terms of OODA Loops.
The UI goal was to allow a Compliance Officer (CO) to access and act upon the exam status of his or her Brokers and determine who had not completed their NASD Mandated Continuing Education.
First, I needed to Observe. This meant gathering data on what the COs did with their data and how they needed it presented. The main tasks could be broken down as follows:
• Get User Feedback on existing reports
• Review sample reports created manually by end users
• List NASD Legal requirements
• Record data on technical expertise of end users
• Record data on interfaces with which users are already comfortable.
• Get samples of any data files used by the end user.
• List other data available in the Learning Management System (LMS).
Simply acting upon this data would have resulted in either dozens of reports, or a handful of monolithic reports with a complex array of options. COs would have become trapped in the "Orient" stage of their own OODA loops. My own Orient loop was initially a bottleneck until I reevaluated how I was approaching the data. Instead of creating a single report to make everyone happy, I decided to create multiple reports, each one designed to meet the needs of a discrete group of users.
My Orient phase consisted of determining how the potential data sets interacted and who needed that data. I also evaluated who wouldn't mind a few extra bits of data in their report. The end result was a large chart on my wall, listing each piece of data in the system. Legally required data was highlighted. Color coded lines connected the data fields to the customer reports. The color coding flagged the data as "Legal," "Bureaucratic" or "Optional."
Orient: Data is placed into three categories:
• Legal: Information the Compliance Officer MUST present to comply with law
• Bureaucratic: Information the CO needs to present for internal business or workflow reasons.
• Optional: Data present in the LMS but not needed in any reports. This was addressed later.
Once I knew who needed what, I began the Design phase of merging reports. My target was six to eight reports and I met that goal. In retrospect, I could have automated much of this if I had used Semantic Data models and defined relationships such as "Farmers Insurance Needs the exam pass date" and "NYLIC does not want the exam score." This would have allowed me to condense the needed reports faster.
• Distill format data into the smallest number of possible formats. This process resulted in six basic reports with two to six options each.
• Order the reports by likely number of users
• Define a login process using The IntraLearn user data, as all Compliance Officers were also students. Each report required the appropriate login, and was restricted to the data for a specific company.
By this point, Acting was easy. I simply sat down and began writing the reports I'd specified. I was tempted to engage in another Observe exchange with the COs at this stage, but decided against it, as I did not want to become trapped in an Analysis Paralysis loop, or to allow Feature Creep to delay the deployment of the initial reports.
• Write the login process for the Reporting Interface.
• Write the report that will be used by the most Compliance Officers. Place that report in a menu that allows for the addition of new reports.
• E-mail the COs about the new reports as they are added.
The Other Loops
During this outer OODA Loop, an overlapping loop was taking place to deal with the "Optional" data. The additional data available, such as the number of times a user had taken an exam, was presented to the Compliance Officers to determine if any of them needed or wanted that data. Additional data was incorporated into the reports as appropriate. For example, the "Times Exam Taken" counter was added to the "Full Data" Excel and CSV export because the only customers who wanted that data were already importing the data into PeopleSoft.
A new OODA Loop began as each report was deployed. CO Feedback initiated the Observe phase. That data was then evaluated for potential use in modifying the existing reports in an Orient phase. If the data was deemed appropriate for use in modifying the report, a Design phase occurred, followed by an Act phase to get the updated report to the end users.
The Resulting Reports
As a result of the above OODA Loops, the Compliance Officer's OODA Loop looked like this.
Observe: Read the brief descriptions of the reports and their options
Orient: Select the report that best met their data needs.
Design: Select the options presented for each report. This typically consisted of such options as the level of detail provided, sort order and if rows will be per rep or per rep/course combination.
Act: Print or download the resulting report, fire, suspend or reprimand Brokers as appropriate. One report even allowed the COs to e-mail the recalcitrant reps with a bulk e-mail or individually, warning them of their pending deadlines.
Two more reports were added within three years, as new customers came on board with new reporting needs.
The "Second Generation" Interface
What did the user rejected "Second Generation" interface get wrong?
At the then CTO's Direction, the developer conducted minimal Observation. Only the data available in the database was reviewed. No other end user requirements or input was taken into account. Due in part to the small data set gathered in the Observation phase, minimal Orientation was possible.
The rationale given for this minimal Observation was that the exiting reports were "truncated" and "failed to take advantage of the data IntraLearn offered." It was further stated that FinancialCampus needed to "Show off what we can do to compete with the big boys."
The Design phase consisted largely of the developer recording every data manipulation he could imagine. Designing and Acting were not separate phases, as the reporting system was written while the developer conceived of new options. This required the developer to completely scrap the existing code twice and begin again from scratch.
As a result, the Compliance Officer's OODA Loop was disrupted. The new system had a steep learning curve, and CO's were unable to separate the OODA Loop stages. They became trapped in the Orient phase, unable to process the Observations presented in the reporting system's "Help" documentation. Few were able to Successfully Design their reports.
Many CO's called or e-mailed to complain about the complexity of the system. FinancialCampus' own CTO, VP and Owner were unable to understand or use the new interface, but still ordered a global switch to the new system.
The above examples show that the OODA Loop process can be used to create a User Interface that optimizes the User's own OODA Loop, even if the user is unaware of the concept. OODA Loops also help compartmentalize the development process, reducing churn and wasted effort later in the product's life cycle. If the Observe, Orient and Design phases have been carried out, Acting will be little more than using the Pseudo-Code to write the final application.