(It's drinking time, excuse the incoherency!)
Well, it's that time of the year--time to start writing evaluations on everyone. By amazing coincidence, Josh McLaughlin
seems to be in a similar predicament, and we had an interesting little chat on the subject.
There's an interesting passage in a book
about Air Force Colonel John Boyd
(Mr. OODA Loop
), where the author, Robert Corham, explores some of the ins-and-outs of personnel evaluation reports (hereafter referred to as "evaluations", though they're called "Officer Evaluation Reports" in the Army, "Fitness Reports" in the Navy, you get the drift).
Corham noted, astutely, that many officers or senior non-commissioned officers one finds involved in scandals will often present excerpts from their official evaluation report, and nearly all the time, those excerpts will be glowing. However, there's a secret code and numerous unwritten rules involved in evaluations. Few of these will actually appear in the official regulations, but service-wide, we're all compelled to follow suit, lest we waste all that ink in evaluation reports for nothing.
Across the board, evaluations are grossly inflated, sometimes to laughable extremes. Take the example of the evaluation report for a non-commissioned officer in the US Army. They are rated from 1-5 (best-worst) in various attributes, such as leadership ability, physical fitness, technical skills, etc. Rating a non-commissioned officer as a "3"--average--in any area is nearly a kiss of death for their career. A "2" (on a 1-5 scale), is generally considered average performance. In many ways, this is an echo of the grading system in many schools, where a "B" is average and a "C" indicates poor performance.
Nevertheless, officers are victims of inflation as well. Once upon a time, I had a brand new pilot--straight out of flight school--arrive at the unit. As per the norm, he received ten days to look for a new house and move in. Once he had moved in, he began in-processing. After turning in his flight records to the standardization pilot, he was informed that it would be a few weeks before he could start his initial mission qualification--some pilots had annual evaluations to complete, while others were already finishing up their mission qualification. Not to mention, there were all sorts of training missions being conducted by the unit; he'd have to wait his turn before hopping in the cockpit.
Meanwhile, it was winter, and Christmas was just around the corner. At many bases, business winds down during the holidays, and this particular pilot took a few weeks of leave. As he was returning, he was informed that he'd be switching units, as the battalion needed to cross-level some people. That meant that he needed an evaluation.
The new pilot was a great officer, don't get me wrong, but over the span of three months, he really didn't have the opportunity to prove himself. When filling out the evaluation, I saw three blocks, labeled "Best Qualified", "Fully Qualified", and "Do Not Promote". Since the new pilot hadn't even flown yet, I couldn't say he was the "best qualified" officer, so I selected the less superlative, "fully qualified".
A week later, I had to explain to my battalion commander why I had selected the middle block. I had almost ruined this young pilot's career because I had said that he was merely "fully qualified"! No one taught me the unwritten rules!
As Josh from al-Sahwa pointed out (I think Reach 364 said this also once), the language in evaluations is often full of grossly over-the-top rhetoric, often sounding hilariously like propaganda--a typical evaluation can often make an officer sound as if he or she is single-handedly disrupting, dismantling and destroying al-Qaeda and its extremist allies. (Damn, I gotta use that one).
Nevertheless, there are numerous subtle indicators of an officer's true performance and potential in evaluations which stick out during promotion boards. While you won't find anything regarding these in the official regulations regarding evaluations, they are passed on from officer to officer. This, of course, makes me wonder if it is indeed true, but such is the nature of evaluations, I guess.
Most commanders know that there is a huge difference between an officer who does "a magnificent job" and one who is "in the top ten percent". In the case of the former example, there are plenty of superfluous adjectives that the promotion boards will simply gloss over. In the latter quote, there is a specific, quantifiable metric--"top ten percent"--which the boards will recognize as an indicator of a solid performer. Again, no official source ever says this, it's just another "unwritten rule" one finds out as a lieutenant.
However, the most baffling unwritten rules are found in the Army's non-commissioned evaluation reports. Whereas the officer reports are basically written in the forms of paragraphs and narratives, the evaluations for the Army's sergeants are written as "bullet statements". Only, there are a few odd quirks to these bullet statements. Here's an example of one or two:
o served as the range NCOIC for a small arms range, qualifying 230 Soldiers over the span of three days; recognized by the battalion commander as the best small-arms range this year
Okay, that's a relatively generic one. Note some odd quirks. It's a "bullet" statement, so it begins with a...uh...lowercase "o". The verbiage after the "bullet" begins with a lowercase letter and ends with no punctuation. If I were back in second grade, a nun would have beaten me senseless had I written anything resembling this. (On a related note, I wondered why the writing in this blog had declined over the last few weeks. I guess it's all these damn NCO evaluations!)
Focus: Evaluations are full of bizarre unwritten rules and subtle nuances. I'm certain I forgot some of the better ones. Please fill me in.
Bonus: I bet the private sector is just as bad with evaluations.
In other news, back to drinking. Thank God my Snuggie allows me to stay warm and drink beer at the same time...