Word Count
Writers Talk About Writing
Performant Jitters and Other Programmer Talk

Last week we heard from Mike Pope, a technical writer and editor at Microsoft, about how mathematical terms evolve in common usage. Now Mike introduces us to some unusual jargon in the computer programming community.
The thing with a language community — programmers, say — is it can be hard to tell whether they're using jargon because it's precise, or just for its own sake. Or for efficiency. Or maybe just to have fun. Let's see what you think.
When programmers describe something as performant, is it just a fancy way to say "fast", or maybe "memory efficient"? Grumpy editors will occasionally think so and strike the term as jargon. But no; performant, while certainly an insider's term, is useful precisely because it's somewhat fuzzy. Obviously, a performant program is one that performs well, but what does that mean? Is it fast? Measured how — fast to load, very responsive, or what? If it means small, does that mean it uses little memory, or little disk space, or is it a small package to deliver over the Web? In the many times I've heard people ask exactly what performant means, the best answer I ever read came from a programmer I used to work with: "It means whatever the user thinks good performance means." A kind of profound non-precision, and very useful.
If you want to get a nice long thread going someday, ask programmers what the difference is between persist (used transitively) and save. (Here's a taste.) Everyone understands that you save a document or a database record. Programmers will also persist settings or persist state. Is that just a fancy way to say save? Broadly speaking, probably yes. However, there is (or can be) a subtle flavor to persist of capturing and saving something that's fleeting. (In the thread I linked to, one programmer says, "It makes a transient instance persistent.") But many programmers would dispute this, and perhaps it just sounds cooler to say that you need to persist something. Even so, I doubt that even programmers use a camera to persist their precious memories.
Here's part of a sentence I ran across recently: "Chip architectures with a small number of registers require the jitter to be very judicious." Require the what? This term virtually requires the interlocutors to know some etymology. First: a compiler converts (compiles) a programmer's code into machine code, which is what the computer actually knows how to run. In various scenarios (and skipping a lot of technical detail), sometimes it's useful to delay this process until just before the code actually needs to run. This requires a special kind of compiler, which is referred to as a just-in-time compiler. Do you see what's going to happen here? The just-in-time compiler becomes the JIT compiler, which becomes the JITter, which becomes the jitter, and soon you need a judicious one.
I love jitter because it illustrates a common agent-noun pattern of appending -er to something. Other examples: what do you call a process that gets a value? A getter. The one that sets a value? A setter. These types of constructions, I think, we have to attribute to efficiency: it's just way faster to type jitter, getter, and setter than their more editorially sanctioned equivalents.
Suppose your jitter isn't performant and your setter won't persist its value. You, my friend, are horked, which is a programmer term (one of many) for the end state following a series of unfortunate events (to quote Lemony Snicket). Horked seems to be a comparative youngster (allowing for the modest age of all computer jargon). One person postulates that the term derives from the cartoon character Ren Hoëk of the "Ren and Stimpy Show." Whether that's true or not, I think we can safely speculate that horked is not used for precision, or to sound grand, or for efficiency. Just pure fun.