Ironically overlong exposition on shortforms.

Disclaimer:  This post is all about steno stuff.  If you’re not a steno, or if you are a steno who’s had a long day at the edit station, you could safely ignore this one.

I taught myself steno, about 12 years ago, using Phoenix Theory.  Although I had a couple of mentors, they didn’t write Phoenix or theories even remotely like Phoenix.  So when it came to altering my main dictionary, I basically never did it.  I even had a “second” main dix with any changes I made, so my “main” main dix was as canned as the day I first installed it.  I was, for at least five years, what they call on the forums a pure Phoenix writer.

In my earliest court jobs, I even stroked out counsel names in full, and used /STPHAO and /SKWRAO for speaker colloquys, even though that made no sense to me – because that was just “the right thing to do”.

When I started captioning, I began introducing shortforms into my writing but I did so very carefully and infrequently.  I loved sports captioning and there’s just no way to caption an Olympic marathon when you’re stroking “Haile Gebrselassie” in full every 30 seconds*.

*Does not apply to Helen Case and Traccee Hunter.

Captioning BBC World News, I realised the odd shortform for a Russian politician or a Kazakhstani geographical feature wouldn’t hurt. And only a fool would CART third-year statistics or applied chemistry lectures without incorporating some helpful briefs.

The thing is, like other Phoenix writers I know, I actually felt a bit guilty, like I was corrupting the dictionary or something.  The canned dictionary is so massive (I think 175,000 words or something?) and so comprehensive, there’s little need to ever add anything from a vocabulary point of view.  Even the most complex scientific or medical words are in there, often in two or three stroking iterations, and there’s countless times I’ve phonetically stroked an unknown word, and then opened my eyes to see it has tranned perfectly.  I don’t think it’s ever let me down.  That confident-boosting property is as valuable to me today as it was when I first started writing.

Non-Phoenix writers tend to hate on it because of how stroke-intensive it is.  And it is.  It’s definitely not the most efficient way to write.  But I believe (having edited for writers using probably 20 other theories) it is the most failsafe.  If you properly know the theory, it will not let you down with word boundaries or homonym issues.  Most of the mistakes I make are because it’s not natural for me to think in an American accent.  I have trained myself fairly well to stroke unknown words “Americanly” so they tran, but obviously for most of the day I think in my regular Australian accent, and sometimes I’m slow on the switch.  The first thing I did in my “second” main dictionary (which at that time functioned as my default first job dictionary) was to change all the American sounds I could, e.g. replacing /TKAUG with /TKOG.  But other than that quite specific issue, I can’t report a complaint with Phoenix.

I guess its realtiming near-perfection was why I was always so reluctant to “mess it up” by creating shortforms.  But since moving to Hong Kong, I’ve started shortforming my arse off (used in the steno sense only, not in terms of my actual arse – that’s definitely wideformed since moving here).  Or trying to anyway.  It still feels unnatural.  But I’m open to it at least.

I’m really glad I waited this long (12 years after “graduating”) to start briefing.  There have been many occasions in that time when I’ve felt I knew enough about my dictionary and writing style, and how software operates and interacts with other external software and hardware, to change my whole writing style – but I would have been wrong.  It’s only really the past year that I’ve felt I’ve done enough varied work in my own career, and had enough solid interaction with writers (mediocre, outstanding and everywhere in between) schooled in a huge assortment of theories, to feel totally confident in tackling this.  (People who have years on my 12 and are shaking your heads at my conviction, hold your peace – it’s too late now!  Or advise me!)

I don’t think it’s anything to do with being in Hong Kong; more to do with being back in court after five years in broadcast TV.  It’s my opinion that a broad shortforming habit is pretty dangerous for captioning – for anyone, newly graduated writers who learned a realtime-ready theory, but especially our old-school colleagues who began working before CAT.  I trained a few 20-year-veterans making the switch from CR to captioning, and it was a challenge.  Not just the dictionaries, but the brains.  The most common refrain heard in the captioning training room wasn’t what you would expect (“Naomi Robson is so shiny in real life!”).  It was, when defining new shortforms, “Oh, that word boundary conflict will never arise, let’s just DO IT”.  Mate, we’re shorthand writers, not dictionaries.  We can’t predict what word boundaries will arise.  We have a better chance in court where things are somewhat more controlled, but even then I don’t feel the threshold is high enough to comfortably make shortforms involving word parts. (I love Eclipse, but I’m gonna say the conflict option feature is crap and has blemished many a LiveNote screen I’ve been watching.)

In live-broadcast captioning, there really is no chance to predict what will or won’t come up.  The rules of English (fluid as they are) don’t apply.  My biggest irritation with voice-captioning is that the voice-recognition software, when it’s stumped, will simply follow a few contextual rules and insert a similar word it THINKS fits in.  With its “brain”, which is totally unequipped to respond to nuances in language.  It means you get error-free captions in the sense that every word trans, but which don’t make any sense because there are random words sprinkled throughout.  At least a steno mistake, while undesirable, appears as gibberish and you’re aware it shouldn’t be there.  Shortforms comprising word parts tend to pop up in a similar way to the voice-captioning artificial “intelligence” – either jarringly offensive, or so inconspicuous as to fly past in the editing process.  Neither of which justifies the hundredth of a second saved in executing the shorter stroke, in my opinion.

To that end, though I’ve joined lots of shortform communities on Facebook and other forums, and they’re packed with ideas for words or phrases I’d love to brief, I can’t use most of the suggestions.  They’re either incompatible with Phoenix, or, in my mind, riskily “normal” strokes.  So while I’m now introducing plenty of shortforms and appreciating the break it gives my fingers, I’m doing it slowly and using ridiculous phonemic combinations.  I’m following a pattern where possible (using underused vowel blends, or asterisk-ing the initial consonant, etc etc).  But these combinations don’t come naturally yet, so I’ve currently got an ever-changing bunch of Post-it Notes stuck over the screen on my machine for easy reference.

Imagine your reporter turned up on the job looking like they had instructions for steno taped to their very machine.  BRINGING THE PROFESSIONALISM!

The scene at my dep this week…unsurprisingly I got asked for a read-back in the first five minutes…

Any other Phoenix writers out there with shortform experience?


4 Comments Add yours

  1. Karen says:

    Jade, I’m a Phoenix Theory writer too, but as you know, I’m still in my 120s, and not briefing that much… yet. I use mainly the shorts that are already in the dictionary to help me save a stroke or two. Otherwise, I’m still stroking everything out. I’m a bit surprised to hear that you didn’t use many briefs until recently! That is very interesting, as I thought being in captioning, where everything seems to be spoken so quickly, would mean you would have to have lots of briefs in order to keep up..
    P.S. I’m arriving in Hong Kong on the 9th. 🙂

  2. jadeluxe says:

    In sports captioning, I briefed heaps…but news stuff, I always found it too obstructive to a clean feed…

    Hey you’re here tomorrow! 🙂 PM’g you on FB…

  3. Glen Warner says:

    Hi, Jade!

    I add briefs to my dictionary — or more correctly, my “persistent job dictionary,” which goes by the unlikely name of “Glenz — all the time … but before anything gets added to that dictionary, I search for the steno in my Phoenix Theory dictionary (also (largely) untouched). If I don’t find any entries in the Phoenix Theory dictionary that use those outlines, then the outline gets added to Glenz dictionary.

    As an “aside,” I have a copy of digitalCAT running under Wine on my Mac (see if you’re curious).

    DigitalCAT has an application which I use to manipulate dictionaries called the Dictionary Maintenance application. Like Jade, I gather briefs and phrases from the various groups on Facebook. When I find one, I then check to see if they conflict with any existing outlines, then add it … and then (after a week or so) I will export the Glenz dictionary as an .rtf file and import it into my PC’s copy of Glenz dictionary.

    Sounds complex, but it actually works!

    I also have several dictionaries available to search through for those times when I am stuck for a brief (and Facebook is unreachable, as it is as I write this). Those dictionaries include StenEd, DigiText, CRAH, StenoMaster, StarTran, Stenograph, and StenoTips (you know … from Keith Vincent’s Steno Tips page: I had a terrible time stroking the Phoenix Theory version of “preponderance of the evidence” (PRA*U-FDZ) so I grabbed the StenEd version (P-PD). Throw in an asterisk, and you’ve got a Phoenix Theory-ready brief! :o)

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s