We have a confession to make. And it’s probably something you need to come clean about too. Over the years, as much as we shouldn’t have, we’ve revelled in every major subtitling mistake. From the time they drastically downgraded the Queen’s mode of transport…
LOL at BBC Subtitles: Apparently the Queen travels in cabbages – haha! pic.twitter.com/N7cATZN6vj
— Amiga #GoBoris #UKIndependence ???????????????? (@AmigaRules) June 22, 2014
…to the morning they offered an alternative view on the Wimbledon queue…
— Dan Walker (@mrdanwalker) July 3, 2017
…and the time they accidentally offered viewers a particularly blunt summary of Cromer.
I see BBC Breakfast is not a fan of Cromer. pic.twitter.com/5ylVFalIZR
— Paul Hughes (@Mr_Hughesy) September 6, 2017
Okay, maybe we can forgive ourselves a smile for that last one. But the rest? It turns out there are actually a few reasons – well, two major ones – why we shouldn’t see subtitlers solely as gaffe-ridden Ed Miliband types.
- BBC Breakfast’s Dan Walker had to correct this awkward subtitle error
- BBC News report on Prince Harry and Meghan Markle contained a very unfortunate subtitle error
The first: sometimes subtitlers’ mistakes simply aren’t. Remember when everyone thought the BBC used ‘the wrong subtitles’ during the US presidential inauguration last year?
That was, as Trump himself would say, fake news. It turned out that the mistake actually lay with a single faulty TV – one busted set had carried over subtitles from CBBC’s The Dumping Ground to BBC News.
— sharon Bell (@sharonbell111) January 21, 2017
Reason two: being a subtitler is really hard. They’re not just IT nerds accustomed to enjoying a fourth coffee break as a dust-covered computer spews out captions in the corner. And the Syntipatico system – the one that created subtitles for ‘Tweezer May’ in BBC comedy W1A? It, alongside any fully automated system, doesn’t actually exist.
The approximately 200 million words subtitled live every year on BBC channels? The captions that are now generated for almost 100% programmes? They’re crafted by an army of subtitlers, people with one of the weirdest – and most challenging – jobs in TV.
Okay then, so how’s subtitling actually done?
Live TV shows – and a lot of pre-recorded ones too, especially the ones that get edited close to their broadcast time like Have I Got News for You – get their captions through a technique called ‘respeaking’.
It’s basically what you’re thinking: as the likes of newsreaders and presenters talk on TV, one of the designated 200 English-speaking subtitlers from across the globe will sit in front of a microphone repeating whatever’s said on air.
Doing this means a clear voice, free of any background noise, can be processed by specialised audio recognition software that generates captions on the screen. It’s a hybrid system – one that relies on a computer and subtitler.
So, if Matt Baker says “Hello and welcome to The One Show” amongst a backdrop of applause, there’ll be a single subtitler somewhere clearly repeating “Hello and welcome to The One Show” into a microphone. And, without the noise of the clapping, the computer can produce the caption “Hello and welcome to The One Show” on screen. Simple, right?
But here’s the thing: there’s more to subtitling than repeating Matt Baker. You’ve got to repeat what Alex Jones says. And John Sergeant. And, well, everyone on screen. That means you talk a lot: while the normal person will speak roughly 10,000 words a day, at their normal work rate a subtitler will fire that off in a couple of hours.
Luckily, they’re not asked to speak for two hours straight. Or even one. Subtitlers are restricted to respeaking a maximum of 15 minutes at a time. They work 900 seconds on, 900 seconds off.
And no, this isn’t through laziness. From the technical to the bizarre, subtitlers have to pull off a plethora of duties – all at the same time. And this mountain of multitasking forms a job so mentally-taxing that any budding subtitler has to work for six months simply to reach trainee standard – after which they could easily fail their recruitment exam, as one in three do.
So, why do people flunk? Well, judge for yourself and imagine this: let’s say you’ve been fast-tracked through the subtitling training academy and been plonked in front of a microphone for your first live shift.
The first programme you’ve been assigned to: this edition of The One Show…
Here’s everything you need to manage – simultaneously and with a few percent margin of error – during the next 15 minutes.
Good luck. You’re going to need it.
Repeat everything like a machine
Remember when we said you have to talk clearly when respeaking? Well, your version of speaking clearly is different to what your voice recognition software considers clear. In fact, to get 100% accuracy, you’re basically going to need to talk like a computer.
“You really have to modify your voice,” explains subtitling trainer Calum Davidson, an expert captioner of eight years. “Every word needs to be clipped and you need to stress every syllable. Plus, you’ve got to space out every word. But then at the same time, you have to say everything quickly too.”
The result? Some very Microsoft Sam-style speech (check out Davidson’s expert respeaking example above) and some painful side effects at the end of the day: “You often go home with a very sore face. You’re speaking so much and in such an unnatural way for what adds up to hours,” says Davidson.
Put in the Punctuation
Nope, the voice recognition software isn’t going to add the full stops and question marks automatically: that’s on you. And the fastest way of putting them in? Say them. That means if, for instance, The One Show’s Alex Jones says, “can we at least have our theme tune?” then you will literally have to say “can we at least have our theme tune question mark”.
Just a warning: after a long shift, this vocal tic might hang around longer than you’d like, according to Davidson: “After shifts often some of the re-speakers might go down the pub and literally say ‘Do you want a pint question mark’.”
Probably best wait for somebody else to get the drinks in then.
Hear the homophones
They’re. Their. There. Not words that voice recognition software can easily tell apart. So, how do you get them captioned?
Although your computer will normally recognise “there”, you’ll have to use “they are” instead of “they’re”. And “their”? You need to program a voice shortcut (more on that below) or use a replacement word that’s as close to the verbatim as possible. And all that could get tricky if there are more ‘their’s or ‘there’s in a short period, especially if they’re in short succession.
In other words, you’d have a nightmare subtitling that last paragraph.
Change the colour
Being a fresh subtitling graduate you’ve probably seen that subtitles change their hue to indicate a change of speaker. And the person in charge of this change is – you guessed it – you.
Mercifully, this bit is easy to pick up: while respeaking you’ll have a small keyboard to hand with a line of four coloured buttons – white, yellow, cyan and green. Press one of these and the text will change to the corresponding colour. Easy.
Just remember: the main speaker or presenter’s captions should appear in white (that’s the easiest colour to read on a black background) with the next three colours assigned to other speakers.
And if there are more than four people in a conversation? You go through the sequence of colours again, remembering which is allocated to each person. Providing you’re not asked to subtitle a Blazin’ Squad reunion interview, you should be fine.
Shift the subtitles around the screen
Turn on the TV right now, throw on the subtitles and the chances are they’ll be firing around the bottom of the screen. But, this isn’t always the best place for captions.
— Portchy (@HeadlongCabbie) January 31, 2017
“This comes up a surprising amount,” says Calum. “For instance, if somebody is demonstrating how to chop an onion on a cookery show, but there’s a subtitle saying ‘THIS IS HOW YOU CHOP THE ONION’ right over the onion, it’ll block out the view and people could get the whole thing wrong.”
So, how would you prevent a nation-wide epidemic of onion-related chopping injuries? Move the subtitles. By using yet another keyboard you can shift the text to one of 22 lines on screen.
Why do you need so many? Well, in case you’re subtitling an interview for a news channel, for instance.
Think about it: the subtitles will already be raised slightly to avoid spilling onto the bottom ticker – and will have to be raised even higher if there’s a ‘breaking news’ alert on screen. At this point these elevated subtitles might be blocking view of a reporter’s mouth, which is a big no no for any subtitler – many deaf viewers rely on lip reading.
Just to add to that challenge, the subtitles you just moved away from the reporter’s mouth might cover the interviewee’s in the next shot. Which means you need to move them again. And again when the shot cuts back.
We never promised it would be easy.
Program a few voice shortcuts
You might have noticed from your hours of avid subtitling viewing, that it’s the monosyllabic words that throw up the most issues: ‘ants’ could be captioned as ‘ands’, for instance. And if a mistake happens once, it’s likely to come up again…
“One of the hardest things I’ve had to do is subtitle the Badminton Horse Trials a few years ago,” says Calum. “It’s a sport I know nothing about, so I spent ages researching the show, learning all the horses’ names, the jockeys, jumps and different moves. And when I went on air I realised the speech software didn’t recognise the word ‘horse’. It went out as ‘hearse’ the entire show. I had to work around it – I was saying phrases like ‘equine beasts’ and ‘this fine animal’ all the way through.”
To be fair, that could have played out much worse…
So apparently BBC needs to work on their translation for subtitles…? pic.twitter.com/K6cjgs0KZs
— BrahmKornbluth (@brahmkornbluth) February 1, 2014
Fortunately, subtitling software now comes with a hearse-proof solution. While a broadcast is going out you’ll have to program a quick macro to fix any reoccurring slip-ups.
That means filling out a small form on the computer in front of you to create a voice shortcut. Instead of trying to say the word ‘horse’, you can just say ‘macro one’, for instance, and the software will substitute ‘macro one’ to ‘horse’.
Easy to do by itself. Less so when you’re respeaking, adding punctuation, changing the text colour, position and everything else below.
Research. Do lots and lots of research
And do it before you go on air, if you can. What for? Well, for every one of your broadcasts you need to anticipate what tricky words might come up.
Preparing for that is the best way to prevent a mondegreen (misunderstanding a phrase and repeating with a similar-sounding version – think the Jimi Hendrix lyric “excuse me while I kiss this guy/the sky”).
Adnan Januzaj has been told he can leave Manchester United for £10million.
Hopefully BBC's subtitles take note. pic.twitter.com/vXbBmjd2mY
— Blog of the Net (@BlogOfTheNet) August 4, 2016
So, before a football game he’s subtitling, Davidson learns the names of all the players and program shortcuts for them all. And if John Craven comes on The One Show to talk about a certain bacteria, he’ll learn everything he can about that bacteria, plus what Craven’s been up to recently, to avoid any slip-ups before going on air.
But there’s a key problem with this: you won’t have time to be completely prepared. Raking through possible words that could crop up over eight hours of TV is an absurdly large task. So often subtitlers compensate live on air. While doing everything else they’re scouring the web on the given topic, calculating if any phrase might cause problems.
And, no pressure, but if you’re really cut out for subtitling you should be able to do this with mental space to spare – “I know people who can sit there and do their shopping list while on air,” says Davidson.
Actually notice what’s happening on screen
As you can hear above, finding any understanding in what’s going on is mind-meltingly difficult as when you’re respeaking you’re essentially talking over somebody else. Half a second after they speak, you start, meaning you’re talking simultaneously.
But why do you really need to work out what’s being said through the chatter? It’s a major way to anticipate what tricky words are on the horizon of the conversation and work out where the subtitles need to be placed on screen – if you can’t tell a speaker is talking about a certain object that’s on screen, you won’t work out your subtitles are in its way.
Worst still, if you’re not able to hold your attention you might be doing more work than necessary. This might happen if the show Final Score lies on your subtitling shift. During recaps of the weekend’s sporting scores, results may be read out – you don’t want to be saying something that’s already on screen and confuse the viewer. And, let’s face it, you don’t need to make this job any harder.
Don’t forget to breathe
To keep up with what’s happening on screen, the best method is to fill your lungs when the speaker on-screen does the same. “You never choose when you breathe when you’re respeaking,” explains Davidson. “The speaker on air gives you your pause for breath – it’s all up to them.”
That means you might have a big problem if you’re subtitling Huw Edwards. “He pauses in the middle of the sentence and in really awkward places!”
However, sometimes there’s more to worry about than Huw Edwards suffocating you. Sometimes you’ve got to avoid laughing on air. And if you’re really focussing on what somebody is saying, like pro subtitlers do, it’s easier said than respoken.
“John Prescott used to make me laugh because he’s so difficult to subtitle – the man never finishes a sentence!” laughs Davidson. “He’s very difficult to understand and you’re trying to follow it, you’re trying to make it readable, but you can’t edit a politician. If I made John Prescott speak perfectly then the viewer’s going to have the impression that he’s making grammatical sentences.”
What would happen if a normal person became a subtitler?
Subtitlers themselves say their job isn’t easy. But nobody would ever admit if their job is actually a breeze – especially if they only do it for 15 minutes at a time, right?
So, what would happen if a layperson tried it? Somebody with no voice coaching and only five minutes of subtitling training? Say, this writer? How would I manage subtitling that tiny segment of The One Show?
Short answer: terribly. Longer answer: I fell at the first hurdle, achieved a ‘failed transmission’ from Davidson and lost the right to poke fun at any subtitling mistake again.
My (senseless) captions in full read:
Fortunately, that wasn’t actually broadcast as I was subtitling a piece of old footage. Unfortunately, about two words in I realised respeaking requires a very specialised mindset. One I didn’t have.
Why? Respeaking means continually talking over the person you’re paying attention to – Matt Baker or Alex Jones, in my case. In the moment it felt like I was committing a major social faux pas, continuously interrupting somebody else. It genuinely took some self-reassuring that I wasn’t being incredibly obnoxious.
But eventually, and over several attempts, I managed to get through two minutes. And I even threw in the occasional punctuation point, changed the text colour once and even managed to remember to breathe. Sure, my attention was so stretched I didn’t know what I was saying, but by the end it was starting to feel like I’d got the hang of it.
I hadn’t. In a just couple of minutes I scored a ‘failed transmission’ i.e. I was as good as somebody who didn’t turn up.
The bleeps represent where I gave up/swore. Please don’t judge me too harshly
But what I lacked in colour changes, punctuation and general sense, I made up for with surrealist imagery, including the phrases “our postman annoyed its jump Grogan” and “Prince Philip was there and he came at me”. Great lyrics for a psychedelic punk album, potentially libellous captions for the BBC.
Yet however bad my Ofcom-baiting and Twitter-storm-inducing captions were, I’d unknowingly made the worst error of all: I’d given up. By stopping and starting my attempts I’d failed to demonstrate the sheer perseverance possessed by all subtitlers.
“The most important thing for any respeaker is to never stop talking,” said Callum. “You can’t just give up because you’re on air. One of the things that we really put trainees under the microscope for is their resilience to keep on going.”
But not only do you need to keep your resolve on air, you need a fair bit of grit afterwards. “There’s not a lot of glory in subtitling. After a hard shift there might be some ‘LOL he meant to say pianist’ tweets (our software a few years ago was geared towards medical terms for some reason). You’ve got to have a sense of humour, but it’s annoying that our mistakes are the focus.”
So, for this short moment let’s put aside all those accidental Freudian slips in the subtitles. Let’s ignore the misspelt celebrity names. Let’s, just for a second, give subtitlers a well-earned silent pat on the back. Because if they go, we’ll be stuck with Syncopatico. Or worse: me.