Note that there are differences in how they are described in some cases:
“grinning face” is also “beaming face”
“beaming face” is also a “smiling face”
“open mouth” is described by JAWS/Narrator but not by NVDA/VoiceOver
“big eyes” are described by NVDA/VoiceOver but not by JAWS/Narrator
“cold sweat” is “sweat” and also “sweat drop”
…
The differences don’t matter to me (but I am just one and not the intended consumer), as I usually experience just the symbol. Reading the text descriptions is useful though as
quite often I have no idea what the symbols are meant to represent. It is also true that emoji’s take on different meanings in different contexts and to different people. For
example I thought 🤙 meant “no worries” but its description is “call me hand”, what do I know 🤷
What Steve observes is representative of a the two sides of emoji’s biggest problem, which are
that when people use them for their figurative meaning, there’s a chance that they have a different interpretation than others (this is, of course, a risk with any communication,
although the effect is perhaps more-pronounced when abbreviating1),
and
when people use them for the literal image they show, it can appear differently: consider the inevitable confusion that arises from the fact that Twitter earlier this year
changed the “gun” emoji, which everybody changed to look like a water pistol
to the extent that the Emoji Consortium changed its official description, which is likely to be used by screen readers, to “water pistol”, back to looking like a firearm. 🤦
But the thing Steve’s post really left me thinking about was a moment from Season 13, Episode 1 of Would I Lie To
You? (still available on iPlayer!), during which blind comedian Chris McCausland described how the screen reader on his phone processes emoji:
I don’t know if it’s true that Chris’s phone actually describes the generic smileys as having “normal eyes”, but it certainly makes for a fantastic gag.
Footnotes
1 I remember an occasion where a generational divide resulted in a hilarious difference of
interpretation of a common acronym, for example. My friend Ash, like most people of their generation, understood “LOL” to mean “laughing out loud”, i.e. an expression of humour. Their
dad still used it in the previous sense of “lots of love”. And so there was a moment of shock and confusion when Ash’s dad,
fondly recalling their recently-deceased mother, sent Ash a text message saying something like: “Thought of your mum today. I miss her. LOL.”.
This is the second in a series of four blog posts which ought to have been published during January
2013, but ran late because I didn’t want to publish any of them before the first one.
I spent the weekend of my birthday working in London, alongside the Squiz team, who make the CMS that forms the foundation of most of the public-facing websites of the Bodleian Libraries. We’d originally scheduled this visit for a different
week, but – in that way that projects sometimes do – the project got juggled about a bit and so I found myself spending the week of my birthday away from home.
But on Tuesday – my second day working on-site at Squiz’s office, and coincidentally my birthday – disaster struck! Our first clue was when the lights went out. And then, a minute or so
later, when the fire alarm started going off. No big deal, we all thought, as we gathered our possessions and prepared to leave the office – it’s probably just that the fire alarm
sounds as a precaution if it’s electricity supply is disrupted… but as we started to go down the stairs and smelled the smoke, we realised that there really was a fire.
The first two fire engines arrived within minutes. Apparently, they don’t mess about when a city centre office block catches light. The smoke was very visible from the street: thick
grey plumes pouring out from the basement windows. Theories about the cause of the fire were whispered around the assembled crowd, and the consensus seemed to be that the substation in
the basement had overheated and set alight its room.
A third fire engine arrived, and – after about a quarter hour of assessing the situation and controlling the crowd – we were told that we wouldn’t be able to get back into our building
for “at least an hour, probably more.” So, being British, we therefore decamped to one of the nearby bars for networking and a round of gin & tonic. After I texted some friends to say
that I hadn’t expected to spend the afternoon of my birthday in the pub, but that it wasn’t an entirely unwelcome experience, a few of them had the cheek to ask once again how
the fire had actually started.
By the time we were allowed to return to the building, it was already getting dark, and we quickly discovered a new problem that faced us: with the power still well and truly out, the
electronic door locks that secured the offices had become completely unusable. Not willing to abandon my laptop, keys, and other personal possessions overnight in an unfamiliar office,
I waited around until a locksmith had been summoned and had drilled his way through the cylinder and allowed us into the building.
It being my birthday, I’d arranged that Ruth would come and spend the night down in London, and that we’d go out to
Dans le Noir, a restaurant that I’d heard about from news articles and via friends some years prior, and always
wanted to try. The restaurant has a distinct and quite remarkable theme that you probably won’t find anywhere else: that theme is that you eat unidentified food in pitch blackness.
As our (blind!) waiter, Gao, led Ruth and I by touch to our table, we suddenly realised that we’d all but forgotten exactly how dark pitch blackness actually is. When you stumble over
your coffee table in the dark on a morning, that’s not truly black: there’s that sliver of light coming from underneath the curtains, or the faint glow of the LED light on the stereo.
Real, complete darkness is disorienting and confusing, and to sit around in it – not even able to see whether your eyes are open or closed – for hours at a time is quite remarkable.
It took us a little while to learn the new skills required to survive in this environment, but Gao was incredibly helpful. We worked out mechanisms for pouring drinks, for checking
whether our plates were empty, and for communicating our relative movements (being geeks, as we are, Ruth and I quickly developed a three-dimensional coordinate-based system for
navigating relative to an agreed centre-point: the tip of the bottle of our mystery wine). We also learned that there’s something truly humbling about being dependent upon the aid of a
blind person to do something that you’d normally be quite capable of doing alone: simple things, like finding where your glass is.
But the bigger lesson that we learned was about how darkness changes the way that we operate on a social level. Ruth and I were sat alongside another couple, and – deprived of
body language, the judgement of sight, and the scrutiny of eye contact – we quickly entered into a conversation that was far deeper and more real than I would have anticipated having
with total strangers. It was particularly strange to see Ruth, who’s usually so shy around new people, really come out as confident and open. I theorise that (in
normally-signted people) eye contact – that is, being able to see that others can see you – serves as a regulator of our willingness to be transparent. Depriving it for long enough that
its lack begins to feel natural makes us more frank and honest. Strange.
Back at Squiz the following day, there was still no electricity. Credit is due to the team there, though, who quickly put in to effect their emergency plans and literally “moved office”
to a handful of conference rooms and meeting spaces around Shoreditch. “Runners” were nominated to help relay messages and equipment between disparate groups of people, and virtualised
networks were established across the city. I laughed when I discovered that Squiz’s old offices had been in an old fire station.
Before long, the folks I’d been working with and I were settled into a basement meeting room in a nearby café, running a stack of Mac desktops and laptops from a monumental string of
power strips, and juggling an Internet connection between the café’s WiFi and a stack of Mifi-like
devices. We were able to get on with our work, and the day was saved, all thanks to some smart emergency planning. Later in the week, a generator was deployed outside the building and
we were able to return to normal desks, but the quick-thinking of the management ensured that a minimum of disruption was caused in the meantime.
Not one to waste the opportunity to make the most of being in London for a week, I spent another of my evenings out with Bryn. He and I went out to the Free
Fringe Fundraiser, which – despite a notable absence of Peter Buckley Hill, who had caught a case of the
then-dominating
norovirus – was still a great deal of fun. It was particularly pleasing to get to see Norman Lovett in the flesh: his particular brand of surrealist anti-humour tickles me mercilessly.
So what could have been “just another business trip” turned into quite the adventure, between fires and birthdays and eating-in-the-dark and comedy. If only it hadn’t taken me two
months to finish writing about it…
I’m not colourblind, and I’m not really a mobile developer, so maybe there’s something I’ve missed, but I’ve got an idea for an app and I thought I’d run it by you guys to
see if there’s something I’ve missed.
Mobile processing power is getting better and better, and we’re probably getting close to the point where we can do live video image manipulation at acceptable framerates (even 10
frames/sec would be something). So why can’t we make an app that shifts colours as seen by the camera to a particular different part of the spectrum (depending on the user’s
preferences).
For example, a deuteranomat (green weak, difficulty differentiating through the red/orange/yellow/green spectrum) might configure the software to shift yellows and greens to instead
be presented as purples and blues. The picture would be false, of course, but it would help distinguish between colours in order to make, for example, colour-coded maps readable.
I was thinking about how video cameras can often “see” infa-red (try pointing a remote control at a video camera and pressing the button), and present it to the viewer as white or
red, when I saw a documentary with some footage of “how bees see the world”. Bees have vision of a similar breadth of spectrum to humans, but shifted well into the infa-red range (and
away from the blue end of the spectrum). In the documentary, they’d filmed some flowers using a highly infa-red sensitive camera, and then they’d “shifted” the colours around the
spectrum in order to make it visible to normal humans: the high-infa-reds became yellows, the low-infa-reds became blues, and the reds they left as reds. Obviously this isn’t what
bees actually experience, but it’s an approximation that allows us to appreciate the variety in their spectrum.
Can we make this conversion happen “live” on mobile technology? And why haven’t we done so!