An Unusual Workday

Some days, my day job doesn’t seem like a job that a real person would have at all. It seems like something out of a sitcom. Today, I have:

  • Worn a bear mask in the office (panda in my case; seen below alongside my head of department, in a grizzly mask).
    Bears in the office
  • Chatted about popular TV shows that happen to contain libraries, for inclusion in a future podcast series.
  • Experimented with Web-based augmented reality as a possible mechanism for digital exhibition content delivery. (Seen this thing from Google Arts & Culture? If you don’t have an AR-capable device to hand, here’s a video of what it’s like.)
    Virtual Reality at the Bodleian
  • Implemented a demonstrative XSS payload targetting a CMS (as a teaching tool, to demonstrate how a series of minor security vulnerabilities can cascade into one huge one).
  • Gotten my ‘flu jab.

Not every day is like this. But sometimes, just sometimes, one can be.

Idea: mobile app that uses camera and shifts colour-balances to make colours “visible”

This self-post was originally posted to /r/ColorBlind. See more things from Dan's Reddit account.

I’m not colourblind, and I’m not really a mobile developer, so maybe there’s something I’ve missed, but I’ve got an idea for an app and I thought I’d run it by you guys to see if there’s something I’ve missed.

Mobile processing power is getting better and better, and we’re probably getting close to the point where we can do live video image manipulation at acceptable framerates (even 10 frames/sec would be something). So why can’t we make an app that shifts colours as seen by the camera to a particular different part of the spectrum (depending on the user’s preferences).

For example, a deuteranomat (green weak, difficulty differentiating through the red/orange/yellow/green spectrum) might configure the software to shift yellows and greens to instead be presented as purples and blues. The picture would be false, of course, but it would help distinguish between colours in order to make, for example, colour-coded maps readable.

I was thinking about how video cameras can often “see” infa-red (try pointing a remote control at a video camera and pressing the button), and present it to the viewer as white or red, when I saw a documentary with some footage of “how bees see the world”. Bees have vision of a similar breadth of spectrum to humans, but shifted well into the infa-red range (and away from the blue end of the spectrum). In the documentary, they’d filmed some flowers using a highly infa-red sensitive camera, and then they’d “shifted” the colours around the spectrum in order to make it visible to normal humans: the high-infa-reds became yellows, the low-infa-reds became blues, and the reds they left as reds. Obviously this isn’t what bees actually experience, but it’s an approximation that allows us to appreciate the variety in their spectrum.

Can we make this conversion happen “live” on mobile technology? And why haven’t we done so!