Normally I find Veritasium’s videos to be… less mindblowing than their titles would aim to have me believe. But I found this one pretty inspiring; the first Feigenbaum constant is a proper headtrip. And I feel like I’ve got new insights into the Mandelbrot set too.
Google’s built-in testing tool Lighthouse judges the accessibility of our websites with a score between 0 and 100. It’s laudable to try to get a high grading, but a score of 100
doesn’t mean that the site is perfectly accessible. To prove that I carried out a little experiment.
…
Manuel Matuzovic wrote a web page that’s pretty-much inaccessible to everybody: it doesn’t work with keyboard navigation, touchscreens, or mice. It doesn’t work with screen
readers. Even if you fix the other problems, its contrast is bad enough that almost nobody could read it. It fails ungracefully if CSS or JavaScript is unavailable. Even the source code is illegible. This took a special kind of evil.
But it scores 100% for accessibility on Lighthouse! I earned my firework show for this site last year but I know better than to let that lull
me into complacency: accessibility isn’t something a machine can test for you, only something that (at best) it can give you guidance on.
This is an RSVP post stating Dan's intention to attend (or not attend) an event. It's presented in a machine-readable format to notify event organisers via Webmention. See more
RSVPs by Dan.
I last handed in a dissertation almost 16 years ago; that one marked the cumulation of my academic work at Aberystwyth University, then the “University of Wales, Aberystwyth”. Since then I’ve studied programming, pentesting and psychology (the P-subject
Triathalon?)… before returning to university to undertake a masters degree in information security and forensics.
Today, I handed in that dissertation. Thanks to digital hand-ins, I’m able to “hand it in” and then change my mind, make changes, and hand-in a replacement version right up
until the deadline on Wednesday (I’m already on my second version!), so I’ve still got a few evenings left for last-minute proofreads and tweaks. That said, I’m mostly
happy with where it is right now.
Writing a dissertation was harder this time around. Things that made it harder included:
Writing a masters-level dissertation rather than a bachelors-level one, naturally.
Opting for a research dissertation rather than an engineering one: I had the choice, and I knew that I’d do better in engineering, but I did research anyway because I
thought that the challenge would be good for me.
Being older! It’s harder to cram information into a late-thirty-something brain than into a young-twenty-something one.
Work: going through the recruitment process for and starting at Automattic ate a lot of my time,
especially as I was used to working part-time at the Bodleian and I’d been turning a little of what would otherwise have been my “freelance work time” into “study time” (last time
around I was working part-time for SmartData, of course).
Life: the kids, our (hopefully) upcoming house move and other commitments are pretty good at getting in the way. Ruth and JTA have been amazing at carving out blocks of time for me to study, especially these last few weekends, which may have made all the
difference.
It feels like less of a bang than last time around, but still sufficient that I’ll breathe a big sigh of relief. I’ve a huge
backlog of things to get on with that I’ve been putting-off until this monster gets finished, but I’m not thinking about them quite yet.
I need a moment to get my bearings again and get used to the fact that once again – and for the first time in several years – I’ll soon be not-a-student. Fun fact, I’ve spent
very-slightly-more than half of my adult life as a registered student: apparently I’m a sucker it, for all that I complain… in fact, I’m already wondering what I can study
next (suggestions welcome!), although I’ve promised myself that I’ll take a couple of years off before I get into anything serious.
(This is, of course, assuming I pass my masters degree, otherwise I might still be a student for a little longer while I “fix” my dissertation!)
If anybody’s curious (and I shan’t blame you if you’re not), here’s my abstract… assuming I don’t go back and change it yet again in the next couple of days (it’s still a little clunky
especially in the final sentence):
Multifactor authentication (MFA), such as the use of a mobile phone in addition to a username and password when logging in to a website, is one of the strongest security enhancements
an individual can add to their online accounts. Compared to alternative enhancements like refraining from the reuse of passwords it’s been shown to be easy and effective. However: MFA
is optional for most consumer-facing Web services supporting MFA, and elective user adoption is well under 10%.
How can user adoption be increased? Delivering security awareness training to users has been shown to help, but the gold standard would be a mechanism to encourage uptake that can be
delivered at the point at which the user first creates an account on a system. This would provide strong protection to an account for its entire life.
Using realistic account signup scenarios delivered to participants’ own computers, an experiment was performed into the use of language surrounding the invitation to adopt MFA. During
the scenarios, participants were exposed to statements designed to either instil fear of hackers or to praise them for setting up an account and considering MFA. The effect on uptake
rates is compared. A follow-up questionnaire asks questions to understand user security behaviours including password and MFA choices and explain their thought processes when
considering each.
No significant difference is found between the use of “fear” and “praise” statements. However, secondary information revealed during the experiment and survey provides recommendations
for service providers to offer MFA after, rather than at, the point of account signup, and for security educators to focus their energies on dispelling user preconceptions about the
convenience, privacy implications, and necessity of MFA.
The “where’s my elephant?” theory takes it name, of course, from The Simpsons episode in which Bart gets an elephant (Season 5, episode 17, to be precise). For those of you
who don’t know the episode: Bart wins a radio contest where you have to answer a phone call with the phrase, “KBBL is going to give me something stupid.” That “something stupid”
turns out to be either $10,000, or “the gag prize”: a full-grown African elephant. Much to the presenters’ surprise, Bart chooses the elephant — which is a problem for the radio
station, since they don’t actually have an elephant to give him. After some attempts at negotiation (the presenters offer Principal Skinner $10,000 to go about with his pants pulled
down for the rest of the school year; the presenters offer to use the $10,000 to turn Skinner into “some sort of lobster-like creature”), Bart finds himself kicked out of the radio
station, screaming “where’s my elephant?”
…
…the “where’s my elephant?” theory holds the following:
If you give someone a joke option, they will take it.
The joke option is a (usually) a joke option for a reason, and choosing it will cause everyone a lot of problems.
In time, the joke will stop being funny, and people will just sort of lose interest in it.
No one ever learns anything.
…
For those that were surprised when Trump was elected or Brexit passed a referendum, the “Where’s My Elephant?” theory of history may provide some solace. With reference to Boaty
McBoatface and to the assassination of Qasem Soleimani, Tom Whyman pitches that “joke” options will be selected significantly more-often that you’d expect or that they should.
Our society is like Bart Simpson. But can we be a better Bart Simpson?
If that didn’t cheer you up: here’s another article, which more-seriously looks at the
political long-game that Remainers in Britain might consider working towards.
Back in 2016, I made an iMessage app called Overreactions. Actually, the term “app” is probably generous: It’s
a collection of static and animated silly faces you can goof around with in iMessage. Its “development” involved many PNGs but zero lines of code.
Just before the 2019 holidays, I received an email from Apple notifying me that the app “does not follow one or more of the App Store Review Guidelines.” I signed in to Apple’s
Resource Center, where it elaborated that the app had gone too long without an update. There were no greater specifics, no broken rules or deprecated dependencies, they just wanted
some sort of update to prove that it was still being maintained or they’d pull the app from the store in December.
Here’s what it took to keep that project up and running…
…
There’s always a fresh argument about Web vs. native (alongside all the rehashed ones, of course). But here’s one you might not have heard before: nobody ever wrote a Web page that met
all the open standards only to be told that they had to re-compile it a few years later for no reason other than that the browser manufacturers wanted to check that the author was still
alive.
But that’s basically what happened here. The author of an app which had been (and still did) work fine was required to re-install the development environment and toolchain, recompile,
and re-submit a functionally-identical version of their app (which every user of the app then had to re-download along with their other updates)… just because Apple think that an app
shouldn’t ever go more than 3 years between updates.
I keep my life pretty busy and don’t get as much “outside” as I’d like, but when I do I like to get out on an occasional geohashing expedition (like these
ones). I (somewhat badly) explained geohashing in the vlog attached to my expedition 2018-08-07 51 -1, but the short
version is this: an xkcd comic proposed an formula to use a stock market index to generate a pair of random coordinates, impossible to predict in
advance, for each date. Those coordinates are (broadly) repeated for each degree of latitude and longitude throughout the planet, and your challenge is to get to them and discover
what’s there. So it’s like geocaching, except you don’t get to find anything at the end and there’s no guarantee that the destination is even remotely accessible. I love it.
Most geohashers used to use a MediaWiki-powered website to coordinate their efforts and share their stories, until a different application on the server where it resided got hacked and the wiki got taken down as a precaution.
That was last September, and the community became somewhat “lost” this winter as a result. It didn’t stop us ‘hashing, of course: the algorithm’s open-source and so are many of its
implementations, so I was able to sink into a disgusting hole in November, for example. But we’d lost the digital
“village square” of our community.
So I emailed Davean, who does techy things for xkcd, and said that I’d like to take over the Geohashing wiki but that I’d first like (a) his or Randall’s blessing to do so, and ideally
(b) a backup of the pages of the site as it last-stood. Apparently I thought that my new job plus finishing my dissertation plus trying to move house plus all of the usual
things I fill my time with wasn’t enough and I needed a mini side-project, because when I finally got the go-ahead at the end of last month I (re)launched geohashing.site. Take a look, if you like. If you’ve never been Geohashing before, there’s never been a more-obscure time to start!
Luckily, it’s not been a significant time-sink for me: members of the geohashing community quickly stepped up to help me modernise content, fix bots, update hyperlinks and the like. I
took the opportunity to fix a few things that had always bugged me about the old site, like the mobile-unfriendly interface and the inability to upload GPX files, and laid the groundwork to make bigger changes down the road (like changing the way that inline maps are displayed, a popular community request).
So yeah: Geohashing’s back, not that it ever went away, and I got to be part of the mission to make it so. I feel like I am, as geohashers say… out standing in my field.
“Even at a young age, I was able to grasp the concept that my mum and dad could love more than one person,” he says. “The only thing I’ve found challenging about having three adults
in my family is getting away with things, because it means more people to check up on you, to make sure you did your chores. But I also have more people around to give me lifts here
and there, to help with homework and to come to my lacrosse games. The saying ‘raised by a village’ definitely applies to me. I feel like a completely normal teenager, just with
polyamorous parents.”
…
Yet another article providing evidence to support the fact that – except for the bigotry of other people – there are no downsides to being a child of polyamorous parents.
Nicely-written; I’ve sent a copy of Alan for the Poly In The Media blog.
There are plenty of opportunities for friction in the user experience when logging in, particularly while entering a two factor authentication code. As developers we should be
building applications that support the need for account security but don’t detract from the user experience. Sometimes it can feel as though these requirements are in a battle
against each other.
In this post we will look at the humble <input> element and the HTML attributes that will help speed up our users’ two factor authentication experience.
…
Summary: simple changes like making your TOTP-receiving <input> to have
inputmode="numeric" gives user-agents solid hints about what kind of data is expected, allowing mobile phones to show a numeric keypad rather than a full keyboard, while
setting autocomplete="one-time-code" hints to password managers and autocomplete tools that what’s being collected needn’t be stored for future use as it’ll expire (and can
also help indicate to authenticators where they should auto-type).
As my current research project will show, the user experience of multifactor authentication is a barrier to entry for many users who might otherwise benefit from it. Let’s lower that
barrier.
Cute open source project that produces on-demand SVG and PNG maps,
like the one above, based on the roads in OpenStreetMap data. It takes a somewhat liberal view of what a “road” is: I found it momentarily
challenging to get my bearings in the map above, which includes where I live, because the towpath and cycle paths are included which I hadn’t expected. Still a beautiful bit of output
and the source code could be adapted for any number of interesting cartographic projects.
A frisbee, propelled by the wind and balanced upright by some kind of black magic, makes an elegant and hypnotic dance across a frozen pond.
Which would be beautiful and weird enough as it is, and is sufficient reason alone to watch this video. But for the full experience you absolutely have to turn on the subtitles.
(Joe reads the text on IE and clicks on “Suggested Sites”)
Me: “Why did you click on that?”
Joe: “I don’t really know what to do, so I thought this would suggest something to me.”
…
Finding adults who’ve got basically no computer experience whatsoever is getting increasingly rare (and already was very uncommon back in 2011 when this was written), and so I can see
why Jennifer Morrow, when presented with the serendipitous opportunity to perform some user testing with one, made the very most of the occasion.
As well as being a heart-warming story, this post’s a good reminder that we shouldn’t make assumptions about the level of expertise of our users.