Do you have permission for those third-party scripts?
Enforcement of the European Union’s General Data Protection Regulation is coming very, very soon. Look busy. This regulation is not
limited to companies based in the EU—it applies to any service anywhere in the world that can be used by citizens of the EU.
…
Jeremy Keith raises some interesting points: when informed consent is required to track an individual, who is responsible for getting your users to “consent” to being
tracked with Google Analytics and similar site-spanning tools? You? Google? Nobody? I’ve spent the weekend talking through only a handful of the woolly edges of the GDPR, especially regarding the liabilities of different companies (potentially not all of which are based in the EU) who are complicit in
the collection of data on the same individuals but who have access to that data in different forms.
It’s complicated, yo. For the time being, I’m making sure that companies for which I have responsibility err on the “safe” side of any fuzzy lines, but I’m sure that others won’t.
I've long been a proponent of Content Security Policies (CSPs). I've used them to fix mixed content warnings on this blog after Disqus made a little mistake, you'll see one adorning
Have I Been Pwned (HIBP) and I even wrote a dedicated Pluralsight course on browser security headers. I'm a
But it’s not all roses with CSPs and that’s partly due to what browsers will and will not let you do and partly due to what the platforms running our websites will and will not let
you do. For example, this blog runs on Ghost Pro which is a managed SaaS platform. I can upload whatever theme I like, but I can’t control
many aspects of how the platform actually executes, including how it handles response headers which is how a CSP is normally served by a site. Now I’m enormously supportive of running on managed platforms, but this is one of the
limitations of doing so. I also can’t add custom headers via Cloudflare at “the edge”; I’m serving the HSTS header from there because there’s first class support for that in the GUI, but not for CSP either
specifically in the GUI or via custom response headers. This will be achievable in the future via Cloudflare workers but for now, they have to come from the origin site.
However, you can add a CSP via meta tag and indeed that’s what I originally did with the upgrade-insecure-requests implementation I mentioned earlier when I fixed
the Disqus issue. However – and this is where we start getting into browser limitations – you can’t use the report-uri directive in a meta tag. Now that doesn’t matter if all the CSP
is doing is upgrading requests, but it matters a lot if you’re actually blocking content. That’s where the real value proposition of a CSP lies too; in its ability
to block things that may have been maliciously inserted into a site. I’ve had enough experience with breaking the CSP on HIBP to know that reporting is absolutely invaluable and
indeed when I’ve not paid attention to reports in the past, it’s
literally cost me money.
One of the most-popular WordPress plugins is Jetpack, a product of Automattic (best-known for providing the widely-used WordPress hosting service “WordPress.com“). Among Jetpack’s
features (many of which are very good) is Jetpack Protect which adds – among other things – the possibility for a CAPTCHA to appear on your login pages. This feature is slightly worse than pointless as it makes
it harder for humans to log in but has no significant impact upon automated robots; at best, it provides a false sense of security and merely frustrates and slows down legitimate human
editors.
“Proving your humanity”, as you’re asked to do, is a task that’s significantly easier for a robot to perform than a human. Eventually, of course, all tests of this nature seem likely to fail as robots become smarter than humans
(especially as the most-popular system is specifically geared towards training robots), but that’s hardly an excuse for inventing a system
that was a failure from its inception. Jetpack’s approach is fundamentally flawed because it makes absolutely no effort to disguise the challenge in a way that humans are able to read
any-differently than robots. I’ll demonstrate that in a moment.
A while back, a colleague of mine network-enabled Jetpack Protect across a handful of websites that I occasionally need to log into, and it bugged me that it ‘broke’ my password safe’s
ability to automatically log me in. So to streamline my workflow – as well as to demonstrate quite how broken Jetpack Protect’s CAPTCHA is, I’ve written a userscript that you can install into your web browser that will
completely circumvent it, solving the maths problems on your behalf so that you don’t have to. Here’s how to use it:
Install a userscript manager into your browser if you don’t have one already: I use Tampermonkey, but it ought to work with almost any of
them.
From now on, whenever you go to a page whose web path begins with “/wp-login.php” that contains a Jetpack Protect maths problem, the answer will be automatically calculated and
filled-in on your behalf. The usual userscript rules apply: if you don’t trust me, read the source code (there are really only five lines to check) and disable automatic updates for it
(especially as it operates across all domains), and feel free to adapt/improve however you see fit. Maybe if we can get enough people using it Automattic will fix this
half-hearted CAPTCHA – or at least give us a switch to disable it in the first
place.
Update: 15 October 2018 – the latest version of Jetpack makes an insignificant change to this CAPTCHA; version 1.2 of this script (linked above) works around the change.
TL;DR: We are making changes to how AMP works in platforms such as Google Search that will enable linked pages to appear under publishers’ URLs instead of the google.com/amp URL
space while maintai…
TL;DR: We are making changes to how AMP works in platforms such as Google Search that will enable linked pages to appear
under publishers’ URLs instead of the google.com/amp URL space while maintaining the performance and privacy benefits of AMP Cache serving.
When we first launched AMP in Google Search we made a big trade-off: to achieve the user experience that users were
telling us that they wanted, instant loading, we needed to start loading the page before the user clicked. As we detailed in a deep-dive blog post last year, privacy reasons make it basically impossible to load the page from the publisher’s server. Publishers shouldn’t
know what people are interested in until they actively go to their pages. Instead, AMP pages are loaded from the Google AMP Cache but with that behavior the URLs changed to include
the google.com/amp/ URL prefix.
We are huge fans of meaningful URLs ourselves and recognize that this isn’t ideal. Many of y’all agree. It is certainly
the #1 piece of feedback we hear about AMP. We sought to ensure that these URLs show up in as few places as possible. Over time our Google Search native apps on Android and iOS
started defaulting to showing the publishers URLs and we worked with browser vendors to share the publisher’s URL of an article where possible. We couldn’t, however, fix the state of
URLs where it matters most: on the web and the browser URL bar.
…
Regular readers may recall that I’ve complained about AMP. This latest announcement by the project lead of the AMP team at Google goes some way to solving
the worst of the problems with the AMP project, but it still leaves a lot to be desired: for example, while Google still favours AMP pages in search results they’re building a walled
garden and penalising people who don’t choose to be inside it, and it’s a walled garden with fewer features than the rest of the web and a lock-in effect once you’re there. We’ve seen this before with “app culture” and with Facebook, but Google have the power to do a huge amount more damage.
Although there’s a lot of heated discussion around diversity, I feel many of us ignore the elephant in the web development diversity room. We tend to forget about users of older or
non-standard devices and browsers, instead focusing on people with modern browsers, which nowadays means the latest versions of Chrome and Safari.
This is nothing new — see “works only in IE” ten years ago, or “works only in Chrome” right now — but as long as we’re addressing other diversity issues in web development we should
address this one as well.
Ignoring users of older browsers springs from the same causes as ignoring women, or non-whites, or any other disadvantaged group. Average web developer does not know any non-whites,
so he ignores them. Average web developer doesn’t know any people with older devices, so he ignores them. Not ignoring them would be more work, and we’re on a tight deadline with a
tight budget, the boss didn’t say we have to pay attention to them, etc. etc. The usual excuses.
As a slight contribution to the diversity in web development discussion, here are the ratios of female attendees and speakers from the Amsterdam web conferences Krijn and I organised or are close to. I’m not sure what these numbers mean, but someone will surely have a bright idea after staring at them for
long enough.
Krijn gathered the crucial attendee numbers, while I added the speakers, a calculation, and some general remarks.
The gannet heeded conservationists' calls to settle on a small New Zealand island. Unfortunately, no eligible ladies did.
Nigel, a handsome gannet bird who lived on a desolate island off the coast of New Zealand, died suddenly this week. Wherever his soul has landed, the singles scene surely cannot
be worse.
The bird was lured to Mana Island five years ago by wildlife officials who, in hopes of establishing a gannet colony there, had placed concrete gannet decoys on cliffsides and
broadcast the sound of the species’ calls. Nigel accepted the invitation, arriving in 2013 as the island’s first gannet in 40 years. But none of his brethren joined him.
In the absence of a living love interest, Nigel became enamored with one of the 80 faux birds. He built her — it? — a nest. He groomed her “chilly, concrete feathers . . . year
after year after year,” the Guardian
reported. He died next to her in that unrequited love nest, the vibrant orange-yellow plumage of his head contrasting, as ever, with the weathered, lemony paint of hers.
“Whether or not he was lonely, he certainly never got anything back, and that must have been [a] very strange experience,” conservation ranger Chris Bell, who also lives on the
island, told the paper. “I think we all have a lot of empathy for him, because he had this fairly hopeless situation.”
The 110 million-year-old fossil of a nodosaur preserves the animal’s armor, skin, and what may have been its final meal.
On the afternoon of March 21, 2011, a heavy-equipment operator named Shawn Funk was carving his way through the earth, unaware that he would soon meet a dragon.
That Monday had started like any other at the Millennium Mine, a vast pit some 17 miles north of Fort McMurray, Alberta, operated by energy company Suncor. Hour after hour Funk’s
towering excavator gobbled its way down to sands laced with bitumen—the transmogrified remains of marine plants and creatures that lived and died more than 110 million years ago. It
was the only ancient life he regularly saw. In 12 years of digging he had stumbled across fossilized wood and the occasional petrified tree stump, but never the remains of an
animal—and certainly no dinosaurs.
But around 1:30, Funk’s bucket clipped something much harder than the surrounding rock. Oddly colored lumps tumbled out of the till, sliding down onto the bank below. Within minutes
Funk and his supervisor, Mike Gratton, began puzzling over the walnut brown rocks. Were they strips of fossilized wood, or were they ribs? And then they turned over one of the lumps
and revealed a bizarre pattern: row after row of sandy brown disks, each ringed in gunmetal gray stone.
“Right away, Mike was like, ‘We gotta get this checked out,’ ” Funk said in a 2011 interview. “It was definitely nothing we had ever seen before.”
There aren’t many great things to write about Hounslow, other than me being in it isn’t the sort of place that brings in visitors. There’s a tired shopping centre, an Asda (whose car
park has just been closed), lots of planes going over and Hounslow Heath, which frankly is just a large bit of scrubland whatever their website tells you about it being a “Local
Nature Reserve and Site of Importance for Nature Conservation (of Metropolitan Importance)” I really wouldn’t make the effort to see it.
What Hounslow does boast is three, yes THREE Poundlands. I have no idea why we need three Poundlands, especially as the high street also boasts a brand new PoundWorld, a 99p shop and
a 97p shop. Seriously, the three Poundlands are literally five minute walks away from each other. You may have seen the press this week about Poundland’s new sex toy range. Sex toys,
in Poundland, for a quid?! Yes, indeedy!
Actually, they first released their pound bullet vibe a few years back (how did I miss this?!) but now they have extended their range further. It’s called Nooky. Of course it is.
Not the best pizza by any stretch, but perfectly acceptable and spectacular value. A fiver for a weekday large pizza is something you can’t argue with.