[this post has been partially damaged during a server failure on Sunday 11th July 2004, and it has been possible to recover only a part of it]
[further fragments of this post were recovered on 12 October 2018]
Fun in the sun.
Kit and I had an idea for something like this a while back, and we were wondering if it constituted entrapment: after all, under UK law, it’s illegal for a human to attempt to trick another human into committing a crime, as it cannot be determined whether that person would have committed the crime of their own volition… but here’s the catch – is it legitimate for a machine, working on behalf of a human, to do the same thing?
That’s what’s likely to be the crucial issue if this scheme to trick ‘net paedophiles into giving information to computerised children [BBC] provides evidence in court (not just leads, as is the case so far) towards convicting people who are ‘grooming’ children on the internet.
Personally, I’d argue that – in this case – the machine is a tool of the human, just like chat room software is a tool of humans. I don’t see the difference between me using chat room software, pretending to be a kid, luring paedophiles, and providing tips to the police, and me writing a program to do the same for me. It’s …
0 comments