Phew. Survived running the tech. support side of the Student Skills Competition. Winners were mostly what I’d have expected.
The technical side all went pretty much to plan, albeit with a lot of stress, mostly caused by teams bring in presentations on CDs etc. at the last miunute, and expecting me to be able
to make them work before they needed them and run the rest of the backstage bits as well. Couldn’t have done it without Kit and Claire helping.
It was a lot of fun. Plus, we got to raid the judges buffet lunch and eat delicious pastry-and-cheese things.
The letting agency are complaining that the rent hasn’t been paid yet. I wish my bank would sort themselves out, allowing me to pay the buggers. Ah well; everything’s sweet so far.
[this post has been partially damaged during a server failure on Sunday 11th July 2004, and it has been possible to recover only a part of it]
[further fragments of this post were recovered on 12 October 2018]
Fun in the sun.
Kit and I had an idea for something like this a while back, and we were wondering if it constituted entrapment: after all, under UK law, it’s illegal for a human to attempt to trick another human into
committing a crime, as it cannot be determined whether that person would have committed the crime of their own volition… but here’s the catch – is it legitimate for a machine, working
on behalf of a human, to do the same thing?
That’s what’s likely to be the crucial issue if this scheme to trick ‘net
paedophiles into giving information to computerised children [BBC] provides evidence in court (not just leads, as is the case so far) towards convicting people who are ‘grooming’
children on the internet.
Personally, I’d argue that – in this case – the machine is a tool of the human, just like chat room software is a tool of humans. I don’t see the difference between me using chat room
software, pretending to be a kid, luring paedophiles, and providing tips to the police, and me writing a program to do the same for me. It’s …