A low-tech solution. But some of the
high-tech solutions are worse.
For some reason, the
news story that’s all the rage at the moment is how to stop children
looking at internet porn. I’m not sure exactly what’s happened to bring this,
but I can vouch it’s a tricky one. Not so long ago we were looking into testing
a website for, amongst other things, checking content was suitable for everyone
to access. It would potentially involve moderating everything posted, including
forums, applications and documents. And even if we could vet all of that,
what’s to say a linked page site will be suitable? And what about linked sites
from linked sites? And linked sites from linked sites from linked sites? Not
easy at all.
Now, I’ve always thought that the same rules should apply on
the internet as apply everywhere else. For adults, the basic principle, quite
rightly, is that you should have the choice to view what you want (bar a few accepted
limits such as paedophilia, certain depictions of rape, incitement to violence and
so on). For children, there are a few rules such as 12-, 15- and 18-rated
films, but it’s broadly viewed as the job of a parent to decide what they
should see, and that’s the way it should be. The internet, however, has made
this job harder. Yes, in the old days there was lying about your age when
seeing an X-rated film, or borrowing the mag your mate got off the top shelf,
but it’s now possible to view this stuff without even leaving your room, so it
must be taken very seriously.
The magic bullet frequently touted is family protection software, but its track record hasn’t always been impressive. Some of the earliest family filters were easily disabled using, out of all things, CTRL-ALT-DEL, which doesn’t give much confidence about how seriously suppliers take this. There were also suspicions that certain programs were, as well as filtering out unsuitable content for children, also filtering out incorrect opinions on subjects such as abortion or homosexuality, or even relatively uncontentious material like information on eating disorders. These are old stories from many years ago so maybe things have improved, but still the focus seems to be on installing software on the child’s computer. There are filtering services at the ISP end which are harder to circumvent (which some people use, even on its own or on top of filtering software) but they aren’t easy to set up, and I suspect this is being overlooked in favour of more lucrative products in shiny boxes at PC World.
Perhaps filtering software can work, but I’d like to propose
two alternative solutions. The first solution, which will take some time to
explain is … a whitelist.
This is a solution I’m proposing for younger children – I
doubt this would be workable for teenagers so Mail and Express readers
will have to wait for my second proposal – but my reasoning is simple: even
with the best will in the world, it is very difficult to imagine a filter that
catches everything. Perhaps it would be better to place the onus on web
developers to keep their content suitable if they want it to be viewed by
children. This is how film certification works – anyone who wants a U, PG, 12 or
15 certificate has to apply to the BBFC for the certificate rather than the
BBFC chasing after films that don’t comply – so maybe something similar can
work on the internet.
How would this work? Well, to start with, it’s got to be
opt-in. It’s one thing blocking illegal content for adults and putting an
opt-out filter for outright porn, but extending an opt-out filter for all web
content equivalent to a BBFC 18 rating is open to too much abuse. But if it’s
an opt-in filter, how do you get started? Few websites are going to bother
applying for a whitelist entry if no-one’s subscribed to it, and few people and
going to subscribe to a whitelist if no websites have opted in. Vicious circle.
So … how about we start with all UK primary schools
subscribed to this filter? There’s obviously no need for pupils to access adult
material in primary schools, and this would give a large enough user base to
prompt websites who want to subscribe to the whitelist to do so. This will then
leave parents free to opt into this whitelist or not as they see fit. (It
shouldn’t be too hard to apply different access right to parents’ and
children’s computers.) Should schools or parents wish to add extra sites they
consider safe, they could add this to their own personal whitelists.
Next question: who decides what is and isn’t suitable? This
is not a decision to take lightly. Even with an opt-in filter, it would be
unacceptable to use a government-backed scheme as an excuse for political
censorship. Luckily, we can take lessons from the BBFC here. Every decision
they make to award or refuse a certificate is publicly available online
with detailed explanations as to why the decision was taken, which is open to
scrutiny from the public, and I’m confident that if they ever started
selectivity censoring content on political grounds, they’d get rumbled quite
quickly. This model could be used for an internet whitelist, with the added
safeguard that the moment any family stops trusting the filter, they can opt
out.
Now for the big complication: websites change. The BBFC have
the advantage that everything they certify is a finished product. However, a
website that has nothing objectionable today could have anything tomorrow. This
is especially a problem for internet forums and sites that rely on
user-uploaded content. So I suggest that an internet whitelist would need to be
based in part on a commitment to self-policing, and acting promptly to remove
unsuitable content. Or, for big sites such as Youtube (who are never going to
make their entire site family-friendly to get on to a whitelist), you could
have the option of selectively whitelisting content flagged as family-friendly
on the site, as long the website can be trusted to enforce this.
One big tripping point is Wikipedia. For all the criticisms
they face, Wikipedia is an important educational resource, but the Wikimedia
foundation are adamant that Wikipedia
is not to be censored. Moderators are good at removing objectionable
material where it doesn’t belong, but they won’t skip over what happens in Debbie Does Dallas. In fact, adult
material relevant to the article can appear where you least expect it (such as
an innocent-looking article on classic 1980s cartoon Henry’s Cat.)
There is, however, a Wikimedia-backed edition of Wikipedia
for schools, which is an excellent idea in its own right. This edition is
still in its infancy, but with a bit of support this could be everything that
primary schools could wish for.
So that’s my idea for a whitelist. But I’m the first to
agree that plans that look good on paper don’t always work in practice,
especially the complicated ones. And above primary school age, I can’t see this
solution being workable. Which brings me on to my second much simpler proposal,
which is … parents need to take responsibility.
I’m not a parent, so I’m not going to dictate to parents
what’s best for them, but the most convincing solution I’ve read to date is to
sit with your children whilst they’re on the internet, with simple rules such
as “Don’t talk to strangers” extended to “Don’t
talk to strangers online”. But whilst I’m sure some parents are being quite
sensible, we also have parents who assist children in bypassing
the age block on Facebook, or refuse to let children watch 18-rated films
but allow
them to play the most violent of 18-rated computers games. Something is
seriously falling down here.
No matter how good parental controls get, no matter how much
freedom parents have to make their own decisions, it is a mistake to view this
as a substitute for parental responsibility. Just like the pre-internet days, a
child or teenager who is determined to get round parent controls will find a
way somehow. This is a blog on IT so that’s enough of a digression into parenting,
but from an IT perspective, the message is simple: this is both a technology
problem and a social problem. Technological solutions can only help with the
technological problems – how you solve the social problem is up to you.
UPDATE 11/05/2012: Since I posted this last week, the government have announced plans to force ISPs to introduce an opt-out filter for internet pornography (meaning that everyone will be subjected to this filter unless they expressly request otherwise). Some will doubtless argue this is a political plan to win votes on the back of bad local election results. But I'm not really interested in the politics behind this. All I'm interested is whether this can work. Except that it might not be possible to separate the two. Confused? Let me explain.
Sticking to my principle that the same law should apply on the internet as applies everywhere else, I can certainly see the case for applying this to stuff that would otherwise be certified R18, or a similar level. (I don't want to repeat what makes a video an R18 – if you really want to know you can read it here.) It makes no sense that a video that you would only be legally allowed to buy from a specialised adult store is also legally available to anyone who can switch off Google Safesearch. I have some sympathy with the argument that it should be up to parents to opt into this, but the problem is one of apathy. Many parents don't even consider whether they should use a family filter, let alone make a decision, and I'd much rather the default option for children was that this isn't available. Anyone who objects is welcome to opt out of the filter.
However, we must be realistic as to what this filter can achieve. Forcing ISPs to filter out R18-rated material is one thing, but any lower than that and you start running into all sorts of problems. Could you create workable automated filters for 12-, 15- or 18-rated material? My guess is no. Even if you can, is it possible to do it is a way that doesn't impede debate of adult issues? Sex education? Drug debates? Gun control? Rape law? This is treading on very dangerous ground. If we're not careful, we could see a repeat of the silly 1950s censorship rules which were circumvented using daft loopholes.
My worry is that if the government does not properly manage expectations, this could send us down the wrong path. Suppose the government introduces the filter, and instead of it being welcomed, parents complain that their teenage children move on to other sites that got round the government-approved definition of internet pornography. So the government create new tighter rules. And teenagers move on to other sites. And the public demand tighter rules still. And so it goes on. And all this time, more and more of the internet gets caught up in increasingly indiscriminate censorship.
Perhaps I'm being paranoid, but I can see this happening if this gets used to chase votes. Because whilst compulsory internet filters might be a vote-winner, and tightening up the rules further might win even more votes, telling the public that it's going too far – and consequently implying that middle-class parents should stop treating the issue as someone else's responsibility – could well be a vote-loser. There's no need to come anywhere near this nightmare scenario if it's handled sensibly, but sometimes politics gets in the way of sense. So I'll wait and see how this pans out before giving my verdict.
UPDATE 11/05/2012: Since I posted this last week, the government have announced plans to force ISPs to introduce an opt-out filter for internet pornography (meaning that everyone will be subjected to this filter unless they expressly request otherwise). Some will doubtless argue this is a political plan to win votes on the back of bad local election results. But I'm not really interested in the politics behind this. All I'm interested is whether this can work. Except that it might not be possible to separate the two. Confused? Let me explain.
Sticking to my principle that the same law should apply on the internet as applies everywhere else, I can certainly see the case for applying this to stuff that would otherwise be certified R18, or a similar level. (I don't want to repeat what makes a video an R18 – if you really want to know you can read it here.) It makes no sense that a video that you would only be legally allowed to buy from a specialised adult store is also legally available to anyone who can switch off Google Safesearch. I have some sympathy with the argument that it should be up to parents to opt into this, but the problem is one of apathy. Many parents don't even consider whether they should use a family filter, let alone make a decision, and I'd much rather the default option for children was that this isn't available. Anyone who objects is welcome to opt out of the filter.
However, we must be realistic as to what this filter can achieve. Forcing ISPs to filter out R18-rated material is one thing, but any lower than that and you start running into all sorts of problems. Could you create workable automated filters for 12-, 15- or 18-rated material? My guess is no. Even if you can, is it possible to do it is a way that doesn't impede debate of adult issues? Sex education? Drug debates? Gun control? Rape law? This is treading on very dangerous ground. If we're not careful, we could see a repeat of the silly 1950s censorship rules which were circumvented using daft loopholes.
My worry is that if the government does not properly manage expectations, this could send us down the wrong path. Suppose the government introduces the filter, and instead of it being welcomed, parents complain that their teenage children move on to other sites that got round the government-approved definition of internet pornography. So the government create new tighter rules. And teenagers move on to other sites. And the public demand tighter rules still. And so it goes on. And all this time, more and more of the internet gets caught up in increasingly indiscriminate censorship.
Perhaps I'm being paranoid, but I can see this happening if this gets used to chase votes. Because whilst compulsory internet filters might be a vote-winner, and tightening up the rules further might win even more votes, telling the public that it's going too far – and consequently implying that middle-class parents should stop treating the issue as someone else's responsibility – could well be a vote-loser. There's no need to come anywhere near this nightmare scenario if it's handled sensibly, but sometimes politics gets in the way of sense. So I'll wait and see how this pans out before giving my verdict.
No comments:
Post a Comment