Chapter 20 Answer guidance to end of chapter questions

Obscenity in the information society
  1. On the morning of Tuesday 1 August 2019, the Metropolitan Police raided the offices of a major corporate client of your firm. The raid revealed a series of potentially unlawful images on the client’s cloud server, based in San Francisco. These images belonged to one of the client’s employees who had originally stored them on his desktop but had transferred them to the cloud server before deleting them from his PC. The employee in question admitted in an interview that he likes to look at pornographic materials during his lunch hour. He claims the images are perfectly legal and ‘not unlike those you will find in any top-shelf magazine’. The client’s representatives have looked at some of the sites he is visiting and are extremely concerned that some of the models look to be under 18. The employee says that all the models are over 18, but that the images are morphed a little using photo-manipulation software to make them look a little younger and that the site carries a disclaimer saying all models are over 18. The same employee also has several images on his computer of him engaging in what the client describes as ‘dangerous, and potentially even fatal, sexual acts’. During his interview with the client the employee dismissed that the acts were dangerous, stating that it was just ‘a bit of fun between consenting adults’.

      You have been asked to determine whether the actions of the employee are potentially   illegal in English law and whether the client is doing anything illegal, or whether it could be   liable for its employee’s actions.

To answer this question I would expect the student to discuss in order:

  1. The first question is whether there may be a possession offence here. As the items are held on the client’s computers and servers we need to know whether the employee (and in turn the client) may be committing a possession offence. Generally possession of a non-extreme pornographic image is not an offence – see Obscene Publications Act 1959. However there are two possible concerns here. One is the apparent projected age of the models and the second the description of content as ‘dangerous, and potentially even fatal, sexual acts’.
  2. Turning first to the age issue. It is an offence under s.160 Criminal Justice Act 1988 for a person to have any indecent photograph of a child in his possession. This offence extends to offence to possess an image of a person under 18, whether or not they look older than they actually are. As noted in the text “in the UK it is not technically illegal to possess an indecent image of a person 18 or over who looks younger than they are” however s.160 also polices pseudo-images which are manipulated images. This appears to be the case here and by s.160 it is illegal to possess a computer-manipulated or computer-generated image which is specifically designed to create the impression that a minor is portrayed. The discussion may look at relevant cases: R. v Bowden and Goodland v DPP.
  3. Turning next to the issue of the “dangerous” images, these may be illegal as extreme pornography under s.63 of the Criminal Justice and Immigration Act 2008. This makes it illegal to possess five categories of image: (a) an act which threatens a person’s life, (b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals, (c) an act which involves sexual interference with a human corpse, (d) a person performing an act of intercourse or oral sex with an animal (whether dead or alive), or (e) an act which involves the non-consensual penetration of a person’s vagina, anus or mouth by another with the other person’s penis, or an act which involves the non-consensual sexual penetration of a person’s vagina or anus by another with a part of the other person’s body or anything else, and a reasonable person looking at the image would think that the persons were real. The images here may breach (a) or (b) (they would have to be reviewed). Recent cases R v Oliver (Philip) and R v Okoro (Cyprian) give guidance on possession but suggest an offence may be committed by the employee and potentially (with knowledge now in place) by the client.
  1. Can we treat ‘extremely pornographic images’ in the same way as images of child abuse images? Is s. 63 of the Criminal Justice and Immigration Act proportionate to the harm?

To answer this question I would expect the student to discuss in order:

  1. A definition of extremely pornographic images drawn from s.63 of the Criminal Justice and Immigration Act.
  2. Why would we criminalise an activity? Usually either the direct harm or indirect harm principle (see Q.1). In these cases there is no direct harm. Is there indirect harm? With NPPICs there is the risk of indirect harm as with pseudo images. In particular there is a risk such images may be used to seduce a child. With extreme images it is not clear there is any indirect harm. Why criminalise possession?  The government took the decision to outlaw the possession of such images on public policy grounds rather than on the harm principle. The policy justification was that ‘there is a small category of pornographic material which is so repugnant that, in common with child abuse images, its possession should not be tolerated’
  3. Discuss whether these are proportionate. What is the risk of indirect harm?
  4. The question of proportionality in extreme images is more vexed. A large proportion of the BDSM community campaigned against it. Due to this concessions made at the time meant to Act failed to properly regulate rape imagery something the government is now having to consider new legislation for. There have been a number of prosecutions mostly for possession of bestiality images. These though do not suggest a public policy campaign. it may fairly be asked what the purpose of s.63 truly is – see e.g. R v Ping Chen Cheung.
  1. Will age-verification (as set out in the Digital Economy Act 2017) be effective at all? Is the age verification process proportionate to the risk or harm?

To answer this question I would expect the student to discuss in order:

  1. The process for age verification as set out in the Digital Economy Act 2017, focusing on the effective provision of s.14(1) ‘A person contravenes this subsection if the person makes pornographic material available on the internet to persons in the United Kingdom on a commercial basis other than in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18” and the creation of a regulator by s.16.
  2. The system for regulation as set out in s.25 and the BBFC guidance including enforcement mechanisms under ss.18, 19 and 23 with a focus on the controversial blocking mechanisms under s.23 as well as the payment block under s.21.
  3. An evaluation of the proportionality of such an approach – relevant ECHR case law may be used here at the student’s discretion.
Back to top