Keeping Abreast of Pornographic Research in Computer Science
Burgeoning numbers of Ph.D's and grad students are choosing to study pornography. Techniques for the analysis of "objectionable images" are gaining increased attention (and grant money) from governments and research institutions around the world, as well as Google. But what, exactly, does computer science have to do with porn? In the name of academic persuit, let's roll up our sleeves and plunge deeply into this often hidden area that lies between the covers of top-shelf research journals.
One cannot do research in image processing without an encounter with Lena (pronounced Lenna). The image of the woman with a feathered hat has become the de-facto test image for many algorithms, and appears in thousands of articles and conference papers. And it is of pornographic pedigree:
Alexander Sawchuk estimates that it was in June or July of 1973 when he, then an assistant professor of electrical engineering at the USC Signal and Image Processing Institute (SIPI), along with a graduate student and the SIPI lab manager, was hurriedly searching the lab for a good image to scan for a colleague's conference paper. They had tired of their stock of usual test images, dull stuff dating back to television standards work in the early 1960s. They wanted something glossy to ensure good output dynamic range, and they wanted a human face. Just then, somebody happened to walk in with a recent issue of Playboy.
The engineers tore away the top third of the centerfold so they could wrap it around the drum of their Muirhead wirephoto scanner, which they had outfitted with analog-to-digital converters (one each for the red, green, and blue channels) and a Hewlett Packard 2100 minicomputer. The Muirhead had a fixed resolution of 100 lines per inch and the engineers wanted a 512 x 512 image, so they limited the scan to the top 5.12 inches of the picture, effectively cropping it at the subject's shoulders.
The rest of the story (and the rest of Lena) can be found here. Indeed, the 70s marked the beginning of a long relationship between computer science and pornography. However, after the birth of the world wide web, things really got hot and heavy.
In the 1990s the world wide web began to explode, pumping information of all kinds into the homes of the technologically savvy at rates as high as 9600 bits per second. It was the time when search engines such as Webcrawler, Altavista, and Yahoo began the arduous task of spidering the scattered bits of information in Internet servers everywhere. The problem was that someone might search for a completely innocuous query such as the Trojan Room Coffee Pot, and come up with images that were unexpected and inappropriate, and depending on one's tastes, objectionable.
It's not likely to be on his business card, but David A. Forsyth is an expert in web pornography, having served on the NRC committee for this topic. It is evident from his web page that he has a sense of humour, which explains the superbly descriptive title for his 1996 paper, Finding Naked People. Forsyth was one of the first researchers to study the problem of identifying objectionable content.
One of Forsyth's research areas is tracking people in images and videos and figuring our their pose. In the general case, the system has to cope with the fact that people can wear clothes. It would be easier if the subjects all wore the same colour, or didn't wear anything at all. Finding Naked People describes a way of first masking out areas of skin. The areas are then grouped together into human figures (visualized by drawing a stick figure on the image). The crux of the paper is the grouping algorithm. The grouper knows rules such as how limbs fit together into a body, and the fact that a person cannot have more than two arms. Using the rules, it figures out how to superimpose a body onto the skin patches. If it can successfully do this, the image is probably a naked person. If it cannot, then it is probably something else, like a lamp.
Here is a visualization of the skin probability field from the paper, with the grouper output segments superimposed on top:
In their System for Screening Objectionable Images, Wang and his colleagues describe the WIPETM method for screening content. They use a wavelet edge detection algorithm to obtain the shape of the image. Edge detection transforms an image into the outlines of the object. Wavelet edge detection allows them to tune it to detect sharp or increasingly blurry edges until well-defined shapes appear.
Image moments allow one to treat any shape as a flat, physical object (like a plate). You can figure out the centre of gravity, axis of symmetry, and other properties that don't change when you move, rotate, or change the size of the object. This typically results in a set of 3 to 7 numbers that you can use to compare how similar shapes are. They were used in early OCR (optical character recognition) algorithms circa 1962.
Wang uses both edge detection and image moments in the analysis. His algorithm is different from modern ones, because an image must pass a series of five YES/NO tests. Future algorithms would combine the detectors using statistical methods and give a probability estimate.
Doing what only Google could, they must have set a record for the rate of pornographic analysis. They evaluated the speed of the algorithm on a corpus of around 1.5 billion thumbnail images of less than 150x150 pixels. "Processing the entire corpus took less than 8 hours," the team bragged, "using 2,500 computers."
To create the visual vocabulary, they extract image patches around "points of interest", parts of the image that are likely to contain features. They are then scaled to a common size, and analyzed using PCA to find commonalities. It is similar to face detection, but for things that aren't faces. It also takes colour into account in the analysis. Because colour is a part of the "vocabulary" already, skin detection is unnecessary.
Using this technique, Deselaers is even able to go beyond simple YES/NO classification and reach a new level of precision. The algorithm can rate images into one of five categories of increasing levels of offensiveness, from benign, to lightly dressed, to partly nude, fully nude, and porn. The paper contains examples from each category, and is guaranteed to offend somebody.
At the end of the Google paper, the authors speculate on how to spur further advances:
Yes, bring on the grant-sponsored porn, so that researchers can make the world a better place. But despite the years of study, one question remains unanswered: if such a corpus existed, how would we find it?
Lena
Finding Naked People
It's better with more than one
Finding Naked People piqued a lot of interest in the field of objectionable images, and the skin matching idea is now the first step in many algorithms. However, as James Ze Wang of Stanford notes, "it takes about 6 minutes on a workstation for the figure grouper in their algorithm to process a suspect image passed by the skin filter."
Here are some examples where the algorithm fails. We have blurred them to protect the eyes of the gentle reader. For high resolution versions, you'll have to refer to
Proceedings of the 4th International Workshop on Interactive Distributed Multimedia Systems and Telecommunication Services on page 20 (the dog-eared one).
Getting a leg up on skin models
Skin detection is an important step in porn detection, but figuring out which colours represent skin is a hard problem. Colour depends on the lighting used in the photo, the ethnicity of the participants, and the quality and noise level. Michael J. Jones and James M. Rehg at Compaq studied the problem in detail. They first manually labeled hundreds of images, highlighting all the areas that were skin using a custom drawing application. Once you have billions of pixels that you know are skin, and billions that you know are not, you can easily classify them using introductory math:
The paper describes how to find the probability function, P, using a database of images painstakingly highlighted by an army of enthusiastic research interns. However, as a porn detector, the method needs work.
It will be obvious to anyone who has bought a digital camera recently how to improve this system. Before reading on, can you spot the solution?
Taking the ogle out of Google
In recent years, Google has had its hands full with the problem of pornographic imagery. Henry A. Rowley, Yushi Jing, and Shumeet Baluja at the Mountain View campus, have developed a system that combines skin detection with a number of different features. After applying face detection, they can deduce that the pixels around the face represent skin colour, and therefore find other skin pixels in the image. If the face is the majority of the image, as in a portrait, the image is safe. They use a colour histogram to detect artificial images such as screen shots. (so dirty cartoons are safe?).
Bags of visual words (Arm, leg, or . . .?)
In 2008, Thomas Deselaers et al. came up with a unique way of finding porn, from the world of artificial intelligence. Large news databases can automatically classify news articles based on the words in them. Articles containing the names of political figures or sports jargon can be easily categorized by machines, that don't need to really understand what the article is about. Techniques exist so that the machines can learn on their own which words or names are important. The same methods can be applied to images, using visual words.
Corpus non indutus
...because of the ubiquity of the Internet, search engines, and the widespread proliferation of electronic images, adult-content detection is an important problem to address. To improve the rate of progress in this field it would be useful to establish a large fixed test set which can be used by both researchers and commercial ventures.
For a good time, read this
anyway, I have a question for you, do you have any idea about how to found data or studies about pornography and race?
I want to know:
- actor men/women percentage in the differents races
- consumer men/women percentage in the different races
ciao
s.
You have a very strange definition of pornography. It's hard to imagine a more tasteful or artistic nude.