Another side of photorealism and VR—can AI save the day?
As focused as I have been over the decades on the realism and suspension of disbelief in computer graphics, and as naïve as it sounds, I never considered the potential dark side of accomplishing what we all sought—CG images indistinguishable from real for questionable purposes.
The dark side was brought to light by a Dartmouth professor who posed the question about CG child pornography— is it, would it be ,illegal? My first reaction was, who thinks up this kind of crap? On reflection, I realized it’s a perplexing, and sadly interesting, question.
The idea of doctoring photos used in a criminal or even capital case is not new; it was actually done before computers were capable enough. And the idea of creating a photorealistic still image isn’t new—have you seen a real photograph of a new car lately, or even a video of one?
And we’ve had some movies of people, albeit in shady and obscurely lighted conditions, that we could accept as real (e.g., Gravity)—but porn? Porn in computer games, or in VR, isn’t new, but it was never realistic, and there was no suspension of disbelief. The physics and physiology weren’t right, surface treatment was off, so we never believed the images were authentic.
Now, with better-than-ever CG capabilities, the unreal can be made to look real.
OK, so what? Is CG porn less expensive to make, or easier to stage? Is this a serious issue?
That’s the question: what if the explicit scene involves children, or torture? It’s illegal to counterfeit money and even postage stamps. Is the issue about coun¬terfeiting an object or counterfeiting an illegal act?
Hany Farid, a professor of computer science at Dartmouth and the author of a study on the topic, says, “We expect that as computer graphics technology continues to advance, observers will find it increasingly difficult to distinguish computer-generated from photographic images. While this can be considered a success for the computer-graphics community, it will no doubt lead to complications for the legal and forensic communities.”
Farid found that observers have considerable difficulty performing this task—more difficulty than observed 5 years ago, when computer-generated imagery was not as photorealistic. (The findings, which have implications for the legality and prosecution of child pornography, appear in the journal ACM Transactions on Applied Perception. A PDF is available on request.) Farid’s team also found observers were now more likely to report that an image was an actual photograph rather than computer generated, and that resolution had surprisingly little effect on performance. And he discovered that a small amount of training greatly improves accuracy.
OK, so far so good. If we can be trained to spot the fraud, then presumably our deep-learning visual training algorithms that are so good at identifying terrorists and children brushing their teeth can also be so trained. Which means this is a non-problem forensically—computers did it, computers can detect it.
But if humans consume it, and sell it, is it illegal? If humans sell synthetic narcotics, that’s illegal. Does it follow that synthetic porn is as well?