Google’s Computers Paint Like Van Gogh, and the Art Sells for Thousands
Google is trying to make art freaks out of computer geeks.
Alphabet Inc.’s [GOOGL +1.06%] Google put on an art show and auction in San Francisco on Friday, displaying works created by computers, with some guidance from humans. The images on display included psychedelic seascapes, van Gogh-inspired forests and fantastical landscapes of castles and dogs. A professional auctioneer hawked the six largest works for as much as $8,000.
“I used to think art was some peculiar thing that humans do,” said Blaise Aguera y Arcas, head of Google’s machine intelligence group in Seattle, who helped organize the art event. “But now I think when we meet the aliens, they’ll have something just like it.”
The art and tech worlds have existed with little overlap in the Bay Area. Tech entrepreneurs have not become big art collectors and galleries and art fairs in Silicon Valley have floundered. One theory goes that the tech industry is populated with engineers who didn’t study the humanities and don’t appreciate art.
Until there’s a math angle to it.
Google engineers applied an algorithm to the artistic process, giving art quantitative and mathematical attributes. Computers, with human helpers, used a technology called neural networks to make 29 works of art for the show.
Google originally developed it to help identify objects in photos. But for art’s sake, engineers turned it on its head, feeding images of random shapes into computer algorithms, which then reported what objects the images looked like, such as dogs, faces and trees.
The algorithms then incrementally changed the images to make them look more like the objects they reported identifying. The artists–some of whom are Google engineers, others who are independent artists–used four techniques: “DeepDream” involved running the process thousands of times to create a unique image; “Fractal DeepDream” ran the same process and different sized versions, producing fractal images; “Class Visualization” zeroed in on a single image; and “Style Transfer” created new images mimicking a famous aesthetic, say Vincent van Gogh’s “Starry Night.”
The show was held in a historic movie theater in the hip Mission neighborhood. Attendees gathered around small tables nibbling purple cauliflower and fancy broccoli and sipping a local IPA beer. At the front of the room, a well-heeled crowd in fashionable, long dresses and boots sent champagne corks flying onto the stage floor. Bouncers controlled a large standing crowd at the back of the room – the fleece pullover, sneaker-wearing art-show newbies.
The group in back whispered, talked and pointed during the auction. Emily Quinn, the professional auctioneer leading the sale, had to remind them that if they didn’t have money to spend, they could still give their time and attention. “Thank you to all of you saying ‘shh!’” she said.
The main works went for $2,200 to $8,000. The proceeds from the event went to the Gray Area Foundation for the Arts, an organization that promotes art and technology.
Like many members of the audience, Poorna Omprakash, a software engineer atAmazon.com Inc.AMZN -0.17% who studied machine learning, had never attended an art event before.
“I’m here because I want to know how they made the art,” said Ms. Omprakash. “But my second reaction is that it is beautiful!”