The machine overlords of the future may now, if it pleases them, eliminate all black and white imagery from the history of their meat-based former masters. All they’ll need is this system from Berkeley computer scientist Richard Zhang, which allows a soulless silicon sentience to “hallucinate” colors into any monochrome image.
It uses what’s called a convolutional neural network (several, actually) — a type of computer vision system that mimics low-level visual systems in our own brains in order to perceive patterns and categorize objects. Google’s DeepDream is probably the most well-known example of one. Trained by examining millions of images of— well, just about everything, Zhang’s system of CNNs recognizes things in black and white photos and colors them the way it thinks they ought to be.
Grass, for instance, has certain features — textures, common locations in images, certain other things often found on or near it. And grass is usually green, right? So when the network thinks it recognizes grass, it colors that region green. The same thing occurs for recognizing certain types of butterflies, building materials, flowers, the nose of a certain breed of dog and so on.
Continue reading “This neural network ‘hallucinates’ the right colors into black and white pictures” »