Lab Home | Phone | Search | ||||||||
|
||||||||
A difficult problem in image processing is the statistical representation of texture. We show that a powerful statistical representation of images can be extracted from a recent development using Convolutional Neural Networks (CNNs): Neural style transfer, where the artistic style of one image is imbued onto another. We present two of our applications using the CNN-derived representation of texture: The first performs unsupervised characterization of a set of textures by mapping them to a low-dimensional space while preserving the relationships between the textures. This is designed to provide automatic quantification of materials microstructures to enable the usage of microstructure in materials informatics and design. The second is the in-painting of texture-like images. Here, missing regions in an image are replaced with synthesized data that recreates the texture of the known portions of the image. Host: Chris Neale |