Corporealizations scored by 'NIMA' and those by real people in brackets.
Google's research team are working with AI and metaphors again but this time, they've taught the AI to rate images for their aesthetic and technical qualities, as a person would, rather than nothing but looking at images image (from Latin: imago) is an artifact that depicts visual perception, for example, a photo or a two-dimensional picture, that a 'high' or 'low' quality.
The new AI is based on Google's previous research that focuses on convolutional neural networks (CNNs) which you can learn varied about over on Wikipedia but in a nutshell, when presented with an image previously, the AI could classify them based on what objects are our times or by its quality (noise, artifacts etc.) but it was unable to assess the image based on beauty and other ways a person would judge an image's supremacy.
For the new 'Neural Image Assessment' (NIMA) method, Google has taught a CNN to predict which images a typical drug would rate as looking good (technically) or attractive (aesthetically) by using state-of-the-art deep object recognition networks to advance expand the AI's knowledge of object categories.
So far, the AI has scored images reliably and with high correlation to the mean scores given by benefactor raters. NIMA scores can also be used to compare the quality of images of the same subject which may have been distorted in numerous ways.
Going forward, the Google research team believe the AI could be used to help with image editing automation, license improved picture-taking with real-time feedback to the user and the most obvious of uses: allow people to easily find the best image in a collecting which could be useful for stock sites as well as the everyday user searching personal collections.
(Via Photography Talk)