The Google Art Critique Algorithm - Ethics of Machine Learning
Do you have a google account? Do you take photos on your smartphone? If so, your photos will be backed-up to google's servers and maybe manipulated. It is all done by the Goole Art Critique.
Yes, google have implemented a function, to automatically apply filters (like panorama, color effects or time lapse animation of several photos). This function is applied by google on some of your photos. Seemingly at random. Well, seemingly... An example; recently, I took a photo of a bonfire by the sea on a family trip. I used my smartphone, and a few hours after returning to WiFi, a new photo appeared in my album.
Here is the adjusted photo;
And here is the original;
The google algorithm had applied a filter to enhance and change some of the colors. Fine, I thought, it even looked kind of cool, so I shared it on Instagram that day. At first I did not think about this more, but after a while I started to think.
Why did google apply those filters, and not any others?
Why did it not choose to make a panorama or a time lapse?
Why exactly that photo?
So, there is something going on in the cloud at google, that decides to apply filters on some of the photos. I did not select this photo for manipulation. Clearly some sort of algorithm at works here. I am not nor have I ever been, an employee of google. But I have some experience with machine learning software and even though I have no exact way of knowing how this selection works, I would put a strong bet on that this algorithm, is based on some sort of machine learning technology. Uploaded photos are analyzed and classified as either normal, or ones that are suitable for manipulation. The ones suitable for manipulation gets manipulated and added to the users collection. Quite simple really!
Let's start with the selection. Which photos should be targeted for manipulation? The ones that are photographically good, beautiful, or exciting of course. But how do you tell a piece of software what is good, beautiful or exciting? Beauty is like the song says in the eyes of the beholder. I.e. Beauty is not an absolute, it is not static and it is for sure different from culture to culture. For machine learning software to work, they need a set of photographs that are already classified as good, exciting or beautiful. This is typically referred to as ”training data”. Once the training data is available, the algorithm is trained on classificating the data over and over again, until it is successful.
From a technological point of view, it is challenge (and probably satisfying) to develop such an algorithm and make it work. But there is an implicit ethical problem here. The beliefs and views of the person or persons that defined the training data, is implicitly coded into the algorithm. It is what those person considered as good, beautiful or exciting that gets learned by the algorithm. This view of photography is implemented and used whenever people upload photos to google. Google do not only develop a tool for enhancing photographs, but actually a rudimentary Google Art Critique algorithm. Like it or not!
We have only seen the beginning of the use of this kind of technology and as it becomes more widespread, there must be transparency and awareness of the ethical challenges that lies in defining machine learning training data. The ethical implications and the consequences for the google art critique are not very significant when it comes to photography. But imagine what this means for machine learning algorithms used in medical decision systems, insurance ratings or mortgage approval applications.
Traditional software is getting more and more regulated in critical applications (like controlling an aircraft a pacemaker or nuclear power plant). But equally important is to address the ethical side in machine learning training data.
I've had these same thoughts when my little Google Assistant spiffs up photos for me that I recently took. I love diving into machine learning. Great points you've made!
Thank you, I guess we are only seeing the beginning.