Photo By: http://www.photoshoptutorials.tv/chilling-decay-effect-in-photoshop/
A picture is worth a thousand words. It rings true today more than ever considering how much content we are sharing about our personal lives everyday. There used to be a time not so long ago that one could go a whole day without seeing/interacting with a photo. Text and voice were our main ways to communicate.
Fast forward today and it’s a whole other experience. We scroll through Facebook first encountering photos and videos. Instagram (more so popular within my group) is all about photos and as one scrolls, the photo dictates likely what one reads. We are visual creatures and it is blaring obvious social networking companies realize this and execute accordingly.
A photo generates a feeling and a response. There are algorithms that dictate what one views. What could possibly go wrong, right?!
Well… about that. It’s no secret Facebook in particular is receiving a LOT of heat for allowing not only the social network but it’s consumers as well to be straight up gamed. Played. Fooled. All by understanding us better than we know ourselves. With that comes a new threat that will be unique to us as a social experience that none of our ancestors have ever faced.
We are getting to a point where even our eyes cannot be trusted. Let alone our ears and other senses. Think how the Chameleon blends in and it fools the senses to think it is a part of something it’s around. Nature does it well and now we humans are doing it digitally. Think of these ridiculous pornography videos that inserts celebrities heads on other actors bodies. Impressive yet very distributing. And no, I don’t subscribe to those; get your minds right!
In all seriousness, let’s play this out for a moment. Back to social media. People are distrusting of the media and yet one of the ways we vet information is typically going to the sources themselves. Did the POTUS say what I thought he said?! Let me take look. Queue viewing material, absorbing what happened, then making a decision on my thoughts. Pretty straight forward although this now may not be the case in the near future.
First, let me address the term. Instead of A.I., we should refer to this as ML, which is known as Machine Learning. We can fairly easily train for facial recognition and do displacements/replacements.
The reason for this concern should be obvious but just in case, let’s cover a few scenarios. Your face could appear a place it never has been. On a body that isn’t yours. There are many situations where that’s not ok. We typically view a photo as proof these days. To your friends. Inside the courtroom. And all of that comes into question if we cannot truly rely on photos as a form of reliable data. Voice is another we are synthesizing currently, but that we can leave for another day.
I feel it’s prudent to be mindful of the ways technology is being used. Adobe, which is known for its software in photography, appears to be taking a responsible take on what it can do to solve fake photos. For example, their ML looks for splicing, where another photo is integrated. Then there is cloning meaning multiple objects that have been copied and pasted. Finally, there’s removal which can be used to dissect certain portions of the photo for removal.
Personally I am VERY optimistic about the future. However, I am cautious and I am very perceptive that many individuals are skeptical (and rightfully so) of the “truth.” What I do not want to see is this blanket of trusting no one, no thing, as this can get so very far out of our hands. That would be chaotic and create a world of disconnected people. It’s too far to tell and difficult to see a forecast. I am just thankfully that we see enough through the mist to know we need to take precaution and be transparent.
Article of Reference:
The Verge: Adobe is using machine learning to…
The Verge: Celebrity…