New Technology Will Literally Put Words Into Someone’s Else's Mouth
By John Stonestreet/Breakpoint.org August 31, 2017 Share this article:
Most of us are familiar with Photoshop, a program that enables
users to edit photographs in such a way that the average person cannot
tell what the camera originally captured as opposed to the
digitally-created lie that results from our editing.
Image
editing is so common these days, from air brushing to full-on altering,
that the old adage "a picture is worth a thousand words," is no longer
as true as it once was.
And it's not just
still images, of course. Through video editing, we can be made to
believe in amazing trick shots with basketballs and Frisbees
that--spoiler alert--never happened.
Now, as
the public radio podcast RadioLab recently told listeners, new
technologies promise--or, "threaten" may be a better word--to do for the
spoken word what's been done to the image.
The technologies vary, but the results are that with
various degrees of success, a person can appear to say almost anything.
Some of these technologies can even use existing sound clips to create
entirely new statements and speeches that were never given.
Some
of these technologies are so convincing that they're prompting the
adoption of digital "watermarks" that will enable listeners to tell
whether what they heard was real or fake.
The
most ambitious of these technologies don't stop at audio: their goal is
to literally put their words into someone's else's mouth, using video
editing to project non-verbal cues, such as facial expressions, onto
other people's faces.
Watch an example here:
So,
a company in China hoping to use Jennifer Anniston to promote their
product, could use this technology to make it appear as if she's
speaking Mandarin, even though she can't.
In
our age of "fake news," one can quickly imagine the potential
geo-political and cultural chaos that this technology could quickly
create if the wrong words were put in the wrong mouths.
Even in light of this danger, I was struck by the tone of
this RadioLab podcast, especially when the reporter asked the creators
of this technology if they felt "no responsibility as to how people
might use this? Especially in a day of fake news?"
What
struck me wasn't his concern. That's justified. But the failure to put
other technological advances, and their creators, under the same sort of
scrutiny. When RadioLab reported a few months ago on the gene-editing
technology known as CRSPR, they weren't freaked out at all.
Instead,
they acknowledged the potential for abuse while at the same time
downplaying the problem. The message seemed to be "it makes us cringe at
first, but that ship has already sailed."
They
told the audience "things are happening very fast," and talked about
potential cures. And, by "potential" they meant the results of lab mice
with muscular dystrophy who weren't exactly cured but at least seemed to
get stronger.
Not once in the entire podcast
was a gene-editing scientist asked if they felt responsibility for the
potential abuse of what the former Director of National Intelligence
once called a possible "weapon of mass destruction."
The
selective worry between these two broadcasts was remarkable to me. As
it turns out, what we worry about says a lot about our worldview.
A technology that threatens the media's control of the
cultural stories warrants handwringing. A technology that strikes at the
sanctity of humanity merits an "Oh well, you know, progress..."
In
Matthew 23, Jesus called his opponents "blind guides" because they
"strain out a gnat but swallow a camel." In other words, they were
concerned about lesser matters but disregarded the weightier ones.
That
pretty much sums it up. People have conniptions about transient or even
trivial matters but gloss over and even embrace real threats to human
life and flourishing.
All of which goes to show
that there's no substitute for discernment, and there's no discernment
without a proper worldview. Without it, we'll be swallowing some pretty
nasty stuff.
Originally published at Breakpoint.org - reposted with permission.
No comments:
Post a Comment