Digital ethicist Dr. Marialice B.F.X. Curran wants you.
“We want people impacted by tech and social media to feel empowered,” she says. “Instead of being reactive, we want people to be active participants in the digital future. We want people to be the digital change.”
Not only does she want our participation, but she also wants us to see what is becoming increasingly clear — the creation and use of technology has an ethical component in its design. “It is often held that technology itself is incapable of possessing moral or ethical qualities, since technology is merely tool making,” says a widely disseminated definition of Technoethics. “But many now believe that each piece of technology is endowed with and radiating ethical commitments all the time, given to it by those that made it, and those that decided how it must be made and used. Whether merely a lifeless amoral ‘tool’ or a solidified embodiment of human values, ‘ethics of technology’ refers to two basic subdivisions.”
Therefore, inherent in the creation of the tool is the ethics surrounding its purpose and what impact it will have upon those who use it. For that reason, we may fear it. We may hide from it. We may choose to sabotage rather than use it. We may shrink from our common responsibility to be a part of the creation process.
I remember as a child that when I got in trouble for doing something kind of wacky, my mother used to call my name — both first and last. Certainly hearing the strong sound of her voice, I knew I was about to be given a different perspective on my behavior. You could call it a form of new information — or a learning opportunity — but it always prompted me to do one thing: hide. My mother was a sweet woman, but she was pretty straightforward. And strangely enough, hiding never actually worked because the longer my mother looked for me, the more peeved she got. If I would have just come to her earlier, I might not have gotten grounded or whatever. Finally I learned, and when she would call my name I would come out of wherever I was and as calmly as possible ask what’s wrong. I came to realize that meeting new information head on was how to better deal with its consequences. Once I realized the value of seeking out information as early as possible, it changed my entire worldview, and probably my life, from that point forward.
Much earlier, in 1811 Nottinghamshire, England, textile workers upset with the rapid pace of change — and fearful of its consequences — went in the other direction. These workers, mainly weavers and farmers, thought they could stop advancing technology in its tracks by destroying the machines intended to replace them. Ned Ludd did it, they said. They thought they could stop the advance of the Industrial Revolution. They were wrong.
Early on in my career, I used the term Neo-Luddites to describe those people with barriers to technology or who are opposed to technological innovations in learning settings. I looked to why learners in particular would avoid new technologies and described the barriers those people felt as they approached new learning systems.
People have barriers to technology that are physical, emotional, and ethical. Physical barriers include the difficulties that learners have with systems, processes, and tools. Emotional barriers include how using new technology makes people feel, both about the organization they are in and about themselves in relationship with the technology. The third barrier to learning technology is one that might be seen as unexpected: people have, like the Luddites before them, a sense that new technology is in some way ethically or morally wrong and will be detrimental in some way to them or to the society at large. Just like the Luddites, they may make a decision that artificial intelligence, robotics, or big data is going to be harmful and therefore should be avoided. As Ned learned, however, there is no way catch the horse after the big digital barn door has been opened.
The way forward is one of engagement and responsibility, not fear or avoidance. But there are things to be concerned about, without question. Our response to the threat is the key to moving forward. David Ryan Polgar (@TechEthicist) believes “the open and collaborative nature will bring forward new voices that can influence this important conversation around social media and tech use.”
We must be engaged. If we are part of the solution, we all move forward.