Moral values of an AI


The moral system of an AI system will not be mirrored by our’s because for the very simple reason that they are not humans.

Artificial systems would never be benevolent because they are trying to achieve their goals without the moral compass built into them.

They are just trying to achieve the objective they are built for.

Giving the immense power of thinking and acting on the instincts of a machine, is a step closer towards the doom of the humanity.

Take an example;

What is the AI machine running the world try to solve the world hunger problem by killing enough people on the planet? For it, it’s the most plausible solution which just fulfills its objective.Fewer people less shortage of food.

There is nothing like a benevolent machine. This is an alien concept.A machine is simply following the instructions, rules laid out before and to fulfill its objective and at the same time learning and evolving. But to what extent, to evolve in terms of speed and precision.

Don’t get me wrong; it’s not like I’m against the advancement of science and technology. I have studied Masters in Computer Applications and have worked in the technology field for almost a decade. I strongly support the advancement of science and technology for the benefit of mankind.

The point I’m trying to stress here is using machines to ease up our daily life and using their immense calculation power to help the world in one or other way is a noble idea. But on the other hand, giving immense power to an AI system so that it can take autonomous decisions at the behest of humans is a reckless decision.

A human moral compass can never be built into them. They can never realize the beauty in nature, in the hidden moments of life, which is basically the essence of being human. A machine can never have the best interest of humanity at heart. The idea of a benevolent machine is a fallacy.

Joseph Weizenbaum, a German American computer scientist and an emeritus professor at MIT university, argued that there are service providers in certain areas which should not be replaced by the AI machines, as its very important to have authentic feelings towards the people they are caring about. For e.g.

1. A therapist
2. Nursemaid
3. Soldier
4. Police officer
5. A judge.

How many times have we read in the news about the drones wrongly attacking and killing the civilians instead of the perpetrators?. To Err is humans, they say, then what about the AI, which takes the autonomous decision. Won’t they make errors?

Machine ethics (or machine morality) or Roboethics is the field of research concerned with designing Artificial Moral Agents (AMAs), robots or artificially intelligent computers that behave morally or as though moral, which is still in its infancy.

We all as humans are turning a blind eye to the irrevocable danger in shape of the AI machine learning systems cause we are trading speed and precision with the
moral values.

To simply put it.

“They are Gods, without the hearts.”

Photo by Tim de Groot on Unsplash

This post is in response to the daily prompt Brilliant

18 thoughts on “Moral values of an AI

    1. Thanks for the feedback. I was really skeptical while attempting this post.I guess we all are slowly considering in our lives in the near future. I just wanted to voice my options and share with my wonderful community for their feedback.

      Liked by 1 person

    1. Thanks for reading and sharing your views. I was really apprehensive about posting this. Yes, may be there is anew filed called Machine ethics introduced from stopping it to happen.


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.