Trust Issues Preventing AI from Reaching its True Potential

While artificial intelligence (AI) is capable of achieving great things, business leaders and the general public alike have expressed some serious concerns about the technology. As uncomfortable as it might make people, there is no denying the great potential that exists from taking huge amounts of unstructured information and creating actionable insights from it. Whether it’s a cure for cancer or the answer to climate change, this rapidly expanding body of digital data could hold the answers to some of our era’s most pressing concerns.

Unfortunately, AI will not be able to live up to its potential if no one trusts it. That trust will come just as any other type of trust is built – by behaving the way people expect it to over a period of time.

Ethics and Bias Top Concerns

There are also concerns about ethics. AI systems can have mechanisms to apply ethical values that are useful for the task at hand, it’s just a matter of making the effort. Google, Amazon, Microsoft, IBM and Facebook have joined forces in the “Partnership on AI” to help guide AI development in an ethical manner.

Another big problem is bias, which can be introduced into AI systems though algorithms. Sometimes the training data used is skewed, and other times the bias of human curators creeps in. Thorough testing of these systems can help find and reduce bias, however.

Companies that employ AI solutions need to use accountability and trust as criteria when selecting a system, and they must be prepared to work with vendors to sort out any undesirable behaviors.

One thing is clear: putting off the implementation of AI benefits no one. After all, the ability to know where a vital resource can be found or what is wrong with an ill patient is simply too valuable to postpone. That is why AI needs to be developed in a way that can inspire confidence and trust.

This blog post was based off of an article from UPS. The original article can be viewed here.




join the supply chain geek network