PeopleGoal

Algorithmic accountability

The belief that companies should be held responsible for the results of their programmed algorithms. A related term is algorithmic transparency which suggests that companies be open about the purpose, structure and underlying actions of the algorithms used to search for, process and deliver information.

As the product of humans, algorithms can have issues resulting from human bias or simple oversight. This has previously led to biases and discriminatory results through the use of, for example, facial recognition technology. Algorithmic accountability is promoted as a way to help such issues be recognized and corrected. Training data and steering machine learning in an ethical direction.

In order to have code audited, it has to have at least qualified transparency, which means that it is made available to third-party inspection. Such inspection may be performed by a regulatory body if algorithms are not open source and open to public inspection.

At the heart of accomplishing algorithmic accountability, it is imperative that companies accept legal and ethical responsibility for it.

PeopleGoal

© 2024 PeopleGoal, Inc. All rights reserved.