1. Help Center
  2. Method
  3. Artificial Intelligence

Why might it be a problem that the AI cannot explain how it arrives at decisions, and how could this problem be solved?

Can Do explains the situation that could lead to resource overload, for example.

Here, the overloads are described more precisely in an analysis. The classification is based on criteria that are embedded in the model. However, the configuration of the model varies for each company. It could be traced back, but it would be extremely laborious and would only apply for that moment.

exmaple_description_resourceoverload

Here's a basic explanation of the issue:

The problem that Artificial Intelligence (AI) often cannot explain how it arrives at its decisions is referred to as the "Black-Box" problem. This is particularly the case with complex models such as deep neural networks, where decision-making processes are not intuitively understandable due to the multitude of processing layers and parameters.