German AI startup cracks chatbot black box


Aleph Alphas update makes generative AI models of its Luminous family explainable. This opens up new applications for generative AI, even in critical tasks.

German AI startup Aleph Alpha is introducing a new feature for its own Luminous family of generative AI models that aims to make their outputs explainable and verifiable.

Current generative AI models fail to deliver “Explainable AI”

While generative AI models such as ChatGPT or GPT-4 are transforming entire industries, they have one major problem: their outputs are not explainable. While they could still be useful in critical areas such as medicine, their lack of explainability poses many dangers.

Explainable AI (XAI) methods try to solve this problem and make the outputs of those AI models explainable and verifiable.


In early 2023, researchers from German AI startup Aleph Alpha, TU Darmstadt, the Hessian.AI research center, and the German Research Center for Artificial Intelligence (DFKI) presented AtMan, an XAI method that makes the generations of such transformer-based models for text and images explainable.

Luminous can now hold output without trusted sources

AtMan is now available for Luminous models and can be applied to text and images, according to Aleph Alpha. This is an important step toward greater explainability and verifiability- a regulatory requirement for generative AI models that is likely to come soon with the EU AI Act.

“This transparency will enable the use of generative AI for critical tasks in law, healthcare, and banking – areas that rely heavily on trusted and accurate information,” said Jonas Andrulis, CEO and founder of Aleph Alpha.

With the new feature, Aleph Alpha says it can infer facts in model output, but also directly withhold output without appropriate trusted sources.

“Our customers often have vetted internal knowledge of great value. We can now build on that and ensure that an AI assistant successfully uses only that knowledge and always provides context,” Andrulis said.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top