Confabulation, Hallucination, Delusion, and Lying. They are all distortions of the Truth or Reality. I think of them as different forms of making shit up (not really a medical term of art).

Confabulation is a medical term for a phenomenon that is often seen in mental illness that renders the affected individual susceptible to suggestion – Korsakov’s syndrome as a result of chronic alcoholism is a good example.

Hallucination refers to sensory phenomena that a person experiences without an external stimulus; the affected individual sees, hears, or smells things that no one else does. The brain is creating its own reality that is inaccessible to anyone else.

I think of Delusion as the brain’s effort to interpret incomplete information when the lack of complete information is a brain defect rather than an absence of information. This is commonly seen in people who struggle with dementia. The affected individual believes that their life-long partner is trying to poison them or is cheating with another – simply as a result of misinterpreting reality. Even a simple trip to the grocery store becomes a liaison or a tryst. Alas!

Lying is simpler to understand, it is a deliberate distortion of the truth (or overt invention of nonexistent facts) for the purpose of achieving some nefarious goal. Think of Trump and all of his sycophants and enablers.

Today, I read an email from my fellow astro-buddy, Fred Garcia. He was alerting me and other astrophotography buddies to an image processing tool called BlurXTerminator. In recent years, there has been a flurry of picture processing tools that utilize neural network technology (aka Artificial Intelligence) to perform their mathematical magic on images. I myself use one such tool called Topaz DeNoise AI. I went to the BlurXTerminator page to read about the tool.

This particular AI-based image processing tool performs a task called Deconvolution using one of several standard algorithms that can remove motion-related blurring as well as blurring caused by optical defects in the lens or telescope used to acquire the image. This tool was developed specifically for astrophotography. The AI for this tool was trained on astronomy images rather than landscape, still life, architectural, or other kinds of non-astronomical images. The product description emphasized that AI image processing not trained on astronomical images was more prone to creating details that were not present in the image – a form of Artificial Intelligence hallucination in which veracity was tainted if not entirely lost.

I recall many years ago, in graduate school, conjecturing that any AI capable of creative output would also be capable of creative error. In this case, as in the case of Large Language Models (ChatGPT, Bard, etc.) it is a matter of making shit up.