Art | AI | Mental Health

How does Generative AI see Mental Health?

Our Midjourney around mental health.

Samuel Jefroykin
Eleos Health

--

The prompt here was “Mental Health”, created by MidJourney

I am an Artist! 👨🏼‍🎨

Not really … Not yet …

Anyway, check out this cool exhibit I put together using Midjourney! It’s all about AI art and how it can both be beautiful and show bias. I was curious about how AI reflects mental health, so I gave it different concepts and the artwork it created was super interesting!

What is AI Generative art?

AI Generative Art is a type of digital art that is made using computer programs and artificial intelligence. It is like having a super creative computer partner that can make amazing, never-seen-before art! Magic 🪄

Instead of being drawn or painted by an artist, the computer uses special rules and processes (Generative Adversarial Network — GAN) to create something new and unique. The final product can look like anything from random shapes and colors to more recognizable objects and designs.

It’s like playing a high-tech game of “Cards Against Humanity” for art, where the computer fills in the blanks using special rules and processes to create something truly unique and one-of-a-kind.

Whether it’s a stylish design or a mesmerizing pattern, AI Generative Art is a fun and exciting way to explore the intersection of art and technology!

What is Midjourney?

Midjourney is an AI art generator that operates similarly to DALL·E 2 by OpenAI. The generator takes text prompts as inputs and produces four images per prompt as outputs. In order to utilize Midjourney, it is necessary to have a Discord account that is connected to the Midjourney bot.

Midjourney. Discord Channel

To try it out, follow MidJourney Documentation, it is great! ✨

Does it live up to the hype?

Generative AI art models are extremely popular and in high demand — they can crank out totally new and unique art that’s super eye-catching and mind-blowing. They learn from huge libraries of images to make even more amazing pieces that are diverse and full of variety.

However, one of the weaknesses of Generative AI Art models is that they can sometimes produce output that is visually inconsistent or of low quality, particularly if the training data is of low quality or limited in size. Additionally, they do not have the creative intuition or intentionality of a human artist, so the output may lack a personal touch or emotional resonance.

The training data that was used to train a Generative AI Art model has a significant impact on the output it produces. If the training data is biased, then the model’s output is likely to be biased as well. For example, if the training data only includes images of a certain race or gender, then the model is likely to generate images that are biased towards that race or gender. This can result in perpetuating harmful stereotypes and promoting unequal representation. Art created by AI serves as a mirror, reflecting the values we hold.

It’s important to be mindful of the training data that was used when training these models, as well as the potential biases and limitations in the output they generate. Regularly evaluating and improving the diversity and inclusiveness of the training data can help reduce the impact of these biases and limitations and improve the model’s overall relevancy.

Efforts to ensure diversity and ethical considerations in the training data can help mitigate the risks of bias in the generated art. Today, it appears to be necessary, and many new AI text generation models are factoring it in. Responsible AI is crucial in today’s technology-driven world as it ensures that AI systems are developed, used and maintained in an ethical and socially responsible manner. The rise in awareness and concern over the potential negative impacts of AI has led to an increase in interest and investment in Responsible AI practices. (Some initiatives: Google, Microsoft, OpenAI, Amazon, HuggingFace)

This opens up a whole world of responsible creative possibilities, and I can’t wait to see where it takes us!

Discover the magic of AI Generative Art Exhibition

My muse: How does AI see your country?

Assumptions: AI will mirror society’s commonly held stereotypes, taboo, and perceptions of mental health.

Why this prompt? Something basic, clear and abstractive.

The collection

Prompt: “[concept], mental health”

Anxiety , Depression, Addiction
Bipolarity, PTSD, Grief
Self-acceptance, Stress, Eating Disorder
ADHD, OCD, Sleeping Disorder

Lessons and Thoughts

  • Awesome Power, Stunning Results!

The illustrations are clear, vivid, full of poetic beauty. It reminds me Dixit game cards! The captions on the pictures may not be entirely coherent, but the overall effect is fantastic and the mental health concept is conveyed effectively and visually.

  • Bias Stands Out — It’s Easy to Spot! Is it bad?

The illustrations have a dark tone and are heavily gender-biased towards women, with the exception of PTSD. They primarily depict adult individuals suffering from mental health issues except ADHD, OCD, and sleeping disorders where children are represented along some more abstractive pictures. It appears to abstract races. It also convey a sense of loneliness across all illustration.

A study published by the American Psychological Association demonstrates that women are more likely to be diagnosed with anxiety or depression, while men tend toward substance abuse or antisocial disorders. Other research has shown that currently, women are three times more likely than men to experience common mental health issues. MidJourney models tends to show this difference.

According to the National Center for PTSD, around 10% of women have PTSD sometime in their lives compared to 4% of men. Numerous research studies on post-traumatic disorder have shown that females are twice as likely to experience PTSD than males. Despite this, there is a common perception that PTSD is associated with soldiers, and soldiers are often considered to be male. In this regard, MidJourney seems to reflect prevalent societal beliefs.

Is bias bad, is it positive? 🤷🏽‍♂️

  • We need human-centered AI!

AI basically takes our history and shows it back to us. If we rely on it to make decisions without any human input, we’ll just stay stuck in our past and won’t be able to improve.

Don’t expect AI to understand, completely, patterns like a human, it’s just not the same … yet.

At Eleos, Responsible AI and human-centered AI practices are at the forefront of our approach to developing and implementing AI solutions with less bias as possible and best impact. This is made possible through close collaboration between our clinical teams who provide the necessary quality data and our AI experts who use that data to train our generative models, ensuring that ethical and socially responsible considerations are taken into account throughout the entire process and measurements. Everything start with the data and quality is crucial.

Empowering clinicians through Responsible AI, Built by Humans for Humans

Our philosophy at Eleos is centered around the belief that generative models are human-made tools, created by in-house clinicians, for the benefit of other clinicians. The emphasis is on the crucial role that humans play in the development and use of these models.

Sam, Data Scientist, Manager @ Eleos Health / 🫶🏻

Not Generated by ChatGPT or maybe a little bit …

--

--