HOW IT ALL BEGAN: EXPLORING HUME.AI

Introduction

Sentence Analysis– "I am a digital living experiment"

"The first sentence analysed in the project, 'I am a digital living experiment,' serves as a foundation to explore AI's ability to empathise. This Artwork/film visualises AI's emotional interpretation of the sentence, using soundscapes and multimodal graphs to translate chatbot responses into artistic forms."

My goal was to find a leading force in developing empathetic AI. Hume.AI emerged as the solution, utilising semantic space theory and machine learning to explore human emotional expression. With Hume.AI as the foundation, I built an emotional multi-modal graph, testing AI's growth in empathy. which included speech prosody analysis, emotive language interpretation, and sound analysis using stable audio prompts. Shaping analysis was conducted with Poe.com. Hume.AI offers comprehensive models and datasets to analyse emotional expression in speech, facial expressions, and reactions.

Project visuals

Emotion AI Visualisation results 1
Visualisation of emotional patterns detected by hume.ai

When I started exploring Hume.AI, I was blown away. Each word, each sentence could be broken down and understood as a separate emotion. It was wild to see how language, the way we speak, could be analyzed and dissected in such a methodical, almost surgical way. But this was the turning point. We, as human beings, like to think of ourselves as unique—that our emotions, the way we feel, are almost otherworldly, beyond explanation. Yet, here we are, creating a virtual entity that can hold conversations that feel almost human. Conversations that can evoke real emotions—sadness, happiness, even moments of connection. And it's all happening through the understanding of words, broken down to their core, creating emotional depth in a way I never expected.

Emotion AI Visualisation results 2
Prompts and visual outcomes
Emotion AI Visualisation results 3
Hume.AI