Today I wrote a manifesto for my experiment. This is a draft.
design + ai + your mood = empathetic-based design
We can feel. Emotions play a big role in our life, especially in relationships with people. We can understand what our companion is feeling at the current moment and adapt based on their emotions. When they feel sad, we try to support and don’t remind about the cause of their sadness.
Our companions adapt based on our emotions too, therefore the adaptation and dialog are two-way. (are supported by two sides)
Our companions adapt based on our emotions too, therefore the adaptation is supported by two sides.
At the current moment, design almost doesn’t have a dialog with a human. Its behavior is based only on user data: your name, what you watched on YouTube, what you bought on Amazon, what you googled, etc…
These things just personalize suggesting content.
But what if we humanize design, give the ability to hear our emotions and allow to change content based on it?
vision on future
Design communicates with our environment, with all that surrounds us, with our feelings. Design looks like a dialog between two humans. It predicts what we want faster than we understand it.
I created this experiment to see what could be if content and design depend on our emotions.
how does it work?
The content will tell you about this experiment.
how to use it?
There are 7 emotions: happy, neutral, angry, sad, fear, disgust and surprise. Try to fake emotions and see what will happen.
This variant will appear with neutral emotion.
You must to allow accessing to your camera. Your data stay only with you on your computer.
It doesn’t work on phones and some browsers.