According to experts, the next step in the development of AI is to incorporate emotion into technology. We tried it for ourselves in order to find out how we can combine the two and what we can learn from the experience.
“OK Google, what’s the weather like today?” is a question heard in more and more living rooms every morning. Mainly in the United States to begin with – where 66.4 million people now own a smart speaker – but now on the other side of the pond as well.
Digital butlers such as Google Assistant, Siri and Alexa have rapidly conquered the world in recent years. They make our lives a little easier, and the more they are used, the more they can train themselves by learning from the mistakes they have made.
Computer + human = magic
Thousands of books have been written about the technology behind these assistants – artificial intelligence – in recent decades. AI can perform human tasks and, thanks to reinforcement learning, it can now teach itself to play chess, navigate or trade on the stock market. The initial fear that technology would make human beings obsolete evaporated when we learned to use it to reinforce our own abilities. In chess, for example, a form of the game known as ‘Centaur Chess’ has emerged, in which people and computers team up. Although the computer helps, it is the human who is ultimately in control and makes the decisions. We see the same thing in hospitals, where doctors use medical imaging to find out more quickly which spots on the skin are harmful. The whole is greater than the sum of its two parts.
What these examples have in common is that they all have an objective definition of what a good or bad decision is. That is why experts point out that the next step in the development of AI is going to be a lot more tricky: incorporating emotion into technology.
Can AI understand our emotions? If you ask us, there is hardly any chance at all that we will experience a significant breakthrough in the research over the next five years – but never say never. There are still too many uncertainties surrounding the development of even ‘simple’ AI, let alone the design of something as subjective as emotions.
Making music with AI
Nonetheless, we were fascinated. We love experiments, so we decided to turn the question on its head. Can AI make something that appeals to our emotions? A piece of music, for example? Our colleague Sebastiaan Van den Branden is the perfect person to take on the challenge, because he is a musician as well as a data scientist.
“I started the experiment by deciding which tool to use,” Sebastiaan explains. “In the end, I chose Magenta by Google. Not simply because we are a Google Partner, but mainly because it is a neural network, which makes it particularly well-suited to making AI art. Google has trained various algorithms that can help do that. For example, there is one algorithm you can use to generate melodies and another that helps to create a certain pattern out of different drum parts.”
Magenta can be used in various ways. There are high-level tools that are really plug and play, but you can also train a model yourself based on your own data. The output is always the same: MIDI files, a type of score that electronic instruments can ‘read’.
“I did various tests to figure out how you can tackle this kind of task. To start with, I made a number of simple melodies and drum patterns and then let artificial intelligence continue working on them. In another experiment, I got the AI to generate different melodies and then selected the ones I liked so that it could carry on developing them. By doing this a few times in succession and varying certain parameters, a human can curate the music and start filtering the results.”
Spoiler alert: the result was not a finished piece of music that you can expect to hear on the radio any time soon, but Sebastiaan did not expect it to be. “As I see it, the value of the AI lies in the inspiration I can draw from it. Artificial intelligence can surprise you with melodies and harmonic contexts that you wouldn’t come up with yourself because it is less stuck in a rut; it doesn’t gravitate as much to certain patterns. You can’t expect a ‘finished’ piece of music – the real work only begins afterwards.”