The concept of Artificial Intelligence - or just simply AI - has long been a fundamental mainstay within the realms and tomes of science fiction. However, whilst such a concept has spent eons as just being an unobtainable trope or thought experiment, the last several decades have seen the AI creeping it's way into real life. Granted, where still yet to be terrorised by Sonny from I, Robot just yet, but with the popularisation of systems such as Alexa and the recent advancements in the field of research, it seems the creation of sentient AI is not just feasible, but expected. Pondering the consequences of this is How Do We Want To Live?, the eleventh record from German post-rock influencers Long Distance Calling.
A record full charm, intrigue and progressive cinematic atmosphere, How Do We Want To Live? questions the very nature of AI and if such synthetic sentience will be beneficial, hindering or outright spell annihilation for the human race. Aided by filmic commentaries on the nature, the album is a grounded sci-fi thriller on an excitably dangerous topic that's on the verge of becoming a problematic reality.
With the record out now via Inside Out Music, Janosch Rathmer (Drums) and Jan Hoffmann (Bass) got in touch to spell out five other simiular expanding technological advancements that are not only just making waves, but are fundamentally problematic in nature.
Janosch: "Most of the people on this planet are using social media. But I think it’s really important that you see both sides of the coin. On one hand it‘s perfect to connect to other people from all over the world. It‘s perfect for musicians and organisations to connect to your fans and customers and to promote your music and your products. But on the other hands you give away your personal data for free. It sometimes feels very strange when you‘re talking about a random topic and minutes later you‘ll get some advertisement on facebook for foot care haha. Data is the new currency and it‘s really important that we keep that in mind and it's also really important that you not getting all your important information about what’s going on on this planet via social media. Because all those conspiracy guys are using social media to promote their weird worldview."
Janosch: "I think this will be a big topic in the next few years. It already is in some parts of the world. We recently published the video for our song 'Voices' where a man has a relationship with an robot. Those robots are getting closer and closer to real human beings. On hand hand it maybe helps some people to fulfil their sexual needs and it might be that it can hold off some people to do questionable things, but on the other hand it is also really strange and it raises some moral questions when it comes to pedophilia and other questionable sexual types. So this topic is very sensitive and it feels like it is still far away from 'normal' but I am pretty sure that it will be a topic in the near future. It also might change the relationships of some people in a dramatic way. It still feels like a trashy sci-fi movie."
Janosch: "We already have cars on the streets like Tesla which drive autonomously in some way. But i think it‘s the wrong way. It‘s the wrong way to think about how it is possible that everyone can afford a car which is driving without a driver. I think it‘s way more important to think about alternatives. We need to get rid of the cars. We need cities without cars. We need to think about safe and affordable public transportation. We need to think about a way to transport people without driving our environment straight to hell. We need to get clean transportation. That is the most desirable thing when it comes to transportation. For most of the people in Germany the car industry is very important and something we can be proud of but there is so much bullshit and ignorance in this branch. We need to think about new ways. And when it comes to self-driving cars it is also important to think about moral questions. A machine should never chose between life and death of a human being."
Autonomous Weapons Jan: "What happens if we give up on our responsibility for weapons? Autonomous weapons are already reality in countries like China, Israel, Russia, South Korea or the United States. They are able to select and engage targets without meaningful human control. In my opinion this is a very dangerous tool. Not only to give up control and power in the first place but even more because of the fact, that this will lead to a new state of mind. Humans won't feel responsible anymore if we pass the torch over to machines. If those machines kill humans, we won't feel responsibility or any form of emotion.
On the other hand, if we get to the point, that those robot weapons only fight against other robot weapons instead of human beings, this would be an 'improvement' in future warfare. If we ever get to the point, that humans won't be killed anymore during wars, then it's definitely better to use robots to fight against each other. But of course those weapons could also be used in civilian life. If a private person decides to kill another private person through a killer robot, can he/she be held responsible for the murder? Let's be very careful with this!"
AI In Law
Jan: "This leads to another interesting thought experiment: how would a machine work as a judge in court? China for example already uses AI in court by scanning private messages or comments on social media to use them as evidence. Some countries also use facial recognition for different purposes. Of course the algorithm is fed by humans based on former experience data but what will the algorithm make out of it? Could the algorithm be racist or discriminatory, based on the former experience data? For example: there's also algorithms for the application process in companies. What if the algorithm prefers male applicants because it's experience teaches that women might go on parental leave at some point?
Algorithms are already able to make decisions by giving an evidence-based analysis of the risks, rather than relying on the subjective decision-making of individual judges. We are talking about smart justice but is it really smart to let a machine decide whats right or wrong? Does it make sense to break it down to a sober analysis instead of a careful consideration based on evidence, experience and a human sense of justice? I prefer the latter."
How Do We Want To Live? is out now via Inside Out Music. Purchase the record here.