To stop a tech apocalypse we need ethics and the arts

Read the Story

Show Top Comments

The article does a poor job of defining what the “tech apocalypse” actually means. There are some examples and some references to pop culture, but no in-depth delving into what the problem actually is. And if you can’t correctly identify the problem, how can you hope to find the solutions? The thing about articles like these is that they sound good to laypeople, but the authors themselves do not really know what they are talking about. In order to determine what the potential problems are, you must first figure out exactly where the current stage of AI development is so as to see where it is heading. But that requires you to be literate with regards to Machine Learning and AI, which is not something that most people are. Discussions about AI are a particularly good example of the Dunning-Kruger effect and of people simply putting their two cents in without knowing what they are saying. The reasons for this are not hard to identify: there are many pop culture references to AI, public perception of it is coloured by the idea that we are close to creating artificial life (and, of course, by the idea that creating such life is even possible), and developed AI strikes at many of the fundamental fears that we have. And AI might well be dangerous in a lot of ways. But not in the ways the public believes, and this is the key idea. Being fearful, to an extent (more like cautious), is fine, but you should be afraid of the right thing.


It worries me that so many threads, especially in personal finance type places, direct people into STEM to achieve anything. We need the arts to live a complete and colorful life, so not everyone can be a programming guru, nor should they be encouraged to be. Without the arts, we live in a dreadfully dull world designed by committees. It’s a shame those in the arts are often not compensated in a way that makes them a socially ideal direction in life. It’s too often a sacrifice to chase a dream rather than a richly rewarded societal boon that is in line with their dreams.


I’m going to be totally honest and disagree. Why not just incorporate both? It’s like saying technology can’t be tasteful. Isn’t digital art made by technology? Movies, shows they bring about art and ethics in abundance. We learn non tech related subjects via the aid of technology. Why backtrack when you can progress AND carry all the history with you.


I feel like the people creating the tech have a lot more exposure to arts and ethics than the average person. We just need more education and a better process for accountability of unethical actions. The piece here cites a “suggestion” by a doctor finkel without any additional evidence, which is not how science works and is a very poor argument. If you want to enact change or impact how and why science is done, you need data documenting what you’ve observed and how your proposed changes impact it. Alan Finkel didn’t do that, and believe me when I say he very easily could if he wanted to, and neither did this article. EDIT: Sorry for the many edits.


It’s more likely that AI will be owned by a few ultra rich to control the masses through ever increasingly complex rules of employment and compensation. AI will develop ever increasingly insidious mouse wheels for us to exhaust ourselves upon as they squeeze ever more exploitation out of us. And here’s where the problem begins. We already have to compete with immortal entities with a tenuous connection to morality and ethics. These are called corporations. Add immortal entities that will eventually out-think us in every way programmable, and put them in charge of the day to day operation of these mega-corporations… like HR and middle management. Tasked with a singular purpose – to legally trap us into ever more exploitative employment. To wring out every erg of energy they legally can from us. And to keep at least one step ahead of any legal system while doing it. This is the real future of AI. Not the mass murdering machine uprising everyone worries about. Think Nazi forced labor camps rather than Matrix style fields or Terminator style murder machines. And most people will gladly walk into it because we’ve all got to eat.