Digital disruption means ethics is playing catch-up
There’s a shared responsibility between tech giants to try and create and utilise a critical ethical framework that assigns responsibility and accountability.
While it can’t be denied that technology has been driving the biggest disruptions in business across the past decade, it should also be noted that in numerous cases, regulation, and indeed ethics, have struggled to keep up.
It’s certainly been quite a week for tech giants being called out for bad behaviour. This week it was announced that Palma has become the first city in Spain to ban the use of flats for Airbnb-style holiday rentals to tourists, with evidence citing a rent increase of 40 per cent since 2013, forcing locals out of the city. And the recent Facebook controversy regarding questionable use of data is another example of untethered growth with a lack of the correct ethical checks and balances in place.
Several commentators suggested that this could be a tipping point in Facebook’s demise, however results announced this week show earnings and revenue soared in the first quarter and active users increased, showing neither advertisers, or customers, are staying away.
But they’re not the only ones. Uber’s lack of consideration of passenger safety being one of the reasons given by TfL for revoking their licence last year. Amongst others, Apple have recently been hauled over the coals by shareholders for their use of dark patterns – tricks used in websites and apps that make you buy or sign up for things that you didn’t mean to – driving ‘addictive’ behaviours amongst children. Now this wasn’t a couple of concerned parents calling the alarm, it was the company’s two largest investors, with one stating the case for the business to take a more moral approach. And lest we forget former Facebook president, Sean Parker, who described the site as made to exploit human vulnerability, saying: “God only knows what it’s doing to our children’s brains.” The list goes on.
Sometimes brands can be part of the informal regulation of the disruptors – a result of the right checks and balances missing from the system. We’re starting to see this happen with a raft of brands pulling YouTube ads appearing alongside inappropriate videos. While this may help to affect positive change, arguably the real change needs to be driven by the market disruptors themselves. It has been noted that there is a developing, yet nascent, consensus around the world – including Silicon Valley – that the long-term consequences of new technologies need to be considered and accounted for from the outset, with no room to eschew responsibility. Dealing with shareholder backlashes and retrospectively investing in expensive CSR initiatives can be avoided by the businesses who embed ethics within their business and product from the start.
The pace of tech development will continue to increase exponentially and the deeper we go, the deeper the room for error. Products and services are becoming more connected and machine learning and AI offer both amazing opportunities for customer experience and a bear trap for ethics. Unchecked, customer behavioural pattern recognition, personal data and predictive experiences are ripe for exploitation.
That said, there is already a segment of the AI community actively identifying and addressing these concerns and exploring whether ethical and moral principles can be built into algorithms. For example, women are being encouraged to drive AI developments to ensure that solutions don’t replicate the male bias prevalent in society and create failsafe procedures so that humans may take back control when automated AI applications reach the limits of their competency. Google’s DeepMind, whose stated mission is to “solve intelligence”, kickstarts its latest initiative – DeepMind Ethics & Society (DMES) – to scrutinise the algorithm and the societal impacts of the technologies it creates.
The businesses that are transforming entire sectors need to plan for success and scale and map potential scenarios to understand their moral obligation to protect their customers, employees and the community in which they operate. By doing so they stand the best chance of developing a sustainable business for the future. Amazon also announced this week that Alexa’s software upgrade next month will encourage children to say ‘please’ and ‘thank you’ as opposed to barking orders, as well as a higher level of parental controls regarding purchasing are being put in place. This is a very small step to bringing some morality back into the hands of the consumer.
One of the most fundamental challenges of the digital age - that technology is not neutral - is something that no one is owning right now. There’s a shared responsibility between tech giants to try and create and utilise a critical ethical framework that assigns responsibility and accountability based on what promotes the public good. Without it, there will be more hell to pay than the shareholders knocking on their door. A sustainable future where innovation meets technology hangs in the balance.
The full piece can be accessed from the link here: Click here