Just because we can, doesn't mean we should

/, Privacy Control, Technology Infrastructure/Just because we can, doesn't mean we should

Just because we can, doesn't mean we should

An IBM Fellow and chief scientist, Grady Booch, has opened up the debate on the questions of ethics and morality in regards to how governments may later utilise software innovations.

Debating the morality behind software development

No doubt he will be branded a ‘conspiracy theorist’ for daring to question the implications of all that is happening around him, but when you consider the rate of technological change over the last 10 years alone I herald such voices raising a siren of concern. As governments rush to install more and more surveillance to make us feel safer under the guise of ‘terrorism’, the reality is they are developing an insidious fear within the minds of most law abiding citizens. The minority should not dictate the majority, but sadly that is not the truth in our world.

“Here’s an example. London’s installing more video cameras per square mile on the street than anybody else. All right, not a lot of software there. But what happens when they couple that with facial recognition software so I can actually track individuals as they go through the city?”

What happens when as a technologist you want to push the envelope for change, and how do you balance that against the implications of the use of this when in the wrong hands (define ‘wrong hands’); should we hold back on technological change for fear of the unknown, should we move forward in complete blindness and naivety ignoring any risks or implications, or is there indeed a middle ground that comfortably and rightly questions – and seeks to implement boundaries. Isn’t that the basis of law and/or parenting?

Everything may indeed have an equal and opposite reaction, and technology is no different, especially when it comes to advertising.

I may well wear a brand’s tee-shirt sporting its logo emblazoned across my chest, standing in agreement with its values, but to assume I want to open up a dialogue of conversation with them is something that can not be taken for granted, it still needs to be requested of me.

Behavioural targeting and user profiling should not be about scraping through the contents of my bin looking for relevance against my habits, something I call snoop-and-serve, but they may ask me if I would like to converse with them and then with my permission we can explore along a journey together – and even then it should not be assumed for life, and/or about every permutation of their business. Relationships do not always last for ever and things have seasons – give me the default choice to opt in, not opt out – and if I change my mind, so be it!

In the envisioned world of talking billboards in the film aptly named the Minority Report, I would suggest that advertising relevance should be always aimed at the minority of people who are prepared to become your brand advocates, not the majority of people who just happen to buy a tee-shirt from you.

The article also highlights the organisation Computer Professionals for Social Responsibility.

About the Author:

I am a Digital Transformation Strategist and focussed on global evangelism; helping position clients at the forefront of emerging media and the next generation of consumer engagement. I'm passionate about how storytelling and creative technology can be used to deliver focussed messages – irrespective of the consumer viewing device – and then drive favourable outcomes for brands, whilst addressing concerns over user profiling.

Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.