Info sec, AI and ethics – some thoughts #codemesh

I’m heading off to speak at the CodeMesh Conference in London shortly and I’ve been thinking about the emerging boundaries between information security, AI and ethics. I will post some thoughts as they evolve.

Developers (and others) and ethical approaches

We need to help everyone, from coders through info sec professionals to senior organisational leaders, to understand that information security, AI and ethics are part of the everyday landscape for everyone now. It is no longer something that someone else does and it needs to become embedded into our everyday practices.

Nobody has all of the answers, and nobody even has all of the questions. But this intersection between information security, privacy, AI and ethics is becoming increasingly important as we start to think about the kind of future we are building. We need to think about to create the kind of future we want and not merely wander blindly into some kind of dystopian future.

In particular, ethics is an area that we do fairly well in academic research. Universities have well-established ethics processes and there is a high level of consciousness among researchers of its importance. But in business this is not even a secondary consideration. There is general theoretical agreement that everyone ought to take an ethical approach to their work, but it is not always welcome in practice. And yet business folks have a part to play in creating ethical workplaces. We all do.

In software development some of the practices that have been proposed – things like Privacy by Design or Security by Design – are interesting,  yet I’ve not seen either in the wild. These are sensible approaches, and Privacy by Design is even part of GDPR so it might even work (eventually). Yet neither of these explicitly focuses on ethics.

And all of this is not much help when a developer is approached by a business person and is asked to develop something that might be ethically a bit shady. Look at the example of the developer for Volkswagen who went to prison for his role in creating software to deceive regulators around the world. There can be real world consequences for poor ethical decision making in the workplace.

VW engineer sentenced to 40-month prison term in diesel case: [he] was a “pivotal figure” in designing the systems used to make Volkswagen diesels appear to comply with U.S. pollution standards, when instead they could emit up to 40 times the allowed levels of smog-forming compounds in normal driving. – Reuters 26 Aug 2017

It all seems to point to a need to develop ways for business people to run an ethical lens over their ideas way earlier than when they approach a developer.

One approach that has merit is something like the Ethics Canvas, which is inspired by notions like the Lean Canvas or the Business Model Canvas. A simple and easy to use tool such as this could provide business folks with a way to consider the ethical implications of things that they ask developers to do. I’ve started to use the Ethics Canvas at work in some projects, it will be interesting to see how it goes.

Header image: By Martin420 [CC BY-SA 4.0 (, from Wikimedia Commons


Some thoughts on digital and data Ethics

‘We ask ethical questions whenever we think about how we should act. Being ethical is a part of what defines us as human beings.’
The Ethics Centre, Sydney

Humans have been thinking about the moral principles that govern our behaviour or the way in which we conduct ourselves for aeons. We are moving at lightspeed towards a new and exciting future that is built on algorithms, data, and digital technologies. Ethics is an area of increasing importance since we are barreling forward with the proliferation of data through digital and IoT and there seems to be little opportunity to slow things down.

I’ve been thinking about digital and data ethics since I joined Steve WilsonDavid BrayJohn Taschek, and R “Ray” Wang  on a Digital Ethics for the Future panel with in 2016.

5 propositions about data

  1. Data is not neutral – that is all data is subject to bias
  2. There is no such thing as raw data – that is, by the simple mechanism of selecting data, you have exercised judgment as to which data to include or exclude
  3. The signal to noise ratio has changed – we now have so much data that there is more noise than signal and it gets difficult to ascertain what is the signal
  4. Data is not inherently smart – it is our interpretation of data that adds value
  5. The more data we have the less anonymity – thus it becomes increasingly difficult to avoid identification

Why this is important

There have been numerous examples of data breaches for example the Australian Red Cross and the nation of Sweden. Every data breach is the result of some defect in the design, development or deployment of the technology. These breaches could be prevented by means of including some ethical frameworks into the design, build and deployment phases.

By the way, the World’s Biggest Data Breaches visualisation tool provides an excellent and mesmerising way to explore data breaches.

It is also interesting to recall the ease with which Microsoft’s Tay Twitter bot was trained to become rather nasty very quickly. Thus demonstrating the need to be sure of the training data one uses and to ponder the potential consequences of design and deployment decisions:  Twitter taught Microsoft’s AI chatbot to be a racist asshole in less than a day.


And there is the recent example of bathroom soap dispensers having been designed to recognise white hands not coloured ones. This is obvious bias from the design and development team, and  an example of why diversity in teams is critical. The fact the average developer is white male means that it is likely that every design has as its default setting as a white male.

The issues of bias – both unconscious and conscious – are enormous.

Data is increasing at a vast rate, as demonstrated by this chart from the IDC Data Age 2025 study, and this means that we need to develop ethical frameworks to support the acquisition, management and analysis of large datasets .

Some existing approaches

Universities have a long history in managing ethics, but even they are struggling with the implications of the complex data sets and algorithms that they are dealing with.

Over the years the ICT industry has developed a number of codes of ethics and codes for professional practice, yet many developers and data scientists are mostly unaware of these. Some examples of these codes of practice include:

But realistically, if developers have not even heard of these codes then how can they possibly influence the design of solutions that avoid bias and other ethical issues?

Some newer approaches

“Privacy is an inherent human right, and a requirement for maintaining the human condition with dignity and respect.’

Bruce Schneier

There are the beginnings of some new approaches, such as the Accenture: 12 guidelines for developing data ethics codes. And recent initiatives such as the OWASP Security by Design Principles and the Privacy by Design might well provide a good starting point for thinking about how we can embed good practice into the design and building of data sets and algorithms.

There is some good discussion of these issues  in  Floridi, Taddeo What is Data Ethics? (2016), and as they note, we need to examine ethics in terms of the following categories:

  • data – including how we generate, record and share data, including issues of consent and unintended uses of the data
  • algorithms – how we interpret data via artificial intelligence, machine learning and robots
  • practices – devising responsible innovation and professional codes to guide this emerging science

There have been developments in the area of community based approaches to improving digital and data ethics, chiefly in the area of machine learning and AI. Here are some examples of groups working in this area:

Some new ways to think about digital and data ethics

‘Complexity is a defining feature of the digital era, and we are not adjusting our governance structures to manage it.’

Kent Aitken, Prime Ministers Fellow, Public Policy Forum Canada, 2017

We need to be clear that technology has no ethics. It is people who demonstrate ethics. And technology inherits the biases of its makers.   We need to develop ethical frameworks and governance practices that enable us to develop solutions that are better from an ethical perspective.

I believe that if we start from the principles of Privacy by Design and Security by Design that we have a fairly firm practical basis for the future.

One thing is certain at an institutional level, information security , privacy and data governance will need more work to form a solid foundation to enable better data ethics.


Bad management, ethics and philosophy: what can we learn from News of the World?

The demise of a 168 year old (and reportedly profitable) newspaper in Britain called the News of the World (NoTW) gives us some valuable insights on a number of levels.

Every day over the past few weeks we have been gobsmacked by the revelations about NoTW and assume nothing could be more shocking.

But then there’s a new revelation about the way NoTW practised its business and we’re even more shocked.

An important insight they offer us is how management practice in the real world is informed by management thinking about business and ethics. And how thinking about business and ethics translates into behaviour in the workplace.

Bruce Guthrie, writing in the Sydney Morning Herald, recounts that in 1988 at a conference of News Corporation editors in Aspen, Colorado:

“I asked about ethics and Rupert called me a wanker”.

This article is interesting because it gives us a view into the behaviour that the top leader in that organisation demonstrated to his senior leaders and managers.

As Guthrie notes:

“I left that conference in Colorado more than 20 years ago concerned that Murdoch saw ethics or, at least, the discussion of them, as an inconvenience that got in the way of the newspaper business.”

When the top leader of an organisation gives that kind of strong message then it is extremely unlikely that any other leaders or managers will explore issues like ethics or managerial accountability. It is also unlikely that exploring those kinds of issues is part of the reward and remuneration structure within the organisation.

Further, it is also unlikely that the business leaders, given that kind of strong message from the top, will ever take the time to consider philosophical issues about management, leadership and the kind of business they want to run for customers, employees or society.

With that kind of leadership message we get a soulless automaton of an organisation that does whatever it takes to deliver shareholder value, no matter what cost to the people involved in the process.

And now, with News of the World, we see the results of that kind of leadership and management.

Where does the buck stop with the kinds of bad behaviour we saw in News of the World? Where did the people at the front line get the message that their appalling practices were okay? What kind of management philosophy was in place there?

news-corp-governance-300x126Perhaps just a quick check of the News Corporation corporate governance page demonstrates their current thinking on corporate governance?

It seems that there are interesting questions for all leaders and managers to ask ourselves arising from this tragic tale of a corporation gone wild.

Most importantly we must ask ourselves “would I have gone along with business practices like those in evidence at News of the World?” – it is easy to say no from the comfort of an armchair and with full hindsight.  More pertinent to consider is the challenge of saying no during the cut-and-thrust of a busy day in the office when your job is on the line?

MBAs, ethics, pledges and virginity

Recently I noted the phenomenon of MBA graduates signing pledges promising to behave ethically. This is an admirable sentiment, especially given the sheer number of MBA graduates (from ‘excellent’ business schools) who’ve been responsible for and/or complicit in our current Global Financial Crisis.

But it all has me wondering how such well-educated and smart people can end up doing such stupid, short-sighted and even illegal things?

We humans are social creatures, and what we do is largely dictated by our social connections (a.k.a our peer group). This is true even in the workplace.  We easily accommodate to the cultural milieu in which we find ourselves. This is one of our human survival mechanisms, the ability to meld into groups or tribes.

Very rarely does any business executive actually set out from the beginning to act criminally. Does anyone imagine that Enron executives or Madoff and his compatriots originally set out to undertake fraud?  I suspect that it was a seemingly innocent series of decisions and compromises over time that led to their unhappy end.

We know how strong the forces of conformance to authority can be – Milgram’s experiment and others bear that out. We also know how hard it can be to break a habit of behaviour or attitude:

The chains of habit are generally too small to be felt until they are too strong to be broken. ~Samuel Johnson

The chances of young people who, even though smart and well-educated, are able to resist the implicit or explicit direction of authority towards unethical ends seems quite remote. How will young people:

  1. Identify inappropriate, but small, actions that indicate a potential problem?
  2. Be strong enough to resist the little daily actions that ultimately lead to evil?
  3. Understand the longer term implications of seemingly inconsequential daily actions?

Then, as those young people mature and obtain hostages to fortune, how will they resist the forces of conformity in the workplace?  How will they resist those little daily compromises that can culminate in real evil?

I suspect that a grown-up with a family and a mortgage is much more resistant to the idea of rocking the boat by calling attention to irregularities in the workplace.  Further, I suspect this is even more so during an economic downturn.

The real question for business schools is how to  equip graduates with the ability to see and recognise the slippery slope as it so gradually appears before them. And also how we can better support whistleblowers, given that they often suffer worse fates than those upon whom they the blow the whistle.

As for the MBAs pledging ethical behaviour, I do hope that they fare better than those who’ve pledged to maintain their virginity.