The ethics of digital innovation in health and care

Imran Ali
4 min readMay 5, 2016

Earlier this year I was asked by mHabitat’s Victoria Betton to co-host a round table discussion on the role of ethics in digital innovation and to consider its impact on designing health and care.

I wanted to illustrate why a technologist such as myself is uncomfortable with the prevailing technology determinism; how my ethics have been shaped by a career in bleeding edge technologies; and to introduce some provocations to help drive a discussion.

You can also follow how the discussion progressed on Twitter by looking up the hashtag #mHEthics… one theme that kept emerging from various participants was the notion of dignity and humane design.

Here’s the transcript of my introduction…

I’m one of the founders of Carbon Imagineering, a digital innovation and R&D practice. Much of our work explores the implications of emerging technologies and how new opportunities can be crafted from disruptive innovations.

I’ve worked in and around emerging technologies for about fifteen years — in evil multinationals, venture backed startups, R&D labs and most recently arts and culture organisations.

I was actually one of the first generation employees at Freeserve in Leeds, designing the services that got many people online for the first time.

A New Medium

I was lucky to be present at the birth of a new medium — looking back, many of us that worked on these things were countercultural and were motivated by the promise of a democratising, empowering technology that diffused power and overturned orthodoxies.

Later as a researcher in Orange R&D, I worked alongside many of the people that would create the Web 2.0 era. We were all very cool and inventive, trying to change the world, but looking back, we were inadvertently building a massive surveillance culture and maybe laying the foundations for future tyranny.

Thing is, we weren’t equipped to innovate and be ethically critical. At university, only one of thirty modules in my degree was about the social implications of technology. Indeed, I was ridiculed for even being interested in it as it didn’t fit the gendered profile of most of my classmates.

Technology Determinism

But anyway, we won. The geeks now rule all realities. It’s not a counterculture, but the culture. We’re not underdogs and outlaws anymore, but we haven’t taken responsibility for what we create We pat ourselves on on the back for disrupting, without considering the human impact of the people and institutions we disrupt.

Unicorns & Uberification

Silicon Valley culture has created incredible innovations, but actually concentrated wealth and power in undemocratic and capital-driven organisations. We contribute our videos and photos and status updates but share none of the stupendous wealth as citizens or users.

Companies like Uber present great conveniences but also present a model that if adopted by other industries, dismantle the notion of employment, accelerating income inequality. Criticising such innovation often invites accusations of Luddism, shutting down important debate.

So this is a good moment to think about ethics and social implications in designing technology. What writers like Anil Dash have started to call Humane Technologies.


I recently worked on Yarn, an AHRC-funded research project on community storytelling here at the University Of Leeds. Not the most lucrative or impactful project in my career, but maybe the most meaningful.

I learned that what we did was a form of deep hanging out with community groups around the country — listening, showing care and empathy, designing for dozens, not millions. And for people, not capital.

There’s potential for this work to be a high-growth startup, but we wanted it to be owned by the people who co-designed it with us. Community storytelling tech should be owned by those communities, not by Jeff Bezos or Mark Zuckerberg.

Understanding the emotional and human origins of this work and having the notion of humane infrastructure at the core means that it’s almost inevitable that the project will be owned in a collective and open structure.


I guess one way to get a discussion rolling about how to codify some of these observations is to throw out some questions and provocations.

  • Code clubs are all the rage with programmes that teach kids how to code, nuns how to code and cats how to be better entrepreneurs. I rarely hear a discussion on embedding humanities in digital education.
  • Can we reclassify digital rights management as a mental illness? How about some radical openness for the things we make?
  • What if data is not “the new oil” but more like toxic, radioactive waste… how would this change how we design and understand long term implications of data use.
  • How can we ensure crowdsourced, socially constructed technology is owned by everyone who contributes. Are CICs enough or are new structures needed for a sharing economy. For example, what if Youtube was owned by its uploaders or Uber by its drivers?
  • Can bad design patterns simply be left to open markets?
  • We hear so much about an Internet Of Things, but no one talks about an internet of beings or an internet of values.



Imran Ali

{Cofounder: @carbonimagineer + @LSx + @impacthubBD. Fellow of the RSA. Writing, curating & scouting emerging technologies; INTP}