Follow

cw: WMD/ genocide

@cadadr@mastodon.sdf.org There is of course some work in the field, though it's not given the emphasis it deserves. Several commentators have noted that CompSci, unlike physics, has not yet had its Hiroshima moment (Ex-Googler and G+ architect Yonatan Zunger, trained in physics, among them).

That notion itself may be flawed: consequences of computer-based moral failure rarely arrive as blinding insights of unignorable magnitude airdropped with precision and creating both tens of thousands of martyrs and witnesses.

The same conflict which birthed Little Boy also snuffed the souls of 6 million Jews (and others) with less haste, but punch-card precision, tabulated and enumerated by IBM as per contract operating in and for Nazi Germany, recording data with serial numbers, some of which still remain tatooed on the arms of survivors.

And yet computer science is almost wholly unaware of its Holocaust past.

1/

There's William J. Rappaport's Philosophy of Computer Science, still in development, which includes chapters on ethics and ethics in AI specifically:

cse.buffalo.edu/~rapaport/510.

@cadadr@mastodon.sdf.org

3/

There is the startlingly prescient writing by Internet (or proto-Internet) pioneers such as Paul Baran (co-inventor of packet-based networks), writing at RAND in the 1960s on issues of ethics, morality, and social responsibility.

rand.org/pubs/authors/b/baran_

(Those writings are now published free of charge online at my request.)

@cadadr@mastodon.sdf.org

4/

Paul Baran appears in this 1966 BBC documentary, Panorama - California 2000, at 31 minutes, discussing privacy:

vimeo.com/170324749

@cadadr@mastodon.sdf.org

5/

Another problem is that computer science, or rather, computer practice, is not, and possibly has never been, a specific profession with dedicated training, certification, and a career track.

Computers are more like phones, or cars, or jackets: nearly everybody has one, many people own or use several, they're ubiquitous and part of virtually all work, entertainment, social engagement, and government. Phones and cars are computers these days, jackets may soon be.

If ethical training is required it needs to be universal. Or simply cultural, akin to religion in pervasiveness if not necessarily methods or structure.

Even within the tech sector, non-CompSci graduates in advanced roles (Zunger, myself) are the norm if not majority.

@cadadr@mastodon.sdf.org

6/

@dredmorbius @cadadr agreed. I'm one of the ones *with* a CS degree and didn't get ethical training either. Such training needs to be universal.

@cadadr@mastodon.sdf.org Peter G. Neumann's Risks Digest (SRI/ACM) often revolves around ethical issues, though that's not a core focus. I thought PGN had written on the topic (and he very likely has), but I'm not finding references readily.

en.wikipedia.org/wiki/Peter_G.

en.wikipedia.org/wiki/RISKS_Di

catless.ncl.ac.uk/Risks/

7/

@cadadr@mastodon.sdf.org And a late add: Ethics of AI / University of Helsinki

The Ethics of AI is a free online course created by the University of Helsinki. The course is for anyone who is interested in the ethical aspects of AI – we want to encourage people to learn what AI ethics means, what can and can’t be done to develop AI in an ethically sustainable way, and how to start thinking about AI from an ethical point of view.

ethics-of-ai.mooc.fi

8/

Sign in to participate in the conversation
Toot.Cat

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!