skip to main content
article
Free Access

Bias in computer systems

Published:01 July 1996Publication History
Skip Abstract Section

Abstract

From an analysis of actual cases, three categories of bias in computer systems have been developed: preexisting, technical, and emergent. Preexisting bias has its roots in social institutions, practices, and attitudes. Technical bias arises from technical constraints of considerations. Emergent bias arises in a context of use. Although others have pointed to bias inparticular computer systems and have noted the general problem, we know of no comparable work that examines this phenomenon comprehensively and which offers a framework for understanding and remedying it. We conclude by suggesting that freedom from bias should by counted amoung the select set of criteria—including reliability, accuracy, and efficiency—according to which the quality of systems in use in society should be judged.

References

  1. BERLINS,M.AND HODGES, L. 1981. Nationality Bill sets out three new citizenship catego-ries. The London Times (Jan. 15), 1, 15.Google ScholarGoogle Scholar
  2. CORBAT~ O, F. J., MERWIN-DAGGETT, M., AND DALEY, R. C. 1962. An experimental time-sharing system. In Proceedings of the Spring Joint Computer Conference. Spartan Books, 335-344.Google ScholarGoogle Scholar
  3. FISHLOCK, T. 1981. Delhi press detect racism in Nationality Bill. The London Times (Jan. 20).Google ScholarGoogle Scholar
  4. FOTOS, C. P. 1988. British Airways assails U.S. decision to void CRS agreement with American. Aviat. Week Space Tech. (Oct. 24), 78.Google ScholarGoogle Scholar
  5. GAO. 1992. Patriot Missile defense: Software problem led to system failure at Dhahran, Saudi Arabia. GAO/IMTEC-92-26, U.S. General Accounting Office, Washington, D.C.Google ScholarGoogle Scholar
  6. GRAETTINGER,J.S.AND PERANSON, E. 1981a. The matching program. New Engl. J. Med. 304, 1163-1165.Google ScholarGoogle Scholar
  7. GRAETTINGER,J.S.AND PERANSON, E. 1981b. National resident matching program. New Engl. J. Med. 305, 526.Google ScholarGoogle Scholar
  8. HUFF,C.AND COOPER, J. 1987. Sex bias in educational software: The effect of designers' stereotypes on the software they design. J. Appl. Soc. Psychol. 17, 519-532.Google ScholarGoogle Scholar
  9. JOHNSON,D.G.AND MULVEY, J. M. 1993. Computer decisions: Ethical issues of responsibil-ity and bias. Statistics and Operations Res. Series SOR-93-11, Dept. of Civil Engineering and Operations Research, Princeton Univ., Princeton, N.J.Google ScholarGoogle Scholar
  10. LEITH, P. 1986. Fundamental errors in legal logic programming. Comput. J. 29, 225-232. Google ScholarGoogle Scholar
  11. MOOR, J. 1985. What is computer ethics? Metaphilosophy 16, 266-275.Google ScholarGoogle Scholar
  12. OTT, J. 1988. American Airlines settles CRS dispute with British Airways. Aviat. Week Space Tech. (July 18).Google ScholarGoogle Scholar
  13. ROTH, A. E. 1984. The evolution of the labor market for medical interns and residents: A case study in game theory. J. Pol. Econ. 92, 6, 991-1016.Google ScholarGoogle Scholar
  14. ROTH, A. E. 1990. New physicians: A natural experiment in market organization. Science 250, (Dec. 14), 1524-1528.Google ScholarGoogle Scholar
  15. SERGOT, M. J., SADRI, F., KOWALSKI, R. A., KRIWACZEK, F., HAMMOND, P., AND CORY,H.T. 1986. The British Nationality Act as a logic program. Commun. ACM 29, 370-386. Google ScholarGoogle Scholar
  16. SHIFRIN, C. A. 1985. Justice will weigh suit challenging airlines' computer reservations. Aviat. Week Space Tech. (Mar. 25), 105-111.Google ScholarGoogle Scholar
  17. SUDARSHAN,A.AND ZISOOK, S. 1981. National resident matching program. New Engl. J. Med. 305, 525-526.Google ScholarGoogle Scholar
  18. TAIB, I. M. 1990. Loophole allows bias in displays on computer reservations systems. Aviat. Week Space Tech. (Feb.), 137.Google ScholarGoogle Scholar
  19. WILLIAMS, K. J., WERTH, V. P., AND WOLFF, J. A. 1981. An analysis of the resident match. New Engl. J. Med. 304, 19, 1165-1166.Google ScholarGoogle Scholar

Index Terms

  1. Bias in computer systems

        Recommendations

        Reviews

        Darin Chardin Savage

        Friedman and Nissenbaum present a fascinating overview of bias within computer systems. The variety of systems surveyed—banking, commerce, computer science, education, medicine, and law—allows for both a broad-ranging and poignant discussion of bias, which, if undetected, may have serious and unfair consequences. The rapid dissemination and ready acceptance of computer algorithms means that biases may easily affect a large number of unsuspecting people. The difficulty, the authors observe, is in identifying and describing the nature of the bias, for this issue has not been comprehensively addressed in the computer literature. Most experts of social bias and discrimination approach the issue from a legal or philosophical background and may not be equipped to interpret and translate issues from a technological standpoint. This places the onus of responsibility on the technological professions themselves, and this paper provides a very thoughtful and thorough framework for meeting this responsibility. From the variety of cases, the authors are able to identify three forms of bias—preexisting, technical, and emergent. The descriptions of how these biases are integral to the cases seem right on target. My only wish as a reader was for a greater quantification of the actual impact of the bias, whether in economic, institutional, or social terms. The authors do give one example that alludes to a quantification of impact by noting that 90 percent of airline reservations are pulled from the first screen of the database. Therefore, those airlines listed only on subsequent screens would be at a great disadvantage. I wanted to know, however, to what extent the businesses are economically disadvantaged by this arrangement and how many people were affected by the BNAP immigration program or the National Resident Match Program. Perhaps the data are not available or are difficult to determine, but the issues raised in the paper evoke an expectation for this kind of data, which, if presented, would highlight the significance of bias. The authors do show unequivocally that bias exists in the case examples, and where and how it exists within the systems. Their rigorous definition of what constitutes bias and their call for standards of unbiased programming that meet clearly defined criteria offer a useful platform from which to address the complex issues of ethics and equity in the information age.

        Access critical reviews of Computing literature here

        Become a reviewer for Computing Reviews.

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in

        Full Access

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader