Pulse oximeter technology serves as an important case study in how bias can be introduced, be perpetuated, and remain unaddressed in medical devices. While these design flaws may largely be unintentional, it is incumbent upon designers and users to make every effort to identify, mitigate, and remove these biases so that they do not contribute to the stark health disparities of minority groups ( 11, 12). Analogously, medical devices can also exhibit racial or ethnic bias if design flaws lead to performance differences in patients of racial or ethnic minority groups ( 2). Bias is well documented in medical practice, affecting behavior, interactions, and decision making, where it may play a role in perpetuating health disparities ( 9, 10). Health care is not immune to these critical problems. This recognition of racial bias within technologies incorporates the modern understanding of racial bias as rooted not only in explicit individual prejudices or racism, but also in systems, laws, policies, or practices in the form of structural racism, whether they are intentionally biased or not ( 7, 8). High-profile examples of software technologies that may perpetuate discriminatory practices include algorithms used for facial recognition ( 4), loan decisions ( 5), and criminal sentencing ( 6). For example, biased hardware designs within automated soap dispensers result in technology that readily dispenses soap to individuals with light skin tones but fails to dispense soap to individuals with darker skin tones ( 3). Yet, limitations in the design of hardware or software can result in systematic performance differences in populations based on attributes such as race, ethnicity, gender, sex, or socioeconomic status. Naïve, conventional wisdom suggests that machines cannot be biased because they are objective, inanimate objects that lack the ability to make conscious decisions ( 1, 2).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |