Artificial intelligence is increasingly being used to create technological interfaces - whether chatbots, personal assistants or robots whose function is to interact with humans. They offer services, answer questions, and even undertake domestic tasks, such as buying groceries or controlling the temperature in the home.
In a study of personal assistants with female voices, such as Amazon's Alexa and Apple's Siri, the United Nations Educational, Scientific and Cultural Organization (UNESCO) argued that these technologies could have significant negative effects on gender equality . In addition to the fact that these artificial intelligence (AI) systems are trained on gender-specific models, these female-voiced assistants all feature stereotypical female attributes. This problem is compounded by the fact that these systems were probably created primarily by male developers . These gender-specific assistants can pose a threat through the biased representation of women they generate, especially as they become increasingly ubiquitous in our daily lives. It is predicted that by the end 2021, there will be more voice assistants on the planet than human beings .
Given the increasing use of voice assistants trained with biased language models, the potential impact on gender norms is of concern. While isolation has increased significantly during COVID-19, there is a risk that some people's main 'female' interaction is with these voice assistants. If we are not careful, sexist representations of women, totally out of step with real women, will intrude into the privacy of the home or our smartphones, anywhere, anytime. Moreover, the models are essentially the same, leading to the reproduction of a single 'standard' and a cultural smoothing in human-machine interaction, denying the diversity of users of these products around the world.
While some have argued that learning algorithms may be less biased than humans, who are often influenced by discriminatory cultural norms of which they may not be aware , this is without regard to the fact that artificial intelligence (AI) is necessarily created by human beings whose way of thinking it incorporates. Indeed, it is easy to underestimate the importance of cultural norms in human decision-making. Artificial intelligence mimics the social biases of the data it has been given unless it is explicitly designed with different principles. It is therefore not surprising that artificial intelligence developed without built-in values only reflects already biased social norms.
The boundary between the digital world and the human body has disintegrated. With the rise of artificial intelligence and the internet of medical things, patients’ bodies can resemble a sci-fi cyborg that operates both independently and electronically through sensors. As the physical and cyber worlds blur, scholars and practitioners have debated medical device regulation, liability for device malfunctions, device privacy, and cybersecurity. One area of the discussion that has been left relatively untouched, however, is femtech. Described broadly as female technology, femtech encompasses wearables, artificial intelligence, apps, and other hardware and software that not only seek to heighten awareness of female health, but also aim to enhance women’s agency over their bodies. Reporters have called femtech a win for women’s health, as startups and venture capitalists finally invest in female products that can benefit half of the population. Today, the most common femtech products on the market focus on menstruation, maternity, and fertility, and are advertised as giving women control over their bodies and wellbeing.
But what if they don’t? By using femtech devices without understanding how these products are regulated and how their data is collected, manipulated, or sold, women may unintentionally be losing control and autonomy over their bodies. These devices collect intimate health data that may be used to maintain stereotypes and societal norms about the female body. For instance, some femtech menstruation products do not permit a user to input abortions or irregular cycles. This failure to account for all female body types and decisions perpetuates the flawed assumption that abortions and irregular cycles are deviations from the standard female body and can marginalize women who do not conform to these “norms.” Similarly, femtech can reinforce outdated perceptions about women and their bodies by consistently trying to quantify, analyze, and create a version of “normal” that all women should strive to achieve.
The fundamental assumptions of femtech, therefore, do not necessarily align with female consumers and patients, and may inadvertently diminish women’s agency and control over their own bodies. This misalignment stems, in part, from the lack of female and provider input into device creation, the rush to market new devices without adequate testing and vetting, and the male-dominated startup industry creating these products. This article analyzes the societal implications associated with femtech in its current form and offers recommendations for modifying the femtech model to avoid undesirable consequences as the industry – and devices – grow in size and complexity.