Are neuro-education biometrics morally acceptable when used for the education of children?
33 min read
CHILDREN, EDUCATION AND NEURO-BIOMETRICS
Authors: Lara Mikocki and Bente van der Laan
Executive summary
This report aims to provide a starting point in understanding the moral concerns of using neuro-biometric devices for the education of children. The report does this from a right-based approach and investigates the key moral dilemmas relevant to the application of a specific neuro-biometric device, FocusEdu, and specifically keeps the interests of the child at the fore of the investigation. The key moral dilemmas that are discussed include questions around the impact of the device on a child’s autonomy and freedom; the impact of the device on a child’s privacy; and the moral concerns on fairness that follow from using the device in the education of children. Following these considerations on the values of autonomy, privacy and fairness, we propose three key relating rights that aim to bring light to ethical considerations around brain rights for children that may have not yet been closely investigated. We do this by highlighting three rights explicit to rights of the brain in education, and its contents. These rights include: a right to cognitive freedom; a right to mental privacy; and a right to cognitive equality. This report does not aim to argue for these rights, but simply aims to highlight the significance of these rights, and suggests to make them more explicit in the era of neuro-biometric devices.
Table of Contents
Introduction
Method
Chapter 1 Autonomy and agency
1.1 Agency and consent
1.2 Coercion
1.3 Introducing the right to cognitive freedom
1.4 Informed consent
Chapter 2: Privacy
2.1 Privacy and surveillance
2.2 Autonomy and the privacy of thoughts
2.3 Introducing the right to mental privacy
Chapter 3: Fairness in education
3.1 Concerns regarding tuition
3.2 The problem with equal educational opportunities
3.3 Introducing the right of cognitive equality
Conclusion
Appendix 1: workplan
Introduction
The Rathenau institute is a Dutch research institute and is concerned with research and debate on the impact of science, innovation and technology on our society. Following the work-program of 2019–2020 the Rathenau does her research into the following four subjects: digital society, malleable lives, knowledge-driven democracy and vital knowledge-ecosystems. We are two students from the master Applied Ethics from Utrecht University and were invited to give an ethical evaluation of an already existing biometric technology. We chose a technology that concerns three interesting subjects: children, education and technology. To introduce FocusEdu we will sketch a scenario in which this technology might be used. With that, we will illuminate three values that we ethical consider in order to answer our research question, which is: in the interests of the child, are neuro-biometrics morally acceptable when used in their education?
Scenario: “A classroom full of children using FocusEdu”
Twenty-five 11–12 year old children are sitting in a classroom, and, like a morning ceremony, all give consent to wearing a headband that will collect and analyse their brain waves while they are in class. They all strap on the devices and calibrate with their screen. The teacher is working hard to explain to her class how to multiply fractions and sometimes takes a look on the display device she has in front of her. The display is showing her which of the children are ‘engaged’ and paying attention to her lesson.
Some children are showing low engagement, the teacher attends specifically to these children. She approaches one of the children, who she helps to put on the right track again as he didn’t understand something. Another child is also given attention but becomes frustrated that the teacher has not left her alone at that particular moment, as she would like a small mental break, at her own pace. One of the headbands is malfunctioning, and a child is left without the same attentional aid as the other children. In the last fifteen minutes of class, when the children are in fact working on their homework that needs to be handed in the following week, the FocusEdu helps most of them focus. One student uses the device excessively, triggering concern for the child’s safety from the parents.
The example illustrates how the research question, in combination with using a method of a right-based approach — a normative method we will elaborate on in the next section — guides us to consider three overarching values: autonomy, privacy and fairness. These values refer to our story in the following way: firstly, the topic of autonomy and children brings up the concerns of giving consent to use this device, freedom and agency; as well as coercion worries through the technology’s implementation. However, on the other side of the debate, the neuro-technology could also promote autonomy by improving the child’s learning quality. Secondly, we explore how the technology could primarily impact privacy, which could in turn influence the wellbeing of the child, through either surveillance techniques or brain-reading techniques. And thirdly, the topic of fairness comes up, which explores notions of equal access and competition in education. In discussing these values we will look specifically from the viewpoint of the child, and provide a response to the following research question: in the interests of the child, are neuro-biometrics morally acceptable when used in their education? At first glance, we can infer that a student receiving improved education experiences some benefits, but are these benefits worth the burdens neuro-biometric technology could impose?
Method
Before ethically evaluating neuro-biometric technologies, what technology are we specifically looking into? The biometric technology we will ethically evaluate will centre around education, namely the monitoring of brain waves via EEG for learning purposes. A tool currently in use for these purposes is the FocusEdu, produced by BrainCo in the USA. FocusEdu collects and reads brain waves from students and is able to measure a students’ focus and attention during class. FocusEdu aims to function as an enhancer, claiming to improve cognitive functioning. As an enhancer, such a device therefore taps into the practice of cognitive enhancement. Cognitive enhancement can be explained as interventions of the human being that aim to improve cognitive functioning, and the debate around it questions whether this should be regarded as morally permissible (Outram 2012). Cognitive enhancement will be a key theme throughout the report due to the fact that FocusEdu is also considered an enhancer. In light of this, the existing debate around cognitive enhancements will provide much of the ethical consideration in analysing neuro-biometric devices such as FocusEdu. However, because the literature on cognitive enhancements focuses on cognitive enhancing drugs it misses out a large portion of ethical concern relevant to neuro-biometric devices like surveillance technology. This will then be explored through another heated debate in ethical technology scholarship, the privacy debate. These two debates provide much of the moral groundwork because they explore similar ethical issues around improving human capabilities (the cognitive enhancement debate), and surveilling technologies (the privacy debate).
In view of the fact that this neuro-biometric technology can enhance the human being, but in doing so, could infringe upon human rights, we therefore employ the normative theory on rights to ethically evaluate the technology. It should be noted, that early in our research we had to set an approach, and we chose the rights-based approach from two other prominent normative theories which include, consequentialism and virtue ethics. We use this approach because it enables us to reflect on values that relate to rights that can provide a fruitful way of viewing the use of neuro-biometrics in education. A rights-based approach may focus upon the legitimacy of the development of neuro-enhancers in education. We use a normative theory on rights as an approach for our ethical evaluation, therefore we consider three important values and try to make them explicit in existing rights, or even suggest new rights. We aim to let this report function as a step for further research, or as background for decision-makers.
The specific theory within the rights-based approach we employed, is Kant’s theory of rights. Kant emphasises to treat every rational being (including oneself) with respect, and always as an end in itself, and is the root of many human rights. Kant also emphasises on moral obligation as an expression of the human capacity for autonomy or self-government. Kant, therefore, has an exclusive focus on rational autonomy (Johnson and Cureton 2019, ch 7). To ethically evaluate this neuro-biometric technology we will start with considering the value of autonomy. From the value of autonomy, what naturally followed was an exploration of a protection of that autonomy, namely the value of privacy. Following this, the value of fairness is considered, as a child’s autonomy might be compromised with unequal opportunities.
This report consists of three chapters, each discussing an ethical value in relation to the research question. Each chapter starts with providing a definition of the value. This definition follows with a highlighting of several of the most important concerns relating to the moral permissibility of using this technology. Elaborating on these concerns, every chapter ends with a suggestion of a right, or an explicit recognition of an already existing right. Therefore, we hope to equip decision-makers with relative ethical background in order to decide whether it is in the interest of children to use these neuro-biometrics in their education.
Chapter 1: Autonomy and agency
The research question guides us to look into the use of this technology especially in the interests of children. Interesting questions relating to this technology develop around the value of autonomy. Are children autonomous? What are they capable of? Are we able to speak on behalf of the interests of children? These questions can be addressed with the well-versed ethical concept of autonomy, which is also perhaps the most prominent ethical value that appears in the cognitive enhancement literature. Therefore, we assessed this value against the neuro-biometric technology, FocusEdu, by looking at the multifaceted aspects of autonomy — from its relationship to a child’s freedom and their agency, to how autonomy can be threatened with forces like surrounding social standards. However, before exploring these many facets, we must first try to define the term autonomy in order to understand its role in the permissibility of neuro-biometric devices.
To start, autonomy has many interpretations, but quite literally means ‘self-rule’, or ‘self-governance’. Next to this, autonomy is often defined as necessary to freedom, in that — our autonomous ability to use “reason to choose our own actions — presupposes that we understand ourselves as free” (Christman 2018). In our classroom scene, there were many instances where autonomy, and therefore freedom, was influenced — whether it be positively or negatively. Three of these autonomy-influencing conditions included: firstly, when the child had to give consent. This is a situation where autonomy becomes a partner to a person’s agency, or, in other words, a person having, or not having, the capacity to give or deny that consent as a fully autonomous agent. Secondly, autonomy was impacted when a student did not want to be helped by the teacher, describing when that child’s autonomy was compromised by the teacher potentially forcing a specific behaviour on the child, also known as coercion. Thirdly, autonomy was influenced when a student was coerced towards a certain learning path through help by the teacher. This describes how the promotion of autonomy can occur because the child is made arguably more free or autonomous by being helped via the technology’s intervention, namely the teacher alerted to aiding the child. These are all conditions in which a child’s autonomy, and therefore their freedom, becomes impacted, each of which we will explore in the coming sections.
1.1 Agency and consent
Having briefly explored the ways in which autonomy could be impacted, we will dive into one of the instances of how autonomy might become relevant, namely the child’s moral agency. This describes how capable the child is in making decisions regarding participating with such technologies, and therefore the moral responsibility of that child. With this introduction of agency, it brings with it a concept that is in healthy debate among many philosophers, however, in very general terms, it can be described as “an agent is a being with the capacity to act, and ‘agency’ denotes the exercise or manifestation of this capacity” (Schlosser 2019, 1). Next to this, what further complicates the concept of agency, is that it cannot exist without autonomy. This is because autonomy requires freedom, and freedom denotes a capacity to act (agency). So, because children are not fully matured into adults, their agency — as having the capacity to act — is sometimes further debated. This can be to the point of perpetual oscillation between the denial of agency and the granting of agency (Traina 2009, 19). Denial of agency represents the child as not having the capacity to act or make decisions for themselves, and the granting of agency means that they do have the capacity to act, even to some extreme points of being capable enough to work like an adult. This oscillation between denial and granting becomes relevant to the moral permissibility of neuro-biometric devices on children, because it becomes unclear whether a child can be granted or denied the freedom to decide in using it. Do they have the capability to decide, reasonably, for a neuro-biometric device? Is this justified? Do they know what they are giving consent to? This is unclear, but what is clear, is that we live in a world where children generally interact with society regularly, as entities that make progressively more unique choices every day. Though we do not have enough space to explore this broad moral territory, we will confine our analysis to the case of our story — the neuro-hooked classroom, where children start their day by giving consent to use the devices. To help understand the moral agency dilemma, philosophers often differentiate between moral agents and moral patients (Pluhar 1988, 33). These are somewhat technical terms, but the general concept is that an individual is a moral agent when that individual can be held morally responsible. Adults are typically thought of as moral agents — in that they are considered capable of acting rightly or wrongly. And most philosophers would actually agree that children are moral patients, but also moral agents. The clarity suffers with the degree to which we hold children responsible and consider them competent, which can differ from adults. And so, for the purpose of clarity, we will take on the position of a hybrid between the two sides illustrated above. Therefore, we define children and their agency in the following two ways: firstly, children are more vulnerable than most adults; but, secondly, they are also evolving beings that can progressively make their own decisions.
1.2 Coercion
Having established that a child has some weight in deciding to use or not use a biometric device as stated in our research question, we can start to look at one of the key concerns associated with the use of a device that reads their brain waves: the degree of freedom they have when confronted with such a device, and the use of it. In other words, is such a technology coercive, in that it might force children into specific behaviours they would otherwise not choose, even subliminally? We explore this topic through two perspectives: firstly, it is coercive and can obstruct a child’s freedom; and secondly it is indeed coercive, but this promotes the child’s freedom.
Firstly, if freedom is obstructed for a cause, this has been established as coercion. This can manifest in two prominent ways in the school setting: through social pressure, and through force by an authority. In the first manifestation, coercion becomes especially perceptible in a competitive environment like the workplace, military, and of course, academia. This means that some individuals may feel pressured to enhance their cognition due to social forces and dominant individuals (Krawczyk 2017, 300). Such competitive environments, in combination with cognitive enhancements, could furthermore lead to addiction, as the person would like to be at a level equal to or better than his peers, and so uses the enhancement obsessively (Glannon 2015, 10). Nevertheless, though there are no studies of addiction to neuro-biometric devices, this report aims to merely highlight the potential risks. On the second manifestation, coercion can occur when a child is forced into a certain line of behaviour they might have otherwise not wanted, and as illustrated in our classroom scene, this can be enacted upon by the teacher. Armed with more in-depth information of the child’s brain attention, an authority could use that information to punish the child, or force them into specific behaviour, against the child’s will. This constitutes coercion, and can be enabled by a device such as FocusEdu, therefore there must consideration around the moral permissibility of the device. We will discuss this further in the section on surveillance, later in the report.
On the other hand, a cognitive-enhancing technology could promote autonomy. This occurs by potentially improving a child’s concentration, alertness, and could lead to improved learning (Garasic, 2016; Nyberg et al. 2003; Sahakian and Morein-Zamir 2011), and therefore could boost their autonomy through improved opportunities in life. Referencing our classroom scene, a child was helped by the technology, because the teacher was more quickly alerted to his struggle. In this sense, whether or not the coercion occurs, cognitive enhancements could arguably contribute to the individual’s flourishing and their general well-being. This is achieved by fostering skills required to effectively engage in society (Shook et al. 2104, 7). Next to this, in contrast to the addiction concerns, a possible positive effect of such a technology has to do with the massive burnouts and the overworked society we live in (Bianchi et al. 2019, 36). Since this technology can help to monitor a person’s attentional quality, it can also monitor how the brain is in a stress or relaxed state, and therefore, a child could be attended to accordingly.
1.3 Introducing the right to cognitive freedom
From the above exploration, autonomy is not only closely linked to agency, or the capacity to act, there are also forces relevant to neuro-biometric technologies that can either threaten or promote a child’s autonomy and also, therefore, their freedom. In light of this, we recognise a close link to well-established rights, the right to freedom, which we do not aim to explore in great detail, however we do aim to highlight its significance, and make it more explicit in this report by extending its relevance to the mind — and proposing its explicitly as the right to cognitive freedom, and informed consent. Though this is not the focus of the report, we will briefly explore this right in the following section.
The significance of the mind has been reflected extensively in human rights law (Bublitz 2013, 1314).
“The freedom of the inner realm has been expanded from religious beliefs and conscience to thoughts in general. Article 18 of the Universal Declaration of Human Rights, adopted in 1948, explicitly guarantees that everyone has the right to freedom of thought, conscience and religion.” (Bublitz 2013, 1315).
According to Bublitz (2013,1330), a right to cognitive freedom guarantees an individual’s sovereignty over her mind and entails the permission to both use and refuse neuroenhancements. In reference to this, we will formulate the right in two ways — the right to refuse and the right to use the neurotechnology. This right to cognitive freedom is so fundamental, because it is necessary to all other rights, where “the right and freedom to control one’s own consciousness and electrochemical thought processes is the necessary substrate for just about every other freedom” (Lavazza 2018, 4). This is also a specifically important notion in the space of education and of children, because if one can achieve greater liberty, or their liberty is obstructed with a potentially coercive application such as FocusEdu, then serious considerations should be taken into account. In sum, a child should be able to freely engage or disengage with neurotechnologies, and they should be equipped to be the master of their own brainwaves.
Chapter 2: Privacy in education
In revisiting our research question, it asks “in the interests of the child, are neuro-biometrics morally acceptable when used in their education?” and reflecting on our rights-based approach, autonomy plays a large role, and with autonomy, privacy becomes central. This is because privacy protects our individual autonomy, and it is therefore essential to the realization of other important rights such as those of freedom (Britz 1996; Mokrosinska 2017, 117; van den Hoven 2014). These privacy issues trigger a key concern, as neuro-biometrics are technologies that intentionally collect and read brain data of a human being for the means of enhancement. So, can it be justified that such an important value such as privacy be sacrificed for benefits accrued through neuro-biometric technologies in education? To help answer this, and because FocusEdu displays brain reading data on an information display to the teacher, we will focus on two aspects of privacy. The first aspect, is a prominent theme within the topic of privacy relevant to FocusEdu: privacy from surveillance practices; and the second aspect, is a less prominent theme within the privacy debate: the privacy of one’s thoughts. We recognise that data privacy is also a large topic within the privacy debate, however this is covered extensively in privacy scholarship, therefore, we will not give it room here.
2.1 Surveillance
Beginning with privacy and surveillance practices, we will explore the risks and opportunities of privacy presented by surveillance. This is to mean that, is surveillance a foe or friend to the value of the previously described value of autonomy, and therefore the moral permissibility of surveillance? Before answering this, we will describe the ubiquity of surveillance practices to demonstrate its ethical significance. Surveillance itself is worldwide becoming the norm in public spaces, and is tightly linked with privacy concerns. Mobile phones, for example, are a type of surveillance tool that now sits comfortably in private spaces, normalising the idea that children should be perpetually available.
Given the morally contingent nature of surveillance, surveillance of children and its extension into the education space raises a number of questions, especially when approaching this in the interests of the child. In our classroom scene, we see that the children are surveilled for their attentional engagement. The questions that then arise centre mostly around social development, a concept closely linked to the interference of a child’s autonomy and independence (Livingstone 2019, 30). However, the existing research shows that generally in the digital environment, children face greater risks but also more opportunities (Livingstone and Haddon 2009, 1). While working to reduce online risks and maximise opportunities is an ideal approach to developing the best online space for children, it is important to remember that every child will come in contact with risks at some stage in their life, and a risk-free environment is not realistic, both online and offline. Nevertheless, we will explore one of the most important risks and opportunities of privacy-impacting neuro-technologies — such as threats posed on adult/child relationships like trust; or its opportunities — such as those that promote autonomy.
In terms of risks, surveillance can have negative effects by reducing one key feature of the social character — trust (Livingstone 2019, 30). Starting with trust, the issue of ensuring, or ‘chilling’ free speech is key. Chilling is the practice of stopping certain behaviours online (Penney 2017, 3). In the era of the Internet, chilling occurs because, “students [can] know that their posts were monitored and consequently could choose not to express their thoughts on the Internet” (Shade and Singh, 2016). If such editing occurs due to online surveillance, surveillance in other areas, like education, may then also cause an obstruction to developing the kind of trust that supports children in participating with social rules (Fletcher et al. 2004, 781). Kerr and Stattin describe that “monitoring children does not encourage pro-social behaviour; instead, children are more likely to behave in pro-social ways when they are able to voluntarily disclose information to adults with whom they share a bond of trust” (Marx 2010, 214). The same is true of surveillance in the workplace, schools and society in general. A real world example of a type of surveillance in the Dutch system is the ‘Magister’ in highschools. Magister is an existing learning platform that provides insights to teachers, students and parents in order to be able to coach the student more effectively towards better results. There has been much controversy and debate over this tool in the media, which denotes that when such surveillance and sharing of information is involved, there is a concern whether the value of trust and character is essentially eroded (Guldemeester, 2018). This erosion denies the social act and freedom of a child to tell their parents their results, effectively erasing opportunities for social connections. The relevance between Magister and FocusEdu relies on the question whether collecting more data improves the education of children, and whether this form of education poses socially destructive changes in the relation between teachers and children, or parents and their children. Therefore, if very basic surveillance practices such as Magister can hamper social interaction, more sophisticated technologies such as neural-biometric devices could also have similar effects.
Having established this concern surrounding adult/child relationships, now we can look at how surveillance could provide some opportunities for the child, even to the end of promoting a child’s autonomy. This concept has been covered predominantly in the autonomy section of the report earlier, and echoes similar promises. If we refer back to our classroom scene, one child was struggling with a task, and the teacher was alerted and empowered, via surveillance, to attend to that child specifically. This support by the technology to the teacher, arguably fast-tracked that child’s learning that might have otherwise not occurred. This is again in the spirit of promoting autonomy, because, even with the privacy violation of surveillance, the device could contribute to the individual’s flourishing (Shook et al. 2104, 7). However, it must be highlighted, that this action was intended for the end of helping, not to the end of disciplining the child, and herein lies the difference between a form of surveillance that promotes autonomy and one that arguably erodes it. This end is necessary to clarify when considering the purpose and normative aspects of the neuro-technology in question.
2.2 Autonomy and the privacy of thoughts
Having explored the issue of privacy as impacted by surveillance, we can move onto the concept of privacy of thoughts, an aspect that may also be compromised by a neuro-biometric device. Privacy of thoughts is particularly important as neuro-education will be monitoring attentional behaviour, giving at least some shallow insight into a student’s internal thought processes — namely low attentional feedback. Though we cannot download a transcript of a person’s thought content as might be seen in a Black Mirror episode, we can begin to derive informed data of a person’s internal processes. In this part, we will look at what sensitive information brain-reading technologies can show about a person, and the outrage at the prospect of such a function, as well as how complicated it is to define its justifiability.
Even though the research is in its early stages, neuro-biometric tools — such as FocusEdu — has, for the first time, demonstrated that there may be ways to access human thought — even without the thinker’s consent (Wolpe 2004, 37). Using fMRI (functional magnetic resonance imaging), scientists can derive all kinds of content by monitoring brain activity while people engage in certain tasks (Ienca and Andorno 2017). As the research has improved, scientists can begin to link personal traits and abilities such as extroversion, introversion, as well as ideology in correspondence with brain patterns (Wolpe 2004, 37). Therefore, a common dispute about brain-reading practices, is that it can interfere with one’s character, or the impression one gives to another, because it interferes with the usually private contents of our thoughts. This interference could shape how one interrelates with other people because we may edit our actual relational behaviours (Ryberg 2017, 157).
In terms of how this relates to FocusEdu, and its impact on a child’s thought freedom, privacy rights professor Adam Moore presents a useful account: the so-called control account of privacy (2008, 414).This account emphasizes the significance of privacy in relation to an individual’s well-being and flourishing (Moore 2008, 417). He describes a right to privacy as one that maintains a certain level of control over the inner spheres of personal information and access to “one’s body, capacities, and powers” (Moore 2003, 215; Moore 1998, 372). In other words, if someone is constantly looking over your shoulder, you could change your behaviour because of that person’s palpable presence, this is called the ‘audience effect’ (Wolf et al. 2015, 5). To illustrate this, Moore describes someone wearing a glove because they are embarrassed about a scar on their hand. He goes on to describe that if someone were to snatch that glove away, one would not only be violating a right to property (the glove is theirs), one would also breach that person’s right to privacy — a right to restrict access to information about the scar on their hand. FocusEdu can be substituted as the snatching hand if not properly regulated, the snatching of private attentional behaviours, of the contents of our minds. If a child wishes not to pay attention — even after being attended by an educator to assist — then their privacy could be violated, and so, their autonomy compromised, if they do not want to be watched. Therefore, if their autonomy is at stake, they ought to be able to keep the ‘glove on’, so to speak. In light of this, it suggests we should be extremely strict around brain-reading technologies (Wolpe 2009).
From the above exploration, autonomy is not only closely linked to agency, or the capacity to act, or coercion, but also to privacy. This is because autonomy, and therefore freedom is compromised when using technologies that can intrude on privacy. From this we again link to the oft debated right to privacy — a protector of our earlier right to freedom. In this sense, to demonstrate its explicitness, we define the brain right to privacy as mental privacy, as a broad term to cover not only its protection from surveillance but also the protection of its contents, or, namely, its thoughts. We explore this right in the following section.
2.3 Introducing the right to mental privacy
Having established these moral concerns on surveillance and privacy of thoughts we propose a right that encapsulates all these considerations of brain privacy, of which we have dubbed as an umbrella term ‘mental privacy’. This mental privacy consideration involves a three-pronged norm approach: privacy of data; privacy from surveillance, and privacy of thoughts, that are each intended to protect the child from bias (data privacy concerns), changing interpersonal trust relationships (surveillance concerns); and behaviour changes (thought privacy concerns). The data privacy recommendation is only mentioned above as a ‘prong’ due to its large significance, as mentioned earlier, but will not be covered further in the scope of this document.
Looking at multiple sides of the discussion, normalizing the idea of parents, and/or a state that is overly intrusive of personal borders is hardly enticing. However, nor is an indifference to the flourishing of a child where surveillance may be that by-product. The limits of these intruding borders is the normative challenge, and should be considered by policy-makers in education technology, where we have approached it in the interests of the child. Each concern raised around privacy can be addressed with responsible care, with some first considerations covered here. For example, in order to maintain trust — features such as transparency, two-fold consent, human-supported interaction with the technology, and prescribing voluntary-use, are all features that could be conducive towards maintaining some trust between the user and the tool. However, it must be recognised these features are not a silver bullet, and may not totally stop the erosion of trust, because, for example, social pressures, as mentioned earlier, could undermine true voluntary use, and therefore children may still feel the effects of trust erosion while being surveilled.
In sum, we believe the right to mental privacy is a condition for the permissibility of brain-reading tools such as FocusEdu for the reasons of flourishing, self-preservation and wellbeing. Privacy is a chief protector of autonomy, and therefore freedom, demonstrating its significance. This is closely related to human rights-based literature, which also prominently defends the significance of freedom, and its protection through privacy (Beitz 2009, 83).
Chapter 3: Fairness in education
In introducing the concept of fairness, we will elaborate on the concept of equality in education. It is the rights-based approach that leads us to discover that there are no rights around cognitive fairness, and even less so within education. In terms of equality in education, we echo The Dutch Inspectorate of Education concerns of fairness in education (Onderwijsinspectie 2018; Onderwijsinspectie 2019). The pressing issues around fairness in education become further apparent with several recently published articles in Dutch newspapers. Two key topics that emerged include: firstly, an article titled: “tuition is not for everyone”, argues that tuition does raise inequality within education. And secondly, the idea of inequality of opportunity in education, with multiple articles suggesting the reintroduction of ‘middle school’. This idea entails that all children up, until they become fifteen, are within the same level of education before they choose between an academic or professional education.
In light of the relations between our research question, cognitive enhancement, and those of fairness currently dominating the newspapers, we discuss two key academic debates. We first elaborate on the value of fairness and the cognitive enhancement debate. Within this, we dwell upon the academic debate that shows the difference between therapy and enhancement. Recognising the cognitive enhancement debate, we will refer to the example of tuition and argue how to ethically evaluate this neuro-biometric technology. Finally, we will write on the debate of distributive justice in education. We dwell upon notions like ‘outcomes’, ‘resources’ and ‘opportunities’ which are important distinctions to make when assessing distributive justice in education. In discussing these two debates, we also write on several positive and negative outcomes this technology could enforce. In conclusion of this chapter, we recognize that the right to cognitive equality is particularly compelling when reflecting on neuro-biometrics, and the interests of children in education.
3.1 Concerns regarding tuition
Before being able to answer our research question, cognitive enhancement also raises philosophical questions around the ideals of education and the understanding of human beings’ functionings (Buchanan 2011). As mentioned earlier this debate is roughly explained by the following distinction, expressed by the following example. Corrective eyeglasses, would be considered therapy rather than enhancement, since they serve to bring your vision back to normal; but strapping on a pair of night-vision binoculars would count as human enhancement, because they give you sight beyond the range of any unassisted human vision (Allhoff et al., 2010, 4). Referring to this example the following ethical questions concerning neuro-biometrics come forward. Shall we use cognitive enhancements in order to improve our education? Is there a problem with our educational system, for example do we think the level of education is below a certain threshold? Even if we think so, is this neuro-biometric technology something that, in the interest of a child, morally permissible to use in their education?
When we think of enhancers in education we find examples everywhere around us. Think of certain calculators that are allowed in math classes and online learning tools that help children to learn a language. Other forms of enhancers are also accepted, such as drinking coffee or the use of personal tutors (Nagel 2019, 205). Currently, schools do not outlaw the use of personal tutors, caffeine, and studying outside of school. However, referring to the annual reports and news articles at the beginning of this chapter, the question of tuition does raise ethical concerns because of the inequality in education that it might enable. Tuition poses a problem because it seems to be only available for children with parents who can afford the costs of these extra-curricular lessons. If we assume that tuition, and possibly also neuro-biometrics, can function as enhancers of education, the following considerations need to be taken into account.
Firstly, a recognition of the potentiation and optimization of individual achievements shows that neuro-biometrics can also function in a competitive way and produce positive outcomes. When educational achievement is explained in terms of effort and talent and refers to how much an individual is supported, the use of this technology could positively impact fairness due to competition within the classroom. Just as tuition helps to improve grades and increase the level of education, neuro-biometrics could also be used to level-up the education of children. Next to individual achievement, cognitive enhancers could also function in the interest of all children in society, and so have positive competitive outcomes for society. Goodman, for example argues that a ban of cognitive enhancers would deprive society of the potentially valuable results of those enhancers (Goodman 2010, 149). Goodman suggests being open to the ethical debate on the use of cognitive enhancers, illustrating that “these drugs ought to be treated in a way consistent with the treatment of most other powerful technologies — not good or bad in themselves, but subject to both good and bad use” (2010, 158). In light of cognitive enhancements’ potential, Sandberg and Savulescu argue that cognitive enhancements could offer significant social and economic benefits, for example: the reduction of economic losses, individual economic benefits, and societal economic benefits (2011, 92–113).
However, a worry related to enhancement is that neuro-biometrics will create injustice when only the rich are able to buy such a technology. According to free-market theories, the rich may buy enhancements, which the poor cannot afford, in that those with the highest levels of well-being and privilege will be able to buy even greater opportunities (Allhoff et al., 2010, 3). In other words, differences could occur between schools that are able to use this technology and other schools that do not have these assets. Another worry is the notion that, even when this technology actually helps to improve a child’s focus, we need to think on possible other techniques — technological or non-technological options — that could also help to improve children’s attention. We suggest to research the effects of other non-technologies and measures that might help to increase equality in education. Those other possibilities could, in the interest of a child, pose greater opportunities to improve the learning of children as opposed to neuro-biometrics.
In sum, there are several authors that argue that the introduction of any enhancement technology will create inequality (Fukuyama 2004; Annas 2005). However, Savulescu suggests enhancers may also reduce inequality (Savulescu 2006, 336; Bostrom and Roachie 2008,18). Savulescu writes that there are some strong fairness-based arguments to establish a moral obligation to employ enhancement and that fairness and justice even require enhancement (2006, 329). Whether these benefits are realized, or whether feared dystopias become a reality, will depend on how enhancements are distributed and employed (Sandberg and Savulescu 2011, 93). Hence, the following section will be an elaboration on the debate of distributive justice in education.
3.2 The problem with equal educational opportunities
Since no academic research has been done on the empirical effects of FocusEdu — for example whether it can actually improve the learning of a child — the only thing we can do for now, is to highlight ethical concerns with reference to distributive justice in education. In achieving this, the academic debate on outcomes versus resources is helpful to situate ourselves to, in order to answer the question whether it is in the interest of children to use this technology to improve education. We will elaborate on this debate in the following section.
To introduce the debate on improving education one could think of differences in education related to social background, talent and effort. These differences suggest the issue of education as a distributive good of justice. All theories of distributive justice in education face the challenge of balancing those actions necessary for achieving certain desired outcomes with opposing, and even conflicting interests. In a society where education can determine success, or failure, in the fight for unequally distributed goods (Schmidtz et al., 2014, 118), the educational system is something one should focus on in seeking a more just society. In educational justice, scholars either debate for resources or outcomes. The perspective on outcomes entails, for example, a focus on certain skills, like the level of reading of the child. While the perspective on resources sees what social background and talent a child has. However, Meyer suggests a third perspective on educational justice, which entails educational opportunities, instead of outcomes or resources (Meyer 2016, 342). The reason to do so is because not every child has the potential to reach the same goals, like, for example, that specific level of reading. However, he does say, “we also need to be careful not to forget how strongly the allegedly autonomous motivation depends on social background” (Meyer 2016, 344).
We acknowledge, with regard to Meyer, to focus on equal opportunities in education for children, instead of a focus for distributive justice in education for outcomes or resources. However we still need to be aware of the difficulty with how to distribute these opportunities. In light of the relationship with educational justice and neuro-biometrics it must be noted that more research is needed for whom this technology might work best. Empirical research, therefore, would need to reflect on the different effects that this technology could have on children with different talents and social backgrounds.
3.3 Introducing the right of cognitive equality in education
Having discussed two of the most relevant debates in fairness of education, in this section we will argue for the introduction of a right to cognitive equality in education. As stated earlier, the actual effects of these neuro-biometrics are still unknown. However, from the perspective of fairness one should be extra vigilant of children and groups who are considered to be disadvantaged due to the use of these neuro-biometrics. So, considering the value of fairness, and thereby also referring to the positive outcomes it might ensure, we suggest that equality could be one of the most pressing topics when using neuro-biometrics in the education of children. Therefore, in the interests of children, we should introduce a right to cognitive equality in education. This right to cognitive equality would be a new right.
Though there is no literature on this idea of cognitive equality, the work of Lynch and Baker could provide some guidance. Lynch and Baker (2005) discuss the concept of equality in education, where they identify some of the issues pertaining to the promotion of equality and suggest several ways in which education could be much more egalitarian. They emphasize the fact that the educational system is strongly integrated into the society around it. They highlight that systemic changes need to be made, in that equality cannot be expected in education without progress towards equality in the economic, cultural and political systems in which it is embedded (Lynch and Baker 2005, 154). Due to a need for systemic changes, a right to cognitive equality could perhaps only come about given these changes. So, if this right would exist, the right to cognitive equality could be one that takes into account the progressive availability of cognitive enhancers such as neuro-biometrics. We are not to able to add meaningful clarification to this right since there is no literature on the subject, however we hope to ignite further discussion.
Conclusion
This ethical report on FocusEdu functions as a starting point for decision-makers to investigate moral concerns on the permissibility of using neuro-biometric in the interests of the child. We evaluated this technology from a normative theory on rights, and more specifically the Kantian perspective. Using this method, the analysis identified some key possible assumptions, for instance, it highlighted that neuro-biometric enhancers are significantly intertwined with autonomy; that privacy of a child could be impacted; and that fairness in the education of children should be morally considered.
With these three values, we were able to define several moral concerns on the use of neuro-biometrics in the education of children. Ethical examination of autonomy demonstrated that we need to reflect on areas like freedom, consent and coercion. From this, we recognized the right to freedom and suggested this right should also be extended and made explicit to a right to freedom for the brain. We identified that privacy follows when autonomy has not been respected. From this, we stipulated the significance of the prominent concept of surveillance, and the less prominent, but compelling concept of the privacy of thoughts. Recognizing the current right to privacy, we made an explicit recognition of this right in terms of the brain, referring to it as mental privacy. This already existing right on privacy could include the protection from surveillance, and privacy of thoughts. Evaluating fairness, we dwelled upon two important debates that were relevant to cognitive enhancement in education. This generated the concept of inequality in education, if we assumed that enhancement can function as tuition, and therefore could have an impact on equality in education. What is interesting about the fairness evaluation, is the introduction of a possible new right, namely that of cognitive equality.
In sum, these ethical considerations suggested the introduction or recognition of certain rights in the interests of a child. However, although we give some suggestions, we are not able to give a definitive answer whether it is in the interest of children to actually implement this neuro-biometrics for their education. Our suggestion for further research is to gain more knowledge on the actual empirical effects of neuro-biometrics in education to see whether it can actually improve the learning of children. With regard to the moral concerns we brought up within this rights-based approach, we aim to ethically equip decision makers to be better able to make a decision to implement neuro-biometrics in education.
Literature
Allhoff, F., Lin, P., Moor, J., and Weckert, J. (2010). Ethics of human enhancement: 25 questions and answers. Studies in Ethics, Law, and Technology, 4(1) 1–41.
Annas G. (2005) Bioethics: Crossing human rights and health law boundaries. New York: Oxford University Press.
Beitz, C. R. (2011). The idea of human rights. Oxford University Press.
Bianchi, R., Schonfeld, I. S., and Laurent, E. (2019). Burnout: Moving beyond the status quo. International Journal of Stress Management, 26(1), 36–45.
Bostrom, N., and Roache, R. (2008). Ethical issues in human enhancement. In J. Ryberg, T. Petersen & C. Wolf (eds.) New Waves in Applied Ethics. Palgrave-Macmillan, 120–152.
Britz, J. J. (1996) Technology as a Threat to Privacy: Ethical Challenges and Guidelines for the Information Professionals. Microcomputers for Information Management, 13(3–4), 175–93.
Bublitz, J. C., and Merkel, R. (2009). Autonomy and authenticity of enhanced personality traits. Bioethics, 23(6), 360–374.
Buchanan, A. (2011). Cognitive enhancement and education. Theory and Research in Education, 9(2), 145–162.
Christman, J. (2018). Autonomy in Moral and Political Philosophy. The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/autonomy-moral/ retrieved on January 14, 2020.
Deems, A. (January 21, 2020) ‘Hoog en laag’ door elkaar in een klas: ‘misschien kan ik nu wel een niveau hoger doen’, de volkskrant, https://www.volkskrant.nl/nieuws-achtergrond/hoog-en-laag-door-elkaar-in-een-klas-misschien-kan-ik-hier-wel-een-hoger-niveau-doen~ba69b38f/ retrieved on January 23, 2020.
Fukuyama, F. (2004). Transhumanism. Foreign Policy, 144, 42–43.
Glannon, W. (2015). Neuroethics: Cognitive Enhancement. Oxford Handbooks Online. https://www.oxfordhandbooks.com/view/10.1093/oxfordhb/9780199935314.001.0001/oxfordhb-9780199935314-e-43 retrieved on January 29, 2020.
Goodman, R. (2010). Cognitive enhancement, cheating, and accomplishment. Kennedy Institute of Ethics Journal, 20(2), 145–160.
Guldemeester, E. (February 21, 2018). Je kind in de gaten houden met een leerlingvolgsysteem moet je dat willen. Trouw. https://www.trouw.nl/nieuws/je-kind-in-de-gaten-houden-met-een-leerlingvolgsysteem-moet-je-dat-wel-willen~b1827b75/ retrieved on December 21, 2019.
Ienca, M., and Andorno, R. (2017). Towards new human rights in the age of neuroscience and neurotechnology. Life Sciences, Society and Policy, 13(1), 1–27.
Johnson, R, and Cureton, A. (2019) Kant’s Moral Philosophy. The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/kant-moral/ retrieved on January 28, 2020.
Krawczyk, D. (2017). Reasoning: The neuroscience of how we think. Academic Press.
Lavazza, A. (2018). Freedom of thought and mental integrity: The moral requirements for any neural prosthesis. Frontiers in Neuroscience, 12(82), 1–10.
Le Clergq, A. (January 17, 2020) ‘Bijles is niet voor eidereeen weggelegd’, de volkskrant, https://www.volkskrant.nl/nieuws-achtergrond/bijles-het-is-niet-voor-iedereen-weggelegd~b42fe46d/ retrieved on January 23, 2020.
Livingstone, S. (2019). Audiences in an age of datafication: critical questions for media research. Television and New Media, 20(2), 170–183.
Livingstone, S., and Haddon, L. (2009). Introduction. In S. Livingstone and L. Haddon (eds.) Kids online: opportunities and risks for children. The Policy Press, 1–6.
Lynch, K., and Baker, J. (2005). Equality in education: An equality of condition perspective. Theory and research in education, 3(2), 131–164.
Meyer, K. (2016). Why should we demand equality of educational opportunity?.Theory and Research in Education, 14(3), 333–347.
Moore, A. D. (1998). Intangible property: privacy, power, and information control. American Philosophical Quarterly, 35(4), 365–378.
Moore, A. D. (2003). Privacy: Its meaning and value. American Philosophical Quarterly, 40(3), 215–227.
Moore, A. (2008). Defining privacy. Journal of Social Philosophy, 39(3), 411–428.
Mokrosinska, D. (2018). Privacy and Autonomy: On Some Misconceptions Concerning the Political Dimensions of Privacy. Law and Philosophy, 37(2), 117–143.
Nagel, S. K. (Ed.). (2019). Shaping Children: Ethical and Social Questions that Arise when Enhancing the Young. Springer.
Nyberg, L., Sandblom, J., Jones, S., Neely, A. S., Petersson, K. M., Ingvar, M., et al.. (2003). Neural correlates of training-related memory improvement in adulthood and aging. Proceedings of the National Academy of Sciences of the United States of America, 100(23), 13728–13733.
Onderwijsinspectie (2018). De staat van het onderwijs 2018. https://www.onderwijsinspectie.nl/documenten/rapporten/2018/04/11/rapport-de-staat-van-het-onderwijs retrieved on January 17, 2020.
Onderwijsinspectie (2019). De staat van het onderwijs. 2019.https://www.onderwijsinspectie.nl/documenten/rapporten/2019/04/10/rapport-de-staat-van-het-onderwijs-2019 retrieved on January 17, 2020.
Outram, S. M. (2012). Ethical considerations in the framing of the cognitive enhancement debate. Neuroethics, 5(2), 173–184.
Penney, J. (2017). Internet surveillance, regulation, and chilling effects online: A comparative case study. Internet Policy Review, 6(2), 1–39.
Ryberg, J. (2017). Neuroethics and Brain Privacy: Setting the Stage. Res Publica, 23(2), 153–158.
Sahakian, B. J., and Morein-Zamir, S. (2011). Neuroethical issues in cognitive enhancement. Journal of Psychopharmacology, 25(2), 197–204.
Sandberg, A., and Savulescu, J. (2011). The social and economic impacts of cognitive enhancement. In J. Savulescu, R. ter Meulen and G. Kahana (eds.) Enhancing human capacities, Blackwell Publishing, 92–113.
Savulescu, J. (2006). Justice, fairness, and enhancement. Annals-New York Academy of Sciences, 1093, 321–338.
Shade, L. R., and Singh, R. (2016). “Honestly, We’re Not Spying on Kids”: School Surveillance of Young People’s Social Media. Social Media + Society, 2(4), 1–12. 2
Schlosser, M. (2019). Agency. The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/agency/ retrieved on January 22, 2020.
Shook, J. R., Galvagni, L., and Giordano, J. (2014). Cognitive enhancement kept within contexts: neuroethics and informed public policy. Frontiers in systems neuroscience, 8(228), 1–8.
Steeves, V., and Jones, O. (2010). Editorial: Surveillance, children and childhood. Surveillance and Society, 7(3/4), 187–191.
Traina, C. L. (2009). Children and Moral Agency. Journal of the Society of Christian Ethics, 29(2), 19–37.
Van den Hoven, J., Blaaw, M., Pieters, W. and Warnier, M. (2019). Privacy and Information Technology. The Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/it-privacy/ retrieved on January 23, 2020.
Wolf, L. K., Bazargani, N., Kilford, E. J., Dumontheil, I., and Blakemore, S. J. (2015). The audience effect in adolescence depends on who’s looking over your shoulder. Journal of adolescence, 43, 5–14.