Norma

«Whenever normality is invoked, in the shade there is always a reference to non-normality (abnormality), so to speak.» (Link, 2013)

 

 

Disclaimer: With the term ‘normal’ we refer to instances where it is applied to individuals or their properties. We do not refer to normality at a societal level where it is applied to institutions and their routines (e.g. reopening of kindergarten, curfews and quarantine, social distancing), as frequently invoked in current Corona-discourses on a possible return to normality.   

 

BAN

 

Adhering to or rejecting the normal?

Making the normal an object of critique and of personal reflection

 

The normal “uses a power as old as Aristotle to bridge the fact/value distinction,
whispering in your ear that what is normal is also right.” (Hacking, 1990)

 

In the mid-19th century, when the Belgian mathematician Adolphe Quetelet started measuring human properties, no human property, or deviation from it - irrespective of its distance from the mean - was considered abnormal, pathologic, deviant, or normatively loaded. All properties lay in the area under the normal distribution curve, so to speak. ‘Normal’ here referred to a range of variations, in particular to a symmetric, bell-shaped distribution of data in which most data fall near the center and, consequently, sometimes was used as a synonym for 'probable', but it did not describe one part of a binary condition (Cryle and Stephens, 2017). Quetelet’s objective was to identify statistical laws such as correlations (i.e. phenomena occurring together) and means (i.e. averages) at the population level. At the end of the 19th century, Francis Galton's theorization of the normal distribution marked the point of convergence between the statistical conceptualization of the normal as the average and the medical conceptualization of the normal as the healthy that, according to Hacking (1990), laid the foundation for the cultural authority of the normal in the 20th century. Until the mid-20th century, however, ‘normal’ was exclusively used as a scientific term, mainly in medicine, which referred to a state of health and the proper functioning of organs. It was only then that it entered daily life and lay persons’ language. Since then, it became more and more vague and, to put it simply, it acquired  two meanings: one descriptive, in terms of the common and the usual, and one normative, as an ideal to be achieved.

Today, almost 200 years after Quetelet’s statistical inquiries, one can not only observe the ubiquity of the normal, but also a form of normatively loaded normality - “the concept of normal is itself normative” (Canguilhem, 1966). It almost seems as if one source of normativity is "statistical voluntarism", in which obligation derives from authoritative statistics that articulate an ought and in which individuals freely choose to adhere to these norms. However, any claim about what ought to be, which is exclusively based on descriptive premises (e.g. statistical statements such as a means or frequencies) falls victim to the is-ought problem articulated by David Hume. A statistical statement is, in the first place, a descriptive statement and, thus, a descriptive premise about what something is, which cannot in itself vindicate a normative conclusion. It concerns the Is, not the Ought.

It is argued that in today’s dominant Western-societal strategy of how normality unfolds within a society, namely Flexible Normalism, norms are being calculated ex post based on statistical data – ‘normal’ is what most people are, or are doing - and an individual may or may not adjust to these norms (Link, 2009). The individual has degrees of freedom with respect to his or her self-alignment. The promise of the flexible normalistic society is to include a huge variety of phenomena in the realm of normality. But this promise requires a careful examination. Various concerns need to be expressed.

Searching for the normal and for orientation, the individual does not face pre-defined and top-down governed norms, but is confronted with statistically crystallized realms of normality. This also means that normality is dynamic and, hence, constantly changing. However, limits to normality are set arbitrarily and lie at the extremes of a curve that is in principle continuous (e.g. normal distribution). Regardless of this, people tend to orient themselves strongly towards normality - they normalize themselves. Even if norms have been developed in a flexible normalistic way, once established they still cause suction effects and conformity pressure, which, at an individual level, leads to (uncritical) internalization and fear of denormalization, and, at a societal level, to homogenization and normalization. As more and more people orient themselves towards normality to be socially accepted, people become more and more similar to each other. Finally, since people considered abnormal often represent a disadvantaged population, their desire to be normal also indicates their demand to be treated equally, that is to be regarded as having the same rights as individuals who are located on the bump in the middle of the normal distribution curve.

Hence, Flexible Normalism - in fact any form of normality - has its dangers. Not being considered normal may lead to stigmatization, social exclusion, and pathologization of an individual and, ultimately, to pathologies on the part of the individual. Normality is quasi-imposed on individuals. Individuals who do not fall within the range of normality and are considered abnormal face pressure to adapt and are frequently marginalized. The “unassuming word [normal] can have a significant effect on the lives of those defined in contrast to it as abnormal, pathological, or deviant” (Cryle and Stephens, 2017). Moreover, it represents a judgment about which human properties are good and which are not. Society as a whole loses diversity and spaces for deviation and otherness diminish. Undoubtedly, (the cultural authority of) the apparently neutral label ‘normal’ emphasizes and reinforces a certain worldview, a set of behavior and human properties, it perpetuates systems of privilege and power, and - in any case - negatively defines the abnormal. Normality becomes a metonym for social exclusion and a powerful tool in the hands of those who aim to construe its essence.

The project takes its starting point in exactly this area of tension: on the one hand, a malleable normality that promises social inclusion as well as an expansion of normality, and, on the other hand, the suction effects, imposition, and conformity pressure of any form of normality that abet social exclusion as well as a contraction of normality. Thus, the project aims to (a) promote awareness of the dangers of linking statistical data and normatively loaded normality, and (b) empower individuals to overcome this nexus of statistics and normativity. For this purpose, we want to make people think about the term ‘normal’, about its (mis)use as a label, and about its potential dangers.

Ultimately, the study aims to demonstrate the following choice between Scylla and Charybdis: the term ‘normal’ either refers to all existing human properties and, therefore, becomes meaningless and dispensable, or it exclusively refers to a subset of human properties and, thereby, excludes other properties from being normal and abets social exclusion, stigmatization, and normalization. The proposed project starts from the conviction that coming to realize the dangers of an essentialization of morality as quantified normality would be beneficial for society as a whole.

After all, what would be lost, if we simply stopped using the term ‘(ab)normal’ when referring to other persons, refrained from labelling persons or their properties as (ab)normal? Can’t we, in these instances, always replace the term with a more accurate, but less stigmatizing one?

                                     Try it for yourself and avoid the term 'normal' in daily life

                                              (when referring to persons or their properties)!

                                              

Challenge

 

 

 

References

  • Canguilhem G (1989). The Normal and the Pathological. Trans. C.R. Fawcett. New York: Zone Books
  • Cryle P & Stephens E (2017). Normality – A Critical Genealogy. Chicago: The University Press of Chicago
  • Hacking I (1990). The Normal State. In: The Taming of Chance. Cambridge: Cambridge University Press
  • Link J (2009). Versuch über den Normalismus - Wie Normalität produziert wird. Göttingen: Vandenhoeck & Ruprecht
  • Link J (2013). Normale Krisen? Normalismus und die Krise der Gegenwart. Konstanz University Press

Contact

The project is funded through the Swiss National Science Foundation’s Spark funding scheme and carried out at the Institute for Biomedical Ethics at the University of Basel, Switzerland.

 

Original title of the project: Statistical Categories and Normativity: (Against) the Essentialization of Morality as Quantified Normality