Breastfeeding Beliefs: From Invincibility to Universal Creation

Breastfeeding Beliefs: From Invincibility to Universal Creation


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Breastfeeding is an infant feeding practice in which a child is fed breast milk directly from breast to mouth. Breastfeeding could be performed by the mother herself or by a wet nurse. Evidence of breastfeeding is found in various past societies and it can be assumed that breastfeeding has been practiced ever since there were babies. Nevertheless, despite breastfeeding being arguably the most natural way of feeding a baby, there was never a time when it was done by everyone, as there is evidence showing that other infant feeding practices were used as well.

Painting of a woman breastfeeding at home. (rijksmuseum / CC BY SA 1.0 )

Breastmilk – Is This Remarkable Fluid a Source of Invincibly?

Although it is unlikely that ancient societies fully understood the nutritious value of breastmilk (scientists today are still learning more about this remarkable bodily fluid), they were aware of its importance. This may be seen in the revered role accorded to breastmilk in mythology. The ancient Greeks, for instance, believed that it was the breastmilk of Hera, the Queen of the Gods, which made the hero Heracles invincible. Additionally, it was the breastmilk of this goddess who made the Milky Way.

Breast milk was also glorified in the myths of ancient Mesopotamia . Astarte, an important Babylonian goddess, was considered to be the ‘Mother of Fertile Breasts, the Queen of Heaven, the Creator of Human Beings, and the Mother of the Gods’.

  • Do You Think Breastfeeding a 3-Year-Old is Strange? In the Ancient World, It Saved Lives
  • Tomb Hidden by History, Now Revealed: Wet Nurse of Tutankhamun May have been His Own Sister
  • Sex Pottery of Peru: Moche Ceramics Shed Light on Ancient Sexuality

Sculpture of Nepal breastfeeding. ( Nabaraj Regmi / Adobe Stock)

The reverence accorded to breastmilk was also given to those who breastfed. This may be seen, first of all, in the artistic depictions of breastfeeding. From the ancient Egyptian civilization images have been found of the goddess Isis breastfeeding her son Horus.

Isis in papyrus swamp suckling Horus. ( Public Domain )

Breastfeeding also plays an important role in the foundation myth of Rome. Instead of the breasts of a woman, however, the twins Romulus and Remus were suckled by a she-wolf until they were discovered by the shepherd Faustulus and his wife, Acca Larentia.

Suckling from Animals

Romulus and Remus weren’t the only ones said to have suckled from animals. In the past, if a mother was unable to produce enough milk to feed her baby and another human woman was unavailable to take her place, a female animal could be used to keep the child alive. Like in the mythological story, direct suckling from the animal was preferred over milking an animal and then providing the milk to the baby – it was acknowledged as a cleaner method. The animals chosen for this purpose were donkeys, cows, goats, sheep, or dogs. Some historians believe that cows and goats may have been domesticated especially for this purpose.

Sculpture of a she-wolf feeding Romulus and Remus. ( neurobite / Adobe Stock)

People used to believe that a baby feeding off animal (and in some cases human) milk would have an impact on the child’s personality. Swedish scientist Carl Linnaeus, for example, thought that a lioness’ milk would promote courage. Donkeys were seen as more moral than ‘lusty’ goats, though goats became preferred animal wet nurses for foundlings in the 18th century.

How Was the Role of a Wet Nurse Viewed?

In general, ancient societies placed much importance on breastfeeding. It may be assumed that mothers would naturally breastfeed their own babies, however, this was not always possible, as some mothers died while giving birth and others were simply unable to lactate. As a result of this, there was a market for women who would feed another’s child, and these were known as wet nurses.

The significance of these women in ancient societies is evident in the respect paid to them. In ancient Egypt, despite belonging to the servant class, wet nurses were highly esteemed, especially those who breastfed the pharaoh. As another example, in ancient Mesopotamia, the role of wet nurses in society was so noteworthy that laws formalizing the relationship between a wet nurse and her employer were issued by the Babylonian king Hammurabi .

XVIII century, family life in Prussian upper-class homes: baby with wet nurse . (acrogame / Adobe Stock)

Not everyone, however, had a positive view of wet nurses. During the Roman period rich families were able to afford wet nurses to breastfeed their babies. This practice was harshly criticized by such writers as Cicero and Tacitus, who were of the opinion that mothers who employed wet nurses were neglecting their duty to Rome, were decadent, and endangered the stability of society.

Wet-nursing continued for centuries, but criticism was also present. In the late 1700s and early 1800s, for example, European reformist movements began to push for women to breastfeed their own babies. And the governments of some nations even began to play a part in this very personal topic. The French government declared women that didn’t breastfeed would not receive welfare in 1793. And in 1794, the Germans made it a legal requirement for all healthy women to breastfeed their own infants. Societal and political factors meant that by the early 1800s many women were proudly declaring their breastfeeding practice.

  • The Mother of all Gods: The Phrygian Cybele
  • The Love of a Mother Never Dies: 4,800-Year-Old Remains of a Mother Cradling Her Baby
  • Rice Goddesses - Earth Mothers who Influenced Heaven, Earth, and the Underworld

‘Young woman breastfeeding her child‘ (1777) by Louis-Roland Trinquesse. ( Public Domain )

What Were the Other Ancient Forms of Infant Feeding?

Apart from breastfeeding by a mother or a wet nurse, other forms of infant feeding were used by ancient societies. Terracotta pots with long spouts have been unearthed in some infant graves. These were thought to have been used to feed the infants.

Such vessels were the pre-cursor of the feeding bottle, which was introduced during the 19th century. As to the contents of these pots, it was not always milk, as one might expect. It has been reported that the ancient Greeks used to feed their babies with a mixture of wine and honey in such pots.


How the Basques became an autonomous community within Spain

The centuries-long struggle for Basque independence may set an example for similar groups in the country.

Travel through Basque country in northern Spain, and you’ll encounter breathtaking coastal vistas, rocky landscapes and tranquil farms. You’ll also meet Basque people, whose history is one of pride, oppression, and struggle.

The Basque ethnic group comes from a region of southwest France and northwest Spain known to outsiders as Basque and to Basque people as Euskal Herria. “Euskal” refers to Euskara, the Basque language, which is linguistically distinct from French, Spanish and indeed any other language. Spoken by approximately 28 percent of modern Basques, it’s unclear exactly where it came from, how it developed, or why it is so distinctive. There are at least six Basque dialects, but the majority of Basques speak a standardized version developed in the 1960s.

Recent research indicates Basques descended from Neolithic farmers who became genetically isolated from other European populations due to their geographic location, which has landscapes that range from coastline to the rocky hills of the Western Pyrenees. That largely inhospitable terrain led to past isolation, which helped determine the course of Basque history. Basques lived in northern Spain when the Romans invaded the area in 196 B.C., for instance, but managed to retain most of their traditions and laws throughout Roman rule and their time under the various invaders that followed.

From around A.D. 824, Basque was part of the Kingdom of Navarre, a medieval state ruled over by a series of monarchs. In 1515, much of Navarre was annexed to the Crown of Castile and became part of what would become modern Spain. After a period of relative independence, Basque self-government was abolished by the Spanish government in Madrid beginning in 1839. Over time, a growing Basque nationalist movement began to insist on political unity and agitate for a separate Basque nation. During the Spanish Civil War in the, Francisco Franco forbade the Basque language, stripped rights from the Basques, and ordered the destruction of the Basque city of Guernica. (Here's how the bombing of Guernica inspired one of Picasso's best works.)

Basques suffered under the Francoist regime. In response, a group of Basque separatists formed Euskadi Ta Askatasuna (ETA) in 1959. The separatist group conducted a decades-long terrorist campaign that ultimately killed over 800 people. The ETA disbanded in May 2018.

By then, Spain had granted relative economic and political autonomy to Basque Country, and acknowledged a separate Basque identity. (Though the Basque Autonomous Community, which includes three Basque provinces, has its own identity, it lacks an actual capital: Vitoria-Gasteiz is its de facto capital, but the largest city within Basque Country is Bilbao.) In the years since Franco’s death in 1975, the Euskera language has been largely revived, and most Basques have stopped the call for a fully autonomous nation.

Another ethnic group within Spain, the Catalans, has not, however, and when Catalonia held an independence referendum in 2017, Spain declared it illegal, suspended the region’s autonomy, and jailed the movement’s leaders. The crisis recently came to a head when nine of the leaders were given lengthy prison terms, prompting huge protests and raising new questions about Spain’s political future.

Could Basque successes point to a potential solution? Perhaps, writes Reuters correspondent Sonya Dowsett—but it will cost lots of money. Though lawmakers have pointed to Basque autonomy as a potential model for peace in Catalonia, it’s unclear if the experiences of Basques can provide a path forward for Catalans.


Religious and Cultural Beliefs and Practices

Background
From 2011 to 2014, a program aimed to improve sexual and reproductive health practices among adolescent girls was implemented by the non-governmental organization BRAC in partnership with the Government of South Sudan and with funding from the World Bank. BRAC had implemented similar programs in Liberia, Sierra Leone, Tanzania and Uganda.

Objectives
Monitoring and evaluation data gathered by BRAC revealed that implementation of its program in South Sudan faced more challenges than comparable efforts in other African countries. This motivated BRAC to further investigate adolescent sexual and reproductive health beliefs and practices in South Sudan.

Methods
Data was gathered from seven key informant interviews and nine focus groups of adult women and men and adolescent girls and boys in Torit and Magwi Counties in Eastern Equatoria state.

Results
The study found a strong cultural preference for girls to demonstrate their fertility by beginning to have children at an early age (13–16 years) and to have many children (8–12). It also found that education on HIV/AIDS had been effective.

Conclusions
To be effective in South Sudan, adolescent sexual and reproductive health programs must take the current social norms and practices into account and learn from the successes of HIV/AIDS education programs.


Beliefs and Attitudes of Mothers Towards Breastfeeding

Resumen
Se presenta una propuesta de investigación en curso, un proyecto donde se aúnan, y se alían, los estudios de lactancia y los de patrimonio inmaterial (estudios críticos de patrimonio). El objetivo es, pues, doble: (1) rescatar el patrimonio cultural inmaterial (asociado a la Plaza de las Pasiegas de Granada) que supuso la práctica de la lactancia por nodrizas, atravesando la historia socio-ciudadana que implicaba la cultura de lactancia (y crianza, por ende) asalariada, para aterrizar (2) en la cultura de lactancia contemporánea denominada lactivismo, que está creando a su vez lo que consideramos nuevas formas de patrimonio vivo.

Palabras clave: nodrizas, lactivistas, Plaza de las Pasiegas (Granada, España), lactancia materna, patrimonio vivo.

Abstract
A proposal of research-in-progress, combining lactation studies and critical heritage studies, is presented here. The aim of this work is to rescue oral memories about the breastfeeding practice in the Plaza de las Pasiegas of Granada (Spain): from past waged nursing to current lactivism. Through that social and civic story involving a culture of breastfeeding (1), we land on the culture of contemporary nursing (2), in turn creating what we consider new ways of living heritage.

Key words: nursing, lactivism, Pasiegas Square (Granada, España), breastfeeding, living heritage.

Resumen
“Mamar: mythos y lógos sobre lactancia humana” constituye un número especial o monográfico interdisciplinar sobre amamantamiento humano, convocado desde las ciencias humanas y sociales. Cuenta con diecisiete artículos producidos desde los siguientes ámbitos generales: ética y filosofía, prehistoria e historia, antropología (en sus múltiples variantes: social y cultural, enfermera, médica, histórica), psicología, pedagogía, bellas artes y ciencias de salud (enfermería y medicina). Esta interdisciplinariedad obedece al mismo objeto de estudio: la lactancia, como hecho biocultural, requiere de plurales epistemologías y ópticas para ser asumido en su riqueza y complejidad, sin las reducciones y presupuestos que, por lo general, han acompañado a su comprensión histórica.
El objetivo de esta compilación es mostrar la viveza y la pluralidad de investigaciones contemporáneas realizadas desde ciencias humanas y sociales sobre lactancia materna humana, que se pretende reivindicar en los estudios de historia social y de las mujeres, de feminismo y de acción social, a través especialmente del movimiento contemporáneo del lactivismo.

Palabras clave : lactancia humana, amamantamiento, historia / historiografía de la lactancia, nodrizas, lactancia mercenaria / asalariada, “La Vanguardia”, etnografía de la lactancia, pueblo ye’kuana, crianza con apego, bancos de leche, parentesco de leche, soberanía alimentaria, perspectiva de género, maternidad (es), renuncia a la lactancia, políticas de promoción de lactancia materna, culpa, grupos de apoyo, promoción de salud, unidades de neonatología, tráfico de leche materna, campaña “Leche materna” (Unicef-Venezuela 2010), lactivismo, ayuda mutua, movimientos de resistencia, Amazonia, Venezuela, México, Chile, España, Reino Unido, Europa.

Abstract
“Breastfeed: mythos and lógos on human lactation” is an interdisciplinary special issue (or monographic) on human lactation, from social and human sciences.
It presents seventeen articles produced from several general disciplines: philosophy of culture, prehistory, history, anthropology (within its multiple varieties: social and cultural, nurse, medical, historical), psychology, pedagogy, fine arts, nursing, and medicine.
The very study object justifies such interdisciplinarity: breastfeeding, as a biocultural fact, requires plural epistemologies and views to be assumed within its richness and complexity, without the reductions and prejudices that, in general, have accompanied its historical comprehension.
The objective or this compilation is to show the life and plurality of current research from social and human sciences on breastfeeding, which is vindicated in the social and women history studies, feminism and social action, and specifically from the current movement of lactivism.

Key words : human lactation, breastfeeding, history / historiography of lactation, wet nurses, mercenary / waged lactation, “La Vanguardia”, ethnography of lactation, Ye’kuana people, attachment parenting, milk banks, milk kinship, food sovereignty, gender perspective, guilt, support groups, health promotion, neonatology units, human milk traffic, “Leche materna” Campaign (Unicef-Venezuela 2010), lactivism, mutual aid, resistance movements, Amazonia, Venezuela, Mexico, Chile, España, United Kingdom, Europa.

PREVALENCE OF RESPIRATORY INFECTION AND FEEDING CHARACTERISTICS OF CHILDREN UNTIL THE SIXTH MONTH OF LIFE IN RIO GRANDE/RS

In developing countries, the respiratory infections are responsible for a third of the deaths and for the half of hospitalizations in under five children, being constituted in a serious problem of public health. A premature introduction of other feed that not the maternal milk, besides predisposing the babies to a higher contact with pathogenic agents, it can expose the organism to foods that it is still not capable to metabolize. The propose of this study was to investigate the prevalence of respiratory infection and the main characteristics of feeding pattern until the sixth month of life, in under 5 children, analyzed in 1995 and 2004, in Rio Grande/RS, Southern Brazil. There were analyzed two cross sectional studies that evaluated 395 children, in 1995 and 385 children, in 2004. The percentile of losses was 2.1% and 4.4%, in 1995 and 2004, respectively. The prevalences of respiratory infection were 34.9% (1995) and 28.3% (2004). There was a significant increase in the proportion of children with respiratory infection between 0 and 6 months of life (1995: 6.5% 2004:11.0%), from families with low income (1995: 26.1% 2004: 47.7%), deficit hight/age (1995: 6.5% 2004: 26.6%). The prevalence of exclusive breast-feeding untill sixth month of life, in children who had respiratory infection, was 1.5% and 4.6%, in 1995 and 2004, respectively. The proportions of children breastfeeded predominantly (1995: 5.8% 2004: 14.7%) and partially (1995: 8.9% 2004: 22.0%) until the six months, increased significantly (p <0.05). However, the type of milk feeding most prevalent n the sixth month of life was the mixed predominant (1995: 63.8% 2004: 54.1%). Therefore, in despite of having had an increase in breastfeeded children's proportions, these values are very on this side of the objectives proposed by the World Organization of Health. Besides, the introduction of other liquids and solid foods before the sixth month of life, is still very high in children from this city. However, they are other necessary studies focused in the alimentary risk factors related to the occurrence of respiratory diseases in under five, so that it will be albe to discover what factors are acting in order to predispose these children to develop these pathologies.

Keywords: respiratory infection, breastfeeding, complementary feeding, children

ALMA MATER STUDIORUM – UNIVERSITÀ DI BOLOGNA
SCUOLA DI MEDICINA E CHIRURGIA
Corso di Laurea in Ostetricia

CONOSCERE IL D-MER:
un’esperienza di gruppi di riflessione con studentesse e ostetriche
Relatore: Claudia Mereu
Presentata da: Oriana Territo
Anno Accademico 2013-2014

INTRODUZIONE: l’indagine affronta la tematica relativa al disturbo disforico da emissione del latte (D-MER), che si presenta nelle donne che allattano al seno quale brusco calo dell’umore in senso negativo accompagnato da sensazioni spiacevoli di vario genere che incombono poco prima della fuoriuscita del latte. In letteratura è stato descritto come fenomeno della durata di pochi minuti, riconducibile ad un repentino e/o eccessivo calo della dopamina in corrispondenza del riflesso di emissione del latte.

OBIETTIVO: l’obiettivo dell’elaborato consiste nel presentare il D-MER, argomento ancora scarsamente conosciuto, per comprendere come la diagnosi di questo disturbo possa inserirsi all’interno della pratica clinica odierna e modificare la condotta assistenziale ostetrica. La tematica in oggetto è stata proposta a due gruppi distinti di studentesse e ostetriche attraverso il focus-group, con lo scopo di presentare l’argomento e successivamente raccogliere le considerazioni rispetto ai risvolti che il D-MER potrebbe avere nella famiglia, nella formazione e nella pratica assistenziale. Nel corso del confronto si è cercato di comprendere le opinioni delle partecipanti sulle metodologie partecipative come strumento formativo universitario e professionale.

CONTENUTI: nel lavoro viene affrontato il tema di un apprendimento che promuova il problem solving e la riflessività che guidi professionisti e studenti nella capacità di risolvere problemi in contesti professionali e di tirocinio, con particolare attenzione alle situazioni assistenziali mai esperite. Vengono esposte, in base agli studi attuali, le caratteristiche del D-MER, i gradi di severità, le cause che lo determinano e le terapie, l’importanza dei risvolti negativi di una mancata diagnosi o di un’errata diagnosi differenziale tra D-MER e depressione post partum. Una parte del testo è dedicata alle testimonianze delle donne che hanno vissuto un’esperienza di D-MER.

CONCLUSIONI: la professionalità ostetrica rappresenta una risorsa irrinunciabile per le famiglie e per la comunità in particolare la gravidanza e il puerperio rappresentano dei momenti in cui si esplica l’assistenza ostetrica attraverso l’esperienza e le abilità di counseling che consentono alla professionista di esercitare un sostegno competente per la donna e la famiglia. Si affianca a ciò la necessità che altre figure professionali conoscano il D-MER, onde assicurare un approccio assistenziale multidisciplinare, ove necessario, e indirizzare verso ostetriche esperte di allattamenti difficili le donne con D-MER lieve o moderato. Le partecipanti ai focus hanno considerato estremamente utile ai fini della propria formazione la possibilità di aderire ad attività formative che implichino una partecipazione attiva e la possibilità di elaborazione pratica del sapere acquisito. Dai focus group emerge come la formazione riflessiva e aperta al confronto tra professionisti offra maggiori stimoli verso la ricerca di nuove soluzioni assistenziali e supporti una professionalità in divenire che tenda alla condivisione e al continuo aggiornamento.


Contents

In general English usage, nude and naked are synonyms for a human being unclothed, but take on many meanings in particular contexts. Nude derives from Norman French, while naked is from the Anglo-Saxon. To be naked is more straightforward, not being properly dressed, or if stark naked, entirely without clothes. Nudity has more social connotations, and particularly in the fine arts, positive associations with the beauty of the human body. [3]

Further synonyms and euphemisms for nudity abound, including "birthday suit", "in the altogether" and "in the buff". [4] Partial nudity is often defined as not covering parts of the body that are deemed to be sexual, such as the buttocks and female breasts.

Two evolutionary processes are significant in human appearance first the biological evolution of early hominids from being covered in fur to being effectively hairless, followed by the cultural evolution of adornments and clothing.

Evolution of hairlessness

The first member of the genus homo to be hairless was Homo erectus, originating about 1.6 million years ago. [5] The dissipation of body heat remains the most widely accepted evolutionary explanation for the loss of body hair in early members of the genus homo, the surviving member of which is modern humans. [6] [7] [8] Less hair, and an increase in sweat glands, made it easier for their bodies to cool when they moved from shady forest to open savanna. This change in environment also resulted in a change in diet, from largely vegetarian to hunting. Pursuing game on the savanna also increased the need for regulation of body heat. [9] [10] Anthropologist and palaeobiologist Nina Jablonski posits that the ability to dissipate excess body heat through eccrine sweating helped make possible the dramatic enlargement of the brain, the most temperature-sensitive human organ. [11] Thus the loss of fur was also a factor in further adaptations, both physical and behavioral, that differentiated humans from other primates. Some of these changes are thought to be the result of sexual selection. By selecting more hairless mates, humans accelerated changes initiated by natural selection. Sexual selection may also account for the remaining human hair in the pubic area and armpits, which are sites for pheromones, while hair on the head continued to provide protection from the sun. [12]

A divergent explanation of humans' relative hairlessness holds that ectoparasites (such as ticks) residing in fur became problematic as humans became hunters living in larger groups with a "home base". Nakedness would also make the lack of parasites apparent to prospective mates. [13] However, this theory is inconsistent with the abundance of parasites that continue to exist in the remaining patches of human hair. [14]

The last common ancestor of humans and chimpanzees was only partially bipedal, often using their front legs for locomotion. Other primate mothers do not need to carry their young because there is fur for them to cling to, but the loss of fur encouraged full bipedalism, allowing the mothers to carry their babies with one or both hands. The combination of hairlessness and upright posture may also explain the enlargement of the female breasts as a sexual signal. [8]

Another theory is that the loss of fur also promoted mother-child attachment based upon the pleasure of skin-to-skin contact. This may explain the more extensive hairlessness of female humans compared to males. Nakedness also affects sexual relationships as well, the duration of human intercourse being many times the duration of any other primates. [14]

With the loss of fur, darker, high-melanin skin evolved as a protection from ultraviolet radiation damage. [15] As humans migrated outside of the tropics, varying degrees of depigmentation evolved in order to permit UVB-induced synthesis of previtamin D3. [16] [17] The relative lightness of female compared to male skin in a given population may be due to the greater need for women to produce more vitamin D during lactation. [18]

Origin of clothing

Some of the technology for what is now called clothing may have originated to make other types of adornment, including jewelry, body paint, tattoos, and other body modifications, "dressing" the naked body without concealing it. [19] [20] According to Leary and Buttermore, body adornment is one of the changes that occurred in the late Paleolithic (40,000 to 60,000 years ago) in which humans became not only anatomically modern, but also behaviorally modern and capable of self-reflection and symbolic interaction. [21] More recent studies place the use of adornment at 77,000 years ago in South Africa, and 90,000—100,000 years ago in Israel and Algeria. [22]

The origin of complex, fitted clothing required the invention of fine stone knives for cutting skins into pieces, and the eyed needle for sewing. This was done by Cro-Magnons, who migrated to Europe around 35,000 years ago. [23] The Neanderthal occupied the same region, but became extinct in part because they could not sew, but draped themselves with crudely cut skins—based upon their simple stone tools—which did not provide the warmth needed to survive as the climate grew colder in the Last Glacial Period. [24] In addition to being less functional, the simple clothing would not have been habitually worn by Neanderthal due to their being more cold-tolerant than Homo sapiens, and would not have acquired the secondary functions of decoration and promoting modesty. [25]

The earliest archeological evidence of fabric clothing is inferred from representations in figurines in the southern Levant dated between 11,700 and 10,500 years ago. [26] The current empirical evidence for the origin of clothing is from a 2010 study published in Molecular Biology and Evolution. That study indicates that the habitual wearing of clothing began at some point in time between 83,000 years ago and 170,000 years ago based upon a genetic analysis indicating when clothing lice diverged from their head louse ancestors. This information suggests that the use of clothing likely originated with anatomically modern humans in Africa prior to their migration to colder climates, allowing them to do so. [27] A 2017 study published in Science estimated that anatomically modern humans evolved 260,000 to 350,000 years ago. [28] Thus, humans were naked in prehistory for at least 90,000 years.

Origin of the nude in art

The naked human body was one of the first subjects of prehistoric art, including the numerous female figurines found throughout Europe, the earliest now dating from 40,000 years ago. The meaning of these objects cannot be determined, however the exaggeration of breasts, bellies, and buttocks indicate more symbolic than realistic interpretations. Alternatives include symbolism of fertility, abundance, or overt sexuality in the context of beliefs in supernatural forces. [29] [30]

Ancient history

The widespread habitual use of clothing is one of the changes that mark the end of the Neolithic and the beginning of civilization. Clothing and adornment became part of the symbolic communication that marked a person's membership in their society, thus nakedness meant being at the bottom of the social scale, lacking in dignity and status. [31] However nudity in depictions of deities and heroes indicates other meanings of the unclothed body in ancient civilizations. The association of nakedness with shame and sexuality was unique to Judeo-Christian societies. [32]

In ancient Mesopotamia, most people owned a single item of clothing, usually a linen cloth that was wrapped and tied. Nudity meant being indebted, or if a slave, not being provided with clothes. [33] In the Uruk period there was recognition of the need for functional and practical nudity while performing many tasks, although the nakedness of workers emphasized the social difference between servants and the elite, who were clothed. [34]

For the average person, clothing changed little in ancient Egypt from the Early Dynastic Period until the Middle Kingdom, a span of 1500 years. Both men and women were bare-chested and barefoot, and wore skirts called schenti which evolved from loincloths and resembled modern kilts. Servants and slaves were nude or wore loincloths. Laborers might be nude while doing tasks that made clothing impractical, such as fishermen or women doing laundry in a river. Women entertainers performed naked. Children might go without clothing until puberty, at about age 12. [35] Only women of the upper classes wore kalasiris, a dress of loose draped or translucent linen which came from just above or below the breasts to the ankles. [36] It was not until the later periods, in particular the New Kingdom (1550–1069 BCE), that functionaries in the households of the wealthy also began wearing more refined dress, and upper-class women wore elaborate dresses and ornamentation which covered their breasts. These later styles are often shown in film and TV as representing ancient Egypt in all periods. [36]

Greece

Male nudity was celebrated in ancient Greece to a greater degree than any culture before or since. [37] [38] The status of freedom, maleness, privilege, and physical virtues were asserted by discarding everyday clothing for athletic nudity. [39] With the association of the naked body with the beauty and power of the gods, nudity became a ritual costume. [40] The female nude emerged as a subject for art in the 5th century BCE, illustrating stories of women bathing both indoors and outdoors. While depictions of nude women were erotic in nature, there was no attribution of impropriety as would be the case for such images in later Western culture. However, the passive images reflected the unequal status of women in society compared to the athletic and heroic images of naked men. [41] In Sparta during the Classical period, women were also trained in athletics, and while scholars do not agree whether they also competed in the nude, the same word (gymnosis, naked or lightly clothed) was used to describe the practice. It is generally agreed that Spartan women were nude only for specific religious and ceremonial purposes. [42]

Late antiquity

The Greek traditions were not maintained in the later Etruscan and Roman athletics because its public nudity became associated with homoeroticism. Roman masculinity involved prudishness and paranoia about effeminacy. [43] The toga was essential to announce the status and rank of male citizens of the Roman Republic (509–27 BCE). [44] The poet Ennius declared, "exposing naked bodies among citizens is the beginning of public disgrace". Cicero endorsed Ennius' words. [45]

In the Roman Empire (27 BCE – 476 CE), the status of the upper classes was such that public nudity was of no concern for men, and also for women if only seen by their social inferiors. [46] An exception was the Roman baths (thermae), which had many social functions. [47] Mixed nude bathing may have been standard in most public baths up to the fourth century CE. [48] The Fall of the Western Roman Empire marked many social changes, including the rise of Christianity. Early Christians generally inherited the norms of dress from Jewish traditions. The exception was the Baptism, which was originally by full immersion and without clothes. Jesus was also originally depicted nude as would have been the case in Roman crucifixions. [49] The Adamites, an obscure Christian sect in North Africa originating in the second century worshiped in the nude, professing to have regained the innocence of Adam. [50]

Clothing used in the Middle East, which loosely envelopes the entire body, changed little for centuries. In part, this consistency arises from the fact that such clothing is well-suited for the climate (protecting the body from dust storms while also allowing cooling by evaporation). [51] In the societies based upon the Abrahamic religions (Judaism, Christianity, and Islam), modesty generally prevailed in public, with clothing covering all parts of the body of a sexual nature. The Torah set forth laws regarding clothing and modesty (tzniut) which also separated Jews from other people in the societies they lived within. [52]

The late fourth century CE was a period of both Christian conversion and standardization of church teachings, in particular on matters of sex. The dress or nakedness of women that were not deemed respectable was also of lesser importance [53] due to the distinction between adultery, which injured third parties: her husband, father, and male relatives while fornication with an unattached woman, likely a prostitute, courtesan or slave, was a lesser sin since it had no male victims, which in a patriarchal society might mean no victim at all. [54]

In stories written in China as early as the fourth century BCE, nudity is presented as an affront to human dignity, reflecting the belief that "humanness" in Chinese society is not innate, but is earned by correct behavior. However, nakedness could also be used by an individual to express contempt for others in their presence. In other stories, the nudity of women, emanating the power of yin, could nullify the yang of aggressive forces. [55]

Nudity in mixed-gender public baths was common in Japan before the effects of Western influence, which began in the 19th century and became extensive during the American occupation after World War II. The practice continues at a dwindling number of hot springs (konyoku) outside of urban areas. [56] Another Japanese tradition was the women free-divers (ama) who for 2,000 years until the 1960s collected seaweed and shellfish wearing only loincloths. Their nakedness was not shocking, since women farmers often worked bare-breasted during the summer. [57]

Post-classical history

The period between the ancient and modern world—approximately 500 to 1450 CE—saw an increasingly stratified society in Europe. At the beginning of the period, everyone other than the upper classes lived in close quarters and did not have the modern sensitivity to private nudity, but slept and bathed together naked as necessary. [48] Later in the period, with the emergence of a middle class, clothing in the form of fashion was a significant indicator of class, and thus its lack became a greater source of embarrassment. [58]

Until the beginning of the eighth century, Christians were baptized naked to represent that they emerged from baptism without sin. The disappearance of nude baptism in the Carolingian era marked the beginning of the sexualization of the body by Christians that had previously been associated with paganism. [49] Sects with beliefs similar to the Adamites, who worshiped naked, reemerged in the early 15th century. [59]

Although there is a common misconception that Europeans did not bathe in the Middle Ages, public bath houses—usually segregated by sex—were popular until the 16th century, when concern for the spread of disease closed many of them. [60] The Roman baths in Bath, Somerset, were rebuilt, and used by both sexes without garments until the 15th century. [61]

In Christian Europe, the parts of the body that were required to be covered in public did not always include the female breasts. In depictions of the Madonna from the 14th century, Mary is shown with one bared breast, symbolic of nourishment and loving care. [62] During a transitional period, there continued to be positive religious images of saints, but also depictions of Eve indicating shame. [63] By 1750, artistic representations of the breast were either erotic or medical. This eroticization of the breast coincided with the persecution of women as witches. [64]

The practice known as veiling of women in public predates Islam in Persia, Syria, and Anatolia. Islamic clothing for men covers the area from the waist to the knees. The Qurʾān provides guidance on the dress of women, but not strict rulings [51] such rulings may be found in the Hadith. In the medieval period, Islamic norms became more patriarchal, and very concerned with the chastity of women before marriage and fidelity afterward. Women were not only veiled, but segregated from society, with no contact with men not of close kinship, the presence of whom defined the difference between public and private spaces. [65]

Of particular concern for both Islam and early Christians, as they extended their control over countries that had previously been part of the Byzantine or Roman empires, was the local custom of public bathing. While Christians were mainly concerned about mixed-gender bathing, which had been common, Islam also prohibited nudity for women in the company of non-Muslim women. [66] In general, the Roman bathing facilities were adapted for separation of the genders, and the bathers retaining at least a loin-cloth as in the Turkish bath of today.

Modern history

Early modern

The Christian association of nakedness with shame and anxiety became ambivalent during the Renaissance as a result of the rediscovered art and writings of ancient Greece offering an alternative tradition of nudity as symbolic of innocence and purity which could be understood in terms of the state of man "before the fall". [67] The meaning of nudity in Europe was also changed in the 1500s by reports of naked inhabitants in the Americas, and the African slaves brought to Italy by the Portuguese. Both slavery and colonialism was the beginning of the modern association of public nakedness with savagery. [68]

Some human activities continued to require states of undress in the presence of others. Opinions regarding the health benefits of bathing varied after the 16th century when many European public bath houses closed due to concerns about the spread of disease, [60] but was generally favorable by the 19th century. This led to the establishment of public bath houses for those who had no bathing facilities in their homes, but gender segregation was maintained. In a number of European cities where this included the middle class, some bath houses became social establishments. [69]

In the United States, where the middle class more often had private baths in their homes, public bath houses were built for the poor, in particular for urban immigrant populations. With the adoption of showers rather than tubs, bathing facilities were added to schools and factories. [69]

The Tokugawa period in Japan (1603-1868) was defined by the social dominance of hereditary classes, with clothing a regulated marker of status and little nudity among the upper classes. However, working populations in both rural and urban areas often dressed in only loincloths, including women in hot weather and while nursing. Lacking baths in their homes, they also frequented public bathhouses where everyone was unclothed together. [70]

Colonialism

The age of colonialism was marked by frequent encounters between Christian and Muslim cultures and indigenous peoples of the tropics, leading to the stereotypes of the "naked savage". [71] In his diaries, Columbus writes that the natives were entirely naked, both men and women and gentle. This also meant that they were less than fully human, and exploitable. [72] Initially Islam exerted little influence beyond large towns, outside of which paganism continued. In travels in Mali in the 1350s, Muslim scholar Ibn Battuta was shocked by the casual relationships between men and women even at the court of Sultans, and the public nudity of female slaves and servants. [73]

Non-western cultures during the period were naked only by comparison to Western norms, the genitals and sometimes the entire lower body of adults being covered by garments in most situations. Lacking the western concept of shame regarding the body, such garments might be removed in public for practical or ceremonial purposes. Children until puberty and sometimes women until marriage might be naked as having "nothing to hide". [74]

From the 17th century, European explorers viewed the lack of clothing they encountered in Africa and Oceania as representative of a primitive state of nature, justifying their own superiority, even as they continued to admire the nudity of Greek statues. A distinction was made by colonizers between idealized nudity in art and the nakedness of indigenous people, which was uncivilized and indicative of racial inferiority. [75] [76]

Depictions of naked savages entered European popular culture in the 18th century in popular stories of tropical islands. In particular, Europeans became fascinated by the image of the Pacific island woman with bare breasts. [77] Dressing Africans in European clothes to cover their nakedness was part of converting them to Christianity. [78] In much of Asia, traditional dress covers the entire body, [79] and while much was made of Polynesian nakedness, European cloth was welcomed as part of traditions of wrapping the body. [80] [81]

In the 19th century, photographs of naked indigenous peoples began circulating in Europe without a clear distinction between those created as commercial curiosities (or erotica) and those claiming to be scientific, or ethnographic images. Given the state of photography, it is unclear which images were posed, rather than being representative of everyday attire. [82] [83] George Basden, a missionary and ethnographer who lived with the Igbo people of Nigeria published two volumes of photographs in the 1920s and 1930s. The book described images of unclothed but elaborately decorated Igbo women as indicating their high status as eligible brides who would not have thought of themselves as naked. [84]

In the early 20th century, tropical countries became tourist destinations. A German tourist guide for Bali beginning in the 1920s added to the promotion of the island as an "Eden" for Western visitors by describing the beauty of Balinese women, who were bare-breasted in everyday life and unclothed while bathing in the ocean. Soon however, the Dutch colonial administration began issuing conflicting orders regarding proper dress, which had limited effect due to some Balinese supporting tradition, others modernization. [85]

Indigenous woman in German East Africa, early 20th century

Three Igbo women in the early 20th century

Fijian girl (1908). The locks of hair falling on her right shoulder show that she is unmarried. When she weds they will be cut.

Group portrait of a Balinese family (1929)

Late modern and contemporary

With the opening of Japan to European visitors in the Meiji era (1868-1912), the previously normal states of undress, and the custom of mixed public bathing, became an issue for leaders concerned with Japan's international reputation. A law was established with fines for those that violated the ban on undress. Although often ignored or circumvented, the law had the effect of sexualizing the naked body in situations that had not previously been erotic. [86]

Nudism (in German Freikörperkultur, "free body culture") originated in Europe in the late 19th century as part of working class opposition to industrialization. Nudism spawned a proselytizing literature in the 1920s and 1930s and was brought to America by German immigrants in the 1930s. [87] While Christian moralists tended to condemn nudism, other Christians argued for the moral purity of the nude body compared to the corruption of the scanty clothing of the era. [88] Its proponents believed that nudism could combat social inequality, including sexual inequality. [89]

In the early 20th century, the attitudes of the general public toward the human body reflected rising consumerism, concerns regarding health and fitness, and changes in clothing fashions that sexualized the body. However, members of English families report that in the 1920s to 1940s they never saw other family members undressed, including those of the same gender. Modesty continued to prevail between married couples, even during sex. [90] However bodily modesty is not part of the Finnish identity due to the universal use of the sauna, a historical tradition that has been maintained, which teaches from an early age that nakedness need not have anything to do with sex. [91] [92]

In Germany between 1910 and 1935 nudist attitudes toward the body were expressed in sports and in the arts. In the 1910s a number of solo female dancers performed in the nude. [93] [94] There were advocates of the health benefits of sun and fresh air that instituted programs of exercise in the nude for children in groups of mixed gender. Adolf Koch founded thirteen Freikörperkultur (FKK) schools. [95] With the rise of Nazism in the 1930s, the nudism movement split ideologically, the socialists adopting the views Koch, seeing his programs as part of improving the lives of the working class. Although many Nazis opposed nudity, others used it to extol the Aryan race as the standard of beauty, as reflected in the Nazi propaganda film Olympia directed by Leni Riefenstahl. [96]

In the United States and other Western countries for much of the 20th century, male nudity was the norm in gender segregated activities including summer camps, [97] swimming pools [98] [99] and communal showers [100] based on cultural beliefs that females need more privacy than males. [101] For boys, this expectation might include public behavior as in 1909 when The New York Times reported that at an elementary school swim public competition the youngest boys competed in the nude. [102] Hygiene was given as the reason for an official guideline requiring male nudity in indoor pools in 1926 [103] , which remained until 1962 but continued to be observed into the 1970s by the YMCA and schools with gender segregated classes. [104] [105] [106] [107] The era of nude swimming by boys in indoor pools declined as mixed-gender usage was allowed, [99] and ended when gender equality in facilities was mandated by Title IX of the Education Amendments of 1972. In the 21st century, the practice of nude swimming is largely forgotten, or even denied to have ever existed. [105]

Both hippies or other participants in the counterculture of the 1960s embraced nudity as part of their daily routine and to emphasize their rejection of anything artificial. [108] In 1974, an article in The New York Times noted an increase in American tolerance for nudity, both at home and in public, approaching that of Europe. [109] By 1998, American attitudes toward sexuality had continued to become more liberal than in prior decades, but the reaction to total nudity in public was generally negative. [110] However, some elements of the counterculture, including nudity, continued with events such as Burning Man. [111]

Norms related to nudity are associated with norms regarding personal freedom, human sexuality, and gender roles, which vary widely among contemporary societies. Situations where private or public nudity is accepted vary. Some people practice nudism within the confines of "nudist camps" or clothing-optional resorts, while naturists seek more open acceptance of nudity in everyday life and in public spaces. [112]

Historically in Western societies, there are two cultural traditions relating to nudity in various contexts. The first tradition comes from the ancient Greeks, who saw the naked body as the natural state and as essentially positive. The second is based upon the Abrahamic religions—Judaism, Christianity, and Islam—which have viewed being naked as shameful and essentially negative. The fundamental teachings of these religions prohibit public and sometimes also private nudity. The interaction between the Greek classical and later Abrahamic traditions has resulted in Western ambivalence, with nudity acquiring both positive and negative meanings in individual psychology, in social life, and in depictions such as art. [113] While public modesty prevails in more recent times, organized groups of nudists or naturists emerged with the stated purpose of regaining a natural connection to the human body and nature, sometimes in private spaces but also in public. Naturism in the United States, meanwhile, remains largely confined to private facilities, with few "clothing optional" public spaces compared to Europe. In spite of the liberalization of attitudes toward sex, Americans remain uncomfortable with complete nudity. [110]

In Africa, there is a sharp contrast between the attitude toward nudity in Islamic countries and the attitude toward nudity in certain sub-Saharan countries that never abandoned, or are reasserting, precolonial norms.

In Asia, the norms regarding public nudity are in keeping with the cultural values of social propriety and human dignity. Rather than being perceived as immoral or shameful, nakedness is perceived as a breach of etiquette and perhaps as an embarrassment. In China, saving face is a powerful social force. In Japan, proper behavior included a tradition of mixed gender public baths before Western contact began in the 19th century, and proper attire for farmers and other workers might be a loincloth for both men and women. In India, the conventions regarding proper dress do not apply to monks in some Hindu and Jain sects who reject clothing as worldly.

Indigenous traditions

The encounter between the indigenous cultures of Africa, the Americas and Oceania with Europeans had a significant effect on both cultures. [114] Western ambivalence could be expressed by responding to the nakedness of natives as either a sign of rampant sexuality or of the innocence that preceded the Fall. [115]

Acharya Vidyasagar, a contemporary Digambara Jain monk

Two women of the Zo'é tribe of Pará State, Brazil

A Swazi woman participating in the Umhlanga ceremony in Eswatini - 2006

Young Hamer woman in southern Ethiopia (near Turmi) - 2006

In India, priests of the Digambara ("skyclad") sect of Jainism and some Hindu Sadhus refrain from wearing clothing to symbolize their rejection of the material world. [116] [117] In Bangladesh, the Mru people have resisted centuries of Muslim and Christian pressure to clothe their nakedness as part of religious conversion. Most retain their own religion, which includes elements of Buddhism and Animism, as well as traditional clothing: a loincloth for men and a skirt for women. [118]

In sub-Saharan Africa, full or partial nudity is observed among some Burkinabese and Nilo-Saharan (e.g. Nuba and Surma people)—during particular occasions for example, stick-fighting tournaments in Ethiopia. [119] The revival of post-colonial culture is asserted in the adoption of traditional dress—young women wearing only beaded skirts and jewelry—in the Umkhosi Womhlanga (Reed Dance) by the Zulu and Swazi. [120] However, the authenticity and propriety of the paid performance of "bare chested" Zulu girls for international tourists is sometimes questioned. [121] Other examples of ethnic tourism reflect the visitor's desire to experience what they imagine to be an exotic culture, which includes nudity. [122]

In Brazil, the Yawalapiti—an indigenous Xingu tribe in the Amazon Basin—practice a funeral ritual known as Quarup to celebrate life, death and rebirth. The ritual involves the presentation of all young girls who have begun menstruating since the last Quarup and whose time has come to choose a partner. [123] The Awá hunters, an indigenous people of Brazil living in the eastern Amazon rainforest, are completely naked apart from a piece of string decorated with bird feathers tied to the end of their penises. This minimalist dress code reflects the spirit of the hunt and being overdressed may be considered ridiculous or inappropriate. [124]

Gender differences

In Western cultures, shame can result from not living up to the ideals of society with regard to physical appearance. Historically, such shame has affected women more than men. With regard to their naked bodies, the result is a tendency toward self-criticism by women, while men are less concerned by the evaluation of others. [125] In patriarchal societies, which include much of the world, norms regarding proper attire and behavior are more strict for women than for men, and the judgements for violation of these norms are more severe. [126]

In much of the world, the modesty of women is a matter not only of social custom but of the legal definition of indecent exposure. In the United States, the exposure of female nipples is a criminal offense in many states and is not usually allowed in public. [127] The inclusion of female breasts within the definition of public indecency depends upon definitions of what is allowed in public spaces and what constitutes sexual indecency. Individual women who have contested indecency laws by baring their breasts in public assert that their behavior is not sexual. In Canada, the law was changed to include a definition of a sexual context in order for behavior to be indecent. [128]

The "topfreedom" movement in the United States promotes equal rights for women to be naked above the waist in public on the same basis that would apply to men in the same circumstances. [129] The illegality of topfreedom is viewed as institutionalization of negative cultural values that affect women's body image. The law in New York State was challenged in 1986 by nine women who exposed their breasts in a public park, which led to nine years of litigation culminating with an opinion by the Court of Appeals that overturned the convictions on the basis of the women's actions not being lewd, rather than overturning the law on the basis of equal protection, which is what the women sought. While the decision gave women more freedom to be top-free (e.g. while sunbathing), it did not give them equality with men. Other court decisions have given individuals the right to be briefly nude in public as a form of expression protected by the First Amendment, but not on a continuing basis for their own enjoyment. [130]

Breastfeeding

Breastfeeding in public is forbidden in some jurisdictions, not regulated in others, and protected as a legal right in public and the workplace in still others. Where public breastfeeding is unregulated or legal, mothers may be reluctant to do so because other people may object. [131] [132] [133] The issue of breastfeeding is part of the sexualization of the breast in many cultures, and the perception of threat in what others perceive as non-sexual. [128] Pope Francis came out in support of public breastfeeding at church services soon after assuming the Papacy. [134]

Sexual and non-sexual nudity

The social context defines the cultural meaning of nudity that may range from the sacred to the profane. There are activities where freedom of movement is promoted by full or partial nudity. The nudity of the ancient Olympics was part of a religious practice. Athletic activities are also appreciated for the beauty of bodies in motion (as in dance), but in the post-modern media athletic bodies are often taken out of context to become purely sexual, perhaps pornographic. [135]

There is also recognition of mundane situations in which nakedness is entirely non-sexual. These include bathing, changing clothes, medical treatment or examination, and strenuous physical activity. In the United States in the 1960s and 1970s, the number of non-sexual contexts for nudity expanded to include both social practices such as streaking or nude beaches, and greater acceptance of nudity in artistic performances. In the 21st century, many of these situations have become sexualized. [136]

The sexual nature of nudity is defined by the gaze of others. Studies of naturism find that its practitioners adopt behaviors and norms that suppress the sexual responses while practicing social nudity. [137] Such norms include refraining from staring, touching, or otherwise calling attention to the body while naked. [138] However, some naturists do not maintain this non-sexual atmosphere, as when nudist resorts host sexually-oriented events. [139]

Private nudity

Individuals vary regarding being comfortable nude in situations that are private.

According to a 2004 U.S. survey by ABC News, 31% of men and 14% of women report sleeping in the nude. [140] In a 2018 U.S. survey by USA Today, 58% reported that they slept in the nude by generation 65% of Millennials, but only 39% of Baby boomers. [141]

Body image

Body image is the perceptions and feelings of a person regarding their own body, which effects self-esteem and life satisfaction. Studies indicate not only that social nudity promotes a positive body image, but that nudity-based interventions are helpful for those with a negative body image. [142]

Concepts of privacy

Societies in continental Europe conceive of privacy as protecting a right to respect and personal dignity. Europeans maintain their dignity, even naked where others may see them, including sunbathing in urban parks. In America, the right to privacy is oriented toward values of liberty, especially in one's home. Americans see public nakedness as a surrender of "any reasonable expectation of privacy". Such cultural differences may make some laws and behaviors of other societies seem incomprehensible, since each culture assumes that their own concepts of privacy are intuitive, and thus human universals. [143]

High and low context cultures

High and low context cultures were defined by Edward T. Hall. The behaviors and norms of a high context culture depend upon shared implicit intuitions that operate within a social situation, while in a low context culture behavior is more dependent upon explicit communications. [144] An example of this distinction was found in research on the behavior of French and German naturists on a nude beach. Germans are extremely low in cultural context. They are characterized by individualism, alienation, estrangement from other people, little body contact, low sensitivity to nonverbal cues, and segmentation of time and space. By contrast, the French, in their personal lives, are relatively high context: they interact within closely knit groups, they are sensitive to nonverbal cues, and they engage in relatively high amounts of body contact. To maintain public propriety on a nude beach, German naturists avoided touching themselves and others and avoid any adornments or behaviors that would call attention to the body. French naturists, on the other hand, were more likely than Germans to wear make-up and jewelry and to touch others as they would while dressed. [145]

Morality

The moral ambiguity of nudity is reflected in its many meanings, often expressed in the metaphors used to describe cultural values, both positive and negative. [146]

One of the first—but now obsolete—meanings of nude in the 16th century was "mere, plain, open, explicit" as reflected in the modern metaphors "the naked truth" and "the bare facts". Naturists often speak of their nakedness in terms of a return to the innocence and simplicity of childhood. The term naturism is based upon the idea that nakedness is connected to nature in a positive way as a form of egalitarianism, that all humans are alike in their nakedness. Nudity also represents freedom: the liberation of the body is associated with sexual liberation, although many naturists tend to downplay this connection. In some forms of group psychotherapy, nudity has been used to promote open interaction and communication. Religious persons who reject the world as it is including all possessions may practice nudism, or use nakedness as a protest against an unjust world. [147]

Many of the negative associations are the inverse of positive ones. If nudity is truth, nakedness may be an invasion of privacy or the exposure of uncomfortable truths, a source of anxiety. The strong connection of nudity to sex produces shame when naked in contexts where sexuality is deemed inappropriate. Rather than being natural, nakedness is associated with savagery, poverty, criminality, and death. To be deprived of clothes is punishment, humiliating and degrading. [148]

Confronted with this ambiguity, some individuals seek to resolve it by working toward greater acceptance of nudity for themselves and others. The majority of naturists go through stages during which they gradually learn a new set of values regarding the human body. [149] However, Krista Thomason notes that negative emotions including shame exist because they are functional, and that human beings are not perfect. [150]

Moral emotions

Shame is one of the moral emotions often associated with nudity. While guilt is the emotion experienced in response to a particular wrong action, shame is a more general and long-lasting self-assessment. [151] Shame is often thought of as positive in response to a failure to act in accordance with moral values, thus motivating an individual to do better in the future. However, shame is often negative as the response to perceived failures to live up to unrealistic expectations. The shame regarding nudity is one of the classic examples of the emotion, yet rather than being a positive motivator, it is considered unhealthy, standing in the way of developing a positive self-image. [152]

Others argue that the shame felt when naked in public is due to valuing modesty and privacy as socially positive. [153] However, the response to such public exposure of normally private behavior is embarrassment (like guilt, also an emotion focused on a particular event or action), rather than shame. [154] The absence of shame, or any other negative emotions regarding being naked, depends upon becoming unselfconscious while nude, which is the state both of children and those that practice naturism. This state is more difficult for women in Western culture, given the social presumption that women's bodies are always being observed and judged not only by men but other women. In a naturist environment, because everyone is naked, it becomes possible to dilute the power of social judgements and experience freedom. [125] [155]

The universality of shame is not supported by anthropological studies, which do not find the use of clothing to cover the genital areas in all societies, but instead the use of adornments to call attention to the sexuality of the body. [156]

Religious interpretations

Abrahamic religions

Among ancient cultures, the association of nakedness with sexual sin was peculiar to Abrahamic religions. In Mesopotamia and Egypt, nakedness was embarrassing due to the social connotations of low status and deprivation rather than shame regarding sexuality. [157] Nudity was also not associated with sexuality due to the prevalence of functional nudity, where clothing was removed while engaged in any activity for which it would be impractical. [158]

The meaning of the naked body in Judaism, Christianity, and Islam was defined by a creation narrative in which Adam and Eve, the first man and woman, were naked and unashamed until they ate the forbidden fruit of the Tree of Knowledge of Good and Evil, after which they sought to cover their genitals. The philosophical meaning of this myth is unclear. Was nakedness innocent before, but after gaining the forbidden knowledge, became evil? The feeling of shame is also problematical, since it is understood as a response to being seen by others, a social context that did not exist. [159] According to German philosopher Thorsten Botz-Bornstein, interpretations of Genesis have placed responsibility for the fall of man and original sin on Eve, and, therefore, all women. As a result, the nudity of women is deemed more shameful personally and corrupting to society than the nakedness of men. [160]

Sects of Christianity through history have included nudity into worship practices, but these have been deemed heretical. [ citation needed ] There have been Christian naturists in the United States since the 1920s, but as a social and recreational practice rather than an organized religion. [ citation needed ]

Indian religions

Some Hindu and Jain practitioners of asceticism reject worldly goods, including clothing, wearing only a loincloth or being naked. [ citation needed ]

Child development

The National Child Traumatic Stress Network issued a report in 2009 on child sexual development in the United States. The report asserted that children have a natural curiosity about their own bodies and the bodies of others that ought to be addressed in an age-appropriate manner. According to the report:

  • Children less than four years old will normally touch their own private parts, look at the private parts of others, and remove their clothes wanting to be naked
  • Between ages four and six, children will be more actively curious. They will attempt to see others dressing or undressing, or will perhaps "play doctor"
  • Between ages six and twelve, children will expand their curiosity to images of undressed people available in the media. They will develop a need for privacy regarding their own bodies and begin to be sexually attracted to peers.

The report recommended that parents learn what is normal in regard to nudity and sexuality at each stage of a child's development and refrain from overreacting to their children's nudity-related behaviors unless there are signs of a problem (e.g. anxiety, aggression, or sexual interactions between children not of the same age or stage of development). [161] The general advice for caregivers is to find ways of setting boundaries without giving the child a sense of shame. [162]

In childcare settings outside the home there is difficulty in determining what behavior is normal and what may be indicative of child sexual abuse (CSA). In 2018 an extensive study of Danish childcare institutions (which had, in the prior century, been tolerant of child nudity and playing doctor) found that contemporary policy had become restrictive as the result of childcare workers being charged with CSA. However, while CSA does occur, the response may be due to "moral panic" that is out of proportion with its actual frequency and over-reaction may have unintended consequences. Strict policies are being implemented not to protect children from a rare threat, but to protect workers from the accusation of CSA. The policies have created a split between childcare workers who continue to believe that behaviors involving nudity are a normal part of child development and those that advocate that children be closely supervised to prohibit such behavior. [163]

The naturist/nudist point of view is that children are "nudists at heart" and that naturism provides the ideal environment for healthy development. It is noted that modern psychology generally agrees that children can benefit from an open environment where the bodies of others their own age of both sexes are not a mystery. However, there is less agreement regarding children and adults being nude. While some doctors have taken the view that some exposure of children to adult nudity (particularly parental nudity) may be healthy, others—notably Benjamin Spock—disagreed. Spock's view was later attributed to the lingering effect of Freudianism on the medical profession. [164] Lake Como Family Nudist Resort near Lutz, Florida hosts a summer camp for children and young people. [165]

In their 1986 study on the effects of social nudity on children, Smith and Sparks concluded that "the viewing of the unclothed body, far from being destructive to the psyche, seems to be either benign or to actually provide positive benefits to the individuals involved". [166] As recently as 1996 the YMCA maintained a policy of allowing young children to accompany their parents into the locker room of the opposite gender, which some health care professionals questioned. [167] A contemporary solution has been to provide separate family changing rooms. [168]

Sex education

In a 2001 survey of attitudes toward sex education in Greece, of those who were asked what effect seeing nudity in the home had on children, 32% said positive, 30% negative, 36% said 'no effect' or 'don't know'. However, there was a clear opinion (86%) that seeing sexual display outside the home had a negative effect. [169]

France, Norway, the Netherlands and the United States show a broad range of openness toward nudity and sexuality as indicated by childhood experiences and sex education practices. The health textbooks in Finnish secondary schools emphasize the normalcy of non-sexual nudity in saunas and gyms as well as openness to the appropriate expression of developing sexuality. [170] In general, the United States remains uniquely puritanical in its moral judgements compared to other Western, developed nations. [171]

Tous à Poil! (Everybody Gets Naked!), a French picture book for children, was first published in 2011 with the stated purpose of presenting a view of nudity in opposition to media images of the ideal body but instead depicting ordinary people swimming naked in the sea including a teacher and a policeman. In a 2014 cable news appearance, Jean-François Copé, then leader of the political party Union for a Popular Movement (UMP) decried the book as undermining the dignity of persons in authority. [172] Attempts by UMP to exclude the book from schools prompted French booksellers and librarians to hold a nude protest in support of the book's viewpoint. [173]

As part of a science program on Norwegian public television (NRK), a series on puberty intended for 8–12-year-olds includes explicit information and images of reproduction, anatomy, and the changes that are normal with the approach of puberty. Rather than diagrams or photos, the videos were shot in a locker room with live nude people of all ages. The presenter, a physician, is relaxed about close examination and touching of relevant body parts, including genitals. While the videos note that the age of consent in Norway is 16, abstinence is not emphasized. In a subsequent series for teens and young adults, real people were recruited to have sex on TV as counterbalance to the unrealistic presentations in advertising and porn. [174] A 2020 episode of a Danish TV show for children presented five nude adults to an audience of 11–13-year-olds with the lesson "normal bodies look like this" to counter social media images of perfect bodies. [175]

As of 2015, 37 U.S. states required that sex education curricula include lessons on abstinence and 25 required that a "just say no" approach be stressed. Studies show that early and complete sex education does not increase the likelihood of becoming sexually active, but leads to better health outcomes overall. [176] The Netherlands also has open and comprehensive sex education beginning as early as age 4, with similar health outcomes, in addition to promoting social benefits such as gender equality. Young children often play outdoors or in public wading pools nude. [177]

In a 2018 survey of predominantly white middle-class college students in the United States, only 9.98% of women and 7.04% of men reported seeing real people (either adults or other children) as their first childhood experience of nudity. Many were accidental (walking in on someone) and were more likely to be remembered as negative by women. Only 4.72% of women and 2% of men reported seeing nude images as part of sex education. A majority of both women (83.59%) and men (89.45%) reported that their first image of nudity was in film, video, or other mass media. [178]

A 2009 report issued by the CDC comparing the sexual health of teens in France, Germany, the Netherlands and the United States concluded that if the US implemented comprehensive sex education similar to the three European countries there would be a significant reduction in teen pregnancies, abortions and the rate of sexually transmitted diseases, and save hundreds of millions of dollars. [179]

Nudity in the home

In 1995, Gordon and Schroeder contended that "there is nothing inherently wrong with bathing with children or otherwise appearing naked in front of them", noting that doing so may provide an opportunity for parents to provide important information. They noted that by ages five to six, children begin to develop a sense of modesty, and recommended to parents who desire to be sensitive to their children's wishes that they respect a child's modesty from that age onwards. [180] In a 1995 review of the literature, Paul Okami concluded that there was no reliable evidence linking exposure to parental nudity to any negative effect. [181] Three years later, his team finished an 18-year longitudinal study that showed, if anything, such exposure was associated with slight beneficial effects, particularly for boys. [182] In 1999, psychologist Barbara Bonner recommended against nudity in the home if children exhibit sexual play of a type that is considered problematic. [183] In 2019, psychiatrist Lea Lis recommended that parents allow nudity as a natural part of family life when children are very young, but to respect the modesty that is likely to emerge with puberty. [184]

Semi-public nudity

In a 2009 article for the New York Times "Home" section, Julie Scelfo interviewed parents regarding the nudity of small children at home in situations which might include visitors outside the immediate household. The situations ranged from a three-year-old being naked at a large gathering to the use of a backyard swim pool becoming an issue when the children of disapproving neighbors participated. While the consensus was to allow kids to be kids up to the age of five, there was acknowledgment of the possible discomfort of adults who consider such behavior to be inappropriate. While opponents of child nudity referred to the danger of pedophilia, proponents viewed innocent nudity as beneficial compared to the sexualization of children in toddler beauty pageants with makeup and "sexy" outfits. [185]

In Russia, a survey on proper dress for girls found acceptance of nudity for preschool age children at the beach, but nowhere else. However, girls being bare-chested was acceptable by some up to puberty. [186]

Recreational swim in the Greenbrier River, West Virginia (1946)

Fountain in Israel (between 1947 and 1950)

Bathing in the center of East Berlin, East Germany (1958)

A nude family at Lake Senftenberg in East Germany (1980s)

Worldwide, laws regarding clothing specify what parts of the body must be covered, prohibiting complete nudity in public except for those jurisdictions that allow nude recreation.

Specific laws may either require or prohibit religious attire (veiling) for women. In a survey using data from 2012 to 2013, there were 11 majority Muslim countries where women must cover their entire bodies in public, which may include the face. There were 39 countries, mostly in Europe, that had some prohibition of religious attire, in particular face coverings in certain situations, such as government buildings. Within Russia, laws may either require or prohibit veiling depending upon location. [187]

The brief, sudden exposure of parts of the body normally hidden from public view has a long tradition, taking several forms.

  • Flashing refers to the brief public exposure of the genitals or female breasts. [188] At Mardi Gras in New Orleans flashing—an activity that would be prohibited at any other time and place—has become a ritual of long standing in celebration of Carnival. While many celebrations of Carnival worldwide include minimal costumes, the extent of nudity in the French Quarter is due to its long history as a "red light district". The ritual "disrobing" is done in the context of a performance which earns a payment, even though it is only symbolic (glass beads). Although the majority of those performing continue to be women, men (both homosexual and heterosexual) now also participate. [189] refers to exposure of the buttocks. Mooning opponents in sports or in battle as an insult may have a history going back to ancient Rome. [190] refers to running nude through a public area. While the activity may have a long history, the term originated in the 1970s for a fad on college campuses, which was initially widespread but short-lived. [191] Later, a tradition of "nude runs" became institutionalized on certain campuses, such as the Primal Scream at Harvard.

In the United Kingdom, nudity may not be used to "harass, alarm or distress" according to the Public Order Act of 1986. [192] According to a police spokesperson in 2013, nudity per se is not unlawful in the United Kingdom however, the circumstances surrounding particular episodes of nudity may create public order offenses. Most naturists comply with the law by being nude only where others cannot see them. [193] After repeated arrests, prosecutions, and convictions in Great Britain, the activist Stephen Gough sued at the European Court of Human Rights for the right to be nude in public outside of designated areas. His claim was ultimately rejected. [194]

In the United States, public nudity is a matter of local laws with the exception of First Amendment protection of free expression, which is generally recognized with regard to performances in an artistic context. However, in Barnes v. Glen Theatre, Inc. the owner of a bar and an adult bookstore in South Bend, Indiana sought to overturn the state law prohibiting "indecent behavior". The US Supreme Court upheld the Indiana law, but with difference in opinion between justices. [195]

Since regulation of everyday public behavior is more often a matter of social convention than written law, some jurisdictions may have no specific law against nudity in public. This was the case in 2006, when three young men who had been skinny-dipping outside Brattleboro, Vermont decided to go into town to see what would happen if they disrobed there. They were not arrested, and the following two summers saw a number of incidents of public nakedness until an ordinance banning nudity was passed. [196]

In the 21st century in the United States, the legal definition of "full nudity" is exposure of the genitals. "Partial nudity" includes exposure of the buttocks by either sex or exposure of the female breasts. [197] Legal definitions are further complicated by laws regarding indecent exposure this term generally refers to engaging in public nudity with an intent to offend common decency. [198] Lewd and indecent behavior is usually defined as causing alarm, discomfort, or annoyance for the average person. Where the law has been challenged by asserting that nudity by itself in not lewd or disorderly, laws have been amended to specify indecent exposure, usually of the genitals but not always of the breast. Public indecency is generally a misdemeanor, but may become a felony upon repeated offense or always if done in the presence of a minor. [199] The law differs between different states. In the state of Oregon, public nudity is legal and protected as free speech as long as there is not an "intent to arouse". [200] The state of Arkansas not only outlaws private nudism, but bans anyone from advocating the practice. [201]

After incidents in July 2020 of ticketing women for sunbathing topless, the Minneapolis Parks board moved to change the regulation that prohibits the exposure of female breasts on park property, which is legal elsewhere in the city and the state of Minnesota. Some tickets were issued when sunbathers were spotted in isolated areas by drones with cameras. [202] The police defended the use of drones as being in response to citizen complaints regarding illegal alcohol and drug use in addition to nudity. [203]

Imposed nudity

Punishment

In some situations, nudity is forced on a person. For example, imposed nudity (full or partial) can be part of a corporal punishment or as humiliation, especially when administered in public. For example, in 2017, students at a girls' school in the north-east Indian state of Arunachal Pradesh were forced to undress as a form of punishment, police say. Although not as common as corporal punishment, it is not unusual for stripping to be used as a form of punishment in Indian schools. [204]

Torture

Nazis used forced nudity to attempt to humiliate inmates in concentration camps. This practice was depicted in the film Schindler's List (1993). [205]

In 2003, Abu Ghraib prison in Baghdad, Iraq gained international notoriety for accounts of torture and abuses by members of the United States Army Reserve during the post-invasion period. Photographic images were circulated that showed the posing of prisoners naked, sometimes bound, and being intimidated and otherwise humiliated, resulting in widespread condemnation of the abuse. [206] [207]

Strip search

A strip search is the removal of some or all of a person's clothing to ensure that they do not have weapons or contraband. Such searches are generally done when an individual is imprisoned after an arrest, and is justified by the need to maintain order in the facility, not as punishment for a crime. [208]

Nudity as protest

Nudity is used to draw public attention to a cause, sometimes including the promotion of public nudity itself. [209]

Particular issues represented include animal rights by the group PETA, environmental issues by the World Naked Bike Ride, and women's rights by the organization FEMEN.

Public baths and spas

Bathing for cleanliness and recreation is a human universal, and the communal use of bathing facilities has been maintained in many cultures from varying traditional sources. When there is complete nudity, the facilities are often segregated by sex, but not always.

The sauna is attended nude in its source country of Finland, where many families have one in their home, and is one of the defining characteristics of Finnish identity. [92] [210] Saunas have been adopted worldwide, first in Scandinavian and German-speaking countries of Europe, [211] with the trend in some of these being to allow both genders to bathe together nude. For example, the Friedrichsbad in Baden-Baden has designated times when mixed nude bathing is permitted. The German sauna culture also became popular in neighbouring countries such as Switzerland, Belgium, the Netherlands and Luxembourg. [a] In contrast to Scandinavia, public sauna facilities in these countries—while nude—do not usually segregate genders. [b] [212] A spa in Třeboň, Czech Republic features a peat pulp bath. [214]

The sauna came to the United States in the 19th century when Finns settled in western territories, building family saunas on their farms. When public saunas were built in the 20th century, they might include separate steam rooms for men and women. [215]

In Japan, public baths (Sentō) were once common, but became less so with the addition of bathtubs in homes. Sentō were mixed gender (konyoku) until the arrival of Western influences, [56] but became segregated by gender in cities. [216] Nudity is required at Japanese hot spring resorts (Onsen). [217] Some such resorts continue to be mixed gender, but the number of such resorts is declining as they cease to be supported by local communities. [56]

In Korea, bathhouses are known as Jjimjilbang. Such facilities may include mixed-sex sauna areas where clothing is worn, but bathing areas are gender segregated nudity is required in those areas. [218] [217] Korean spas have opened in the United States, also gender separated except the bathing areas. In addition to the health benefits, a woman wrote in Psychology Today suggesting the social benefits for women and girls having real life experience of seeing the variety of real female bodies—even more naked than at a beach—as a counterbalance to the unrealistic nudity seen in popular media. [219]

In Russia, public banyas are clothing-optional and are usually gender-segregated. [217]

Nudity in semi-public facilities

Historically, certain facilities associated with activities that require partial or complete nakedness, such as bathing or changing clothes, have limited access to certain members of the public. These normal activities are guided by generally accepted norms, the first of which is that the facilities are most often segregated by gender however, this may not be the case in all cultures.

Changing rooms may be provided in stores, workplaces, or sports facilities to allow people to change their clothing. Some changing rooms have individual cubicles or stalls affording varying degrees of privacy. Locker rooms and communal showers associated with sports generally lacked any individual space, thus providing minimal physical privacy.

The men's locker room—which historically in Western cultures had been a setting for open male social nudity—is, in the 21st century United States, becoming a space of modesty and distancing between men. For much of the 20th century, the norm in locker rooms had been for men to undress completely without embarrassment. That norm has changed in the 21st century, men typically wear towels or other garments in the locker room most of the time and avoid any interaction with others while naked. This shift is the result of changes in social norms regarding masculinity and how maleness is publicly expressed also, open male nudity has become associated with homosexuality. [220] [221] In facilities such as the YMCA that cater to multiple generations, the young are uncomfortable sharing space with older people who do not cover up. [222] The behavior in women's locker rooms and showers also indicates a generational change, younger women covering more, and full nudity being brief and rare, while older women are more open and casual. [223]

By the 1990s, communal showers in American schools had become "uncomfortable", not only because students were accustomed to more privacy at home, but because young people became more self-conscious based upon the comparison to mass media images of perfect bodies. [224] In the 21st century, some high-end New York City gyms were redesigned to cater to millennials who want to shower without ever being seen naked. [225] The trend for privacy is being extended to public schools, colleges and community facilities replacing "gang showers" and open locker rooms with individual stalls and changing rooms. The change also addresses issues of transgender usage and family use when one parent accompanies children of differing gender. [226]

A 2014 study of schools in England found that 53% of boys and 67.5% of girls did not shower after physical education (PE) classes. Other studies indicate that not showering, while often related to being naked with peers, is also related to lower intensity of physical activity and involvement in sports. [227]

This shift in attitudes has come to societies historically open to nudity. In Denmark, secondary school students are now avoiding showering after gym classes. In interviews, students cited the lack of privacy, fears of being judged by idealized standards, and the possibility of being photographed while naked. [228] Similar results were found in schools in Norway. [229] [230]

Social and public nudity

Attitudes toward public nudity vary depending on culture, time, location, and context. There are particular contexts in which nudity is tolerated, accepted, or even encouraged in public spaces. In Europe, such contexts include nude beaches, within some intentional communities (such as naturist resorts or clubs) and at special events. Such special events can be understood by expanding the historical concept of Carnival, where otherwise transgressive behaviors are allowed on particular occasions, to include other mass nudity public events. Examples include the Solstice Swim in Tasmania (part of the Dark Mofo festival) and World Naked Bike Rides. [231]

Germany is known for being tolerant of public nudity in many situations. [232] In a 2014 survey, 28% of Austrians and Germans had sunbathed nude on a beach, 18% of Norwegians, 17% of Spaniards and Australians, 16% of New Zealanders. Of the nationalities surveyed, the Japanese had the lowest percentage, 2%. [233]

In the United States in 2012, the city council of San Francisco, California, banned public nudity in the inner-city area. This move was met by harsh resistance because the city was known for its liberal culture and had previously tolerated public nudity. [234] [235] Similarly, park rangers began filing tickets against nudists at San Onofre State Beach—also a place with long tradition of public nudity—in 2010. [236]

Naturism

Naturism (or nudism) is a subculture advocating and defending private and public nudity as part of a simple, natural lifestyle. Naturists reject contemporary standards of modesty that discourage personal, family and social nudity. They instead seek to create a social environment where individuals feel comfortable being in the company of nude people and being seen nude, either by other naturists or by the general public. [237] In contradiction of the popular belief that nudists are more sexually permissive, research finds that nudist and non-nudists do not differ in their sexual behavior. [238]

The social sciences, until the middle of the 20th century, often studied public nakedness, including naturism, in the context of deviance or criminality. [239] However, more recent studies find that naturism has positive effects on body image, self-esteem and life satisfaction. [240] The Encyclopedia of Social Deviance continues to have an entry on "Nudism", [241] but also defines "Normal Deviance" as violating social norms in a positive way, leading to social change. [242]

Nude beaches

A nude beach, sometimes called a clothing-optional or free beach, is a beach where users are at liberty to be nude. Such beaches are usually on public lands. Nude beaches may be official (legally sanctioned), unofficial (tolerated by residents and law enforcement), or illegal but so isolated as to escape enforcement.

Clothing-optional recreation

In a picture-making civilization, pictorial conventions continually reaffirm what is natural in human appearance, which is part of socialization. [243]

In Western societies, the contexts for depictions of nudity include information, art and pornography. Any ambiguous image not easily fitting into one of these categories may be misinterpreted, leading to disputes. [244] The nude in photography includes scientific, commercial, fine art, and erotic photography. [245]

The nude human figure has been one of the subjects of art from its Paleolithic beginnings, and a major preoccupation of Western art since the ancient Greeks. In The Nude: a Study in Ideal Form, Lord Kenneth Clark states that to be naked is to be deprived of clothes, and implies embarrassment and shame, while a nude, as a work of art, has no such connotations. [246] This separation of the artistic form from the related social and cultural issues was largely unexamined by classical art historians, but became a focus of social and feminist critiques in the 1970s when classical nudes of women were seen as symbolic of male objectification of female bodies. [247] [248] The debate over objectification has continued, recently energized by the #MeToo movement. [249]

Arts-related activities

Distinct from the nude artworks created, sessions where artists work from live models are a social situation where nudity has a long tradition. The role of the model both as part of visual art education and in the creation of finished works has evolved since antiquity in Western societies and worldwide wherever western cultural practices in the visual arts have been adopted. At modern universities, art schools, and community groups "art model" is a job, one requirement of which is to pose "undraped" and motionless for minutes, hours (with breaks) or resuming the same pose for days as the artwork requires. [250] Some have investigated the benefits of arts education including nudes as an opportunity to satisfy youthful curiosity regarding the human body in a non-sexual context. [251]

Photography of groups of nude people in public places has been done around the world with or without official cooperation. The gathering itself is proposed as performance art, while the resulting images become statements based upon the identities of the people posing and the location selected: urban, scenic landscapes, or sites of historical significance. The photographers including Spencer Tunick [252] [253] [254] and Henning von Berg state a variety of artistic, cultural, and political reasons for their work, while those being photographed may be professional models or unpaid volunteers attracted to the project for personal reasons.

Sexually explicit images

Sexual acts have been depicted in art from the beginning in the stone age.

Indecency and obscenity

Limits on the depiction of nudity are based upon the legal definitions of indecency and obscenity. In 1973, the Supreme Court in Miller v. California established the three-tiered Miller test to determine what was obscene (and thus not protected) versus what was merely erotic and thus protected by the First Amendment. [245]

Depictions of child nudity (or of children with nude adults) appear in works of art in various cultures and historical periods. These attitudes have changed over time and have become increasingly frowned upon, [255] especially in the case of photography. During the years when film was developed by commercial photo labs, snapshots taken by parents of their nude infant or toddler children were reported to the police as possible child pornography. [256] While some individuals were arrested, tried, or convicted no charges involving mere nudity have been ultimately upheld, because the legal definition of child pornography is that it depicts sexually explicit conduct. [257]

Nudity may be used as a part of live performances, such as dance, theater, performance art and nude body painting.

Dance

Dance, as a sequence of human movement, may be ceremonial, social or one of the performing arts. Partial or complete nudity is a feature of ceremonial dances in some tropical countries. However, some claim that modern practices may be used to promote "ethnic tourism" rather than to revive authentic traditions. [258] In Western traditions, dance costumes have evolved towards providing more freedom of movement and revealing more of the body complete nakedness is the culmination of this process. [259] Modern choreographers consider nudity one of the possible "costumes" available for dance, some seeing nudity as expressing deeper human qualities through dance which works against the sexual objectification of the body in commercial culture. [260] While nudity in social dance is not common, events such as "Naked Tango" have been held in Germany. [261]

Theater

Models posing on stage nude was a feature of tableaux vivants at London's Windmill Theatre and New York's Ziegfeld Follies in the early 20th century. [262] [263] English and United States law did not allow nude or topless performers to move on stage, but allowed them stand motionless to imitate works of art. [264] Reflecting the era, the American theater in the 1960s addressed issues including hypocrisy and freedom. By 1968 nudity was freely employed by playwrights, directors and producers not only on subjects of sexuality but regarding social injustice and war. [265]

A well-known performance that included nudity was the Broadway musical Hair in 1968. [266] The New York Times printed a number of letters to the editor in 1969 with opinions ranging from actress June Havoc stating that nudity indicates that the producer has run out of ideas while actress Shelley Winters joked that it was disgusting, but if she were 22 she would do it. [267] The nudity in Hair was benign compared to later productions. Dionysus in 69, a modern version of The Bacchae, included a chorus of nude and partially nude actors who staged a birth ritual and interacted with the audience. [268] Eventually nudity became an issue of personal integrity and privacy, with some actors choosing to perform nude, others not. [269]

Erotic performances

Public performances that have the intent of arousing the erotic interest of an audience have an indeterminate history, generally associated with prostitution. The striptease did not end with performers entirely nude until the twentieth century, but has since evolved into the live sex show. [ citation needed ]


2. Knowledge and Skills to Implement Breastfeeding Policies

  • Ensure all staff, health care providers and volunteers have the knowledge and skills necessary to implement the infant feeding policy.

Undergraduate and Postgraduate Education

As awareness of the importance of protecting, promoting, and supporting breastfeeding increases, undergraduate and continuing professional education of HCPs needs to address the biological, social, and emotional components of breastfeeding – and the multiplicity of factors that affect this dynamic relationship. However, numerous studies describe the lack of formal breastfeeding information in HCPs' educational programs. Footnote 57 Footnote 58 Footnote 59 Footnote 60

HCPs are known to affect the breastfeeding relationship. Women who perceive their HCPs as supportive of breastfeeding are more likely to breastfeed than those who perceive them as neutral or favouring formula feeding. Footnote 61 Footnote 62 In fact, the more often breastfeeding is mentioned during pregnancy, the more likely women will breastfeed. Footnote 63

Continuing Education

Everyone who works at a health facility – administrators, managers, volunteers, allied health professionals, auxiliary staff, students, clerks, and all HCPs – needs to be aware of the facility's policies, including the BFI. Specific training in keeping with everyone's role should be provided. For example, phlebotomists should actively support mothers in breastfeeding or holding the baby in skin-to-skin contact to comfort their child through blood tests.

Continuing professional education needs to be developed to address the breastfeeding needs of families and enhance the care of mothers and babies. A holistic and interprofessional approach to professional continuing education should be a responsibility shared by HCPs and health facilities.

Education Strategies

WHO and UNICEF recommend 18–20 hours of breastfeeding education (including 3 hours of clinical experience) for HCPs who provide direct breastfeeding care, for example, lactation consultants, perinatal nurses, midwives, obstetricians, and family physicians. Providers responsible for clinical support of breastfeeding mothers and infants require specific knowledge, skills, and attitudes. Footnote 59 Research suggests that clinical mentorships, didactic learning modules, and Internet learning options are also useful training opportunities. Footnote 64 The specific knowledge and skills required are outlined in The BFI 10 Steps and WHO Code Outcome Indicators for Hospital and Community Health Services. Footnote 37

At a minimum, all HCPs require orientation in the policies and practice guidelines of the facility, including the BFI (i.e., the Ten Steps and International Code of Marketing of Breast-milk Substitutes).

Ongoing Competency Validation

Practice change requires more than education. Footnote 65 Numerous professional organizations have breastfeeding guidelines – yet professional practice often does not reflect the guidelines. Footnote 59 To be effective, implementation of evidence-based policies in hospitals and community facilities requires a combination of various education strategies and clinical support.


Discussion

Breastfeeding was almost universally practised among the mothers who participated in this study. However, the mothers' knowledge about and attitudes towards breastfeeding and related conditions were not always consistent with healthy practices. It is very important to understand mothers' beliefs and attitudes for appropriate health education or other interventions on breastfeeding.

The characteristics of the participants of this study are very similar to those of the rural populations in the southeastern regions of Turkey. Most of the women were illiterate and could not speak Turkish. The literacy rate of the women in this study (18.2%) was significantly lower than the overall literacy rates among women in Turkey (76.1 %). 5 The illiteracy rate of the study population is also lower than that in many other developing countries. Multiparity, early marriages, living in an extended family structure, and short birth intervals were the features representing a traditional lifestyle.

In the study population a wide range of mothers (98.6%) ever breastfed their infants. In all other traditional populations a great majority of mothers choose to breastfeed during the first years of the infant, and breastfeeding is regarded highly for the health of the infant. 6 – 9 On the other hand, in some accounts of Western society, breastfeeding is perceived as denigrating women's bodies, leading women to believe that their bodies are inadequately suited for breastfeeding. Consequently, many women may perceive breastfeeding as something which will be difficult (or impossible) to achieve successfully. 10 Mothers consistently ranked breastfeeding as the best nutrition for infant growth and health, but the value of exclusively breastfeeding was not well known by the mothers. An important factor for not practising exclusive breastfeeding was the perception of water as being indispensable for the infant's health. Exclusively breastfeeding is not frequently practised in other communities either. Rates for 4 months of exclusive breastfeeding were 0.0% in rural Malawi 9 and 1.3% in Turkey. 3 Mothers supplemented infants' feeding with other fluids until the mature milk began to flow. Mothers also fed water to their babies, which they thought to be a requirement for the infant. Participants of another study among African American women perceived that giving infants water was essential, and they believed that cereal and solid foods should be introduced much earlier. 11 But in Nepal breast milk was considered to be pure, and while the infant was drinking only breast milk, he or she, unlike adults, was not yet polluted. 12

Between 15 and 65% of mothers studied in different regions of the world had not given colostrum to their babies. 13 – 15 In Ibadan mothers claimed to have discarded the colostrum produced in the first 24 h postpartum and infants were fed on glucose water or herbal preparations. 13 In Guinea-Bissau mothers also had negative cultural perceptions about colostrum. 16 In Nigeria mothers said that colostrum should be discarded because it is dirty, ‘like pus’, and therefore potentially harmful to the infant. 17 In those communities breastfeeding was traditionally delayed and glucose water and herbal preparations were given to infants, in a similar way to our study population. Similar to many other cultures, mothers in our study perceived colostrum as a harmful, pus-like, and religiously forbidden milk. It is interesting that some of them related meconium to colostrum. Another, different perception of colostrum was that it was the milk which had stayed in the breast during the 9 months of pregnancy and thus became stale. Nigerian women also claimed that water and herbal teas would purge the baby and clean its stomach, similar to the findings in our study. 17 These traditional beliefs made mothers not give breast milk for the first days of life. In the survey 38.6% of the women stated that they did not have white milk, and 24.4% of them stated that colostrum was too dirty for breastfeeding during the first hours. These and all the other answers demonstrate the strong misperceptions about colostrum.

Literate women and women who had professional birth attendance during their most recent birth were more likely to initiate breastfeeding earlier. These results indicate that traditional beliefs and attitudes can be changed, and acceptance of colostrum can be enhanced, through the training of mothers and the provision of health professionals for deliveries.

Once breastfeeding has been initiated, it was widely accepted and highly valued in our study population, similar to other communities. 17, 18 In our study population the median duration of breastfeeding was found to be 18 months. This result was higher than in other parts of Turkey, where the average is 16.2 months. 3

In Ethiopia, the majority of the mothers studied stopped breastfeeding when they became sick or pregnant or their child became sick. 18 More than half of the women decided to discontinue breastfeeding because they had become pregnant again. 19 In our study, getting pregnant or becoming ill were also shown to be important reasons for the cessation of breastfeeding. Women stated that if a pregnant woman breastfed, both the breastfed child and the fetus could be harmed. Pregnant women were not expected to breastfeed according to the Yoruba culture either. 17

One of the other interesting findings of this study was the avoidance of breastfeeding once the mother worked under the sun.


Breastfeeding Beliefs: From Invincibility to Universal Creation - History

The flag of the United Nations (left). The flag of the United States of America (right).

Editor's Note:

In the wake of World War II, the United States helped to build a set of new international institutions designed to bring peace, stability, and cooperation to the post-war world. They included the United Nations, the World Bank, and the World Health Organization. Yet, many Americans have been suspicious of these institutions. And many American politicians have treated them with hostility, none more extravagantly than Donald Trump. This month, historian Joe Parrott examines this history and explores what the future of American diplomacy might hold.

The 2020 U.S. elections have resulted in a new president-elect. Joe Biden will occupy the White House, in part because much of the country rejected Donald Trump. While this backlash was perhaps concentrated on domestic matters and perceptions, Trump’s foreign policy also reflected a polarizing approach to American internationalism that remains popular with a large segment of the country.

Over the past four years, Trump has flouted the authority of international institutions, from questioning the utility of the North Atlantic Treaty Organization to rejecting the Paris Climate Agreement. The latest and most dramatic move was his pledge to withdraw from the World Health Organization (WHO) amid a global pandemic, ostensibly because it did not play hardball with China.

While Trump’s loud rejection of multilateralism has been part of his America First agenda, U.S. administrations have been shifting away from cooperation with international institutions for some time.

Yet many of the world’s most contentious issues seem to demand border-crossing responses. These include not only disease and public health, but climate change, immigration, and global economic inequality.

Understanding why Trump pulled back from the world amid demands for greater collaboration—and why many Americans, particularly Republicans, supported this strategy—requires an understanding of supranational institutions and their relationships with their most important benefactor: the United States.

Beginning with the United Nations (UN) in 1945, the United States helped create institutions that sought to deliver peace, economic development, and social services to the world through an American-friendly model. These multilateral bodies allowed Washington officials to portray self-interested policies as altruistic during the ideological competition of the Cold War.

But these institutions and their goals changed as decolonization transformed their memberships. New postcolonial states and economically marginalized nations in the Global South (Asia, Africa, and Latin America) emphasized policies dedicated to reversing trade inequalities, expanding investment in poorer countries, and promoting multilateralism.

These policies often clashed with U.S. priorities. The gradual loss of control frustrated Washington officials, who complained about the UN and WHO even as they generously funded both. Such frustrations encouraged candidate Trump to argue that Americans were getting a bad deal, an argument he often used to justify his actions as president.

Over the past decades, the United States has struggled to adapt to the more complex, decentralized world that has seen U.S. power diluted. These global changes have destabilized its relationship with multilateral organizations it once viewed as essential for effective management of global challenges such as war and disease.

Looking back on the history of U.S. cooperation with organizations such as the UN and WHO helps explain the current polarized view of supranational institutions, suggesting the country might benefit from working with them more in the future.

The United States and the Origins of Supranational Governance

After two world wars and the Great Depression led to mass human suffering, world leaders and their constituents believed things needed to change. The highly technological and destructive wars made great power politics seem dangerous and erased American confidence in the secure isolation provided by two oceans. The collapse of the integrated global economy proved that nationalistic tools such as tariffs could not protect individual states from hard times.

U.S. leaders came to believe the nation could not remain aloof from the rest of the world. They had long avoided close involvement with European states they viewed as autocratic, corrupt, and warlike. But Washington’s reluctance to coordinate responses to the Depression and the rise of European fascism paved the way for World War II. As a result, U.S. officials began cooperating to construct a new international order even before the conflict ended.

This 1941 poster from the United States Office of War Information depicts the flags of the wartime United Nations. The flags of the "big four," the United States, Great Britain, the Soviet Union, and China are in the top row, while the other allies are listed in alphabetical order below (left). This 1943 poster depicts the United Nations as a wartime alliance following the Declaration of the United Nations of 1942 (right).

The most visible example of this new direction in U.S. policy was the creation of the United Nations. From the majestic Beaux-Arts War Memorial Opera House in San Francisco, Americans led a group of world leaders in pursuit of the goal that “all worthy human beings may be permitted to live decently as free people.”

The UN was envisioned as a forum on diplomacy, economics, and global norms. At the heart of the institution was a search for collective security, wherein a small council of countries headed by the five permanent members (the United States, Great Britain, the Soviet Union, China, and France) decided how to respond to violations of international law and custom, sometimes with military and economic force.

In designing this system, the United States willingly forfeited some of its sovereignty in pursuit of a better managed international system. Indeed, it stood willing to sacrifice more by adhering to the majority opinion of the Security Council.

The Soviets, however, demanded each permanent member receive a veto, which the United States approved in order to guarantee Moscow’s participation. While this great-power veto undermined the UN’s ability to act as an effective check on war, the body played a vital role establishing norms of human rights and promoting global economic development.

U.S. support for the UN hinged on the new domestic understanding that national interests across the globe were fiercely entwined. The decades after World War I taught Americans that peace would come not through national defense alone but by removing the largely economic roots of war.

The UN was the centerpiece of an array of institutions seeking to ensure a decent standard of living that would prevent major conflicts waged of necessity, conflicts that would have astonishing potential for destruction after the invention of the atomic bomb.

Two such economic organizations emerged from the 1944 Bretton Woods Conference: the International Monetary Fund (IMF) and what eventually became the World Bank. The former aimed to stabilize currencies through lending to maintain international financial stability and promote trade, in part to prevent another Great Depression. Founded to finance postwar reconstruction, the World Bank shifted to lending to low-income countries after the American Marshall Plan independently provided over $10 billion to rebuild Europe.

These organizations formed the foundations for what is often called the liberal international order. Multilateral institutions promoted democracy, diplomacy, open markets, and managed capitalism in collaboration with a powerful post-World War II United States.

Modernization and the WHO

Central to this liberal international order was another theory whose popularity peaked in the 1950s and 1960s: modernization. Advocates argued that policy interventions could create conditions for rapid modernization, wherein all countries could achieve something akin to the level of economic, social, and political development enjoyed in Western Europe and North America. Improved living standards would provide the foundations for peace and promote further collaboration.

New supranational organizations arose to support this mission, which coordinated international expertise to promote best practices learned from the post-war rebuilding while also extending them to the Global South.

An important example of these new institutions was the WHO, founded in 1948 as a specialized agency within the UN. Wartime destruction of infrastructure, mass displacements, and increasingly rapid transportation bred fears among world leaders that small outbreaks of deadly diseases could spread quickly across the world. They hoped the WHO would help anticipate and prevent such crises.

Based in Geneva, the WHO operated primarily as an advisor to governments and a clearinghouse for information. It had no regulatory power and initially could only respond to requests for assistance from national health institutes. Yet governments participated because the WHO provided information and a level of coordination that they could not achieve on their own.

In the early postwar years, its priorities were controlling communicable diseases and strengthening national programs of health administration, education, and environmental sanitation. This mission was especially vital in the Global South, where European colonial administrators had long ignored national health infrastructures in Africa and Asia. The United States was the largest single contributor, providing almost a third of WHO funding into the 1950s.

The WHO coordinated global health standards and aid projects, but its role in eradicating diseases tends to attract the most attention. Eradication was far from an obvious goal experts before World War II questioned whether it was feasible or practical. Yet the postwar era appeared to hold unlimited technological opportunity, especially in the United States. With encouragement from American officials, the WHO argued that the high costs of global eradication of certain diseases—malaria, for example—were justified in economic terms as well as in human lives.

This logic proved convincing. The WHO spearheaded global campaigns, identifying goals and strategies for eradication while also arranging for funds – either through bilateral aid from wealthy countries or in direct grants. There were real accomplishments, if never on the scale early advocates envisioned. The WHO declared smallpox effectively eradicated in 1980, nearly ended the threat of polio, and greatly reduced the number of areas prone to malaria.

Most WHO programs in these early years originated and operated in the global North, but many had a major impact on countries of the Global South as part of the postwar project of development and modernization. Health infrastructures improved substantially, if unevenly. Mortality rates for children under 5, for example, dropped by nearly 75 percent from the 1950s to 1980s in parts of Asia and Eastern Europe.

These improvements supported the broader goals of the liberal international order. The combination of aid and investment never reached the 1 percent budget goal envisioned by John F. Kennedy and the UN in the 1960s, but there was progress. Despite lingering and ingrained inequalities, the average 5 percent growth experienced by many Global South countries from 1960 to 1980 greatly reduced rates of poverty. Living standards rose throughout the world thanks in part to American and European assistance.

This level of global coordination and voluntary transfers of wealth was unprecedented. European empires promoted economic and infrastructural development in their colonies as investments for the profit of investors or colonists. Never before had countries transferred even this small fraction of their wealth for the benefit of other nations, nor had they sought to do so in coordinated fashion.

Organizations like the UN and WHO were vital experiments in creating a new international system, in which the benefits of technology, wealth, and knowledge could be more evenly distributed in pursuit of a more just and peaceful world.

U.S. Aid to Supranational Organizations

As the world’s richest country, the United States was the primary benefactor of these institutions and their programs. Why the United States chose to follow this path is complicated.

The search for peace and stability through multilateralism provided the basic logic of post-WWII liberalism, but changing the way the United States operated internationally required more than just altruism. After all, U.S. leaders from George Washington onward warned against what Thomas Jefferson called “entangling alliances.” Senator Henry Cabot Lodge defeated Woodrow Wilson’s attempt to join the nascent League of Nations in 1919, launching what would become two decades of increasingly vocal calls for an isolationist foreign policy.

World War II quieted all but the most vocal isolationists, and the Cold War militated against a revival. From 1945 until 1989 or so, the United States became deeply involved in a competition with the Soviet Union, in which it sought to contain the spread of totalitarian communism. That threat motivated both U.S. political leaders and the American people to embrace liberal internationalism, lest ignoring the perceived Soviet threat would repeat the mistakes of the interwar years.

As part of its competition with the Soviet Union, the United States had to prove the superiority of its capitalist, democratic system. It needed to explain to foreign countries why they should adopt, or at least ally with, the democratic, capitalist model of modernization rather than the centralized, state-based Soviet one. This ideological battle focused on individual, quotidian issues such as healthcare, education, and the economy.

The Soviets understood this reality and attacked the United States for everything from domestic racism to poor working conditions. Moscow diplomats argued that these issues highlighted the flaws of the capitalist system, creating economic and health inequities. In response, U.S. investment in economic, social, and health issues—at home and abroad—softened the edges of capitalism and aimed to prove the system was worth defending.

Foreign aid thus became a vital tool in Cold War competition. It supposedly demonstrated both American wealth and benevolence, but the world it promoted stuck rigidly to U.S. models.

When strategically important countries strayed from the prescribed path, the United States could use the promise of aid or combine it with military force—as was done in Indochina—to keep them in the fold. The liberal international order had a coercive military component, which has remained among the most consistent elements of U.S. policy even after the Cold War.

In this context, support for multilateral institutions like the WHO and specialized UN programs played a valuable role legitimizing American ideas and protecting the United States from accusations that its foreign policy was wholly self-interested. They implied U.S. priorities aligned with international desires.

UN Ambassador Henry Cabot Lodge III, the grandson of Wilson’s nemesis a generation before, explained that working with supranational institutions allowed the country to “get credit for practicing altruism instead of power politics.”

The United States expected that these institutions would work hand-in-glove with U.S. policies, and they did so for the first two decades of the Cold War. At the UN, Americans or European allies held key offices they were willing to compromise because they viewed the world similarly and shared goals. Soviet initiatives antithetical to U.S. visions of economics or healthcare were generally defeated in democratic forums thanks to this Euro-American bloc and a few South American allies.

In less politicized bodies such as the WHO, Western countries dominated senior positions for decades. This gave them extensive control of major programs. For instance, the ultimately unsuccessful malaria campaign initially followed U.S. and European models of mosquito reduction, and the U.S. Centers for Disease Control was central to smallpox eradication in the 1960s.

Through funding and control of administrative positions, the United States promoted orderly, managed development of foreign countries according to U.S. preferences. This “self-interested pragmatism”—as Marcos Cueto and his co-authors argued in their recent history of the WHO—helped justify American participation in these organizations. It also convinced an often skeptical, cost-conscious Congress to fund a quarter to a third of their total operating budgets.

Decolonization and Rise of the Rest

The United States embraced multilateralism because supranational institutions aligned with U.S. priorities, but the explosion of new nations after 1945 challenged U.S. control.

The wave of decolonization that swept the globe from 1947 into the 1970s transformed the UN, tripling membership from an initial 51 to 154 states by 1980. Most new states came from former colonial territories in Asia and Africa and soon outnumbered the American and European countries that formed the bulk of the founding membership.

The addition of these new nations to supranational institutions, including specialized ones like the WHO, changed their internal dynamics in ways that refocused their efforts on the needs of the Global South.

Southern activism led the United States to question the value of bodies like the UN as their policies shifted. Many of the new countries eschewed the Cold War competition. They courted aid from both superpowers and borrowed elements of the planned Soviet economy in the search for rapid economic development. They also looked to place their problems—economic underdevelopment, trade inequalities, the inheritance of weak infrastructures, and structural disparities—onto the international agenda.

They were able to do so because some of the international institutions set up after World War II had strong democratic elements. The General Assembly of the UN, for example, operated according to a one-country-one-vote system. Newly decolonized states downplayed differences to create a powerful voting bloc that prioritized new issues.

The UN became a forum for challenging structural inequalities in the international system that traditionally privileged European and North American countries.

Among the most important initiatives was the UN Conference on Trade and Development (UNCTAD), which advocated a New International Economic Order that empowered Global South states. It sought to boost aid and allow protectionist policies in the developing world to help inefficient start-up industries find their feet. This upended the open market regime established after World War II, which favored advanced industrialized economies that imported cheap raw materials from the Global South.

Similar developments occurred within supranational institutions such as the WHO. Despite real gains, institutionalized bias and top-down technocratic approaches created unequal outcomes between Global North and South. In 1955, for example, the WHO deemed malaria eradication efforts in sub-Saharan Africa premature because the limited colonial infrastructures could not undertake programs designed for Europe and North America.

Global South states responded by demanding more bottom-up approaches to eradication and new emphasis on primary health care. This latter policy embraced social medicine that rejected expensive, technology-dependent Western programs in favor of broad improvements to medical infrastructure and individual health education, promoting prevention rather than just intervention.

The popularity of these approaches from the 1960s onward put the United States on the defensive. The Soviets had long championed primary health care, while small-government politicians and a territorial American Medical Association rejected social approaches to healthcare at home.

The priorities of African, Asian, and Latin American nations were pushing supranational organizations in directions that challenged U.S. priorities.

The Rise of Neoliberalism

Domestic criticism grew as the United States lost influence within these supranational bodies. From the 1960s onward, congressional skeptics seized on these challenges to U.S. visions of development as proof of hostility that justified rethinking foreign policy.

When these organizations crossed political tripwires, such as involving Palestine in UN and WHO deliberations, frustrations boiled over and became political fodder. After the UN admitted the People’s Republic of China in 1971 with the support of many Global South states, Republican firebrand Barry Goldwater claimed: “The time has come to recognize the United Nations for the anti-American, anti-freedom organization that it has become.”

Conservative Republicans and Southern Democrats bristled at supranational organizations trying to set regulations and establish norms, especially when they challenged domestic policies or business interests. In one instance, the Reagan administration drew rebukes from career officials and many Democrats when it voted against a nonbinding WHO initiative in 1981 to restrict the marketing of formula and encourage breastfeeding.

This domestic revolt weakened the U.S. commitment to foreign aid. Bilateral agreements between Washington and individual countries became popular, since they provided the United States with leverage and greater control of projects.

But even here, many politicians viewed recipients of bilateral aid who broke with the United States at the UN or WHO as ungrateful. Led by defense-minded politicians and budget hawks, the Congress progressively cut foreign assistance, buoyed by popular attitudes that supported aid in principle but exaggerated the cost of such programs.

By the late 1970s, foreign aid had become a political issue, wielded by Republicans and conservative Democrats to criticize opponents for being spendthrift or foolish. The popularity of such attacks owed less to the actual costs of foreign aid and multilateral institutions—which never amounted to even 1% of the annual federal budget—but rather to the increasingly common perception of multilateral institutions and nation-states themselves as bureaucratic, inefficient, and wasteful.

This attitude reflected a shift from liberal internationalism to neoliberalism as the dominant American ideology. Neoliberalism prioritized the rational power of markets. It sought to limit the role of states and international institutions to supporting economic exchanges.

This approach fit well with the Bretton Woods lending institutions established at the end of WWII, in which the United States still wielded major influence. Power in both the IMF and World Bank relied on the size of financial contributions rather than majority votes.

The U.S. government nominated both the president of the World Bank and deputy director of the IMF, where a Western European held the directorship until 2019. And in contrast to the UN’s location in cosmopolitan New York City or to UNCTAD and the WHO in Geneva, the IMP and World Bank are headquartered in Washington, D.C., barely a half mile from the State Department.

During a period when critics claimed the UN and other institutions were forums for criticizing the United States, Washington officials could still align the IMF and World Bank with national priorities.

Rejecting the demands for change articulated by UNCTAD, the United States positioned the IMF and World Bank to become the primary lenders of monetary and development aid in the 1980s. The conditions attached to loans compelled poor nations to pursue rigid free-market structural reforms that included low tariffs, reduced taxes, fiscal austerity, and the resulting reduction of social services.

These policies fit with the neoliberal economic attitudes of U.S. policymakers and opened small states—concentrated in Africa and Latin America—to trade from advanced industrial countries. It also undermined the diversification of these economies and led to drastic reductions in social spending that had been vital to improving living standards in prior decades.

The neoliberal shift also damaged multilateral institutions like the WHO. Major Western donors—notably the United States—had been trying to rein in spending since the 1960s. The Reagan administration cut the budget and slowed payments as part of a larger effort to minimize the role of the UN and related institutions. In 1985, the United States even refused to pay dues to the WHO in protest of an essential medicines list it believed would interfere in the domestic pharmaceutical industry.

These funding difficulties coincided with the rise of small, non-governmental organizations that competed for funds in areas of development and healthcare. Notable among these were grants given by the World Bank that stressed efficiency through competition, public-private partnerships, and clearly measurable results.

The bank’s neoliberal approach to healthcare emphasized cost-effective interventions such as immunization that revived narrowed versions of earlier practices rather than the more diffuse and equitable primary health care approach demanded by Global South nations.

The economic and public health gains of the postwar period slowed or even reversed in many countries. The 1980s became known as the “lost decade” among developmental economists. Per capita income stagnated in Latin America and declined in Africa.

Austerity measures adopted in these regions and elsewhere exacerbated political divides and promoted radical politics as average people became increasingly disillusioned with sitting governments. Though the weakening of organizations like the UN and WHO encouraged these retreats, U.S. critics used these facts as evidence of their ineffectiveness.

The result is that since the 1990s, the United States has often acted unilaterally or in concert with a limited number of European allies. This most famously occurred in our prolonged wars in the Middle East, but this has also influenced aid programs.

The President's Emergency Plan for AIDS Relief (PEPFAR), for example, provided important relief for AIDS-plagued countries, but emphasis on narrow quantifiable goals and domestic political issues—including abstinence-only education under George W. Bush—drew critiques from locals and health experts.

The Future of Multilateralism

After the Great Depression and World War II, the United States embraced domestic and foreign policies it historically had avoided. Multilateral organizations and membership in international institutions were arguably the most important.

Though it sought outsized influence, the country sacrificed sovereignty and willingly compromised with Europeans it had long distrusted in order to pursue shared goals of peace and economic recovery now understood as inherently intertwined with American prosperity.

The benefits were uneven but quite real: improved living standards at home and abroad, avoidance of nuclear war, and global connections that have promoted rapid technological innovation.

But the U.S. drifted away from multilateral cooperation when post-colonial states challenged its domination of these institutions. Since the 1980s, the Republican Party has consciously politicized the issue, portraying most international accords and collaboration with supranational bodies as signs of weakness. This was one reason why Donald Trump heard cheers when he withdrew from the Paris Accords and the WHO.

Yet the decision to prioritize unilateralism does not negate the need for coordination on global issues of economics, disease, climate, and much else. The current pandemic provides an example of just how damaging recent trends towards unilateralism have been.

The neo-liberal defunding of healthcare systems has made tracking communicable viruses more difficult in countries across the world, and there was minimal coordination in the early stages of the virus. Resistance to global guidance and regulation from the WHO further hampered efforts, which exacerbated existing problems in the leadership and legitimacy of the organization, caused partly by decades of underfunding and denigration.

So too has the U.S. preference for technological interventions rather than social ones led to the country’s current crisis. We suffer some of the highest infection totals in the world—and the economic malaise caused by consumer anxiety—while collectively waiting for a vaccine that will take months to distribute once it does arrive.

One question is whether the country will reach the point that it did 75 years ago, when the general public realized a new approach was needed. The other: can the nation find the humility and political will to make the compromises and sacrifices necessary to take an effective leadership role in our more complex, decentralized international system?

We have a new president, but it is unclear if or when the country will adopt the foreign policies necessary to address the challenges of the 21 st century.

Suggested Reading

Marcos Cueto, et. al. The World Health Organization: A History (Cambridge 2019)

Jussi M. Hanhimäki, The United Nations: A Very Short Introduction (Oxford 2015)

Vijay Prashad, The Darker Nations: A People's History of the Third World (New Press 2008)

Richard Jolly, et. al. Eds. UN Ideas that Changed the World (Indiana, 2009)

Sara Lorenzini, Global Development: A Cold War History (Princeton, 2019)

Stephen Macekura, Of Limits and Growth: The Rise of Global Sustainable Development in the Twentieth Century (Cambridge 2015)

David L. Bosco, Five to Rule Them All: The UN Security Council and the Making of the Modern World (Oxford 2009)

Alanna O’Malley, The Diplomacy of Decolonisation: America, Britain and the United Nations during the Congo crisis 1960-1964 (Manchester 2018)

U.S. State Department Office of the Historian, Foreign Relations of the United States, Harry Truman: https://history.state.gov/ historicaldocuments/truman

The 1953 review: Principal stresses and strains facing the United States in the United Nations: https://history.state.gov/ historicaldocuments/frus1952- 54v03/subch4

This content is made possible, in part, by Ohio Humanities, a state affiliate of the National Endowment for the Humanities. Any views, findings, conclusions or recommendations expressed in this content do not necessarily represent those of the National Endowment for the Humanities.

Two Universal Ideals from the Declaration of Independence

I have recently been studying about the time of the American Revolution, the founding of the United States, the signing of the Declaration of Independence and the glorious creation of the United States Constitution. I’m sure there are similar fascinating and eye-opening principles behind the creation of countries all over the world, but this particular event and ideals that sprang from it have affected others on a global scale unprecedented in recent history. I want to talk about two of those ideals.

First, that “ all men are created equal .” The reason for promoting this ideal stemmed from a belief existing for centuries, millennia even, that kings (and, to an extent, nobles) were created to rule over others. Commoners all over the world were relegated to obeying laws with which they had no say. No man, woman or child could generally rise above the station in which they were born and also were not permitted to associate with those of higher classes. The revolutionary idea that people were created equal led to the “American dream,” a dream that brought thousands upon thousands of immigrants to its shores in hopes of making a better life. This dream, this founding principle said that individuals could increase their station through their own force of sheer will– not because the American government would provide it for them, but that a land was provided where they could work and earn what they were willing to earn, and thereby improve their own situation.

Today this ideal has shifted in the public image, to say that all values are created equal, all lifestyles, all choices. This image of equality is a far cry from what was promoted by the writers of the Declaration of Independence, who linked this idea with “the consent of the governed,” and “the separate and equal station to which the Laws of Nature and of Nature’s God entitle them.” Each person now living in this land were declared to have certain rights, given to them by God, which none could take away: the ability to live the ability to pursue their education, training, and work in any chosen field their ability to be free without a tyrannical government imposing unjust, arbitrary laws upon them that were designed more to fill its coffers and expand its own power than to serve justice.

Second, an encompassing message of the colonists desire for independence sprang from sincere religious beliefs, not from any one denomination, but Christian in nature. This desire to live without governmental oppression came in this form: “ that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. ” These Rights were declared in rebuke of a tyrannical monarchy who had been plundering their coasts, inciting insurrection, imposing taxes without consent, and dissolving their representation (among many other things– See Declaration of Independence .) The colonists invoked natural freedom from this oppression as universal, coming from a Supreme Being who created all mankind. Indeed, the name of deity is invoked several times in this document: “the laws of Nature and Nature’s God,” “endowed by their Creator,” “Appealing to the Supreme Judge of the world,” and “with a firm reliance on the protection of divine Providence.”

These men prayed fervently, in public meetings, for their God to unite them, to guide them in this Herculean endeavor, to help them create a truly free land…and those prayers were answered. No one reading the history of these people can contest the sheer magnitude of the work they undertook. No one can likewise contest the miraculous documents and institutions they created as a result. American Independence started a course that changed the world, changed the way people around the globe saw their status. And the Founding Fathers ascribed that change to the willpower of determined men and the miraculous involvement of a Divine Hand.

Succinctly stated, from President George Washington’s first inaugural address : ”No People can be bound to acknowledge and adore the invisible hand, which conducts the Affairs of men more than the People of the United States. Every step, by which they have advanced to the character of an independent nation, seems to have been distinguished by some token of providential agency.” The American nation, and nations around the world today, would do well to remember these things.


7 Ways French Women Do Childbirth & Breastfeeding Differently

I gave birth to two of my three children in France (the third in Switzerland), and had an all-around extremely positive experience. The importance that the European culture and medical system puts on the mother’s health and well-being in all phases of pregnancy and childbirth made me feel safe and taken care of.

France’s infrastructure — including a socialized medical system, low-cost child care, and numerous resources for pregnant women — allows women the time to prepare and recuperate from childbirth and, just as important, to care for the newborn baby.

I’ve never given birth in America, so I can’t speak to how the two compare, for better or for worse. But from living in France for 15 years and my own personal experience having children here, here is what I’ve noticed about how women in France approach childbirth and breastfeeding:

1. Medicalized births are still standard.

France is often regarded as a country that relies wholly on medicalized, medicated births. Epidurals are the norm (“Why go through the pain if you don’t have to?" is the thinking), and inducing babies is also common, likely because it’s easier on the doctor's and patient’s schedule.

However, natural childbirth is no longer entirely rare. French hospitals and clinics are very slowly changing, sometimes allowing natural births to take place in hospitals, and letting parents present a birthing plan — something that was unheard of up to a few years ago.

Personally, I never considered natural childbirth, and since I fell into what the “norm” was in France, it all went as I had wished. I did ask not to be induced if possible, and my wishes were respected.

2. Women tend to stay longer in the hospital after birth.

Twelve years ago, when I gave birth to my oldest, mothers were allowed to stay at the hospital or clinic for up to six days following normal childbirth. So I had my own set of child-care “experts” surrounding me for almost a week before I had to go at it alone! By the time I got home, I felt I had things a bit more under control, and my body was definitely feeling much better.

Today, women in France usually stay three full days (not including the day of the birth). For a cesarean section, it’s at least four.

During this time, aside from being seen by nurses and doctors, first-time mothers are taught how to bathe, dress, and clean a newborn. Women who plan on breastfeeding are given access to a lactation specialist. New mothers are urged to rest, sleep as often as possible, and eat well in order to promote a healthy recuperation.

After childbirth, most women attend 10 sessions with a midwife to help them “reeducate,” or strengthen, the perineum.

3. Mother and baby are kept together in the hospital room.

Rather than separate the mother and baby into different wards, French hospitals typically allow babies born needing medical intervention to stay in a suite with their mother.

For babies that need to stay longer than the standard hospital stay, accommodations are arranged so that families can remain with the baby. This is seen as a crucial component in the baby’s recuperation.

4. Breastfeeding is treated as a personal choice.

According to the World Health Organization, while French women are breastfeeding more today than they were a decade ago, they still fall short of the recommendations: at least six months of breastfeeding.

In France, women often breastfeed just shy of three months. That's because most women return to work at that time, since maternity leave is usually 10 to 13 weeks. Recent stats show that only 19 percent of mothers are still breastfeeding at six months. (For comparison, more than 80 percent of Norwegian mothers are still breastfeeding at six months.)

Why is this? Historically speaking, breastfeeding was all but discounted by the medical community even in the 1970s, and it has taken decades to turn this tide. It wasn't until the 21st century that the French government stepped in to proactively promote breastfeeding.

In my personal experience, the choice to breastfeed was left up to me without any pressure either way. No medical professional ever pushed me to breastfeed or to bottle-feed. Both options were supported.

"If a woman is self-motivated and wants to breastfeed, we support her 100 percent," says Karine Murello Ory, a child-care assistant at the private hospital of Versailles. "But if she doesn’t want to, it makes no sense to force the issue. In this case, both the baby and the mother are better off bottle-feeding.”

5. Midwives play a large role.

After returning from the hospital, all French women have free access to a local midwife, for a minimum of three home visits. Midwives help check on wounds, assist with breastfeeding, answer questions about the baby’s eating and sleeping schedule, and offer a friendly ear in the early, delicate days following childbirth.

For me, the biggest difference my midwife made after childbirth was the moral support she provided. The days following discharge are tough for parents who are still getting used to a newborn. Add on sleep deprivation, body aches, and sometimes a bag of mixed emotions, and having a gentle, experienced professional to talk to can be a huge boost to morale.

Midwives are often mothers themselves and have an ability to relate to their patients as health care professionals and human beings. They also allot more time to house calls than a rushed medical rendezvous with a gynecologist or pediatrician.

6. Women attend “perineum reeducation” sessions after birth.

After childbirth, most women attend 10 sessions with a midwife to help them "reeducate," or strengthen, the perineum and pelvic floor. This rééducation périnéale is paid for by the state and is often seen as a necessary preventive measure for issues like urinary incontinence.

I've found that Anglophones in France often find this practice somewhat invasive and awkward. Yet after having several children, going through the relatively painless sessions with the midwife — think strength training for your private parts either using a type of wand, or even the midwife's fingers — are worth it in the long run.

Following this practice, most women will also seek a professional to help them re-strengthen the abdominal muscles, also covered by the universal health care system.

7. There’s a growing awareness of postpartum depression.

Twelve years ago, when I visited my gynecologist for the routine post-childbirth checkup, he didn’t really ask much about my moods or how I was coping.

Today however, things are changing for the better in France. Midwives taking care of women after birth are trained to detect symptoms of postpartum depression. “Midwives are probably best situated to spot cases of postpartum depression because of their close proximity to mothers following childbirth,” says Dr. Susanne Braig, medical director of Gynecology, Obstetrics, and Pediatrics at Annecy Hospital in Haute Savoie. “We also try to pre-empt the severity with well-rounded support from the necessary professionals before the baby arrives.”

More than anything, it’s important for a woman to feel safe, understood, and taken care of when experiencing something as vulnerable as giving birth and raising a child. Wherever you’re giving birth, make sure you surround yourself with a positive support system.

Want your passion for wellness to change the world? Become A Functional Nutrition Coach! Enroll today to join our upcoming live office hours.


Watch the video: Die Natürliche Schöpfung