Acquisition of knowledge
Where does our knowledge come from?
What we know has changed over the millennia, as have the ways we acquire our knowledge. Even today, we are still benefi ting from the insights of our ancestors, many of them obtained by risking life and limb. But time and again in the history of humankind, knowledge has been lost to future generations. We can only shake our heads in disbelief at some purported certainties from the past, while other issues may perplex us until the end of our days. A brief history of knowledge and its acquisition.
Age-old conundrums: Where do the stars lead us?
Humans have always been fascinated by the stars. And very early on, people who once stared in awe at the heavens began to discern patterns. The trajectories of the stars seemed to repeat. Archaeological finds suggest that Stone Age people already hoped to understand this recurring cycle. Most likely, they already knew the difference between north, south, east and west. Then they began charting the phases of the moon, solstices and constellations. Later, humankind’s fi rst farmers began scanning the skies for signs that told them when to sow and harvest their crops. In many cultures, this task was attributed to the star cluster known as the Seven Sisters. It disappears below the horizon at the beginning of spring and re-emerges at harvest time.
Finds such as the Nebra sky disk from the Bronze Age indicate that humans recorded their insights for posterity. Because knowledge was valuable. Astronomy was inextricably linked to astrology, religion and cult worship. People believed that the movements of the stars were a kind of divine script: the ability to read them could help avert calamities on earth. Astrologers exerted considerable infl uence over political life. And because the powerful have always had a penchant for planning, observations of celestial phenomena were factored into their calculations no later than in the Neo-Babylonian Empire. Astronomy, the oldest of the sciences, is a corner stone of our civilization and the starting point for countless other types of knowledge. Astronomical calendars allowed humans to produce surpluses and trade with them. On land, transporting goods was arduous. The sea was faster, but navigation based on trial and error was likely to take a fatal turn. Some peoples, how ever, used basic tools to develop a system of celestial navigation, also known as astronavigation.
The Phoenicians – who inhabited the area around present-day Lebanon and Syria – were probably the first to use the stars at sea. On the other side of the world, Polynesian children were already studying 178 stars and constellations, and combining them with observations of currents, wind, sea swell and bird flight. This allowed them to explore Pacific islands thousands of kilometers from the mainland. Many discoveries, however, may also have been the result of poor navigation. The legends of the Vikings tell of voyages through storms and fog, in which a knowledge of the sun’s position and North Star was of little help – but which nonetheless saw them arrive at modern- day Canada’s eastern seaboard.
Over the centuries, inventions such as the Jacob’s Staff, compass and sextant made astronavigation increasingly accurate. The science gained such precision, in fact, that humans relied on it to land on the moon. Even today, in the era of GPS, it is a fixture in the training of mariners: satellites can be hacked, the Internet and radio transmissions can fail, but a view of the sky is as reliable a source of orientation as ever. Astronomy shows that technological progress does not render ancient knowledge redundant.
Drawing the wrong conclusions: What makes us sick?
Hippocrates, who lived on the Greek island of Kos from about 460 to 370 BC, was likely the first physician to systematically observe the progression of different diseases. His theory: in a healthy person, the four body fluids – blood, phlegm, yellow bile and black bile – are precisely balanced. If this equilibrium is disrupted by external factors, people get sick. Hippocrates was convinced that polluted air was the main cause of disease. Presumably he had seen how people became prone to illness if they lived in overcrowded accom modation, or near swamps or cesspits. He concluded that such places emit miasma: noxious and toxic fumes. And that epi demics were spread by the wind and the exhaled breath of the afflicted.
Hippocrates’ Miasma Theory demonstrates how accurate observations can lead to false conclusions. Above all in medieval Europe, scholars followed the writings of their Classical sources to the letter, and did so for centuries. Doctors only treated symptoms. The root of all illness was beyond dispute: contaminated air. When the plague bacterium arrived in European cities around 1348, it therefore prompted absurd responses. To purify the air he breathed, Pope Clement VI spent months in the summer seated between two large bonfires – even through the disease was spread by ratfleas. However, unlike the millions who died, he remained healthy. When out and about, many people carried aroma pouches containing herbs and flower petals – although these likely had little effect. But staying away from the sick and dead did help, they noticed. The particularly vulnerable port city of Marseilles isolated the crews of arriving ships for 40 days – and, in doing so, invented the quarantine. Ergo: bad science can lead to good results. That said, those benefits tend to perpetuate the fallacies that spawned them.
The Miasma Theory went on to shape medical thinking until the late 19th century. And that, although scientists were gradually identifying which environmental factors encouraged diseases. Although physicians had long started measuring bodily functions, statistically evaluating their findings and classifying abnormal ities. And although the invention of the microscope had exposed the miniscule organisms that teemed in water, food and bodily fluids. The world had to wait for laboratory scientists such as Louis Pasteur and Robert Koch, who conducted systematic experiments that proved how tiny parasites caused and spread diseases. These modern scientists welcomed conflicting results because they highlighted new research areas. In 1876, it was Robert Koch who provided unequivocal proof that anthrax was caused by a bacterium: Bacillus anthracis. Bacteriology was born, and has reigned supreme ever since. How ever, contesting scientific certainties and axioms can still be a challenge today.
Unsolved mysteries: What happens in our heads?
Today we live in a scientific society. Our body of knowledge is expanding exponentially. But the realm of what we do not know – and perhaps never will know – is not decreasing as our thirst for knowledge grows. There is a huge enigma residing inside our very heads. The physiological parameters of this organ – our brain – are a known quantity. It weighs about 1.5 kilograms, is brownish-grey in appearance and possesses the consistency of camembert cheese. Yet from a human perspective, there is no more complex entity in the entire universe. Each of the approximately 100 billion neurons is as powerful as a computer. It can communicate with thousands of other neurons via axons and dendrites.
The brain’s output is diverse and enigmatic: perception, cognition, actions, language, memories, emotions, ideas. Understanding their biology will remain the last major challenge of science, according to Eric Kandel, an Austrian-American neuroscientist and Nobel laureate. Some questions being investigated by brain scientists today were already being asked in antiquity. Where does our consciousness come from – the feature that so clearly distinguishes us from other living creatures? To what extent can we determine our own actions? What makes us emotionally ill and how can we be healed?
However, modern brain research is not driven purely by an unselfish quest for knowledge. If serious conditions such as dementia, strokes, Parkinson’s disease and depression become treatable at some point, the economic benefits would be considerable. For this reason, many countries have established multibillion-dollar research programs in which hundreds of scientists from diverse disciplines explore the brain. Supercomputers, Artificial Intelligence and Big Data are being integrated, and amateur scientists are donating research or making contributions through community science projects. Whatever the case, the solitary geniuses of the past have no place in today’s large-scale research operations. The German-American neuroscientist and Nobel laureate Thomas Südhof believes that we only understand five percent of what is happening inside our brains. But, even if we don’t know how, our brains can still be trusted to produce that inquisitiveness that has driven us since time immemorial to seek new experiences, gain insights, and share knowledge.
Stefanie Hardick, born in 1978, is a freelance journalist specializing in scientific and historical topics.