Sitemap

The seven ages of information

From language to ChatGPT, the revolutions that have shaped the way we process information.

15 min readJun 17, 2025

--

By Francisco Rodrigues, Universidade de São Paulo.

Around 35 million years ago, a complex of tectonic plates formed in northeastern Africa with the separation of the African and Arabian plates, creating the Great Rift Valley in present-day Kenya. More than a million years ago, the region’s lakes collected rainfall, providing fresh water and attracting animals. The lakes also served as a natural migration corridor stretching for thousands of kilometres. Early humans lived there by hunting and gathering, alongside many other species. However, around 400,000 years ago, climate change caused the lakes to dry up and rainfall to become scarce. The large animals that our hominid ancestors had hunted were replaced by smaller species. This intense environmental pressure led to the selection of the most adapted individuals, resulting in the emergence of Homo sapiens.

Press enter or click to view image in full size
The species of the genus Homo originated in Africa and spread rapidly to all continents, marking the global occupation by the first humans. (Source: Wikipedia)

Homo sapiens was a new and very distinct species with a fundamental characteristic: the ability to communicate through complex language. Based on estimates of phonetic and genetic diversity, it is thought that our species began using language between 150,000 and 300,000 years ago. This enabled us to evolve not only genetically, but also culturally. Therefore, the emergence of language was the first major revolution in human information processing.

Through language, we can express our thoughts and experiences, transmit knowledge and ideas, and strengthen social bonds. Language made it possible to pass down stories, myths, traditions, and knowledge from generation to generation, ensuring the continuity and cultural enrichment of human societies.

For the philosopher and evolutionary thinker Herbert Spencer, language is not merely a means of communication, but also an instrument that reflects and shapes the development of thought, embedded in a broader evolutionary process that spans from the physical universe to human culture. For the philosopher Friedrich Nietzsche, language holds an even more significant role: thought is always mediated by it, and language is unstable and plural. In this sense, according to Nietzsche, language not only helps us describe the world, but also limits our perception of reality, as all thought is shaped through it.

Just as the genus Homo evolved until only one species remained — Homo sapiens — language also underwent an evolutionary process in which only a few languages survived. Against this backdrop, the emergence of the Indo-European language marks a pivotal moment in human history.

Press enter or click to view image in full size
From the Proto-Indo-European language derived multiple linguistic branches, which gave rise to hundreds of languages and dialects still spoken in vast parts of Europe, as well as in Iran, northern India and areas of Central Asia. (Image from the web comic Stand Still, Stay Silent).

This ancient language family has given rise to hundreds of languages and dialects now spoken across vast regions of Europe, Iran, northern India and other parts of Asia. It is considered the largest language family in the world in terms of the number of native speakers and includes languages such as Portuguese, English, Spanish, Hindi, Persian and Russian, to name but a few. It is believed to have originated in the steppes of Eurasia north of the Black Sea — in present-day Ukraine — around six thousand years ago. Between 4000 and 3000 BC, peoples associated with the Yamna (or Yamnaya) culture spread this ancestral language across vast areas of Europe, Central Asia, Iran and the Indian subcontinent. These peoples had mastered the domestication of the horse and the use of wheeled vehicles. For reasons that are still unclear, Indo-European languages prevailed over those previously spoken in these regions, profoundly shaping the linguistic and cultural landscape of the civilisations that developed thereafter. The following map shows how the language spread across Europe and Asia.

Press enter or click to view image in full size
The expansion of the Indo-European languages originated in the steppe region north of the Black Sea, in the area of southern Russia and the Caucasus where the Proto-Indo-European peoples lived around 3400 BC. This population dispersed in several migratory waves, taking their languages and cultures with them, and spreading them across much of Europe, Western Asia, and South Asia. (Source: Wikipedia).

The way in which we discovered the spread of the Indo-European language is fascinating. In 1647, Marcus Zuerius van Boxhorn was among the first to propose that Dutch, Greek, Latin, Persian and German all descended from a single ancestral language. In the 18th century, scholars such as Gaston Coeurdoux and, in particular, Sir William Jones in 1786, reinforced this hypothesis by identifying similarities between Sanskrit, Greek, Latin and Persian. They proposed that these languages all shared a common origin. To support this idea, they compared basic and universal words — terms present in all cultures that are less susceptible to rapid change due to migration or invasion. A classic example is the word ‘mother’: em Portuguese, ‘mãe’; in Latin, ‘mater’; in Spanish, ‘madre’; in French, ‘mère’; in English, ‘mother’; in German, ‘mutter’; in Greek, ‘μήτηρ’ (mētēr); in Russian, ‘мать’ (mat’); and in Sanskrit, ‘मातृ’ (mātr). The similarities are obvious. Other words, such as ‘God’ or the names of numbers, also exhibit similar patterns across different languages, further supporting the idea of a common linguistic ancestry.

Just like our DNA, language has evolved over time. However, rather than being stored in nucleic acid molecules, it has been recorded in our brains, meaning that human contact is required for it to be passed on to subsequent generations.

As hunter-gatherers became farmers and founded the first towns and cities, which later evolved into civilisations, it became impossible to store all the information produced using only oral language. Over time, language transcended the limitations of internal thought and began to be recorded on external media. Thus began the second great revolution in the way we process information: the birth of writing.

Press enter or click to view image in full size
Cave art from Lascaux, France, dating from the Paleolithic period. (Source: Wikipedia)

The earliest examples of this type of record date back around 40,000 years, in the form of cave paintings such as those found at Lascaux in France and Altamira in Spain. These images were the earliest examples of humans externalising language by reproducing the world around them. They were the first method of preserving information that would stand the test of time. From these drawings, basic forms of pictographic writing developed, where each symbol represented a particular object. The oldest records we have access to use drawings to represent concrete objects — for example, a stylised ox’s head symbolises the animal itself, as can be seen below.

Press enter or click to view image in full size
Evolution of the letter A.

Writing emerged from the rather mundane needs of the bureaucracies of the ancient empires of the Fertile Crescent, which covered the regions of Mesopotamia (modern-day Iraq, Syria and Turkey) and Egypt. Around 3200 BC, cuneiform writing was created in Sumer for administrative and economic purposes, such as recording goods and transactions. This system used a triangular-tipped stylus to engrave symbols on wet clay tablets — hence the name ‘cuneiform’ (from the Latin cuneus, meaning ‘wedge’). The picture below shows a clay tablet with cuneiform writing.

This clay tablet features cuneiform writing and depicts a Sumerian contract detailing the sale of a field and a house, dating back to around 2600 BC. (Source: Wikipedia)

These tablets have withstood the test of time, and today, we have numerous records that provide insights into life during that period. Over the centuries, cuneiform writing was also used to record literary, religious, and legal texts.

One important historical figure from this period was Princess Enheduana, daughter of King Sargon of Akkad. She is widely regarded as the world’s first known female writer. A priestess and poet of the Akkadian Empire, she lived in the city of Ur around 2300 BC and wrote hymns, poems and prayers in cuneiform script. Her most famous works are the Hymns to Inanna and the Exaltation of Inanna (Nin-me-šar-ra), which combine religious devotion with personal and political reflections. These were the first autographed texts not related to bureaucratic records.

The texts of Princess Enheduana can be accessed via the Electronic Text Corpus of Sumerian Literature platform at Oxford University.

Writing has undergone countless transformations over time. One of the most significant advances occurred with the Phoenicians, who inhabited the regions of present-day Lebanon, Syria, and northern Israel. As discussed earlier, writing systems were initially based on pictograms representing concrete objects. Over time, these symbols became more abstract, representing sounds such as syllables. However, the great revolution came with the Phoenician alphabet. It simplified writing by reducing it to a lean set of symbols representing the sounds (phonemes) of speech rather than ideas or whole objects. In other words, each symbol corresponded to a specific sound, making the system much more efficient, flexible, and easier to learn. The following figure shows the Phoenician alphabet and its modern-day counterparts.

The Phoenician alphabet and its modern-day characters.

This innovation was driven primarily by intense commercial and maritime development in the Mediterranean, which necessitated a swift and accessible form of written communication ideal for contracts, records, and negotiations. As trade grew, the Phoenician alphabet spread throughout Europe and Asia, directly influencing the writing systems of the Greeks, Romans, Arabs and Hebrews. Nowadays, almost all of the world’s major languages use variations of this alphabetic model, with a few exceptions such as Chinese logographic characters, Japanese syllabaries, Korean Hangul, and the writing systems developed by ancient African and American peoples.

One curious aspect of writing is our impressive ability to read at high speed. However, the human brain was not biologically shaped for reading. In fact, it is the shapes of letters that have adapted to the way we process visual information. In other words, we did not evolve to read — writing evolved to be read. Throughout history, the shape of letters has evolved to align with human visual and cognitive abilities, making reading and writing easier.

Writing gave birth to history, as it provided us with access to the records and thoughts of ancient peoples. For many centuries, scribes played a fundamental role in various kingdoms, storing information that could be consulted in the future. However, the invention of the movable type printing press by Johannes Gutenberg at the end of the Middle Ages marked the third major revolution in the way we process and store information. Between 1439 and 1440, he developed a printing machine that used letters and symbols cast in metals such as lead, tin, and antimony. This allowed the letters to be rearranged to form different texts, and multiple copies to be produced quickly.

Press enter or click to view image in full size
Artist’s view of Johannes Gutenberg in his workshop, showing his first proof sheet.

The system was based on movable type arranged on a board and covered in linseed oil-based ink. This was then pressed against the paper using a manual press similar to those used for squeezing grapes in winemaking. This technique enabled mass printing to be carried out much more quickly and cheaply than manual copying by scribes. Although the concept of printing with movable type had existed in China since the 11th century, Gutenberg created an improved system featuring durable metal type and a quick-drying, sticky ink. His printing press enabled knowledge to be disseminated on a large scale, ushering in a new era of information democratisation.

This marks the end of the third phase in the evolution of information. Through language, human beings developed a means of transmitting knowledge orally from one generation to the next, thereby creating complex social ties and collective memories. Writing externalised information, freeing it from the limitations of time and space so that it could be stored and consulted by individuals separated by great temporal and geographical distances. The Gutenberg press marked the beginning of the era of mass reproduction of information. For the first time, written knowledge could be copied on a large scale with fidelity, accelerating its dissemination, standardisation, and accessibility.

The fourth information revolution began in the 19th century with the invention of Morse code and the development of the electric telegraph in 1837. For the first time, information could travel faster than a locomotive, severing the dependence of communication on physical transportation. This breakthrough marked a profound change, dematerialising information and making it possible to transmit it almost instantaneously over long distances in the form of electrical signals. To achieve this, Samuel Morse created a binary system comprising short and long pulses. He developed this code with the help of his partner, Alfred Lewis Vail, who visited a newspaper in Morristown, New Jersey, and manually counted the types (pieces of metal bearing letters) used in printing, in order to estimate the frequency of letters in the English language. Based on this data, the shortest codes were assigned to the most frequent letters, such as E and T, while the longest signs were reserved for the least common letters, such as Q and J. This was one of the first attempts at optimising sign processing — informational compression based on statistical frequency. As can be seen in the figure below, the most common characters use shorter signs.

Morse code, created around 1837 for use in transmitting messages using the telegraph. For example, the SOS message in Morse is given by ···−−−···.

In the 20th century, the idea of using binary codes, first proposed by Morse, paved the way for the development of computers. Just as Morse code uses sequences of short and long signs to represent letters and numbers, computers began using sequences of bits — zeros and ones — to encode data and instructions. As digital communication advanced, Claude Shannon proposed the concept of information entropy in 1948. This measures the uncertainty or unpredictability associated with a set of possible messages. This was the first mathematical formalisation of the concept of information.

Shannon’s definition of entropy, which revolutionized information processing by computers.

Shannon defined the amount of information carried by an event as being inversely proportional to its probability; that is to say, the less likely an event is to occur, the more information it conveys. For example, a regular die conveys more information than a coin because it has six faces, whereas a coin has only two. In other words, the value shown by a die (any number from 1 to 6) is more difficult to predict than the outcome of a coin toss (heads or tails), which implies greater information content.

Computers have made it possible to store and transmit large amounts of information quickly and efficiently, ushering in the fifth information revolution. It was possible, for example, to store an entire Bible on three standard 3.5-inch floppy disks, each with a capacity of 1.44 MB, considering that the printed Bible has around 1,200–1,500 pages. Processing capacity evolved rapidly from the first personal computers launched in the 1980s (IBM PCs) according to Moore’s law, which states that the number of transistors on a chip doubles every 18 to 24 months while the cost per transistor falls. This continued until the 2000s, when the first smartphones were launched — the iPhone was launched in 2007.

Advances in computer technology and the internet have led to the sixth information revolution, which began with the rise of social networks in the early 2000s. Morse code and the telegraph enabled signals to be transmitted over long distances, and the internet built on these innovations. With easier data transmission and processing, social networks emerged naturally.

Platforms such as Facebook (2004), Twitter/X (2006), YouTube (2005) and, more recently, TikTok (2016) have transformed the way information is produced and circulated. Unlike previous media, which mostly operated on a one-to-many communication model, social networks have established a many-to-many dynamic in which each user can be a producer, curator, and consumer of content simultaneously.

This technological revolution has significantly increased the ability to share knowledge and participate in public life, but it has also brought unprecedented challenges. The vast amount of content and the lack of institutional filters has made it easier to spread false, misleading and malicious information, which has had a profound impact on public debate and consensus-building. Additionally, the decentralisation of information sources has fragmented epistemic authority, which was previously concentrated in institutions such as universities, newspapers, and publishing houses. As Eli Pariser argues in The Filter Bubble (2011), platforms’ personalisation algorithms create information bubbles in which users are mostly exposed to content that reinforces their pre-existing beliefs. This phenomenon is exacerbated by the psychological mechanism of cognitive dissonance (Leon Festinger, 1957), whereby individuals reject conflicting information and favour information that confirms their values and identities. In other words, although we now have access to unprecedented volumes of information, disinformation is also proliferating, acting as noise amidst knowledge. The evolution of social networks presents challenges that are still being debated and understood.

Press enter or click to view image in full size
The evolution of social networks over time.

The seventh informational leap is represented by artificial intelligence (AI) in the 21st century. Systems such as large language models (LLMs), virtual assistants, and machine learning algorithms can store and distribute information, as well as process, generate, and transform it autonomously. AI ushers in a new paradigm in which information is produced not only by humans for humans, but also by machines for humans and other machines. This advance was so significant that John Hopfield and Geoffrey Hinton were awarded the 2024 Nobel Prize in Physics for their fundamental contributions to the development of artificial neural networks, the basis of AI models. David Baker, Demis Hassabis and John Jumper from Google DeepMind were also awarded the Nobel Prize in Chemistry for developing AlphaFold: an AI model that can predict protein structures with unprecedented accuracy and solve a 50-year-old challenge in molecular biology. In short, AI has ushered in a new era of scientific discovery.

Language models have evolved rapidly since their popularisation in 2023 with the launch of ChatGPT to the general public, giving rise to systems such as DeepSeek, Claude, Gemini and Perplexity, among others. Meanwhile, generative image and video models such as DALL-E and Veo have emerged, capable of creating visual content with unprecedented speed and quality. These tools have profoundly transformed the way we produce, access and interact with information, accelerating creative processes, automating tasks, and blurring the lines between human authorship and algorithmic generation.

Although language models operate in a way that is fundamentally different to the human brain, they have demonstrated their ability to imitate various aspects of human behaviour with remarkable accuracy. Operating with artificial neural networks and large volumes of data enables these systems to learn linguistic and conceptual patterns, allowing them to respond, argue and even create fluidly. While they lack consciousness or intentionality, their ability to perform challenging cognitive tasks suggests a new form of artificial intelligence that brings machines closer to human language and reasoning than ever before.

In fact, recent studies indicate that models such as ChatGPT-3.5 and Gemini Pro Vision 1.0 spontaneously develop internal conceptual representation structures that are similar to human mental representations (see ‘Human-like object concept representations emerge naturally in multimodal large language models’, Nature Machine Intelligence). Analysis of neuroimaging data revealed a correlation between the internal activations of the models and patterns of brain activity in regions such as the fusiform gyrus and the parahippocampal cortex, which are associated with visual and conceptual processing. These findings suggest that these systems are evolving rapidly, pointing to a profound but as yet uncertain transformation in the way we process and understand information.

Although we do not yet fully understand the implications of the seventh information revolution, it is clear that it is already having a profound impact on the way we communicate, work and generate knowledge. While companies and institutions are racing to adapt to this new era, society as a whole is still unprepared to deal with its impacts.

Sam Altman, the CEO of OpenAI, has suggested that the revolution brought about by artificial intelligence may be so profound that it requires a new social contract — a new structure of economic and social organisation capable of addressing the challenges and opportunities presented by this technological advancement. In his words, this would require establishing ‘a floor for everyone, in exchange for a ceiling for no one’: ensuring a minimum standard of well-being, dignity, and access to opportunities for all, without imposing artificial barriers to individual success.

This vision is based on the idea that technology can create a virtuous cycle of economic growth and shared prosperity if it is guided properly.

“A change in the social contract will be necessary.” — Sam Altman, CEO of OpenAI

Two striking trends emerge when we look at the history of information processing: the amount of available information is growing rapidly and the intervals between revolutions are becoming progressively shorter. Initially, information exchange occurred through language, involving only a few individuals in interactions limited by time and space. Today, social media allows us to receive information from thousands of people in real time, regardless of geographical or temporal boundaries. With the advancement of artificial intelligence, information now originates not only from human beings, but also from machines that are capable of shaping — and sometimes distorting — our perception of reality.

If we analyse the time intervals between information revolutions, we can see that hundreds of thousands of years passed between the emergence of language and the invention of writing, several millennia between writing and the printing press, a few centuries between the printing press and the telegraph, a few decades between the telegraph and social media, and just a few years between social media and large language models. If this trend continues, the next revolution is very near, perhaps even imminent. While we don’t yet know what will come next, it is almost certain that, just as we are still adapting to current technologies, we will not be fully prepared for what lies ahead. Therefore, the future remains uncertain.

Press enter or click to view image in full size
In 1900, it was predicted that by the year 2000, teachers would use machines to inject the contents of books directly into students’ brains. This image illustrates how predictions can often differ greatly from what actually comes to pass. (Source: Public Domain).

In any case, when we look to the past, we see that every improvement in our ability to process information has largely been associated with a better quality of life, whether through scientific progress, technological development or fairer legislation. Although we are still far from building a truly equal society, we live in a substantially more civilised world than in earlier times. Therefore, despite uncertainty about what the future holds, history itself provides grounds for optimism that it may be even better than the present.

If you’re curious about my research, visit this link:

See you next time!

To find out more:

--

--

Francisco Rodrigues, PhD
Francisco Rodrigues, PhD

Written by Francisco Rodrigues, PhD

Scientist. Nonfiction author. Professor of Data Science and Complex Systems at the University of São Paulo. https://linktr.ee/francisco.rodrigues

No responses yet