New studies reveal a comprehensive evolution mirroring human adaptation in various aspects of life. This evolution challenges our understanding of established linguistic theories. One study indicates that some languages may diminish or excel in conveying our words, bringing the languages of close relatives closer. Another study suggests that languages of close relatives adhere to a specific essential structure, marking a distinct characteristic of human language. Simon Kirby, Professor of Language Evolution at the University of Edinburgh, notes: "These findings challenge old assumptions about the uniqueness of human language and reveal the deep similarities between seemingly distant types of evolution." Linguistic capabilities may assist in understanding other animal languages and how humans think in a better way. A group of researchers suggests that the number of types with established communication systems, and the number of them exhibiting constant features, can be considered unique to humans. In a paper on capacity, computer scientist Mason Youngblood from Stony Brook University, along with linguistic colleagues, analyzed the capacity of 51 human languages and 6,511 language features. Youngblood observes that natural selection favors effective communication, aiding individuals in sharing information quickly and broadly. Youngblood acknowledges that established references can transfer meaning, and repetition aids in precise transfer, but these benefits are not significant, as words are often short and fleeting, and may deceive interested parties. To measure the capacity for communication between humans and animals, Youngblood used two linguistic principles: Menzerath's Law and Zipf's Law. According to Menzerath's Law, the capacity decreases as the lengths of its constituents, such as words, syllables, or sounds, become shorter, as Youngblood explains. According to Zipf's Law, communication systems become more efficient when frequently used elements – such as words, sounds, and gestures – are shorter. Youngblood applied both laws to strings of data from 16 language types, including Balinese, Dolphin, and other artificial languages. For comparison, he also assessed 51 human languages. The findings from 11 of the 16 types showed Menzerath's Law, indicating that it diminishes or excels in human words. The artificial systems were Orca languages, Dolphin Hector, Dolphin Commerson, and Dolphin Heaviside, as noted on the Science Alert website. Youngblood found that most types did not exhibit Zipf's Law, and this only appeared in the languages of close relatives, with the languages of close relatives being the only ones that adapt human principles. In the second study, researchers focused on languages of close relatives, applying the combined computational principles to assess words among satisfied children. Previous studies limited the human language to aid in learning and enhance language preservation, presenting the contextual units as a repetitive reinforcement governed by the power law known as Zipf distribution. This helps children learn the language more easily, as the researchers note in the new study, and "it is suggested that precise preservation of language across generations." Languages of close relatives are characterized by being consistent, and often what arises from hidden components, on behalf of the example, the languages arise through using individual sound components, then the components repeat to form the subjects, and the subjects gather together to build the meanings. Languages of close relatives transfer meanings quickly, as they interact with the language, as the authors say that if the essential characteristics of human languages evolve from a faster transfer, similar signs should appear in languages. Considering this idea, they analyzed dozens of years of languages of close relatives, using word segmentation techniques for use with satisfied children. Languages of close relatives possess the same essential structure for human language, which offers a glimpse into the origins of established communication. This reveals a hidden structure in the languages, whereby the contextual sub-strings are unique and their repetition follows Zipf distribution, and are the features found in all human languages. The researchers add that the lengths of the sub-strings adhere more to Zipf's Law for approval, which indicates that the most frequently used linguistic units tend to be shorter.
Language Evolution Mirrors Human Adaptation: Study Reveals Deep Linguistic Structures
Edited by: Vera Mo
Read more news on this topic:
Did you find an error or inaccuracy?
We will consider your comments as soon as possible.