Loading
  • 21 Aug, 2019

  • By, Wikipedia

Semantic

Semantics is the study of linguistic meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction between sense and reference. Sense is given by the ideas and concepts associated with an expression while reference is the object to which an expression points. Semantics contrasts with syntax, which studies the rules that dictate how to create grammatically correct sentences, and pragmatics, which investigates how people use language in communication.

Lexical semantics is the branch of semantics that studies word meaning. It examines whether words have one or several meanings and in what lexical relations they stand to one another. Phrasal semantics studies the meaning of sentences by exploring the phenomenon of compositionality or how new meanings can be created by arranging words. Formal semantics relies on logic and mathematics to provide precise frameworks of the relation between language and meaning. Cognitive semantics examines meaning from a psychological perspective and assumes a close relation between language ability and the conceptual structures used to understand the world. Other branches of semantics include conceptual semantics, computational semantics, and cultural semantics.

Theories of meaning are general explanations of the nature of meaning and how expressions are endowed with it. According to referential theories, the meaning of an expression is the part of reality to which it points. Ideational theories identify meaning with mental states like the ideas that an expression evokes in the minds of language users. According to causal theories, meaning is determined by causes and effects, which behaviorist semantics analyzes in terms of stimulus and response. Further theories of meaning include truth-conditional semantics, verificationist theories, the use theory, and inferentialist semantics.

The study of semantic phenomena began during antiquity but was not recognized as an independent field of inquiry until the 19th century. Semantics is relevant to the fields of formal logic, computer science, and psychology.

Semantics is the study of meaning in languages. It is a systematic inquiry that examines what linguistic meaning is and how it arises. It investigates how expressions are built up from different layers of constituents, like morphemes, words, clauses, sentences, and texts, and how the meanings of the constituents affect one another. Semantics can focus on a specific language, like English, but in its widest sense, it investigates meaning structures relevant to all languages. As a descriptive discipline, it aims to determine how meaning works without prescribing what meaning people should associate with particular expressions. Some of its key questions are "How do the meanings of words combine to create the meanings of sentences?", "How do meanings relate to the minds of language users, and to the things words refer to?", and "What is the connection between what a word means, and the contexts in which it is used?". The main disciplines engaged in semantics are linguistics, semiotics, and philosophy. Besides its meaning as a field of inquiry, semantics can also refer to theories within this field, like truth-conditional semantics, and to the meaning of particular expressions, like the semantics of the word fairy.

As a field of inquiry, semantics has both an internal and an external side. The internal side is interested in the connection between words and the mental phenomena they evoke, like ideas and conceptual representations. The external side examines how words refer to objects in the world and under what conditions a sentence is true.

Many related disciplines investigate language and meaning. Semantics contrasts with other subfields of linguistics focused on distinct aspects of language. Phonology studies the different types of sounds used in languages and how sounds are connected to form words while syntax examines the rules that dictate how to arrange words to create sentences. These divisions are reflected in the fact that it is possible to master some aspects of a language while lacking others, like when a person knows how to pronounce a word without knowing its meaning. As a subfield of semiotics, semantics has a more narrow focus on meaning in language while semiotics studies both linguistic and non-linguistic signs. Semiotics investigates additional topics like the meaning of non-verbal communication, conventional symbols, and natural signs independent of human interaction. Examples include nodding to signal agreement, stripes on a uniform signifying rank, and the presence of vultures indicating a nearby animal carcass.

Semantics further contrasts with pragmatics, which is interested in how people use language in communication. An expression like "That's what I'm talking about" can mean many things depending on who says it and in what situation. Semantics is interested in the possible meanings of expressions: what they can and cannot mean in general. In this regard, it is sometimes defined as the study of context-independent meaning. Pragmatics examines which of these possible meanings is relevant in a particular case. In contrast to semantics, it is interested in actual performance rather than in the general linguistic competence underlying this performance. This includes the topic of additional meaning that can be inferred even though it is not literally expressed, like what it means if a speaker remains silent on a certain topic. A closely related distinction by the semiotician Charles W. Morris holds that semantics studies the relation between words and the world, pragmatics examines the relation between words and users, and syntax focuses on the relation between different words.

Semantics is related to etymology, which studies how words and their meanings changed in the course of history. Another connected field is hermeneutics, which is the art or science of interpretation and is concerned with the right methodology of interpreting text in general and scripture in particular. Metasemantics examines the metaphysical foundations of meaning and aims to explain where it comes from or how it arises.

The word semantics originated from the Ancient Greek adjective semantikos, meaning 'relating to signs', which is a derivative of sēmeion, the noun for 'sign'. It was initially used for medical symptoms and only later acquired its wider meaning regarding any type of sign, including linguistic signs. The word semantics entered the English language from the French term semantique, which the linguist Michel Bréal first introduced at the end of the 19th century.

Basic concepts

Meaning

Semantics studies meaning in language, which is limited to the meaning of linguistic expressions. It concerns how signs are interpreted and what information they contain. An example is the meaning of words provided in dictionary definitions by giving synonymous expressions or paraphrases, like defining the meaning of the term ram as adult male sheep. There are many forms of non-linguistic meaning that are not examined by semantics. Actions and policies can have meaning in relation to the goal they serve. Fields like religion and spirituality are interested in the meaning of life, which is about finding a purpose in life or the significance of existence in general.

Photo of a dictionary
Semantics is not focused on subjective speaker meaning and is instead interested in public meaning, like the meaning found in general dictionary definitions.

Linguistic meaning can be analyzed on different levels. Word meaning is studied by lexical semantics and investigates the denotation of individual words. It is often related to concepts of entities, like how the word dog is associated with the concept of the four-legged domestic animal. Sentence meaning falls into the field of phrasal semantics and concerns the denotation of full sentences. It usually expresses a concept applying to a type of situation, as in the sentence "the dog has ruined my blue skirt". The meaning of a sentence is often referred to as a proposition. Different sentences can express the same proposition, like the English sentence "the tree is green" and the German sentence "der Baum ist grün". Utterance meaning is studied by pragmatics and is about the meaning of an expression on a particular occasion. Sentence meaning and utterance meaning come apart in cases where expressions are used in a non-literal way, as is often the case with irony.

Semantics is primarily interested in the public meaning that expressions have, like the meaning found in general dictionary definitions. Speaker meaning, by contrast, is the private or subjective meaning that individuals associate with expressions. It can diverge from the literal meaning, like when a person associates the word needle with pain or drugs.

Sense and reference

Bust of Gottlob Frege
The distinction between sense and reference was first introduced by the philosopher Gottlob Frege.

Meaning is often analyzed in terms of sense and reference, also referred to as intension and extension or connotation and denotation. The referent of an expression is the object to which the expression points. The sense of an expression is the way in which it refers to that object or how the object is interpreted. For example, the expressions morning star and evening star refer to the same planet, just like the expressions 2 + 2 and 3 + 1 refer to the same number. The meanings of these expressions differ not on the level of reference but on the level of sense. Sense is sometimes understood as a mental phenomenon that helps people identify the objects to which an expression refers. Some semanticists focus primarily on sense or primarily on reference in their analysis of meaning. To grasp the full meaning of an expression, it is usually necessary to understand both to what entities in the world it refers and how it describes them.

The distinction between sense and reference can explain identity statements, which can be used to show how two expressions with a different sense have the same referent. For instance, the sentence "the morning star is the evening star" is informative and people can learn something from it. The sentence "the morning star is the morning star", by contrast, is an uninformative tautology since the expressions are identical not only on the level of reference but also on the level of sense.

Compositionality

Compositionality is a key aspect of how languages construct meaning. It is the idea that the meaning of a complex expression is a function of the meanings of its parts. It is possible to understand the meaning of the sentence "Zuzana owns a dog" by understanding what the words Zuzana, owns, a and dog mean and how they are combined. In this regard, the meaning of complex expressions like sentences is different from word meaning since it is normally not possible to deduce what a word means by looking at its letters and one needs to consult a dictionary instead.

Compositionality is often used to explain how people can formulate and understand an almost infinite number of meanings even though the amount of words and cognitive resources is finite. Many sentences that people read are sentences that they have never seen before and they are nonetheless able to understand them.

When interpreted in a strong sense, the principle of compositionality states that the meaning of a complex expression is not just affected by its parts and how they are combined but fully determined this way. It is controversial whether this claim is correct or whether additional aspects influence meaning. For example, context may affect the meaning of expressions; idioms like "kick the bucket" carry figurative or non-literal meanings that are not directly reducible to the meanings of their parts.

Truth and truth conditions

Truth is a property of statements that accurately present the world and true statements are in accord with reality. Whether a statement is true usually depends on the relation between the statement and the rest of the world. The truth conditions of a statement are the way the world needs to be for the statement to be true. For example, it belongs to the truth conditions of the sentence "it is raining outside" that raindrops are falling from the sky. The sentence is true if it is used in a situation in which the truth conditions are fulfilled, i.e., if there is actually rain outside.

Truth conditions play a central role in semantics and some theories rely exclusively on truth conditions to analyze meaning. To understand a statement usually implies that one has an idea about the conditions under which it would be true. This can happen even if one does not know whether the conditions are fulfilled.

Semiotic triangle

Diagram of the semiotic triangle
The semiotic triangle aims to explain how the relation between language (Symbol) and world (Referent) is mediated by the language users (Thought or Reference).

The semiotic triangle, also called the triangle of meaning, is a model used to explain the relation between language, language users, and the world, represented in the model as Symbol, Thought or Reference, and Referent. The symbol is a linguistic signifier, either in its spoken or written form. The central idea of the model is that there is no direct relation between a linguistic expression and what it refers to, as was assumed by earlier dyadic models. This is expressed in the diagram by the dotted line between symbol and referent.

The model holds instead that the relation between the two is mediated through a third component. For example, the term apple stands for a type of fruit but there is no direct connection between this string of letters and the corresponding physical object. The relation is only established indirectly through the mind of the language user. When they see the symbol, it evokes a mental image or a concept, which establishes the connection to the physical object. This process is only possible if the language user learned the meaning of the symbol before. The meaning of a specific symbol is governed by the conventions of a particular language. The same symbol may refer to one object in one language, to another object in a different language, and to no object in another language.

Others

Many other concepts are used to describe semantic phenomena. The semantic role of an expression is the function it fulfills in a sentence. In the sentence "the boy kicked the ball", the boy has the role of the agent who performs an action. The ball is the theme or patient of this action as something that does not act itself but is involved in or affected by the action. The same entity can be both agent and patient, like when someone cuts themselves. An entity has the semantic role of an instrument if it is used to perform the action, for instance, when cutting something with a knife then the knife is the instrument. For some sentences, no action is described but an experience takes place, like when a girl sees a bird. In this case, the girl has the role of the experiencer. Other common semantic roles are location, source, goal, beneficiary, and stimulus.

Lexical relations describe how words stand to one another. Two words are synonyms if they share the same or a very similar meaning, like car and automobile or buy and purchase. Antonyms have opposite meanings, such as the contrast between alive and dead or fast and slow. One term is a hyponym of another term if the meaning of the first term is included in the meaning of the second term. For example, ant is a hyponym of insect. A prototype is a hyponym that has characteristic features of the type it belongs to. A robin is a prototype of a bird but a penguin is not. Two words with the same pronunciation are homophones like flour and flower, while two words with the same spelling are homonyms, like a bank of a river in contrast to a bank as a financial institution. Hyponymy is closely related to meronymy, which describes the relation between part and whole. For instance, wheel is a meronym of car. An expression is ambiguous if it has more than one possible meaning. In some cases, it is possible to disambiguate them to discern the intended meaning. The term polysemy is used if the different meanings are closely related to one another, like the meanings of the word head, which can refer to the topmost part of the human body or the top-ranking person in an organization.

The meaning of words can often be subdivided into meaning components called semantic features. The word horse has the semantic feature animate but lacks the semantic feature human. It may not always be possible to fully reconstruct the meaning of a word by identifying all its semantic features.

A semantic or lexical field is a group of words that are all related to the same activity or subject. For instance, the semantic field of cooking includes words like bake, boil, spice, and pan.

The context of an expression refers to the situation or circumstances in which it is used and includes time, location, speaker, and audience. It also encompasses other passages in a text that come before and after it. Context affects the meaning of various expressions, like the deictic expression here and the anaphoric expression she.

A syntactic environment is extensional or transparent if it is always possible to exchange expressions with the same reference without affecting the truth value of the sentence. For example, the environment of the sentence "the number 8 is even" is extensional because replacing the expression the number 8 with the number of planets in the solar system does not change its truth value. For intensional or opaque contexts, this type of substitution is not always possible. For instance, the embedded clause in "Paco believes that the number 8 is even" is intensional since Paco may not know that the number of planets in the solar system is 8.

Semanticists commonly distinguish the language they study, called object language, from the language they use to express their findings, called metalanguage. When a professor uses Japanese to teach their student how to interpret the language of first-order logic then the language of first-order logic is the object language and Japanese is the metalanguage. The same language may occupy the role of object language and metalanguage at the same time. This is the case in monolingual English dictionaries, in which both the entry term belonging to the object language and the definition text belonging to the metalanguage are taken from the English language.

Branches

Lexical semantics

Lexical semantics is the sub-field of semantics that studies word meaning. It examines semantic aspects of individual words and the vocabulary as a whole. This includes the study of lexical relations between words, such as whether two terms are synonyms or antonyms. Lexical semantics categorizes words based on semantic features they share and groups them into semantic fields unified by a common subject. This information is used to create taxonomies to organize lexical knowledge, for example, by distinguishing between physical and abstract entities and subdividing physical entities into stuff and individuated entities. Further topics of interest are polysemy, ambiguity, and vagueness.

Lexical semantics is sometimes divided into two complementary approaches: semasiology and onomasiology. Semasiology starts from words and examines what their meaning is. It is interested in whether words have one or several meanings and how those meanings are related to one another. Instead of going from word to meaning, onomasiology goes from meaning to word. It starts with a concept and examines what names this concept has or how it can be expressed in a particular language.

Some semanticists also include the study of lexical units other than words in the field of lexical semantics. Compound expressions like being under the weather have a non-literal meaning that acts as a unit and is not a direct function of its parts. Another topic concerns the meaning of morphemes that make up words, for instance, how negative prefixes like in- and dis- affect the meaning of the words they are part of, as in inanimate and dishonest.

Phrasal semantics

Phrasal semantics studies the meaning of sentences. It relies on the principle of compositionality to explore how the meaning of complex expressions arises from the combination of their parts. The different parts can be analyzed as subject, predicate, or argument. The subject of a sentence usually refers to a specific entity while the predicate describes a feature of the subject or an event in which the subject participates. Arguments provide additional information to complete the predicate. For example, in the sentence "Mary hit the ball", Mary is the subject, hit is the predicate, and the ball is an argument. A more fine-grained categorization distinguishes between different semantic roles of words, such as agent, patient, theme, location, source, and goal.

Diagram of a parse tree
Parse trees, like the constituency-based parse tree, show how expressions are combined to form sentences.

Verbs usually function as predicates and often help to establish connections between different expressions to form a more complex meaning structure. In the expression "Beethoven likes Schubert", the verb like connects a liker to the object of their liking. Other sentence parts modify meaning rather than form new connections. For instance, the adjective red modifies the color of another entity in the expression red car. A further compositional device is variable binding, which is used to determine the reference of a term. For example, the last part of the expression "the woman who likes Beethoven" specifies which woman is meant. Parse trees can be used to show the underlying hierarchy employed to combine the different parts. Various grammatical devices, like the gerund form, also contribute to meaning and are studied by grammatical semantics.

Formal semantics

Formal semantics uses formal tools from logic and mathematics to analyze meaning in natural languages. It aims to develop precise logical formalisms to clarify the relation between expressions and their denotation. One of its key tasks is to provide frameworks of how language represents the world, for example, using ontological models to show how linguistic expressions map to the entities of that model. A common idea is that words refer to individual objects or groups of objects while sentences relate to events and states. Sentences are mapped to a truth value based on whether their description of the world is in correspondence with its ontological model.

Formal semantics further examines how to use formal mechanisms to represent linguistic phenomena such as quantification, intensionality, noun phrases, plurals, mass terms, tense, and modality. Montague semantics is an early and influential theory in formal semantics that provides a detailed analysis of how the English language can be represented using mathematical logic. It relies on higher-order logic, lambda calculus, and type theory to show how meaning is created through the combination of expressions belonging to different syntactic categories.

Dynamic semantics is a subfield of formal semantics that focuses on how information grows over time. According to it, "meaning is context change potential": the meaning of a sentence is not given by the information it contains but by the information change it brings about relative to a context.

Cognitive semantics

Diagram of a hypotenuse
Cognitive semantics is interested in the conceptual structures underlying language, which can be articulated through the contrast between profile and base. For instance, the term hypotenuse profiles a straight line against the background of a right-angled triangle.

Cognitive semantics studies the problem of meaning from a psychological perspective or how the mind of the language user affects meaning. As a subdiscipline of cognitive linguistics, it sees language as a wide cognitive ability that is closely related to the conceptual structures used to understand and represent the world. Cognitive semanticists do not draw a sharp distinction between linguistic knowledge and knowledge of the world and see them instead as interrelated phenomena. They study how the interaction between language and human cognition affects the conceptual organization in very general domains like space, time, causation, and action. The contrast between profile and base is sometimes used to articulate the underlying knowledge structure. The profile of a linguistic expression is the aspect of the knowledge structure that it brings to the foreground while the base is the background that provides the context of this aspect without being at the center of attention. For example, the profile of the word hypotenuse is a straight line while the base is a right-angled triangle of which the hypotenuse forms a part.

Cognitive semantics further compares the conceptual patterns and linguistic typologies across languages and considers to what extent the cognitive conceptual structures of humans are universal or relative to their linguistic background. Another research topic concerns the psychological processes involved in the application of grammar. Other investigated phenomena include categorization, which is understood as a cognitive heuristic to avoid information overload by regarding different entities in the same way, and embodiment, which concerns how the language user's bodily experience affects the meaning of expressions.

Frame semantics is an important subfield of cognitive semantics. Its central idea is that the meaning of terms cannot be understood in isolation from each other but needs to be analyzed on the background of the conceptual structures they depend on. These structures are made explicit in terms of semantic frames. For example, words like bride, groom, and honeymoon evoke in the mind the frame of marriage.

Others

Conceptual semantics shares with cognitive semantics the idea of studying linguistic meaning from a psychological perspective by examining how humans conceptualize and experience the world. It holds that meaning is not about the objects to which expressions refer but about the cognitive structure of human concepts that connect thought, perception, and action. Conceptual semantics differs from cognitive semantics by introducing a strict distinction between meaning and syntax and by relying on various formal devices to explore the relation between meaning and cognition.

Computational semantics examines how the meaning of natural language expressions can be represented and processed on computers. It often relies on the insights of formal semantics and applies them to problems that can be computationally solved. Some of its key problems include computing the meaning of complex expressions by analyzing their parts, handling ambiguity, vagueness, and context-dependence, and using the extracted information in automatic reasoning. It forms part of computational linguistics, artificial intelligence, and cognitive science. Its applications include machine learning and machine translation.

Cultural semantics studies the relation between linguistic meaning and culture. It compares conceptual structures in different languages and is interested in how meanings evolve and change because of cultural phenomena associated with politics, religion, and customs. For example, address practices encode cultural values and social hierarchies, as in the difference of politeness of expressions like tu and usted in Spanish or du and Sie in German in contrast to English, which lacks these distinctions and uses the pronoun you in either case. Closely related fields are intercultural semantics, cross-cultural semantics, and comparative semantics.

Pragmatic semantics studies how the meaning of an expression is shaped by the situation in which it is used. It is based on the idea that communicative meaning is usually context-sensitive and depends on who participates in the exchange, what information they share, and what their intentions and background assumptions are. It focuses on communicative actions, of which linguistic expressions only form one part. Some theorists include these topics within the scope of semantics while others consider them part of the distinct discipline of pragmatics.

Theories of meaning

Theories of meaning explain what meaning is, what meaning an expression has, and how the relation between expression and meaning is established.

Referential

Diagram of referential theories
Referential theories identify meaning with the entities to which expressions point.

Referential theories state that the meaning of an expression is the entity to which it points. The meaning of singular terms like names is the individual to which they refer. For example, the meaning of the name George Washington is the person with this name. General terms refer not to a single entity but to the set of objects to which this term applies. In this regard, the meaning of the term cat is the set of all cats. Similarly, verbs usually refer to classes of actions or events and adjectives refer to properties of individuals and events.

Simple referential theories face problems for meaningful expressions that have no clear referent. Names like Pegasus and Santa Claus have meaning even though they do not point to existing entities. Other difficulties concern cases in which different expressions are about the same entity. For instance, the expressions Roger Bannister and the first man to run a four-minute mile refer to the same person but do not mean exactly the same thing. This is particularly relevant when talking about beliefs since a person may understand both expressions without knowing that they point to the same entity. A further problem is given by expressions whose meaning depends on the context, like the deictic terms here and I.

To avoid these problems, referential theories often introduce additional devices. Some identify meaning not directly with objects but with functions that point to objects. This additional level has the advantage of taking the context of an expression into account since the same expression may point to one object in one context and to another object in a different context. For example, the reference of the word here depends on the location in which it is used. A closely related approach is possible world semantics, which allows expressions to refer not only to entities in the actual world but also to entities in other possible worlds. According to this view, expressions like the first man to run a four-minute mile refer to different persons in different worlds. This view can also be used to analyze sentences that talk about what is possible or what is necessary: possibility is what is true in some possible worlds while necessity is what is true in all possible worlds.

Ideational

Diagram of ideational theories
Ideational theories identify meaning with the mental states of language users.

Ideational theories, also called mentalist theories, are not primarily interested in the reference of expressions and instead explain meaning in terms of the mental states of language users. One historically influential approach articulated by John Locke holds that expressions stand for ideas in the speaker's mind. According to this view, the meaning of the word dog is the idea that people have of dogs. Language is seen as a medium used to transfer ideas from the speaker to the audience. After having learned the same meaning of signs, the speaker can produce a sign that corresponds to the idea in their mind and the perception of this sign evokes the same idea in the mind of the audience.

A closely related theory focuses not directly on ideas but on intentions. This view is particularly associated with Paul Grice, who observed that people usually communicate to cause some reaction in their audience. He held that the meaning of an expression is given by the intended reaction. This means that communication is not just about decoding what the speaker literally said but requires an understanding of their intention or why they said it. For example, telling someone looking for petrol that "there is a garage around the corner" has the meaning that petrol can be obtained there because of the speaker's intention to help. This goes beyond the literal meaning, which has no explicit connection to petrol.

Causal

Causal theories hold that the meaning of an expression depends on the causes and effects it has. According to behaviorist semantics, also referred to as stimulus-response theory, the meaning of an expression is given by the situation that prompts the speaker to use it and the response it provokes in the audience. For instance, the meaning of yelling "Fire!" is given by the presence of an uncontrolled fire and attempts to control it or seek safety. Behaviorist semantics relies on the idea that learning a language consists in adopting behavioral patterns in the form of stimulus-response pairs. One of its key motivations is to avoid private mental entities and define meaning instead in terms of publicly observable language behavior.

Another causal theory focuses on the meaning of names and holds that a naming event is required to establish the link between name and named entity. This naming event acts as a form of baptism that establishes the first link of a causal chain in which all subsequent uses of the name participate. According to this view, the name Plato refers to an ancient Greek philosopher because, at some point, he was originally named this way and people kept using this name to refer to him. This view was originally formulated by Saul Kripke to apply to names only but has been extended to cover other types of speech as well.

Others

Truth-conditional semantics analyzes the meaning of sentences in terms of their truth conditions. According to this view, to understand a sentence means to know what the world needs to be like for the sentence to be true. Truth conditions can themselves be expressed through possible worlds. For example, the sentence "Hillary Clinton won the 2016 American presidential election" is false in the actual world but there are some possible worlds in which it is true. The extension of a sentence can be interpreted as its truth value while its intension is the set of all possible worlds in which it is true. Truth-conditional semantics is closely related to verificationist theories, which introduce the additional idea that there should be some kind of verification procedure to assess whether a sentence is true. They state that the meaning of a sentence consists in the method to verify it or in the circumstances that justify it. For instance, scientific claims often make predictions, which can be used to confirm or disconfirm them using observation. According to verificationism, sentences that can neither be verified nor falsified are meaningless.

The use theory states that the meaning of an expression is given by the way it is utilized. This view was first introduced by Ludwig Wittgenstein, who understood language as a collection of language games. The meaning of expressions depends on how they are used inside a game and the same expression may have different meanings in different games. Some versions of this theory identify meaning directly with patterns of regular use. Others focus on social norms and conventions by additionally taking into account whether a certain use is considered appropriate in a given society.

Inferentialist semantics, also called conceptual role semantics, holds that the meaning of an expression is given by the role it plays in the premises and conclusions of good inferences. For example, one can infer from "x is a male sibling" that "x is a brother" and one can infer from "x is a brother" that "x has parents". According to inferentialist semantics, the meaning of the word brother is determined by these and all similar inferences that can be drawn.

History

Semantics was established as an independent field of inquiry in the 19th century but the study of semantic phenomena began as early as the ancient period as part of philosophy and logic. In ancient Greece, Plato (427–347 BCE) explored the relation between names and things in his dialogue Cratylus. It considers the positions of naturalism, which holds that things have their name by nature, and conventionalism, which states that names are related to their referents by customs and conventions among language users. The book On Interpretation by Aristotle (384–322 BCE) introduced various conceptual distinctions that greatly influenced subsequent works in semantics. He developed an early form of the semantic triangle by holding that spoken and written words evoke mental concepts, which refer to external things by resembling them. For him, mental concepts are the same for all humans, unlike the conventional words they associate with those concepts. The Stoics incorporated many of the insights of their predecessors to develop a complex theory of language through the perspective of logic. They discerned different kinds of words by their semantic and syntactic roles, such as the contrast between names, common nouns, and verbs. They also discussed the difference between statements, commands, and prohibitions.

Painting of Bhartṛhari
Bhartṛhari developed and compared various semantic theories of the meaning of words.

In ancient India, the orthodox school of Nyaya held that all names refer to real objects. It explored how words lead to an understanding of the thing meant and what consequence this relation has to the creation of knowledge. Philosophers of the orthodox school of Mīmāṃsā discussed the relation between the meanings of individual words and full sentences while considering which one is more basic. The book Vākyapadīya by Bhartṛhari (4th–5th century CE) distinguished between different types of words and considered how they can carry different meanings depending on how they are used. In ancient China, the Mohists argued that names play a key role in making distinctions to guide moral behavior. They inspired the School of Names, which explored the relation between names and entities while examining how names are required to identify and judge entities.

Statue of Abelard
One of Peter Abelard's innovations was his focus on the meaning of full sentences rather than the meaning of individual words.

In the Middle Ages, Augustine of Hippo (354–430) developed a general conception of signs as entities that stand for other entities and convey them to the intellect. He was the first to introduce the distinction between natural and linguistic signs as different types belonging to a common genus. Boethius (480–528) wrote a translation of and various comments on Aristotle's book On Interpretation, which popularized its main ideas and inspired reflections on semantic phenomena in the scholastic tradition. An innovation in the semantics of Peter Abelard (1079–1142) was his interest in propositions or the meaning of sentences in contrast to the focus on the meaning of individual words by many of his predecessors. He further explored the nature of universals, which he understood as mere semantic phenomena of common names caused by mental abstractions that do not refer to any entities. In the Arabic tradition, Ibn Faris (920–1004) identified meaning with the intention of the speaker while Abu Mansur al-Azhari (895–980) held that meaning resides directly in speech and needs to be extracted through interpretation.

An important topic towards the end of the Middle Ages was the distinction between categorematic and syncategorematic terms. Categorematic terms have an independent meaning and refer to some part of reality, like horse and Socrates. Syncategorematic terms lack independent meaning and fulfill other semantic functions, such as modifying or quantifying the meaning of other expressions, like the words some, not, and necessarily. An early version of the causal theory of meaning was proposed by Roger Bacon (c. 1219/20 – c. 1292), who held that things get names similar to how people get names through some kind of initial baptism. His ideas inspired the tradition of the speculative grammarians, who proposed that there are certain universal structures found in all languages. They arrived at this conclusion by drawing an analogy between the modes of signification on the level of language, the modes of understanding on the level of mind, and the modes of being on the level of reality.

In the early modern period, Thomas Hobbes (1588–1679) distinguished between marks, which people use privately to recall their own thoughts, and signs, which are used publicly to communicate their ideas to others. In their Port-Royal Logic, Antoine Arnauld (1612–1694) and Pierre Nicole (1625–1695) developed an early precursor of the distinction between intension and extension. The Essay Concerning Human Understanding by John Locke (1632–1704) presented an influential version of the ideational theory of meaning, according to which words stand for ideas and help people communicate by transferring ideas from one mind to another. Gottfried Wilhelm Leibniz (1646–1716) understood language as the mirror of thought and tried to conceive the outlines of a universal formal language to express scientific and philosophical truths. This attempt inspired theorists Christian Wolff (1679–1754), Georg Bernhard Bilfinger (1693–1750), and Johann Heinrich Lambert (1728–1777) to develop the idea of a general science of sign systems. Étienne Bonnot de Condillac (1715–1780) accepted and further developed Leibniz's idea of the linguistic nature of thought. Against Locke, he held that language is involved in the creation of ideas and is not merely a medium to communicate them.

Photo of Michel Jules Alfred Bréal
Michel Bréal coined the French term semantique and conceptualized the scope of this field of inquiry.

In the 19th century, semantics emerged and solidified as an independent field of inquiry. Christian Karl Reisig (1792–1829) is sometimes credited as the father of semantics since he clarified its concept and scope while also making various contributions to its key ideas. Michel Bréal (1832–1915) followed him in providing a broad conception of the field, for which he coined the French term semantique. John Stuart Mill (1806–1873) gave great importance to the role of names to refer to things. He distinguished between the connotation and denotation of names and held that propositions are formed by combining names. Charles Sanders Peirce (1839–1914) conceived semiotics as a general theory of signs with several subdisciplines, which were later identified by Charles W. Morris (1901–1979) as syntactics, semantics, and pragmatics. In his pragmatist approach to semantics, Peirce held that the meaning of conceptions consists in the entirety of their practical consequences. The philosophy of Gottlob Frege (1848–1925) contributed to semantics on many different levels. Frege first introduced the distinction between sense and reference, and his development of predicate logic and the principle of compositionality formed the foundation of many subsequent developments in formal semantics. Edmund Husserl (1859–1938) explored meaning from a phenomenological perspective by considering the mental acts that endow expressions with meaning. He held that meaning always implies reference to an object and expressions that lack a referent, like green is or, are meaningless.

In the 20th century, Alfred Tarski (1901–1983) defined truth in formal languages through his semantic theory of truth, which was influential in the development of truth-conditional semantics by Donald Davidson (1917–2003). Tarski's student Richard Montague (1930–1971) formulated a complex formal framework of the semantics of the English language, which was responsible for establishing formal semantics as a major area of research. According to structural semantics, which was inspired by the structuralist philosophy of Ferdinand de Saussure (1857–1913), language is a complex network of structural relations and the meanings of words are not fixed individually but depend on their position within this network. The theory of general semantics was developed by Alfred Korzybski (1879–1950) as an inquiry into how language represents reality and affects human thought. The contributions of George Lakoff (1941–present) and Ronald Langacker (1942–present) provided the foundation of cognitive semantics. Charles J. Fillmore (1929–2014) developed frame semantics as a major approach in this area. The closely related field of conceptual semantics was inaugurated by Ray Jackendoff (1945–present).

In various disciplines

Logic

Logicians study correct reasoning and often develop formal languages to express arguments and assess their correctness. One part of this process is to provide a semantics for a formal language to precisely define what its terms mean. A semantics of a formal language is a set of rules, usually expressed as a mathematical function, that assigns meanings to formal language expressions. For example, the language of first-order logic uses lowercase letters for individual constants and uppercase letters for predicates. To express the sentence "Bertie is a dog", the formula can be used where is an individual constant for Bertie and is a predicate for dog. Classical model-theoretic semantics assigns meaning to these terms by defining an interpretation function that maps individual constants to specific objects and predicates to sets of objects or tuples. The function maps to Bertie and to the set of all dogs. This way, it is possible to calculate the truth value of the sentence: it is true if Bertie is a member of the set of dogs and false otherwise.

Formal logic aims to determine whether arguments are deductively valid, that is, whether the premises entail the conclusion. Entailment can be defined in terms of syntax or in terms of semantics. Syntactic entailment, expressed with the symbol , relies on rules of inference, which can be understood as procedures to transform premises and arrive at a conclusion. These procedures only take the logical form of the premises on the level of syntax into account and ignore what meaning they express. Semantic entailment, expressed with the symbol , looks at the meaning of the premises, in particular, at their truth value. A conclusion follows semantically from a set of premises if the truth of the premises ensures the truth of the conclusion, that is, if any semantic interpretation function that assigns the premises the value true also assigns the conclusion the value true.

Computer science

In computer science, the semantics of a program is how it behaves when a computer runs it. Semantics contrasts with syntax, which is the particular form in which instructions are expressed. The same behavior can usually be described with different forms of syntax. In JavaScript, this is the case for the commands i += 1 and i = i + 1, which are syntactically different expressions to increase the value of the variable i by one. This difference is also reflected in different programming languages since they rely on different syntax but can usually be employed to create programs with the same behavior on the semantic level.

Static semantics focuses on semantic aspects that affect the compilation of a program. In particular, it is concerned with detecting errors of syntactically correct programs, such as type errors, which arise when an operation receives an incompatible data type. This is the case, for instance, if a function performing a numerical calculation is given a string instead of a number as an argument. Dynamic semantics focuses on the run time behavior of programs, that is, what happens during the execution of instructions. The main approaches to dynamic semantics are denotational, axiomatic, and operational semantics. Denotational semantics relies on mathematical formalisms to describe the effects of each element of the code. Axiomatic semantics uses deductive logic to analyze which conditions must be in place before and after the execution of a program. Operational semantics interprets the execution of a program as a series of steps, each involving the transition from one state to another state.

Psychology

Psychological semantics examines psychological aspects of meaning. It is concerned with how meaning is represented on a cognitive level and what mental processes are involved in understanding and producing language. It further investigates how meaning interacts with other mental processes, such as the relation between language and perceptual experience. Other issues concern how people learn new words and relate them to familiar things and concepts, how they infer the meaning of compound expressions they have never heard before, how they resolve ambiguous expressions, and how semantic illusions lead them to misinterpret sentences.

One key topic is semantic memory, which is a form of general knowledge of meaning that includes the knowledge of language, concepts, and facts. It contrasts with episodic memory, which records events that a person experienced in their life. The comprehension of language relies on semantic memory and the information it carries about word meanings. According to a common view, word meanings are stored and processed in relation to their semantic features. The feature comparison model states that sentences like "a robin is a bird" are assessed on a psychological level by comparing the semantic features of the word robin with the semantic features of the word bird. The assessment process is fast if their semantic features are similar, which is the case if the example is a prototype of the general category. For atypical examples, as in the sentence "a penguin is a bird", there is less overlap in the semantic features and the psychological process is significantly slower.

See also

References

Notes

  1. ^ The study of meaning structures found in all languages is sometimes referred to as universal semantics.
  2. ^ Semantics usually focuses on natural languages but it can also include the study of meaning in formal languages, like the language of first-order logic and programming languages.
  3. ^ Antonym is an antonym of synonym.
  4. ^ Some linguists use the term homonym for both phenomena.
  5. ^ Some authors use the term compositional semantics for this type of inquiry.
  6. ^ The term formal semantics is sometimes used in a different sense to refer to compositional semantics or to the study of meaning in the formal languages of systems of logic.
  7. ^ Cognitive semantics does not accept the idea of linguistic relativity associated with the Sapir–Whorf hypothesis and holds instead that the underlying cognitive processes responsible for conceptual structures are independent of the language one speaks.
  8. ^ Other examples are the word island, which profiles a landmass against the background of the surrounding water, and the word uncle, which profiles a human adult male against the background of kinship relations.
  9. ^ A possible world is a complete way of how things could have been.
  10. ^ The history of semantics is different from historical semantics, which studies how the meanings of words change through time.
  11. ^ Some theorists use the term structural semantics in a different sense to refer to phrasal semantics.
  12. ^ Some theorists use the term psychosemantics to refer to this discipline while others understand the term in a different sense.

Citations

  1. ^
  2. ^
  3. ^ Allan 2009, p. xi
  4. ^
  5. ^ Zaefferer 2019, p. 1
  6. ^
  7. ^ Griffiths & Cummins 2023, p. 12
  8. ^ Riemer 2010, p. 2
  9. ^
  10. ^ Carston 2011, p. 280
  11. ^ Williams 1997, p. 457
  12. ^
  13. ^
  14. ^
  15. ^
  16. ^
  17. ^ Griffiths & Cummins 2023, pp. 12–13
  18. ^ Bezuidenhout 2009, p. 875
  19. ^
  20. ^
  21. ^
  22. ^
  23. ^
  24. ^
  25. ^ Tondl 2012, p. 111
  26. ^ Olkowski & Pirovolakis 2019, pp. 65–66
  27. ^
  28. ^
  29. ^ Zalta 2022, § 1. Frege’s Life and Influences, § 3. Frege’s Philosophy of Language.
  30. ^
  31. ^
  32. ^
  33. ^
  34. ^
  35. ^ Cunningham 2009, p. 531
  36. ^ Marti 1998, Lead Section
  37. ^
  38. ^ Löbner 2013, pp. 7–8, 10–12
  39. ^
  40. ^
  41. ^
  42. ^ Heffer 2014, p. 42
  43. ^ Saeed 2009, p. 63
  44. ^
  45. ^
  46. ^ Yule 2010, pp. 113–115
  47. ^
  48. ^
  49. ^
  50. ^
  51. ^
  52. ^
  53. ^
  54. ^
  55. ^ Pustejovsky 2009, p. 479
  56. ^
  57. ^
  58. ^
  59. ^
  60. ^
  61. ^ Fasold & Connor-Linton 2006, pp. 141–143
  62. ^
  63. ^ Jackendoff 2002, pp. 378–380
  64. ^ Jackendoff 2002, pp. 382–383
  65. ^ Jackendoff 2002, pp. 384–385
  66. ^
  67. ^ Wierzbicka 1988, p. 3
  68. ^
  69. ^
  70. ^ Moeschler 2007, pp. 31–33
  71. ^ Portner & Partee 2008, pp. 3, 8–10, 35, 127, 324
  72. ^
  73. ^
  74. ^
  75. ^ Kortmann 2020, p. 165.
  76. ^ Taylor 2009, pp. 73–74
  77. ^ Li 2021
  78. ^
  79. ^
  80. ^ Taylor 2009, pp. 74–75
  81. ^
  82. ^
  83. ^ Taylor 2009, pp. 76–77
  84. ^ Taylor 2009, p. 82
  85. ^
  86. ^
  87. ^
  88. ^
  89. ^ Bunt & Muskens 1999, pp. 1–2
  90. ^
  91. ^
  92. ^ Zhao 2023, Preface
  93. ^ Farese 2018, pp. 1–3
  94. ^ Peeters 2006, p. 25
  95. ^
  96. ^
  97. ^
  98. ^
  99. ^ Davis 2005, pp. 209–210
  100. ^ Gibbs 1994, pp. 29–30
  101. ^ Davis 2005, p. 211
  102. ^
  103. ^ Speaks 2021, § 2.1.2 Theories of Reference Vs. Semantic Theories
  104. ^ Speaks 2021, § 2.1.4 Character and Content, Context and Circumstance
  105. ^
  106. ^
  107. ^
  108. ^
  109. ^
  110. ^
  111. ^
  112. ^ Feng 2010, p. 19
  113. ^
  114. ^
  115. ^
  116. ^ Lyons 1996, pp. 123–125
  117. ^ Lyons 1996, pp. 120–121
  118. ^
  119. ^ Blackburn 2008a
  120. ^ Speaks 2021, § 3.2.1 Causal Origin
  121. ^
  122. ^ Berto & Jago 2023, § 1. Reasons for Introducing Impossible Worlds
  123. ^ Kearns 2011, pp. 8–11
  124. ^
  125. ^ Boyd, Gasper & Trout 1991, p. 5
  126. ^
  127. ^
  128. ^ Speaks 2021, § 3.2.4 Regularities in Use
  129. ^ Speaks 2021, § 3.2.5 Social Norms
  130. ^
    • Speaks 2021, § 2.2.3 Inferentialist Semantics
    • Whiting, Lead Section, § 1a. A Theory of Linguistic Meaning
    • Hess 2022, § Abstract, § 1 Introduction
  131. ^ Whiting, § 1a. A Theory of Linguistic Meaning
  132. ^
  133. ^
  134. ^
  135. ^
  136. ^
  137. ^ Bekkum et al. 1997, pp. 110–112
  138. ^
  139. ^ Bekkum et al. 1997, pp. 75–76
  140. ^
  141. ^
  142. ^
  143. ^
  144. ^
  145. ^
  146. ^
  147. ^
  148. ^
  149. ^
  150. ^
  151. ^
  152. ^
  153. ^
  154. ^
  155. ^ Nerlich 2019, pp. 218, 221–223
  156. ^
  157. ^ Kretzmann 2006, pp. 795–796
  158. ^
  159. ^
  160. ^ Kretzmann 2006, pp. 802–803
  161. ^
  162. ^
  163. ^ Rowe & Levine 2015, p. 151
  164. ^
  165. ^
  166. ^
  167. ^ Croft & Cruse 2004, p. 8
  168. ^ Östman & Fried 2005, pp. 191–192
  169. ^
  170. ^
  171. ^
  172. ^
  173. ^
  174. ^
  175. ^
  176. ^
  177. ^
  178. ^
  179. ^
  180. ^ Sanford 2009, pp. 793–797
  181. ^
  182. ^

Sources