Home »
Science »
Archive by category "Cognitive Science" (Page 7)
Category Archives: Cognitive Science
Morphology Morphology is about what happens to words to change their structure, impacting their meaning and usage. In English, we add -s or -es at the end of words to make them plural (guy –> guys, time –> times). Japanese, on the other hand, uses reduplication (hito –>hitobito, toki –> tokidoki) to make words plural. Adding to words, affixation, has three […]
Topographical maps of concepts in a text provide useful views of language. Fortuna et al in Semantic Knowledge Management (pp. 155-169) describe how three-dimensional topic maps can both give meaningful insights into clusters of related content, such as news stories or published papers. I have frequently stressed the importance of concept associations in the brain, in cognitive […]
Posted by
Joe Roushar in
Cognitive Science,
Communication,
Knowledge,
Learning,
Linguistics,
Memory,
Ontology,
Perception,
Philosophy,
Translation , Followed with
Comments Off on Three-Dimensional Model of Language
The Paired Model By pairing language strata, we attempt to find or describe symmetrical structures in language, thus helping clarify one of the most abstract phenomena known to man: verbal communication. This pairing of characteristics is also useful in decomposing the problem into smaller chunks to make it easier for computers to deal with. A note […]
Layered Model Just as the brain has areas with three to six distinct layers, a typical artificial neural systems (ANS) also has several layers. The example at right shows a network with three layers that illustrate a neural network‘s distributed architecture. The uniform circles connected by lines are symbolic of the state of an ANS at […]
If at first you don’t succeed, try – try again. Humans are pretty good at learning from our mistakes. In fact, some suggest that whatever doesn’t kill you makes you stronger. Today I’d like to riff on that theme a bit, and talk about ways in which machines can implement learning from errors. Error Minimization […]
A Different Approach The previous section explained why we need the ground-up formulation of a new grammar for Natural Language (NL) understanding: none of the existing ones can accommodate the high demands of modern technology and approximate the paradigm of artificial intelligence as a model for human linguistic competence and performance across all linguistic phenomena. In […]
Words and Sentences The game of Scrabble is completely about words. Crossword puzzles go from sentence or phrase to word. Our analytical approach begins with the word, but doesn’t stop there. From both the cognitive and computational perspectives, the sentence exhibits far more complexity and changeability than the word. At any given moment, the structure of […]
Linguistic Building Blocks While languages are infinite, each has a finite number of structures, functions and attributes. Functions and attributes are the building blocks of a grammar. Grammars or languages are categorized as regular, context-free (CF), context-sensitive (CS), recursive, and recursively enumerable. A context-sensitive grammar is a powerful formalism that describes the language in terms of […]
What is Traditional Grammar Grammars provide the knowledge and rules necessary to understand or disambiguate the often ambiguous strings of words that constitute language. People disambiguate by searching available knowledge (Nirenburg, 1987). Because a traditional grammar specifies a finite set of rules or patterns which attempt to capture the regularities of a language (Grosz, 1986), it […]
Exceptions to the Rules In natural language, exceptions to rules of grammar and other characteristics are frequent, and exhibit little consistency or predictability. They may multiply in discourse and in situations where the speaker/writer’s competence is limited. Idioms and irregular verbs are common exceptions which can be quite difficult to categorize or describe using formal descriptive grammar. Further exceptions […]