10 Apr Talking About Computational Linguistics
Massive strides have been made in the cognitive definition of thought, perception, culture and language. The interaction or kinship of all these elements of the human experience is also better defined. This combined work has increased our self-awareness and provided the basis for synthesizing and modeling automata to extend our abilities and increase the effectiveness of our work. In the language arena, work in natural language processing (NLP) has provided the basis for computer aids, some of which respond to NL input and execute various applications while others assist in the actual study of language. There are many tools in the figurative toolbox, but none that accurately interpret the intent of the speaker through intelligent dialog. Posts I plan for this section will establish a framework for moving forward.
|Understanding Context Cross-Reference|
|Click on these Links to other posts and glossary/bibliography references|
|Prior Post||Next Post|
|Abstract Contexts and Fuzzy Reasoning||Are you My Grammar?|
|cognitive linguistics||Hirst 1987 Chomsky 1968|
|perception thought||Miller 1976 Allen 1987|
|automata NLP||Grishman 1986 Winograd 1983|
NLP has many facets and application areas. The development and comparison of grammar formalisms for parsing NL have traditionally received the most attention. The object has been to develop an NLP paradigm that enables computers to rapidly disambiguate polysemous (having multiple meanings) text. Robust automatic understanding of NL input has been achieved, up to now, only at a great cost in computer resources and time. Even the best NL systems have acknowledged shortcomings.
If it is possible to overcome these shortcomings, it will only way to do so will be to understand how humans produce and comprehend language. We do not yet fully understand this, but we can build models based on all the constraints we understand today.
Components of Computational Linguistics
- A Language (Finnish, Farsi…)
- Lexicon or Dictionary
- Grammar (set of rules for
- analyzing and forming
- Parser (program code to
- use the grammar rules to
- analyze text)
- Generator (program to
- create correct text using
- grammar rules)
- A Voice Analyzer (audio input)
- A Speech Synthesizer (output)
- Rules and Conditions
The ranks of full-time computational linguists are gradually increasing, sprouting from the ranks of linguists, computer scientists and even, occasionally, cognitive psychologists. Some are relatively naive participants (your humble author, for one) with their sleeves rolled-up ready to tackle one of the most vexing problems of computing.
The perspective of computational linguists is quite simple: we seek formalisms to structure data and develop algorithms to use in programs to process natural language. The author’s personal objective is to produce high-quality translations of text from many different languages. This requires total accuracy in the interpretation of input text and the generation of correct target-language text that preserves the intended source-text meaning.
Computers use simple (often binary true/false) conditional statements to act on input data. Conditional statements are critical to the usefulness of programs, particularly symbolic and artificial-intelligence programs that rely less on arithmetical operations than on conditional propositions. In computerized interpretation, the modeling and generation of the grammar formalisms of human languages are necessary to provide rules for conditional statements. Thou shalt not dangle thy participle! Ergo caviat locutor.
In upcoming posts and in the successive sections, I will attempt to show how each of these components contributes to a computational model that can accurately pull original intent out of complex ambiguous words, phrases and more.
|Click below to look in each Understanding Context section|
|4||Perception and Cognition||5||Fuzzy Logic||6||Language and Dialog||7||Cybernetic Models|
|8||Apps and Processes||9||The End of Code||Glossary||Bibliography|