Formalism Framework
of Language
Formalism is a school of literary
criticism and literary theory having mainly to do with
structural purposes of a particular text. It is the study of a text without
taking into account any outside influence. Formalism rejects (or sometimes
simply "brackets," i.e., ignores for the purpose of
analysis) notions of culture or societal influence, authorship, and content,
and instead focuses on modes, genres, discourse, and forms.
In another definition, Formalism or formal linguistics
is the study of the abstract forms of language and their internal relations. It
fixes on the forms of languages as evidence of the universals without
considering how these forms function in communication and the ways of social
life in different communities. 
In literary
theory, formalism
refers to critical approaches that analyze, interpret, or evaluate the inherent
features of a text. These features include not only grammar and syntax but also literary
devices such as meter and tropes.
Formalism views the primary function of ordinary language as
communicating a message by references to the world outside of language. The
formalist approach studies the form of the work, as opposed to its content.
This approach examines the formula or methodology in literature, and how it
leads to deeper meaning with closer reading. New Critics believe that
anything that is essential to interpreting the text must be found within the
text itself (point of view, symbols, irony, language, etc.)—and not from what
the reader brings to understanding the work through his or her own assumptions
based on his or her own experiences and interpretive strategies.  They
therefore particular attention to the literary devices used in the work and to the patterns these devices
establish. As the narrative flows through its plot complications, it eventually
reaches a climactic point, and all the details of the form fall into place in
the dénouement.  These internal relationships gradually reveal a form—a
principle by which all subordinate patterns can be seen. The formalist
approach reduces the importance of a text’s
historical, biographical, and cultural context.
Besides, Formalists favor an approach to the study of language which
emphasizes abstract, quasi-mathematical theories of linguistic structure based
primarily, but not always exclusively, on intuitions of grammaticality. These
theories are usually, but not always, discrete: they do not employ statistical
methods and avoid continuous structures. One strength of these theories, at
least according to proponents, is that they take otherwise vague linguistic
intuitions and make them precise and testable. However, there is no necessary
divide between the two approaches. Functionalists can and sometimes do use
formal techniques, and formalists can and sometimes do take communicative
function into account. Often confused with generative linguistics, which is a subset of formal linguistics. Also often confused with
Chomskyan linguistics, which is a subset of generative linguistics.
Formalism of Noam Chomsky
Chomsky has been described as the "father of modern
linguistics" and a major figure of analytic philosophy. His work has
influenced fields such as computer science, mathematics, and psychology. He is
credited as the creator or co-creator of the Chomsky hierarchy, the universal
grammar theory, and the Chomsky–Schützenberger theorem.
In the mid-1950s
Noam Chomsky, developed the formalism of context-free grammars and also their
classification as a special type of formal grammar (which he called
phrase-structure grammars). What Chomsky called a phrase structure grammar is
also known now as a constituency grammar, whereby constituency grammars stand
in contrast to dependency grammars. In Chomsky's generative grammar framework,
the syntax of natural language was described by context-free rules combined with
transformation rules.
In formal language
theory, a grammar (when the context isn't given, often called a formal grammar
for clarity) is a set of formation rules for strings in a formal language. The
rules describe how to form strings from the language's alphabet that are valid
according to the language's syntax. A grammar does not describe the meaning of
the strings or what can be done with them in whatever context—only their form.
Formal language
theory is the discipline which studies formal grammars and languages, is a
branch of applied mathematics. Its applications are found in theoretical
computer science, theoretical linguistics, formal semantics, mathematical
logic, and other areas.
A formal grammar is
a set of rules for rewriting strings, along with a "start symbol"
from which rewriting must start. Therefore, a grammar is usually thought of as
a language generator. However, it can also sometimes be used as the basis for a
"recognizer"—a function in computing that determines whether a given
string belongs to the language or is grammatically incorrect. To describe such
recognizers, formal language theory uses separate formalisms, known as automata
theory. One of the interesting results of automata theory is that it is not
possible to design a recognizer for certain formal languages.
A Powerful Grammar Formalism
The grammar
formalism is closely related to other formalisms currently in use in
computational linguistics. These formalisms are known as `unification-based',
`constraint-based', `information-based' and `feature-logic based'. Members of
this class are for example Definite Clause Grammars, PATR II, Functional
Unification Grammar and formalisms underlying linguistic theories such as
Generalized Phrase Structure Grammar, Lexical Functional Grammar, Unification
Categorical Grammar, Categorical Unification Grammar and Head-driven Phrase
Structure Grammar. 
They also show
how the nice properties of logic programming languages carry over to a whole
range of such constraint-based formalisms, by abstracting away from the actual
constraint language that is used. I define such a constraint-based formalism in
which the underlying constraint language, consists of path equations. The most
important characteristics of the formalism are: 
- The formalism consists of definite clauses, as in Prolog; instead of first-order terms the data structures of the formalism are feature structures.
- The formalism does not assume that concatenation is the sole string-combining operation (in contrast to FUG, DCG, PATR II, LFG, GPSG and UCG).
- The formalism is defined in an abstract framework, which facilitates the extendability of the techniques I develop in later chapters, to formalisms based on other (more powerful) constraint-languages.
Each of these points
will now be clarified in turn. 
Firstly, the principal `data-structures' of the formalism are
feature structures, rather than first-order terms such as in Prolog. The
motivation is that such feature structures are closer to the objects usually
manipulated by linguists. Furthermore, in writing grammars the use of
first-order terms becomes rather tiresome because it is necessary to keep track
of the number of arguments functors take, and the position of sub-terms in such
terms. Using path equations to define feature structures achieves some sort of
data abstraction. As an example consider a program which manipulates terms such
as the following 

and suppose furthermore we want to refer to a specific part Cat of such a term T. In Prolog we are then forced to mention all intermediate functors, and for each functor we need to mention all its arguments (possibly using the `anonymous' variable `_'):

On the other hand, using path equations, it is possible to refer to such an embedded term by its path, in this case the value of Cat is obtained by the equation Cat
 T syn head cat.
T syn head cat.
This said, it
should be stressed though that the difference between the two approaches is not
very decisive. In fact first-order terms may be used in an implementation of
such graph-based formalisms, and data-abstraction can also be achieved by other
means such as syntactic macro's, or by auxiliary predicates. 
From a
linguistic point of view, the second characteristic is the most salient one.
The formalism to be proposed, does not enforce that concatenation is the sole
operation to combine strings. This choice can be motivated by: 
- Increased symmetry of parsing and generation
- Increased expressive power
- Other applications
Dropping the concatenative base can be motivated from the desire to
use grammars in a reversible
way. From a reversible viewpoint, it is attractive to view a grammar simply as
a definition of the relation between strings and logical forms. To give a
different status to the phonology attribute, seems to destroy the inherent
symmetry somewhat. Thus, the formalism does not prescribe how the value of the
phonology attribute is to be composed, just as it does not prescribe how the
value of the semantics attribute is composed. 
If we do not
incorporate a concatenate base, then we allow for investigation of other types
of string combinations in natural language grammars. Several researchers, have
noted that analyses of a whole range of linguistic phenomena (most notably
those involving discontinuous constituents) may be simplified by assuming other
types of string operations. If no assumptions about the construction of
phonological representations, or semantic representations, are defined, then
the parsing and generation problem of the formalism is generally not decidable.
An important theme of this thesis is, to investigate parsing and generation
procedures which can be applied usefully, for linguistically motivated
grammars. 
Another reason
for developing a formalism which is not based on concatenation is the
observation that other (non-linguistic) problems can be encoded in a
unification-grammar as well, if we are not forced to manipulate strings.
Furthermore, the formalism is also used to define meta-interpreters in -- this
usage of the formalism also entails that no assumptions about string
construction are built-in. 
The third
characteristic states that the resulting formalism is a member of a class of
constraint-based formalisms. Therefore, results that hold for this class carry
over to the present formalism. In the other direction, it is easy to see how the
current formalism can be extended to allow for other, perhaps more complex
constraints. This is very useful as in the last few years a whole family of
different constraints has been proposed that do extend formalisms such as
PATR II. In a somewhat idealized view, parsing and generation algorithms
defined for a member of the class of constraint-based formalisms, can be used
for other members of this class, provided the appropriate constraint-solving
techniques are available for the constraints incorporated in these other
formalisms. 
 
Tidak ada komentar:
Posting Komentar