\documentclass[11pt,a4paper]{article}
\usepackage[pdftex]{graphicx}
\usepackage[english]{babel}
%\usepackage{multicol}
%\usepackage{url}
\usepackage[pdfborder={0 0 0}]{hyperref}
\usepackage{verbatim}
%\usepackage[cm]{fullpage}
%\usepackage[left=2cm,top=2cm,right=2cm,bottom=2cm]{geometry}
%\usepackage{setspace} \onehalfspacing
%\usepackage{sober}
%\usepackage{times}
%\pagestyle{plain}

\begin{document}
\begin{center}
{\Large {\bf Autonomous Syntax and Compositionality}: \\ 
	the bane of linguistics?} \\

{Andreas van Cranenburgh\footnote{\texttt{andreas@unstable.nl}}, \today} \\
{\em Essay proposal for Language \& Cognition course, University of Amsterdam}
\end{center}

\begin{comment}
\begin{center}\begin{verse}\begin{verbatim}
[...] Try to attach a meaning
To words that you've heard

Stumbling through the dark
Seems I'm stumbling through the dark
Eveybody's stumbling through the dark \end{verbatim}
\end{verse} -- from the album {\em Rainy day music}, The Jayhawks (2003)
\end{center}
\end{comment}

%\abstract{ }
\tableofcontents

\section{Research question}
Is the notion of autonomous syntax and its semantic counterpart,
compositionality, philosophically tenable for a plausible syntax-semantics
interface?

\section{Elaboration, projected contents}

\subsection{Autonomous syntax and syntactocentrism}
\addcontentsline{toc}{subsubsection}{\numberline {2.1.1}Processing autonomy: modularity}
\addcontentsline{toc}{subsubsection}{\numberline {2.1.2}Representational autonomy: levels of description}

There exist at least two %kinds of
definitions of autonomous syntax:

\begin{description}
\setlength{\itemsep}{0pt}
\setlength{\parskip}{0pt}
\item[processing autonomy] Syntax is an independent module with input and
output to other %kinds of
modules.
\item[representational autonomy] Syntax is an autonomous level logically
distinct from phonology and semantics.
\end{description}

It does not seem anyone still subscribes to the first version, namely the
fanciful notion that syntax is a completely independent module handing over
parse trees to some kind of semantic interpretation module. Counterexamples to
such a hypothesis are numerous, such as the archetypical ``eager to please''
versus ``easy to please'' --- where the former yields a different interpetation
of ``please'' from the latter, active versus passive.  Furthermore, the
modularity of mind is at odds with the brain's plasticity and the fundamental
interconnectedness of the brain's structure. %TODO: cite brain plasticity

%FIXME: this example is bad? need example where semantics influences parsing.

The second version, representational autonomy, is more pervasive in modern
linguistics, yet this position also has its detractors, chiefly among
proponents of Construction Grammar (eg., Tomasello 2003) and Cognitive Grammar
(eg., Langacker 1998). They are able to deny this position by claiming that a
grammar is a pairing of utterances with meanings, without a clear separation
between syntax, semantics and pragmatics.  They argue that all linguistic forms
have a conceptual basis (though this basis can be very abstract).  

This is opposed to interpretive semantics espoused by many generative
grammarians, where grammar is taken to be a pairing of phonetics and semantics,
ie., {\em meaning} in a narrow sense, ignoring connotations. Semantics would be
derived from deep structure in parallel to derivation of syntax. Such a view
amounts to positing an actual homomorphism between syntactic structure and
semantic interpretation.

Generative linguistics, especially in the earlier syntactic structures and
aspects period (Chomsky 1957, 1965) depends on a supposedly clear distinction
between syntactic and semantic well-formedness. Without specifying where this
difference lies Chomsky claims that a native speaker's judgment will suffice to
establish in which of these two dimensions any given mistake lies; that some
kind of rule has been broken is assumed {\em a priori}. It should be noted that
Chomsky's famous example sentence, ``colorless green ideas sleep furiously'' is
more often than not mischaracterized as being an argument for the irrelevance
of semantics. In fact he was arguing that ``the notion of [the probability of a
sentence] is an entirely useless one, under any known interpretation of this
term'' (Chomsky 1969, p.\ 57). He devised a sentence which certainly was not
part of any corpus or ever heard by anyone, and argued that, its novelty
notwithstanding, a native speaker effortlessly recognizes it as syntactically
grammatical. This was supposed to prove that statistical models of language
(ie., Markov models) where hopelessly inept at explaining language. Other
researchers have since claimed to have falsified this particular instance %mistake corrected here. 
of the
argument by showing that the probability of the sentence ``colorless green
...'' using a newspaper corpus is actually 20.000 times more likely than its
ungrammatical, reversed version with a Hidden Markov Model (Pereira 2000)
\footnote{On a more humorous note, there has been a competition to embed the
famous nonsense sentence into meaningful verse, yielding (arguably)
semantically plausible interpretations, albeit in a poetic sense, see
\url{http://www.linguistlist.org/issues/2/2-457.html\#2}}. To summarize, there
appears to be no clear distinction between syntax and semantics, and the
single-minded emphasis on syntax is not warranted.

\subsection{Principle of Compositionality}
\addcontentsline{toc}{subsubsection}
	{\numberline {2.2.1}Formal semantics}
\addcontentsline{toc}{subsubsection}
	{\numberline {2.2.2}Connectionist approaches}
\addcontentsline{toc}{subsubsection}
	{\numberline {2.2.3}Construction and Cognitive Grammar}

A natural counterpart to autonomous syntax is the principle of
compositionality:

\begin{quote}
``The meaning of a complex expression is a function of the meanings of its
immediate syntactic parts and the way in which they are combined.'' -- (Krifka
1999)
\end{quote}

In a trivial sense this principle is necessarily true, if it is granted that
language produces an infinity of meanings given a finite vocabulary and syntax.
The strict version, that nothing else but syntax and lexical semantics
determine the meaning of a sentence, however, is demonstrably false.
Counterexamples are idioms with limited applicability, as well as anaphora and
indexicals whose meaning depend on context.

This principle has been applied by Montague (1973) to develop a formal
semantics for a fragment of English, by many accounts the most rigorous and
elegant formulation of semantics to date. His approach is to augment a formal
syntax with a semantic interpretation. The result is a Categorial Grammar that
transforms a fragment of English into First-order Logic using the Lambda
Calculus. Problems arise with arbitrary numbers of adverbial clauses, because
the so called ``meaning postulates'' (which map words to predicates,
connectives or quantifiers) need to be specified {\em a priori}, with a fixed
number of (possible) arguments. An alternative is to use Davidsonian event
semantics, which lets adverbs modify events (reifying them in
the process!). %but: complain about davidson's misguided insistence on FOL!
% non-monotonicity, intensional effects etc.

In general Formal Semantics substitute a narrow emphasis on syntactic rules
with a focus on both syntactic and semantic rules, without reconsidering the
insistence on rules. However, regularities in language are not sufficient to
prove that actual rules are being followed. %ie., tries to syntactize semantics!

Problems for compositionality arise through polysemy: how can mere structure
determine {\em which} meaning of a polysemous word should make up the meaning
of a sentence? It seems that this is determined by the shifting context
in which
a sentence occurs, which compositionality should abstract over. Lakoff (1987)
rejects compositionality and argues that polysemy can be explained by radial
categories: concepts with more than one center of gravity.
%explain radial categories!

Another problem is even more basic: what is the meaning of a word? For
concerete nouns and some verbs this seems sufficiently clear, one can refer to
real world examples or specify necessary and sufficient conditions. Yet
abstract nouns and other grammatical classes pose a problem.  Jackendoff (1983)
concedes these problems, and all but approaches a theory of prototypical
concepts but wholly ignores sentence level integration\footnote{This mistake,
focussing on words and neglecting sentential meaning has been made since time
immemorial, cf.\ Locke and Leibniz; often it is even claimed that words must
refer to objects!}. %this footnote is actually a MAJOR point.

A different approach is to acknowledge that only complete sentences have stable
meanings, which implies one should infer the meaning of words from their effect
(syntactic, semantic and pragmatic) on example sentences (Brugman 1988).
So instead of the principle of compositionality where words contribute to the
meaning of a sentence one is left with the notion that words constrain the
possible meanings of a sentence, not in a part-whole relation but in an
abstract-specific relation. %elaborate! they constrain, but what is the initial meaning?? there is a parallel with my bachelor thesis here, where initial exemplar is crucial and constrains the rest of the exemplars that are used to interpret the sentence.

From a connectionist perspective, Chalmers (1990) argues that maintaining
compositionality means that a connectionist model could merely re-implement a
classical symbolic theory; which is a favourite attack of Fodor \& Pylyshyn
(1988) against connectionism. Instead Chalmers presents a connectionist model
that is able to perform syntactic transformations on distributed
representations, which carry their meanings hollistically, without the need for
extracting them as a separate step. 

\subsection{Methodological considerations: false dichotomies}

\addcontentsline{toc}{subsubsection}
	{\numberline {2.3.1}Competence versus performance}
\addcontentsline{toc}{subsubsection}
	{\numberline {2.3.2}Internalist versus externalist perspectives}

\begin{quote}{\bf Orthodoxy versus Orthopraxy: }
	Theory without practice, practice without theory\end{quote}

By attacking radical interpretation, Chomsky (2000) puts up a straw man 
%of someone 
bent on merely studying empirical phenomena. Whereas any theory must,
by definition, attempt to offer a consistent and complete account of that which
it is about, arguing that it should completely abstract over situations and
social contexts is the other, equally misguided extreme. It is not that the
dichotomy of competence and performance is necessarily flawed, but a linguistic
paradigm cannot begin to claim that it has explained the former without
clarifying or even incorporating the latter. Yet this is exactly what
generative linguistics claim to be inevitable. By rigging the debate with a
false dichotomy of empirical and theoretical approaches generative linguistics
needlessly polarizes the field, which can unfortunately lead to both sides
dismissing each other's results.%TODO: example?
%[Digression into philosophy of science:]
%This is reminiscient of mathematicians doing theoretical physics who insist
%on formulating elegant, symmetric models of reality (which they tellingly
%prefer to call a ``theory of everything''); as opposed to more conservative
%physicists who counter that no such bias is warranted and that physics has
%usually been messy due to the need of accounting for empirical evidence.
\footnote{Compare the split between hydraulics and fluid mechanics in the 18th
century: the former observing phenomena which could not be explained, the
latter explaining phenomena which could not be observed, according to a
humorous depiction by Nobel Laureate Sir Cyril Hinshelwood (cited in M.J.
Lighthill (1956), ``Physics of gas flow at very high speeds". Nature 178: 343)}

Another strange and narrow focus is that on an internalist perspective of
semantics. An internalist (eg., Chomsky 1986) claims that meaning must be
derived from causal connections to other parts of a system (eg., modules in a
mind); this implies that meaning is only {\em in the mind}.
%this is rather like age-old claim that all knowledge derives from reason.  
An externalist (Putnam 1975) claims that meaning can only be derived from
%causal connections to 
external reality, because there has to be a {\em division of linguistic labor}
if words are to acquire an established meaning. 

%internalist vs. externalist semantics
%internalist: meaning is derived from causal connections to other parts of system
%externalist: causal connections to external reality (compare robot reply)
%Searle: neither of these correct, intentionality instead
%Searle: humans have original intentionality, computers only derived
%Dennet: all intentionality is derived. instead of other minds problem 
%	there now is "do I have a mind myself" problem
%Fodor: intentionality is independent of interpretation, but intrinstic of
%	certain connections to external reality.

\section{Expected conclusion}
Linguistics would benefit of reconciling different approaches to language and
integrating various results, be it theoretical or empirical, into a more
coherent and less counter-intuitive whole. Autonomous syntax and the principle
of compositionality clearly impede this reconciliation; they are thus liable to
being rejected. Semantics should be a first-class citizen of a yet to be
devised {\em intellectual} theory of everything.

\begin{center}$\infty$\end{center}

\section{Bibliography}
%turn off bold for labels (all bow to magic incantations!)
%\renewcommand{\descriptionlabel}[1] {\hspace{\labelsep}#1}

\begin{description}
%Brugman, Claudia. 1981. Story of Over. University of California, Berkeley, M.A. Thesis. Available from Indiana University Linguistics Club, Bloomington, Indiana.

\item[Brugman, Claudia] (1988), ``The Story of Over: Polysemy, Semantics and the Structure of the Lexicon.'' New York: Garland.

\item[Chalmers, David J.] (1990), ``Syntactic Transformations on Distributed Representations,'' Connection Science, Vol. 2.
% --- (1993). "Connectionism and Compositionality: Why Fodor and Pylyshyn Were Wrong" in Philosophical Psychology 6: 305-319.

\item[Chomsky, Noam] (1957), ``Syntactic Structures,'' The Hague : Mouton. \\
% ––– (1959). Review of Skinner's Verbal Behavior, Language, 35: 26-58.
 --- (1965), ``Aspects of the Theory of Syntax,'' Cambridge, MA: MIT Press. \\
  --- (1969), ``Quine's Empirical Assumptions, in Words and objections. Essays on the work of W. V. Quine,'' ed.\ Donald Davidson and Jaakko Hintikka, pp.\ 53--68, D. Reidel, Dordrecht \\
% --- (1975), ``Reflections on Language,'' London : Fontana. \\
% --- (1980), ``Rules and Representations,'' New York: Columbia University Press. \\
%# ––– (1981). Lectures on Government and Binding, Hawthorne, NY: Walter De Gruyter Incorporated.
%# ––– (1982). The Generative Enterprise, Dordrecht: Foris Publications.
 --- (1986), ``Knowledge of Language, Its Nature, Origin and Use,'' NY: Praeger. \\
%# ––– (1988). Language and Problems of Knowledge, The Managua Lectures, Cambridge, MA: MIT Press.
%# ––– (1975a). The Logical Structure of Linguistic Theory, NY: Plenum.
%# ––– (1990). “On the nature, acquisition and use of language”, in Mind and Cognition: A Reader, W.G. Lycan (ed.), Cambridge MA and London UK: Blackwells, pp.627-45.
%# ––– (1995). The Minimalist Program, Cambridge: MIT Press.
 --- (2000), ``New Horizons in the Study of Language and Mind,''
    Cambridge: Cambridge University Press.

\item[Davidson, Donald] (1985), ``Adverbs of Action,'' in Vermazen and Hintikka (eds.), 1985.

\item[Krifka, Manfred] (1999), entry on ``Compositionality,'' in The MIT Encyclopedia of the Cognitive Sciences, eds. Robert A. Wilson and Frank Keil.

\item[Fodor, Jerry \& Pylyshyn, Zenon] (1988), ``Connectionism and cognitive architecture: A critical analysis,'' Cognition vol.\ 28, pp.\ 3--71

\item[Jackendoff, Ray] (1983), ``Semantics and cognition,''  Cambridge, MA:  MIT Press.

\item[Langacker, Ronald] (1998), ``Conceptualization, symbolization and
grammar,'' In Tomasello, Michael. 1998, ``The New Psychology of Language:
Cognitive and functional approaches to language structure,'' Mahwah, NJ:
Laurence Erlbaum.

\item[Montague, Richard] (1973), ``The Proper Treatment of Quantification in
Ordinary English,'' reprinted in ``Formal Semantics: The Essential Readings,''
by Paul Portner, Barbara H. Partee, eds. %Blackwell, 2002. ISBN 0631215425

\item[Pereira, Fernando] (2000), ``Formal grammar and information theory:
together again?'', Philosophical Transactions of the Royal Society 358(1769):
1239-1253.
%\url{http://www.cis.upenn.edu/%7epereira/papers/rsoc.pdf}

\item[Putnam, Hilary] (1975), ``The meaning of `meaning','' in Language, Mind and Knowledge, ed.\ K. Gunderson

\item[Tomasello, Micheal] (2003), ``Constructing a Language. A Usage-Based Theory of Language Acquisition,'' Cambridge MA: Harvard University Press.

\end{description}
\end{document}

%http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.54.7505
%Syntax-semantics interaction in sentence understanding (1995) by Kavi Mahesh,
%Susan Bovair, Jennifer K. Holbrook, Ashwin Ram, J. Spencer Rugaber
%ftp://ftp.cc.gatech.edu/pub/coc/tech_reports/1995/GIT-CC-95-10.ps.Z

%http://www.springerlink.com/content/k26j8675jl448208/

%[on construction grammar]
%The earliest study was "There-Constructions," which appeared as Case Study 3
%in George Lakoff's Women, Fire, and Dangerous Things.[1] It argued that the
%meaning of the whole was not a function of the meanings of the parts, that odd
%grammatical properties of Deictic There-constructions followed from the
%pragmatic meaning of the construction, and that variations on the central
%construction could be seen as simple extensions using form-meaning pairs of
%the central construction.

%relation to truth conditional semantics
%Pietroski

%Searle: syntax by itself is not constitutive of semantics.
%“This point is missed so often, it bears repeating: the syntactically specifiable objects over which computations are defined can and standardly do possess a semantics; it's just that the semantics is not involved in the specification.”
%Rey (2002) attacking Searle
%this means that characterizing programs as formal systems is unfair
%a program needs semantic specification as well.
%Rey, G., 2002, ‘Searle’s Misunderstandings of Functionalism and Strong AI’ in Preston and Bishop (eds.)
%Searle, J., 1980, ‘Minds, Brains and Programs’, Behavioral and Brain Sciences, 3:417-57

%[..]there is, of course, a familiar philosophical ambiguity lurking in the
%wings here -- the confusion of behaving in a fashion describable by a rule
%with following or applying a rule -- and arguably advocates of an algorithmic
%level of description have not always kept this distinction in mind. 
%On Marr's computational theory of vision

%However, Lowenheim-Skolem assures that there is at least one interpretation S*
%that maps all of the referring terms onto only mathematical objects. S* cannot
%be the canonical interpretation, but there is nothing in the syntax of
%Mentalese to explain why S is the correct interpretation and S* is not.
%Therefore syntax underdetermines semantics. (Compare acknowledgement of this
%in Pylyshyn 1984: page 44.)
%(criticism that representational theory of mind does not account for semantics)

%"The on-line anticipatory hypotheses that hearers use make autonomous syntax pretty well redundant." (from http://www.phon.ucl.ac.uk/home/robyn/relevance/relevance_archives/0330.html )
