1. Homepage
  2. Programming
  3. CSC 485H/2501H: Computational linguistics, Fall 2023 Assignment 3: Symbolic Machine Translation

CSC 485H/2501H: Computational linguistics, Fall 2023 Assignment 3: Symbolic Machine Translation

Engage in a Conversation
University of TorontoCSC 485CSC2501Computational linguisticsCSC485

CSC 485H/2501H: Computational linguistics, Fall 2023 CourseNana.COM

Assignment 3 CourseNana.COM

Late assignments will not be accepted without a valid medical certificate or other documentation of an emergency. CourseNana.COM

Overview: Symbolic Machine Translation CourseNana.COM

In this assignment, you will learn how to write phrase structure grammars for some different lin- guistic phenomena in two different languages: English and Chinese. You can use the two grammars to create an interlingual machine translation system by parsing in one, and generating in the other. Don’t panic if you don’t speak Chinese, and also don’t cheer up yet if you can speak the language — it won’t give you much of an advantage over other students. A facility with languages in general will help you, as will the ability to learn and understand the nuances between the grammars of two different languages. In particular, you will start by working on agreement. Then, you will need to analyse the quantifier scoping difference between the two languages. CourseNana.COM

1. Agreement: Determiners, Numbers and Classifiers [10 marks]

English expresses subject–verb agreement in person and number. English has two kinds of number: singular and plural. The subject of a clause must agree with its predicate: they should be both singular or both plural. However, the number of a direct object does not need to agree with anything. CourseNana.COM

(1) (2) (3) (4) CourseNana.COM

A programmer annoys a dolphin. CourseNana.COM

Two programmers annoy a dolphin.
* Two programmers annoys two dolphins. * A programmer annoy two dolphins.

Chinese, on the other hand, does not exhibit subject–verb agreement. As shown in the examples below, most nouns do not inflect at all for plurality. Chinese does, however, have a classifier (CL) part of speech that English does not. Semantically, classifiers are similar to English collective nouns (a bottle of water, a murder of crows), but English collective nouns are only used when describing collectives. With very few exceptions, classifiers are mandatory in complex Chinese noun phrases. Different CLs agree with different classes of nouns that are sorted by mostly semantic criteria. For example, 程序员 (chengxuyuan)1 programmer is a person and an occupation, so it should be classified by either (ge) or (ming) and cannot be classified by the animal CL (zhi). However, the rules of determining a noun’s class constitute a formal system that must be followed irrespective of semantic similarity judgements. For example, while geese and fish are both animals and can both be classified by the animal CL (zhi), (yu) fish can take another classifier, CourseNana.COM

livestock. CourseNana.COM

一个程序员 CourseNana.COM

yi ge chengxuyuan one ge-CL programmer CourseNana.COM

两个程序员 CourseNana.COM

liang ge chengxuyuan two ge-CL programmer CourseNana.COM

三个程序员 CourseNana.COM

san ge chengxuyuan three ge-CL programmer CourseNana.COM

*三 程序员
san chengxuyuan three programmer CourseNana.COM

*三 只 程序员
san zhi chengxuyuan three zhi-CL programmer CourseNana.COM

(10) 一只鹅 yizhie CourseNana.COM

one zhi-CL goose CourseNana.COM

(11) 两只鹅 liang zhi e CourseNana.COM

two zhi-CL goose (12) 三只鹅 CourseNana.COM

san zhi e three zhi-CL goose CourseNana.COM

(13) *三条鹅 san tiao e CourseNana.COM

three tiao-CL goose CourseNana.COM

(14) *三名鹅 san ming e CourseNana.COM

three ming-CL goose CourseNana.COM

1Use either Chinese characters or the Romanized form, but with no spaces or hyphens, e.g., chengxuyuan, for multi-character lexical entries. CourseNana.COM

You should be familiar by now with the terminology in the English grammar starter code for this question. The Chinese grammar is fairly similar, but there is a new phrasal category called a classifier phrase (CLP), formed by a number and a classifier. The classifier phrase serves the same role as a determiner does in English. CourseNana.COM

The two grammars below don’t appropriately constrain the NPs generated. You need to design your own rules and features to properly enforce agreement. CourseNana.COM

English Grammar: Rules: CourseNana.COM

NP Det N NP Num N VP V NP S NP VP CourseNana.COM

Lexicon: CourseNana.COM

a: Det one: Num two: Num three: Num programmer: N programmers: N goose: N geese: N fish: N fish: N observe: V observes: V observed: V catch: V catches: V caught: V CourseNana.COM

Chinese Grammar: Rules: CourseNana.COM


Lexicon: CourseNana.COM

yi one/a Num liang two Num san three Num CourseNana.COM

程序员 chengxuyuan programmer N e goose N yu fish N CourseNana.COM

观察 guancha observe V 抓到 zhuadao catch V CourseNana.COM

ge CL ming CL zhi CL tiao CL CourseNana.COM

Here is a list of all of the nouns in this question and their acceptable classifiers: • egoose:zhi;
程序员 chengxuyuan programmer: ge, ming. CourseNana.COM

  1. (a)  (6 marks) Implement one grammar for each language pursuant to the specifications above. English: q1_en.pl and Chinese: q1_zh.pl. CourseNana.COM

    Neither of your grammars need to handle embedded clauses, e.g., a programmer caught two geese observe a fish. Similarly for Chinese, your grammar doesn’t need to parse sentences like example (15): CourseNana.COM

    (15) 一名 程序员 抓到 两 只鹅观察 一条 鱼 yi ming chengxuyuan zhuadao liang zhi e guancha yi tiao yu CourseNana.COM

    A programmer caught two geese observe a fish. CourseNana.COM

    For the Chinese grammar, the lexical entries can be coded in either pinyin (the Romanized transcriptions of the Chinese characters) or in simplified Chinese characters. CourseNana.COM

  2. (b)  (4marks)Useyourgrammarstoparseandtranslatethefollowingsentences.Saveandsubmit all the translation results in the .grale format. The results of sentence (16) should be named q1b_en.grale and the results of sentence (17) should be named q1b_zh.grale. CourseNana.COM

(16) Two geese caught one fish CourseNana.COM

(17)两个程序员 观察三条鱼 liang ge chengxuyuan guancha san tiao yu CourseNana.COM

Operational Instructions CourseNana.COM

  • If you decide to use simplified Chinese characters, enter them in Unicode and use the -u flag when you run TRALE. CourseNana.COM

  • Independently test your grammars in TRALE first, before trying to translate. CourseNana.COM

  • Use the function translate to generate a semantic representation of your source sen- tence. If your sentence can be parsed, the function translate should open another gralej interface with all of the translation results. CourseNana.COM

            | ?- translate([two,geese,observe,one,fish]).
  • To save the translation results, on the top left of the Gralej window (the window with the INITIAL CATEGORY entry and all of the translated sentences listed), click File >> Save all >> TRALE format. CourseNana.COM

  • Don’t forget to close all of the windows or kill both of the Gralej processes after you finish. Each Gralej process will take up one port in the server, and no one can use the server if we run out of ports. CourseNana.COM

2. Quantifier Scope [30 marks] CourseNana.COM

For this assignment, we will consider two quantifiers: the universal quantifier (every, mei) and the existential quantifier (a, yi). In English, both quantifiers behave as singular determiners. CourseNana.COM

(18) (19) (20) CourseNana.COM

A professor stole every cookie. * A professor stole every cookies. * A professors stole every cookie. CourseNana.COM

In Chinese, both of these quantifiers behave more like numerical determiners. In addition, when a universal quantifier modifies an NP that occurs before the verb (such as with a universally quanti- fied subject), the preverbal operator (dou) is required. When a universally quantified NP occurs after the verb, the dou-operator must not appear with it. CourseNana.COM

Every professor stole a cookie. CourseNana.COM

A professor stole every cookie. CourseNana.COM

每个 教授 都偷了一块 饼干 CourseNana.COM

mei ge jiaoshou dou tou-le yi kuai binggan ge-CL professor DOU stole kuai-CL cookie CourseNana.COM

*每个 教授 偷了一块 饼干 mei ge jiaoshou tou-le yi kuai binggan ge-CL professor stole kuai-CL cookie CourseNana.COM

一个 教授 偷了每块 饼干 CourseNana.COM

yi ge jiaoshou tou-le mei kuai binggan ge-CL professor stole kuai-CL cookie CourseNana.COM

*一个 教授 都偷了每块 饼干
yi ge jiaoshou dou tou-le mei kuai binggan ge-CL professor dou stole kuai-CL cookie CourseNana.COM

Quantifier Scope Ambiguity In lecture, we talked about different kinds of ambiguity. In many English sentences, no matter what the order of the quantifiers, there is a quantifier scope ambiguity. For example, the sentence every programmer speaks a language has two readings: CourseNana.COM

• (> ) Every programmer speaks a language. [The language is Inuktitut.]
• (
>)Everyprogrammerspeaksalanguage.[SomeprogrammersspeakInuktitutandsome CourseNana.COM

programmers speak Aymara.] CourseNana.COM

The symbol (> ) means the existential quantifier outscopes the universal quantifier in a logical form representation of the sentence. CourseNana.COM

S LF:?? CourseNana.COM

NP VP LF:λP.x.(programmer(x) P (x)) LF:λz.y.(language(y) speak(z, y)) CourseNana.COM

every programmer CourseNana.COM

speaks a language CourseNana.COM

Figure 1: Beta Reduction. What should be the LF of S? CourseNana.COM

We can write the semantics of the two sentences in their logical forms (LF) to distinguish the two readings: CourseNana.COM

y.(language(y)∧∀x.(programmer(x)speak(x,y))) x.(programmer(x)⇒∃y.(language(y)speak(x,y))) CourseNana.COM

English sentences (27, 28) are scopally ambiguous no matter what the linear order of the quan- tifiers is. But in Chinese, a sentence is scopally ambiguous only when the universally quantified NP precedes the existential NP: (29) is ambiguous, but (30) is unambiguous.2 CourseNana.COM

  1. (27)  Every programmer speaks a language Ambiguous: > , > CourseNana.COM

  2. (28)  A programmer speaks every language Ambiguous: > , > CourseNana.COM

(29)每个程序员 都会说一种 语言 mei ge chengxuyuan dou huishuo yi zhong yuyan
ge-CL programmer DOU speak zhong-CL language CourseNana.COM

Ambiguous: > , > CourseNana.COM

(30)一个程序员 会说每种 语言
yi ge chengxuyuan huishuo mei zhong yuyan
ge-CL programmer speak zhong-CL language CourseNana.COM

Unambiguous: > CourseNana.COM

How can we derive the LF of the two readings? We use a process called beta reduction. Recall the lambda-calculus notation: λx.x2 denotes a function that takes a variable x, and returns the square of its value (x2). After substituting the value for the bound variable x, we can reduce the function application in the body of the lambda term to a new expression. For example, applying 2 to λx.x2 will get us: CourseNana.COM

λx.x2(2) = 22 CourseNana.COM

S CourseNana.COM

x.(programmer(x) ⇒ ∃y.(language(y) speak(x, y))) NP VP CourseNana.COM

λP.x.(programmer(x) P (x)) λz.y.(language(y) speak(z, y)) Q N V NP CourseNana.COM

λF.λP.y.(F (y) P (y))
λx.(programmer(x)) λQ.λz.Q(λx.(speak(z, x))) CourseNana.COM

λP.y.(language(y) P (y)) Q N CourseNana.COM

Figure 2: Beta reduction analysis of the sentence every programmer speaks a language. CourseNana.COM

λF.λP.y.(F (y) P (y)) λx.(language(x)) CourseNana.COM

This process is also known as beta reduction (denoted as β ). Note that beta reduction itself does not tell us that this equals 4. That is obtained by a subsequent process of arithmetic evaluation. But we can use beta reduction even if we don’t evaluate. CourseNana.COM

We can also perform beta reduction on variables for functions. For example, applying in λF.F (2) to λx.x2 will yield: CourseNana.COM

λF.F (2) (λx.x2) = (λx.x2)(2) = 22 CourseNana.COM

Now, let’s look at an example that uses beta reduction to compute the LF of a sentence. For exam- ple, as shown in figure 1, we know that the LF of the NP every programmer is λP.x.(programmer(x) P (x)) and the LF of the VP speaks a language is λz.y.(language(y) speak(z, y)). What is the
LF of
every programmer speaks a language? CourseNana.COM

λP.x.(programmer(x) P (x))(λz.y.(language(y) speak(z, y))) β x.(programmer(x)λz.y.(language(y)speak(z,y))(x))
β x.(programmer(x)y.(language(y)speak(x,y))) CourseNana.COM

Each step of repeatedly applying beta reduction to every subterm until we reach an irreducible statement is called beta normalisation. CourseNana.COM

Figure 2 shows the complete analysis of the sentence every programmer speaks a language. Familiarize yourself with every part of the analysis. But this only generates one of the two readings – the surface reading (> ). We will use a technique called quantifier storage to capture the scopal ambiguity and make both readings available. CourseNana.COM

Quantifier Storage If quantifier scoping is a semantic effect, how do we represent it in syntax? When there is no ambiguity, keeping track of the quantifier scope is pretty straightforward. To keep track of and resolve scope ambiguities, we will use a list called a quantifier store. The idea behind QSTORE is that, instead of consuming all of the LF components right away, we can choose to keep them in QSTORE and apply them later. CourseNana.COM

S (2)
x.programmer(x) speak(x, z) QSTORE: z; λG.y.(language(y) G(y)) CourseNana.COM

NP VP LF: λP.x.(programmer(x) P (x)) CourseNana.COM

QSTORE: ⟨⟩ every programmer CourseNana.COM

NP (1)
λF.F(z) CourseNana.COM

z; λG.y.(language(y) G(y)) a language CourseNana.COM

Figure 3: Quantifier Storage. Storing the quantifier at (1), and retrieve it later at (2). 9 CourseNana.COM

Let’s go back to the example, every programmer speaks a language (figure 3). We first store the LF of the NP a language at (1) and replace the LF of the NP with a placeholder λF.F(z). The variable z in this expression is a free occurrence, and it is the same variable as the z in the store and in the LF of the sentence (the free occurrences of z are highlighted in red). We retrieve the logical form from the store at (2). The retrieval process consists of three steps: CourseNana.COM

  1. First, we construct a function λz.LS , where LS is the current LF, and z is the variable paired in the QSTORE entry. In our particular case, this will yield λz.(x, programmer(x) speak(x, z)). CourseNana.COM

  2. Then, we apply this function to the LF from the QSTORE entry. CourseNana.COM

  3. Finally, we beta normalise. Using beta normalisation, we obtain the second reading of the sentence. CourseNana.COM

    λG.y.(language(y) G(y))(λz.(x, programmer(x) speak(x, z))) β y.(language(y)(λz.(x.programmer(x)speak(x,z))(y))
    β y.(language(y)(x.programmer(x)speak(x,y)) CourseNana.COM

Topicalization and Movement Topicalization is a linguistic phenomenon in which an NP appears at the beginning of a sentence in order to establish it as the topic of discussion in a sentence or to emphasize it in some other way. It plays an important role in the syntax of fixed-word-order languages because grammatical function is mainly determined by word order. Both Chinese and English exhibit topicalization. The entire object NP, for example, can be moved to the beginning of the sentence in either language. But in Chinese, object topicalization is more restricted when the subject is quantified: it can happen when the subject is universally quantified, but not when it is existentially quantified (33-36). CourseNana.COM

A language, every programmer speaks. language programmer speak CourseNana.COM

Ambiguous: > and >
Every language, a programmer speaks. CourseNana.COM

language programmer speak Ambiguous: > and > CourseNana.COM

一种 语言 每个 程序员 都会说 CourseNana.COM

yi zhong yuyan mei ge chengxuyuan dou huishuo zhong-CL language ge-CL programmer dou speak CourseNana.COM

Unambiguous: > CourseNana.COM

每 种 语言 每 个 程序员 都 会说 CourseNana.COM

mei zhong yuyan mei ge chengxuyuan dou huishuo zhong-CL language ge-CL programmer dou speak CourseNana.COM

*一种 语言 一个 程序员 会说
yi zhong yuyan yi ge chengxuyuan huishuo zhong-CL language ge-CL programmer speak CourseNana.COM

(36)*每种 语言一个程序员 ()会说 mei zhong yuyan yi ge chengxuyuan (dou) huishuo zhong-CL language ge-CL programmer dou speak CourseNana.COM

S CourseNana.COM


a language every programmer speaks ε Figure 4: English topicalization parse tree: example (31). CourseNana.COM

S CourseNana.COM

NP NP VP CourseNana.COM

Q CL N Q CL N D VP CourseNana.COM


yi CourseNana.COM


In English, neither subject–verb agreement nor quantifier scope ambiguity is generally affected by movement. In particular, the number and person of the subject should always agree with the predicate no matter where it occurs. Here, you can assume that Chinese also follows subject–verb agreement (regarding the requirement of dou) in the same way that English does. But whereas in English, both readings are still available after the sentences are topicalised (31, 32), this is not the case in Chinese. Compared to its untopicalised counterpart (29), the topicalised sentence (33) is no longer ambiguous. CourseNana.COM

Figures 4 and 5 show the parse trees of sentences (31) and (33). Topicalization is generally analysed with gaps. An empty trace is left in the untopicalized position of the object NP, where the gap is introduced. The gapped NP then percolates up the tree, and is finally unified with the topicalized NP at the left periphery of the sentence.3 CourseNana.COM

3Although Chinese is an SVO (Subject-Verb-Object) language, there is a means of performing “double movement.” 11 CourseNana.COM

种 语言 每 个 程序员 都 CourseNana.COM

zhong yuyan mei ge chengxuyuan dou zhong-CL language ge-CL programmer dou 会说 CourseNana.COM

huishuo CourseNana.COM

speak CourseNana.COM

Figure 5: Chinese topicalization parse tree: example (33). CourseNana.COM

  1. (a)  (2marks)Manuallyconvertallreadingsofthesentences(28)and(30)tologicalexpressions. Put your logical forms in section 2(a) of analysis.txt. Use exists and forall for the quantifiers, and use => and the caret symbol ^ for implication and conjunction. CourseNana.COM

  2. (b)  (10marks)Implementgrammarsforthesyntaxofquantifierscopeambiguity.Youdon’tneed to account for meanings, or for ambiguity in meanings (there should be no syntactic ambigu- ities). At this point, a correct grammar will produce exactly one parse for every grammatical sentence. Test your implementation before you move on to the next step. CourseNana.COM

  3. (c)  (10 marks) Augment your grammars to represent meaning and quantifier scope ambiguity. Marks for question 2(b) will be deducted if your work on this part causes errors in the syn- tactic predictions. Your grammar should generate more than one parse for each ambiguous sentence. CourseNana.COM

  4. (d)  (4 marks) Translate sentences (28) and (30), as you did in the first question. CourseNana.COM

    Operational Instructions CourseNana.COM

    • Use the function translate to generate semantic representations of your source sen- tences. If your sentences can be parsed, translate should open another gralej win- dow and with all of the translation results. CourseNana.COM

              | ?- translate([a,programmer,speaks,every,language]).
    • You will be prompted as follows to see the next parse. CourseNana.COM

                ANOTHER? y
                ANOTHER? y

      Answer y to see the next parse until you reach the end. Each time TRALE will open a new Gralej window. You need to store all of your translation results by repeating the previous step. A no will be returned when you reach the end of your parses. CourseNana.COM

    • Saveyourtranslationsofsentence(28)asq2d_28_1.grale,q2d_28_2.grale...and your translations of sentence (30) as q2d_30_1.grale, q2d_30_2.grale ... CourseNana.COM

    • Submit a zip file q2d.zip containing all the translation results. You can use this com- mand: zip -r q2d.zip q2d_*.grale to create the zip file. CourseNana.COM

    • Again, don’t forget to close all the windows and kill your Gralej processes after you finish. CourseNana.COM

(1) 一个 程序员 每种 语言 都会说
yi ge chengxuyuan mei zhong yuyan dou huishuo ge-CL programmer zhong-CL language dou speak CourseNana.COM

A programmer every language speak. We will ignore these. CourseNana.COM

(e) (4 marks) Large Language Models (LLMs) have gained quite a lot of popularity recently. In this question, you will explore the question of whether LLMs such as ChatGPT4 “understand” quantifier scope. CourseNana.COM

There are several approaches to this exploration: You can ask ChatGPT to translate sentences and compare its translations to your grammar’s translations. Or, you can directly interrogate ChatGPT about quantifier scope readings and analyse its responses. You can also design some clever linguistic tasks that involve quantifier scope and observe how ChatGPT handles them. Be both creative and precise in your experimentation. CourseNana.COM

In your write-up, report at least one case where GPT’s behaviour differs from that of your grammar. Document the prompts you used and describe your experimental design. Reflect on the differences observed and share your thoughts on why these differences may have occurred. CourseNana.COM

Get in Touch with Our Experts

Wechat WeChat
Whatsapp Whatsapp
University of Toronto代写,CSC 485代写,CSC2501代写,Computational linguistics代写,CSC485代写,University of Toronto代编,CSC 485代编,CSC2501代编,Computational linguistics代编,CSC485代编,University of Toronto代考,CSC 485代考,CSC2501代考,Computational linguistics代考,CSC485代考,University of Torontohelp,CSC 485help,CSC2501help,Computational linguisticshelp,CSC485help,University of Toronto作业代写,CSC 485作业代写,CSC2501作业代写,Computational linguistics作业代写,CSC485作业代写,University of Toronto编程代写,CSC 485编程代写,CSC2501编程代写,Computational linguistics编程代写,CSC485编程代写,University of Torontoprogramming help,CSC 485programming help,CSC2501programming help,Computational linguisticsprogramming help,CSC485programming help,University of Torontoassignment help,CSC 485assignment help,CSC2501assignment help,Computational linguisticsassignment help,CSC485assignment help,University of Torontosolution,CSC 485solution,CSC2501solution,Computational linguisticssolution,CSC485solution,