Created: Mon 2017-11-27
w48: thu, nov 30
- began learning dependency parsing. tried tagging word dependencies. a lot of the syntactical relations were really confusing.
- did two mock exams for probability. There were still some problems that I couldn't get right. esp those involving Bayes theorem. this is bad because it is widely used in “given this, what is the probability is that” situations. (Some practice problems on Brilliant.org are really helpful.)
- Further note: Textbook examples and exams at this level are just toy. Should absolutely spend more time on it. Good news is there's always some new epiphany (for lack of a smaller word) every now and then if you keep spending time on it.
- Got tripped by Simpson's paradox (still working on probability).
w48: wed, nov 29
- failed to deliver ass2 on time even though worked on it for better part of the day.
- produced some hands-on examples to illustrate what happens if you try to tag a piece of badly tokenized text.
- read about hidden Markov model for hours, still don't quite get it.
- took Basic Swedish exam, it was easy.
- watched lecture video that's also due today (dependency trees and structure trees). feeling crushed by theses deadlines.
w48: tue, nov 28
- lab7 worked on a toy lemmatizer. the problem is not designed ideally, you can achieve a fairly good score by lower() all words.
- worked on ass2.
- searched for tagsets for Chinese and Cantonese. CUHK has a Cantonese Child Language Corpus. UCLA has a written Chinese Corpus, which uses PKU tagset.
w48: mon, nov 27
- tried to read chapter 7 of Schay. understood very little of the content.
- “Tur för mig” means “lucky me”, not “it's my turn”
- noticed another faux-amis with english ”ge sig ut” means to “go out”
- read 5.1 to 5.3 of JM, which covered part-of-speech tagging, incl. tagsets for english.
- A thought when reading probability textbook: The more you learn the more you feel it (math) is a system. I'm not saying I understand much of it. I'm just saying I can see some beauty in it.
sun, nov 26
- exp 6.1.1 (Misprints on a Page): misunderstood what lambda means in Poisson distribution
- exp 6.1.2 (Diners at a restaurant)
- p.d.f. of a normal distribution
- course slides: morphological analysis, finite state morphology, and stemming