Selectional Preference Learning


See also: 

CorMet (Metaphor Extraction System)


CorMet: a computational, corpus-based conventional metaphor extraction system [PDF] from upenn.edu ZJ Mason – Computational Linguistics, 2004 – dl.acm.org … CorMet first uses the selectional-preference-learning algorithm described in Resnik (1993), then clustering over the results. … An overall measure of the choosiness of a case slot is selectional-preference strength, SR(p), defined as the relative entropy of the posterior probability … Cited by 84 – Related articles – BL Direct – All 27 versions

[PDF] Flexible, corpus-based modelling of human plausibility judgements [PDF] from upenn.edu S Padó, U Padó… – Proceedings of EMNLP/CoNLL, 2007 – acl.ldc.upenn.edu … We present Resnik’s model in some detail, since we will use it for comparison below. Resnik first computes the overall selectional preference strength for each verb-relation pair, ie the degree of “con- strainedness” of each relation. … Cited by 16 – Related articles – View as HTML – All 27 versions

Automatically selecting useful phrases for dialogue act tagging [PDF] from arxiv.org K Samuel, S Carberry… – Arxiv preprint cs/9906016, 1999 – arxiv.org … Method Abbreviation Previous Literature LIT All Phrases ALL Cooccurrences COOC Conditional Probability CP Entropy ENT T Test TTEST Mutual Information MI Selectional Preference Strength S Information Gain IG Deviation D Deviation Conditional Probability DCP … Cited by 28 – Related articles – All 6 versions

Computational Metaphor Identification in Communities of Blogs E Baumer… – International Conference on Weblogs and Social …, 2008 – aaai.org … hinges largely on selectional preference learning (Resnik 1993). For example, the English verbs “eat” or “drink” tend to have a human1 or animal as the subject and food or potable liquid as the direction object, respectively. The selectional preference strength for a given verb … Cited by 2 – Related articles – All 2 versions

[PDF] Computational identification of conceptual metaphors in communities of Blogs [PDF] from ericbaumer.com E Baumer… – 2008 – ericbaumer.com … implementation hinges largely on selectional preference learning [8]. For example, the English verbs “eat” or “drink” tend to have a human1 or animal as the subject and either food or potable liquid as the direction object, respectively. The selectional preference strength for a … Related articles – View as HTML – All 9 versions

[PDF] A simple, similarity-based model for selectional preferences [PDF] from upenn.edu K Erk – ANNUAL MEETING-ASSOCIATION FOR …, 2007 – acl.ldc.upenn.edu … Resnik uses the WordNet noun hierarchy for generalization. His information-theoretic approach models the selectional preference strength of an argument position1 rp of a predicate p as S(rp) = ? c P(c|rp) log P(c|rp) P(c) where the c are WordNet synsets. … Cited by 55 – Related articles – View as HTML – BL Direct – All 26 versions

Latent variable models of selectional preference [PDF] from cam.ac.uk DO Séaghdha – Proceedings of the 48th Annual Meeting of the …, 2010 – dl.acm.org … This paper takes up tools (“topic models”) that have been proven successful in modelling document-word co-occurrences and adapts them to the task of selectional preference learning. … 2 Related work 2.1 Selectional preference learning … Cited by 14 – Related articles – All 10 versions

A pilot study of english selectional preferences and their cross-lingual compatibility with basque [PDF] from psu.edu E Agirre, I Aldezabal… – Text, Speech and Dialogue, 2003 – Springer … This article is structured as follows. In the next section, a short review of the different approaches to selectional preference learning is presented, alongside the corpora used on this experiment. … 2 Selectional Preference Learning … Cited by 10 – Related articles – BL Direct – All 8 versions

[PDF] Selectional preference and sense disambiguation [PDF] from upenn.edu P Resnik – Proceedings of the ACL SIGLEX Workshop on …, 1997 – acl.ldc.upenn.edu … Information theory provides an appropriate way to quantify the difference between the prior and pos- terior distributions, in the form of relative entropy (Kullback and Leibler, 1951). The model defines the selectional preference strength of a predicate as: • SR(p) = D(er(clp)[I Pr(c)) … Cited by 226 – Related articles – View as HTML – All 19 versions

Extracting Meaning from Cell Phone Improvement Ideas J Turner, R Lencevicius… – Twenty-Second International FLAIRS …, 2009 – aaai.org … alone). Thus words that strongly prefer certain categories have a high selectional preference strength. By multiply- ing the counts in the word vectors by the words’ selectional preference strength, we emphasize important words. … Related articles – All 2 versions

[PDF] Implicit object constructions and the (in) transitivity continuum [PDF] from umd.edu MB Olsen… – 33rd Proceedings of the Chicago Linguistic …, 1997 – umiacs.umd.edu … set of experiments confirmed the connection between strong selection and implicit objects, showing via a corpus analysis (replicated using several different corpora) that there is a statistically significant correlation between a verb’s selectional preference strength (in present … Cited by 13 – Related articles – All 11 versions

[PDF] A cognitive model for the representation and acquisition of verb selectional preferences [PDF] from upenn.edu A Alishahi… – ACL 2007, 2007 – acl.ldc.upenn.edu … Resnik (1996) defines the selectional preference strength of a verb as the divergence between two probability distributions: the prior probabilities of the classes, and the posterior probabilities of the classes given that verb. The … Cited by 7 – Related articles – View as HTML – All 38 versions

[PDF] The Relationship between Semantic Similarity and Subcategorization Frames in English: A Stochastic Test Using ICE-GB and Word et [PDF] from mireene.com S Song, JW Choe – 2008 – corpus.mireene.com … Keywords: semantic similarity, subcategorization frames, ICE-GB, WordNet, statistical method, clustering, dendrogram, selectional preference strength … This study also required a method that can calculate the selectional preference strength of subcategorization frames. … Related articles – View as HTML

Automatic acquisition of attribute host by selectional constraint resolution J Zhao, H Liu… – MICAI 2007: Advances in Artificial Intelligence, 2007 – Springer … The selectional preference strength S(ai) of an attribute ai discussed in Section 4 can be an indicator for the attribute host’s generality in the concept taxonomy. Table 1 lists a subset of attributes ranked in a descending order respect to their selectional preference strength. … Related articles – BL Direct – All 2 versions

[PDF] Collecting and employing selectional restrictions [PDF] from psu.edu A Wagner… – First Swiss-Estonian Student Workshop on …, 1996 – Citeseer … He quanti es the selectional preference strength of a predicate with respect to a certain argument, ie how strong the predicate constrains its argu- ments semantically,5 by relative entropy. … 5For example, eat” has a greater selectional preference strength for its object then see”. 5 … Cited by 3 – Related articles – View as HTML – All 10 versions

[PDF] Selectional Preference Based Verb Sense Disambiguation Using WordNet [PDF] from assta.org P Ye – Australasian Language Technology Workshop, 2004 – assta.org … prior distribution Prp(t) as the probability of the noun-type t occurring in a particular selectional preference p. From the prior distribution, Resnik defines the selectional preference strength of a par- ticular verb sense s with respect to a particular se- lectional preference p over a … Cited by 2 – Related articles – View as HTML – All 14 versions

Improving automatic query classification via semi-supervised learning [PDF] from psu.edu SM Beitzel, EC Jensen, O Frieder… – Data Mining, Fifth …, 2005 – ieeexplore.ieee.org … He defines the selectional preference strength S(x) of a word x, as Equation 1, where u ranges over a set U of semantic classes. … 2 u S(x) = D( P(U|x) || P(U) ) P(u|x) = P (u |x )lo g P (u ) ? ? ? ? Equation 1: Selectional Preference Strength … Cited by 60 – Related articles – All 12 versions

[PDF] Automatic acquisition of taxonomies from text: FCA meets NLP [PDF] from shef.ac.uk P Cimiano, S Staab… – … of the ECML/PKDD Workshop on …, 2003 – dcs.shef.ac.uk … the significance of a certain verb-object pair¥£¡ F”, we used three different mea- sures: a standard measure based on the conditional probability, the mutual information measure used in (Hindle, 1990), as well as a measure based on Resnik’s selectional preference strength of a … Cited by 75 – Related articles – View as HTML – All 11 versions

[PDF] Evaluating Approaches to Selectional Preference Acquisition against German Plausibility Judgments [PDF] from pyrx.de C Brockmann – pyrx.de … Selectional preference strength captures the re- lationship between a verb and the entire argument class hierarchy. … (4) A(v,r,c) = P(c|v,r)log P(c|v,r) P(c) S(v) This measure quantifies the relative contribution of class c to the overall selectional preference strength. … Related articles – View as HTML – All 6 versions

Integrating selectional preferences in wordnet [PDF] from arxiv.org E Agirre… – Arxiv preprint cs/0204027, 2002 – arxiv.org … 20.080 Donostia. Spain. jibmaird@si.ehu.es Abstract Selectional preference learning methods have usually focused on word-to-class relations, eg, a verb selects as its subject a given nominal class. … Page 2. 2 2 Selectional preference learning … Cited by 56 – Related articles – All 7 versions

Selectional constraints: An information-theoretic model and its computational realization [PDF] from um.edu.mt P Resnik – Cognition, 1996 – Elsevier … So, treating the former as q and the latter as p, the difference between the two distributions is quantified as: S(pi) = D(Pr(clp)llPr(c)) Pr(clpi) =]pr(clp), log Pr(c) I will call this quantity selectional preference strength.6 Notice that, in this model, the selectional preference strength of a … Cited by 146 – Related articles – BL Direct – All 19 versions

[PDF] Extraction of Selectional Preferences for French using a Mapping from EuroWordNet to the Suggested Upper Merged Ontology [PDF] from uni-stuttgart.de D Spohr – Proceedings of the 4th Global WordNet Conference, …, 2008 – ims.uni-stuttgart.de … The strength of selectional preference SR(p) of a predicate p with respect to a grammatical relation R is defined as follows. SELECTIONAL PREFERENCE STRENGTH: … The selectional preference strength is on the one hand an indicator as to “how much information [. . . … Cited by 3 – Related articles – View as HTML

[PDF] Estimating frequency counts of concepts in multiple-inheritance hierarchies [PDF] from uni-tuebingen.de A Wagner – LDV Forum, 2004 – sfb441.uni-tuebingen.de … Moreover, he uses the distributions p(ncpt|v) and p(ncpt) to quantify the overall pref- erence strength of v. The selectional preference strength quantifies how strong the predicate se- mantically constrains its arguments. For example … Cited by 3 – Related articles – View as HTML – All 17 versions

Statistical models for the induction and use of selectional preferences [PDF] from liu.edu M Light… – Cognitive Science, 2002 – Elsevier … This aggregate difference is considered the selectional preference strength of the verb v. The selectional association of v for a specific class, c, is the contribution of that concept to the total selectional preference strength: It is the difference in the distributions at a particular class … Cited by 29 – Related articles – All 38 versions

[PDF] Document Number: Working Paper 5.2 b Project ref. IST-2001-34460 Project Acronym MEANING Project URL http://www. lsi. upc. es/~ nlp/meaning/meaning. … [PDF] from upc.edu E Agirre, I Aldezabal… – lsi.upc.edu … This report is structured as follows. In the next section, a short review of the different approaches to selectional preference learning is presented, followed by the resources used in the experiment. … Finally some conclusions are drawn. 6 SELECTIONAL PREFERENCE LEARNING … Related articles – View as HTML – All 2 versions

[PDF] Unsupervised Text Mining for Ontology Extraction: An Evaluation of Statistical Measures [PDF] from psu.edu ML Reinberger… – Proceeding of LREC, 2004 – Citeseer … the Resnik (Resnik, 1997) measure, which computes for each verb its selectional preference strength Sr(v); this measure is high when the NSs that combine with the verb as objects are infrequent: ?? = A(?| ?) C (?) with (?) = ¡(c)¨ A(?| ?) C ! [A(?| ?)/ A(?)]} . … Cited by 4 – Related articles – View as HTML – All 18 versions

[PDF] Evaluating and combining approaches to selectional preference acquisition [PDF] from pyrx.de C Brockmann… – Master’s thesis, Universität des Saarlandes, …, 2002 – pyrx.de Page 1. Universität des Saarlandes Philosophische Fakultät II – Sprach-, Literatur- und Kulturwissenschaften Fachrichtung 4.7 – Allgemeine Linguistik Computerlinguistik Postfach 15 11 50 66041 Saarbrücken Diplomarbeit Evaluating and Combining Approaches to … Cited by 2 – Related articles – View as HTML – All 7 versions

?? ??? ??? ???? ?? ?? ??? ???: ICE-GB ? WordNet ? ??? ??? ?? ???… – ?????, 2010 – papersearch.net … Choe ). – ??, 5,200 ?. – ????, 2010. – ?????, Semantic similarity, subcategorization frames, ICE-GB, WordNet, statistical method, clustering, dendrogram, selectional preference strength. – ??, ??? ??? ????. … Cached

Automatic metaphor interpretation as a paraphrasing task [PDF] from rug.nl E Shutova – Human Language Technologies: The 2010 Annual …, 2010 – dl.acm.org … Table 2: The list of paraphrases reranked using selec- tional preferences selectional preference strength as follows. SR(v) = D(P(c|v)||P(c)) = ? c P(c|v) log P(c|v) P(c) , (6) … Selectional preference strength measures how strongly the predicate constrains its arguments. … Cited by 3 – Related articles – All 12 versions

WordNet and class-based probabilities P Resnik – WordNet: An electronic lexical database, 1998 – books.google.com … only if</2= d. For example, the average mutual information(X; Y) is equivalent to D (p (x, y)|| p (x) p (y)), that is, a measure of how well the independence distribution p (x) p (y) approximates the actual joint distri- bution p (x, y). Similarly, the selectional preference strength of a … Cited by 46 – Related articles

[PDF] The Effect of Selectional Preferences on Semantic Role Labeling [PDF] from utexas.edu AC Young – 2009 – comp.ling.utexas.edu … classes are. The second is the selectional association between a predicate and a given word class. The selectional preference strength S(p) of a predicate p is computed by first computing two probability distributions and then finding the difference … Cited by 1 – Related articles – View as HTML – Library Search – All 4 versions

Automatic web query classification using labeled and unlabeled training data [PDF] from iit.edu SM Beitzel, EC Jensen, O Frieder… – Proceedings of the 28th …, 2005 – dl.acm.org … x,u) pairs, discarding y’s for which we have no semantic information 3. Mine the (x,u) pairs to find lexemes that prefer to be followed or preceded by lexemes in certain categories (preferences) 4. Score each preference using Resnik’s Selectional Preference Strength [4] and keep … Cited by 57 – Related articles – All 13 versions

[PDF] Clustering concept hierarchies from text [PDF] from kit.edu P Cimiano, A Hotho… – Proceedings of LREC, 2004 – aifb.kit.edu … “$# %’&)(103254$67?8 %9&)(@0254 8A %9&B7?8 Resnik: ?? ?C? ? ? E DG FIHG???C? ? ? where the selectional preference strength of a verb is defined according to (Resnik, 1997): FH ???P R QT S 7VUCW ? ?????? G ! “$#X %9&B73Y (@025418 %’&)7?8 … Cited by 45 – Related articles – View as HTML – All 18 versions

Explaining away ambiguity: Learning verb selectional preference with Bayesian networks [PDF] from arxiv.org M Ciaramita… – Proceedings of the 18th conference on …, 2000 – dl.acm.org … S(p,r) = D(P(clp, r) II P(c)) (1) Resnik defines the selectional association of a predicate for a particular class c to be the por- tion of the selectional preference strength due to that class: P(clP, ~’) 1 P(clp , r) log (2) A(p,,., c) – S(v,,’) P(c) … Cited by 42 – Related articles – All 29 versions

Computational mechanisms for metaphor in languages: a survey [PDF] from ict.ac.cn CL Zhou, Y Yang… – Journal of Computer Science and …, 2007 – Springer … The third step is Selectional Preference Learning. CorMet uses selectional-preference-learning algorithm to get a verb semantic preference which is an overall measure of the choosiness of a case slot measured by selectional-preference strength, SR(Ô). … Cited by 13 – Related articles – BL Direct – All 5 versions

Semi-supervised WSD in selectional preferences with semantic redundancy [PDF] from aclweb.org X Tang, X Chen, W Qu… – … of the 23rd International Conference on …, 2010 – dl.acm.org Page 1. Coling 2010: Poster Volume, pages 1238-1246, Beijing, August 2010 Semi-Supervised WSD in Selectional Preferences with Semantic Redundancy Xuri TANG 1,5 , Xiaohe CHEN 1 , Weiguang QU 2,3 and Shiwen YU 4 … Cited by 2 – Related articles – All 9 versions

Inducing Chinese Selectional Preference Based on HowNet Y Jia, H Zan… – 2011 Seventh International …, 2011 – doi.ieeecomputersociety.org … A. Selectional Association In (1), Selectional preference strength (SPS) is defined as the relative entropy of the posterior probability Pr(c|p) and the prior probability Pr(c), which measures how much information the predicate p provides about the semantic class of its argument. …

Computational metaphor extraction to encourage critical reflection and support epistemological pluralism E Baumer – Proceedings of the 8th iternational conference on …, 2007 – dl.acm.org … frequency in general English. For each of these characteristic verbs in a domain, the algorithm performs selectional preference learning to determine the type of words for which each verb’s case slots tend to select. For example, in … Related articles

Automatic construction of a lexical attribute knowledge base J Zhao, Y Gao, H Liu… – Knowledge Science, Engineering and …, 2007 – Springer … 204 J. Zhao et al. Assoc(ai,c) = P(c|ai) log P(c|ai) P(c) S(ai) In which, S(ai) is called the selectional preference strength of the attribute ai that is modeled as the relative entropy between the probability distribution P(c|ai) and P(c) as: Page 8. S(ai) = ? c P(c|ai) log P(c|ai) P(c) … Cited by 4 – Related articles – BL Direct – All 3 versions

Michael R. Brent, Computational Approaches to Language Acquisition D Estival – Computers and the Humanities, 1999 – Springer … He argues against the view of selectional restrictions as conventional- ized inferences (cf. Johnson-Laird, 1983) without reverting to a definitional view based on necessary and sufficient features. Selectional preference strength is the Page 3. 286 BOOK REVIEWS … Related articles – All 3 versions

Computational Approaches to Language Acquisition, Michael R. Brent, ed. M Kanazawa – Journal of Logic, Language and Information, 2004 – Springer … He calculates the selectional preference strength of verbs from three corpora, and compares the results with plausibility judgments and typicality ratings of adult subjects, and with the possibility and likelihood of each verb undergoing object deletion. … BL Direct – All 4 versions

[PDF] ACQUISITION OF SELECTIONAL PREFERENCES IN NATURAL LANGUAGE PROCESSING [PDF] from shef.ac.uk S KARAMBELKAR – 2001 – dcs.shef.ac.uk … Resnik argues that this information is quantitative and not qualitative, and is a scalar quantity. He calls this difference Selectional Preference Strength of the argument. … As a result the difference in probabilities (Selectional Preference Strength) will be high for such predicates. … Related articles – View as HTML – All 2 versions

A non-negative tensor factorization model for selectional preference induction T Van de Cruys – Natural Language Engineering, 2010 – Cambridge Univ Press … He then calculates the selectional preference strength Sr(v) of a specific verb v in a particular relation r by computing the Kullback-Leibler divergence between the class distribution of the verb p(c | v) and the aggregate class distribution p(c): Sr(v) = ? c p(c | v) log p(c | v) p(c) (1) … Cited by 1 – Related articles – All 4 versions

Using selectional restrictions for real word error correction [PDF] from rus.ec R Banu… – Applied Computing, 2004 – Springer … between p( C ) and p(C|v) where C is a semantic class and p(C|v) is the conditional probability of C occurring as argument of v at some argument position and P( C ) is the marginal probability of C.From this, two quantities the Selectional Preference Strength (SPS) and … Cited by 1 – Related articles – BL Direct – All 5 versions

[PDF] Automatic labelling of topic models [PDF] from rug.nl JH Lau, K Grieser, D Newman… – Proceedings of the 49th …, 2011 – acl.eldoc.ub.rug.nl … In topic model- based selectional preference learning (Ritter et al., 2010; `O Séaghdha, 2010), the learned topics can be translated into semantic class labels (eg DAYS OF THE WEEK), and argument positions for individ- ual predicates can be annotated with those labels for … Related articles – View as HTML – All 8 versions

metaViz: Visualizing Computationally Identified Metaphors in Political Blogs [PDF] from ericbaumer.com EPS Baumer, J Sinclair, D Hubin… – … and Engineering, 2009 …, 2009 – ieeexplore.ieee.org … large source of content on a wide variety of topics. All documents in the source and target corpora are parsed [11]. The crux of CMI is selectional preference learning [12]. For example, words for the concept of food are often the … Cited by 5 – Related articles – All 6 versions

Automatic Metaphor Recognition Based on Semantic Relation Patterns X Tang, W Qu, X Chen… – Asian Language Processing ( …, 2010 – ieeexplore.ieee.org … The information-theoretic model[20] is used to acquire SPs. [20] defines two concepts: selectional preference strength(Formula 1) and selectional association(Formula 2), as is defined below: )( )Pr( )| Pr( log)| Pr( ),( i i i i pS c pc pc cpA = (1) … Related articles – All 2 versions

[PDF] Classification-based Retrieval Methods to Enhance Information Discovery on the Web [PDF] from airccse.org YK Jain… – International Journal – airccse.org … y, its argument. The selectional preference strength S(x) of a word x, where u ranges over a set U of semantic classes. P(U) is … from P(U). Selectional Preference Strength:S(x) = D(P(u|x)||P(u)) =?P(u|x)lg(P(u|x)/p(u)) The ideal approach … Related articles – View as HTML – All 5 versions

[PDF] Learning lexical semantic representations [PDF] from mu.oz.au T Baldwin – Lecture notes from the ACL/HCSNet Advanced …, 2006 – cs.mu.oz.au … Detection of Alternations • Alternations can be learnt through: ? subcategorisation frame acquisition ? (and optionally) selectional preference learning McCarthy, 2001; Schulte im Walde, 2003; Korhonen, 2002 36 Page 38. ACL/HCSNet Advanced Programme in NLP … Cited by 1 – Related articles – View as HTML – All 2 versions

[PDF] Estimating Frequency Counts of Concepts in Multiple ? Inheritance Hierarchies [PDF] from dwds.de A Wagner – media.dwds.de … LDV FORUM Wagner the predicate semantically constrains its argu ments. For example, Aeat’ has a greater selectional preference strength for its object then Ahave’, be cause Aeat’ strongly prefers objects denoting food, whereas Ahave’ can select almost any noun as its object. … Related articles – All 4 versions

[PDF] ” Computational Metaphor Identification [PDF] from uci.edu EPS Baumer, D Hubin… – luci.ics.uci.edu … 3.1.4 Selectional Preference Learning. … Individual selectional association numbers are the- oretically unbounded, but because selectional association is normalized by selectional preference strength, the sum of all selectional associations for any single grammatical relation of a … Related articles – View as HTML – All 2 versions

A flexible, corpus-driven model of regular and inverse selectional preferences [PDF] from pp.ua K Erk, S Padó… – Computational Linguistics, 2010 – MIT Press … For the generalization step, Resnik’s model maps all headwords onto WordNet synsets (or classes) c. Resnik first computes the overall selectional preference strength for each verb-relation pair (v, r), that is, the degree to which the pair constrains possible fillers. … Cited by 2 – Related articles – All 13 versions

[PDF] Computational Metaphor Identification: A Method for Identifying Conceptual Metaphors in Written Text [PDF] from nbu.bg E Baumer, B Tomlinson… – Proc. Analogy – nbu.bg … MacCartney, & Manning, 2006). The crux of CMI is selectional preference learning (Resnik, 1993), which identifies asso- ciation between different classes of words through specific grammatical relationships. For example, words … Cited by 1 – Related articles – View as HTML – All 4 versions

Automatic classification of web queries using very large unlabeled query logs [PDF] from psu.edu SM Beitzel, EC Jensen, DD Lewis… – ACM Transactions on …, 2007 – dl.acm.org … Resnik [1993] presented an influential information theoretic approach to selectional preference. He defined the selectional preference strength S(x) of a word x, as in Equation (2), where u ranges over a set U of semantic classes. … Cited by 55 – Related articles – BL Direct – All 10 versions

America is like Metamucil: fostering critical and creative thinking about metaphor in political blogs [PDF] from skku.edu EPS Baumer, J Sinclair… – Proceedings of the 28th …, 2010 – dl.acm.org … previous work [26]. For more details, see [3,4]. The crux of CMI is selectional preference learning [29], which quantifies the degree to which certain classes of nouns tend to be associated with specific verbs. For example, words … Cited by 4 – Related articles – All 8 versions

[PDF] A Random Indexing Approach to Unsupervised Selectional Preference Induction [PDF] from diva-portal.org H Hägglöf… – su.diva-portal.org … Following this procedure, Resnik [28] [29] proposed a corpus-based model of selectional preference induction. In an information-theoretic approach of estimating selectional preference strength, Resnik collected a set of verbs and arguments from a corpus. … Related articles – View as HTML – All 2 versions

[PDF] Using Lexical Constraints to Enhance the Quality of Computer-Generated Multiple-Choice Cloze Items [PDF] from aclweb.org CL Liu, CH Wang… – International Journal of Computational …, 2005 – aclweb.org … ALL HIGH LOW DIFF key 0.07 0.14 -0.07 -0.21 rank of word frequency (rank 1 is most frequent) distractors 0.11 0.15 0.03 -0.15 key -0.17 -0.15 -0.07 0.13 selectional-preference strength with the stem of the items distractors -0.20 -0.14 -0.21 0.00 ALL students … Cited by 10 – Related articles – View as HTML – All 17 versions

[PDF] Exploiting web-derived selectional preference to improve statistical dependency parsing [PDF] from ia.ac.cn G Zhou, J Zhao, K Liu… – … of the 49th Annual Meeting of the …, 2011 – nlpr-web.ia.ac.cn … Conventional selectional preference learning methods have usually fo- cused on word-to-class relations, eg, a verb selects as its subject a given nominal class. This paper extends previous work to word- to-word selectional preferences by using web- scale data. … Related articles – View as HTML – All 10 versions

Fostering metaphorical creativity using computational metaphor identification [PDF] from ericbaumer.com EPS Baumer, B Tomlinson, LE Richland… – Proceeding of the …, 2009 – dl.acm.org … 19,26]. 317 Page 4. The crux of CMI is selectional preference learning [32], which identifies the tendency of particular words to appear with certain other classes of words in specific grammatical relationships. For example, words … Cited by 2 – Related articles – All 5 versions

[PDF] 0 1999 Massachusetts Institute of Technology Second printing with corrections, 2000 [PDF] from preterhuman.net CD Manning… – mail.preterhuman.net … Some subcategorization frames learned by Manning’s system. 276 An example where the simple model for resolving PP attachment ambiguity fails. 280 Selectional Preference Strength (SPS). 290 Association strength distinguishes a verb’s plausible and implausible objects. … Related articles – View as HTML – All 11 versions

Using semantic preferences to identify verbal participation in role switching alternations [PDF] from psu.edu D McCarthy – Proceedings of the 1st North American chapter of the …, 2000 – dl.acm.org … Ear- lier work by Resnik (1993) demonstrated a link be- tween selectional preference strength and participa- tion in alternations where the direct object is omit- ted. Resnik used syntactic information from the bracketing within the Penn Treebank corpus. … Cited by 69 – Related articles – All 29 versions

Metaphor identification using verb and noun clustering [PDF] from aclweb.org E Shutova, L Sun… – … of the 23rd International Conference on …, 2010 – dl.acm.org … Figure 2: Clustered verbs (source domains) The verb clusters contain coherent lists of source domain vocabulary. 3.3 Selectional Preference Strength Filter Following Wilks (1978), we take metaphor to rep- resent a violation of selectional restrictions. … Cited by 3 – Related articles – All 13 versions

Cue phrase selection methods for textual classification problems [PDF] from utwente.nl JH Stehouwer – 2006 – essay.utwente.nl Page 1. Cue Phrase Selection Methods for Textual Classification Problems JH Stehouwer Master of Science Thesis Human Media Interaction Research Group Human Media Interaction Faculty of Computer Science University of Twente Enschede, The Netherlands … Cited by 2 – Related articles – All 5 versions

[PDF] Frames predict the interpretation of lexical omissions [PDF] from colorado.edu J Ruppenhofer… – 2009 – spot.colorado.edu Page 1. Frames predict the interpretation of lexical omissions Josef Ruppenhofer Department of Computational Linguistics and Phonetics Universität des Saarlandes 66123 Saarbrücken Germany josefr@coli.uni-saarland.de … Cited by 2 – Related articles – View as HTML – All 2 versions

Sample Complexity G Bounds – Springer … in the log that have (x,u) as a forward pair and the number of queries in the log that can be decomposed as (x,z). Tis allows one to write a forward rule of the form “x Y classified as u with weight p,” where p is the MLE of P(ux), provided that the selectional preference strength of x … Related articles

[PDF] Exploiting distributional similarity for lexical acquisition [PDF] from dialog-21.ru MC Diana – dialog-21.ru … For example, in the expressions I’ll eat my hat, the direct object hat is not prototypical of the types of object we usually see with eat and our model should indicate this. We use a measure of selectional preference strength as an estimate of compositionality. … Related articles – View as HTML

[PDF] Selection and information: a class-based approach to lexical relationships [PDF] from upenn.edu PS Resnik – IRCS Technical Reports Series, 1993 – repository.upenn.edu … that verbs permitting implicit objects tend as a group to select more strongly for that argument than obligatorily transitive verbs; the second experiment demonstrates that the tendency in practice to drop the object of verbs correlates with selectional preference strength; and a … Cited by 423 – Related articles – Library Search – All 11 versions

[PDF] On understanding and classifying web queries [PDF] from webir.org SM Beitzel – 2006 – webir.org Page 1. ON UNDERSTANDING AND CLASSIFYING WEB QUERIES BY STEVEN M. BEITZEL Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computer Science in the Graduate College of the Illinois Institute of Technology … Cited by 14 – Related articles – View as HTML – All 18 versions

[PDF] Contextual Distinctiveness: A new lexical property computed from large corpora [PDF] from ed.ac.uk S McDonald… – Informatics Research Report EDI-INF-RR- …, 2001 – inf.ed.ac.uk … Page 16. 15 1 The efficacy of the relative entropy measure to capture distributional differences of this sort has been shown in related work by Resnik (1993), who used relative entropy to estimate the selectional preference strength of a verb for its arguments. … Cited by 4 – Related articles – View as HTML – All 10 versions

[PDF] Computational approaches to figurative language [PDF] from cam.ac.uk EV Shutova – month, 2011 – www-test.cl.cam.ac.uk … 77 8 Page 9. CONTENTS CONTENTS 4.4.2 Corpus search . . . . . 78 4.4.3 Selectional preference strength filter . . . . . 79 4.5 Evaluation and discussion . . . . . 81 … Related articles – View as HTML – All 5 versions

[PDF] Computational Metaphor Identification to Foster Critical Thinking and Creativity DISSERTATION [PDF] from ericbaumer.com E BAUMER – 2009 – ericbaumer.com … 3.1.3 Finding Characteristic Nouns…..87 3.1.4 Selectional Preference Learning…..90 3.1.5 Synset Clustering…..98 … Related articles – View as HTML – All 2 versions

[PDF] Selection and Information: A Class-Based Approach to Lexical Relationships (Ph. D. Dissertation) [PDF] from upenn.edu PS Resnik – 1993 – ircs.upenn.edu … that verbs permitting implicit objects tend as a group to select more strongly for that argument than obligatorily transitive verbs; the second experiment demonstrates that the tendency in practice to drop the object of verbs correlates with selectional preference strength; and a … Cited by 2 – Related articles – View as HTML – All 8 versions

[PDF] AGraph APPROACH TO MEASURING TEXT DISTANCE [PDF] from toronto.edu VYC Tsang – 2008 – cs.toronto.edu Page 1. AGRAPH APPROACH TO MEASURING TEXT DISTANCE by Vivian Yuen-Chong Tsang A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy Graduate Department of Computer Science University of Toronto … Cited by 1 – Related articles – View as HTML – Library Search – All 9 versions

[PDF] ANon-DUAL APPROACH TO MEASURING SEMANTIC DISTANCE [PDF] from psu.edu I ONTOLOGICAL – 2008 – Citeseer Page 1. ANON-DUAL APPROACH TO MEASURING SEMANTIC DISTANCE BY INTEGRATING ONTOLOGICAL AND DISTRIBUTIONAL INFORMATION WITHIN A NETWORK-FLOW FRAMEWORK by Vivian Yuen-Chong Tsang … Related articles – View as HTML

[DOC] Learning which Verbs Allow Object Omission: Verb Semantic Selectivity and the Implicit Object Construction [DOC] from upenn.edu TN Medina – 2007 – sas.upenn.edu … derived in accordance with competing demands of four factors: faithfulness to the underlying argument structure of the verb, economy of structure dependent on high semantic selectivity (using Resnik’s (1996) measure of verb Selectional Preference Strength), and requirements … Related articles – View as HTML – Library Search – All 2 versions

A graph-theoretic framework for semantic distance [PDF] from aclweb.org V Tsang… – Computational Linguistics, 2010 – MIT Press Page 1. A Graph-Theoretic Framework for Semantic Distance Vivian Tsang * University of Toronto Suzanne Stevenson ** University of Toronto Many NLP applications entail that texts are classified based on their semantic distance (how similar or different the texts are). … Cited by 7 – Related articles – All 19 versions

[PDF] Lexical Functions in Information Retrieval [PDF] from clin.nl K Bangha – clin.nl Page 1. 1 Lexical Functions in Information Retrieval Kornel Bangha kornel.robert. bangha@umontreal.ca University of Montréal – Canada Abstract Ordinary human interaction is often based on shared knowledge. This knowledge … Related articles – View as HTML – All 4 versions

[PDF] Cue-Based Dialogue Act Classification [PDF] from shef.ac.uk N Webb – 2010 – nlp.shef.ac.uk Page 1. Cue-Based Dialogue Act Classification Nick Webb Submitted for the Degree of Ph.D. Department of Computer Science University of Sheffield March, 2010 Page 2. i Typographic Conventions In all writing about language there is the danger of confusion over whether … Related articles – View as HTML

[PDF] Learning thematic role relations for lexical semantic nets [PDF] from psu.edu A Wagner – Neuphilologischen Fakultät der Universität Tübingen, …, 2005 – Citeseer Page 1. LEARNING THEMATIC ROLE RELATIONS FOR LEXICAL SEMANTIC NETS von ANDREAS WAGNER Philosophische Dissertation angenommen von der Neuphilologischen Fakultät der Universität Tübingen am 8. Dezember 2004 Tübingen 2005 Page 2. … Cited by 11 – Related articles – View as HTML – All 11 versions

Knowledge processing on an extended wordnet [PDF] from psu.edu S Harabagiu… – WordNet: An Electronic Lexical …, 1998 – books.google.com Page 401. Chapter 16 Knowledge Processing on an Extended WordNet Sanda M. Harabagiu and Dan I. Moldovan 16.1 VERY LARGE KNOWLEDGE BASES AND WORDNET 16.1. 1 Desirable Features and What WordNet Can … Cited by 53 – Related articles – All 14 versions

[BOOK] Word sense selection in texts: an integrated model [PDF] from daviszhou.net OY Kwong – 2000 – daviszhou.net Page 1. Word Sense Selection in Texts: An Integrated Model Oi Yee Kwong Downing College University of Cambridge A Dissertation Submitted for the Degree of Doctor of Philosophy May 2000 Page 2. To my parents Page 3. Preface … Cited by 2 – Related articles – View as HTML – Library Search – BL Direct – All 9 versions

[PDF] A Dissertation Submitted for the Degree of Doctor of Philosophy [PDF] from psu.edu AI Model… – 2000 – Citeseer Page 1. Word Sense Selection in Texts: An Integrated Model Oi Yee Kwong Downing College University of Cambridge A Dissertation Submitted for the Degree of Doctor of Philosophy May 2000 Page 2. To my parents Page 3. Preface … Related articles – View as HTML – All 3 versions

[PDF] Applying semantically enhanced web mining techniques for building a domain ontology [PDF] from unisa.it E D’Avanzo, A Elia… – dsc.unisa.it Page 1. Facoltà di Lettere e Filosofia Corso di Laurea Specialistica in Comunicazione d’Impresa e Pubblica Tesi di Laurea in Informatica per il Commercio Elettronico Applying semantically enhanced web mining techniques for building a domain ontology Supervisors Candidata … Related articles – View as HTML

[PDF] A clustering approach for the unsupervised recognition of nonliteral language [PDF] from psu.edu J Birke – 2005 – Citeseer Page 1. A CLUSTERING APPROACH FOR THE UNSUPERVISED RECOGNITION OF NONLITERAL LANGUAGE by Julia Birke BA, McGill University, 1996 THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF … Cited by 2 – Related articles – View as HTML – Library Search – All 25 versions

Application of generic sense classes in word sense disambiguation [PDF] from nus.edu US KOHOMBAN – 2007 – scholarbank.nus.edu Page 1. APPLICATION OF GENERIC SENSE CLASSES IN WORD SENSE DISAMBIGUATION UPALI SATHYAJITH KOHOMBAN NATIONAL UNIVERSITY OF SINGAPORE 2006 Page 2. APPLICATION OF GENERIC SENSE CLASSES IN WORD SENSE DISAMBIGUATION … Related articles – All 4 versions

[PDF] Word Sense Disambiguation: Facing Current Challenges [PDF] from psu.edu DM Iraolak – Citeseer Page 1. Euskal Herriko Unibertsitatea/ Universidad del Pais Vasco Lengoaia eta Sistema Informatikoak Saila Departamento de Lenguajes y Sistemas Informáticos Supervised Word Sense Disambiguation: Facing Current Challenges David Martinez Iraolak … Related articles – View as HTML – All 8 versions