Autoencoder & Natural Language 2014


See also:

Autoencoder & Natural Language 2013 | Autoencoder & Natural Language 2015Best Convolutional Neural Network VideosCNN (Convolutional Neural Network) & Natural Language 2014


Distributed representations of sentences and documents QV Le, T Mikolov – arXiv preprint arXiv:1405.4053, 2014 – arxiv.org … These properties make word vectors attractive for many natural language processing tasks such as language … In this direction, autoencoder-style models have also been used to model paragraphs … typically require parsing and is shown to work for sentence-level representations. … Cited by 33 Related articles All 10 versions

Deep fragment embeddings for bidirectional image sentence mapping A Karpathy, A Joulin, FFF Li – Advances in Neural Information …, 2014 – papers.nips.cc … [20] described an autoencoder that learns … of the IEEE 98(8) (2010) 1485–1508 [9] Yang, Y., Teo, CL, Daumé III, H., Aloimonos, Y.: Corpus-guided sentence generation of … (2010) [33] Collobert, R., Weston, J.: A unified architecture for natural language processing: Deep neural … Cited by 18 Related articles All 5 versions

A convolutional neural network for modelling sentences N Kalchbrenner, E Grefenstette, P Blunsom – arXiv preprint arXiv: …, 2014 – arxiv.org … The network handles input sentences of varying length and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations. The network does not rely on a parse tree and is easily ap- plicable to any language. … Cited by 17 Related articles All 11 versions

Learning multilingual word representations using a bag-of-words autoencoder S Lauly, A Boulanger, H Larochelle – arXiv preprint arXiv:1401.1803, 2014 – arxiv.org … 1While the literature on autoencoders usually refers to the post-nonlinearity activation vector as the hid- den … preliminary results, our future work will investigate extensions of our bag- of-words multilingual autoencoder to bags … Natural Language Processing (Almost) from Scratch. … Cited by 5 Related articles All 5 versions

Conditional random field autoencoders for unsupervised structured prediction W Ammar, C Dyer, NA Smith – Advances in Neural Information …, 2014 – papers.nips.cc … to model structure in numerous problem domains, includ- ing natural language processing (NLP), computational … the average per-sentence inference runtime for the CRF autoencoder compared to … For CRF autoencoders, the average inference runtime grows slightly due to the … Cited by 4 Related articles All 6 versions

Context-aware learning for sentence-level sentiment analysis with posterior regularization B Yang, C Cardie – Proceedings of ACL, 2014 – acl2014.org … For example, the word “although” is often used to connect two polar clauses within a sentence, while the word “however” is often used to at the beginning of the sentence to connect two polar sentences. It is important to distinguish these two types of dis- course connectives. … Cited by 3 Related articles All 7 versions

Learning grounded meaning representations with autoencoders C Silberer, M Lapata – Proceedings of ACL, 2014 – acl2014.org … Autoencoders An autoencoder is an unsuper- vised neural network which is trained to recon- struct a given input from its latent representation (Bengio, 2009). It consists of an encoder f? which maps an input vector x(i) to a latent … Cited by 3 Related articles All 8 versions

Bilingually-constrained phrase embeddings for machine translation J Zhang, S Liu, M Li, M Zhou… – Proceedings of the 52th …, 2014 – nlpr-web.ia.ac.cn … Figure 4: An illustration of the bilingual- constrained recursive auto-encoders. … word embedding matrix L for two lan- guages (Section 3.1.1); ?rec: recursive auto-encoder parameter matrices W … The bilingual training data from LDC 2 contains 0.96M sentence pairs and 1.1M entity … Cited by 4 Related articles All 10 versions

The cmu submission for the shared task on language identification in code-switched data CC Lin, W Ammar, L Levin, C Dyer – Proceedings of the First Workshop …, 2014 – aclweb.org … the CRF autoencoder model, we use the supervisedly-trained CRF parameters to initialize the CRF autoencoder models. … Conditional random field autoencoders for unsuper- vised structured prediction … In Natural Language Processing and Informa- tion Systems, pages 412–416 … Cited by 3 Related articles All 2 versions

Grounded compositional semantics for finding and describing images with sentences R Socher, A Karpathy, QV Le… – Transactions of the …, 2014 – tacl2013.cs.columbia.edu … We leave this to future work. With lin- ear kernels, kCCA does well for image search but is worse for sentence self similarity and describing images with sentences close-by in embedding space. All other models are trained by replacing the DT- RNN function in Eq. 5. … Cited by 33 Related articles All 16 versions

Convolutional neural networks for sentence classification Y Kim – arXiv preprint arXiv:1408.5882, 2014 – arxiv.org … Subj: Subjectivity dataset where the task is to classify a sentence as being subjective or … CCAE: Combinatorial Category Autoencoders with combinatorial category grammar operators (Hermann and Blunsom, 2013). … 2011. Natural Language Processing (Almost) from Scratch. … Cited by 3 Related articles All 8 versions

A recursive recurrent neural network for statistical machine translation S Liu, N Yang, M Li, M Zhou – Proceedings of ACL, 2014 – ling.uni-potsdam.de … Applying DNN to natural language processing (NLP), representation or embedding of words is usually learnt first. … (2013) use recursive auto encoders to make … Given the rep- resentations of the smaller phrase pairs, recursive auto-encoder can generate the representation of the … Cited by 5 Related articles All 8 versions

Learning phrase representations using rnn encoder-decoder for statistical machine translation K Cho, B van Merrienboer, C Gulcehre… – arXiv preprint arXiv: …, 2014 – arxiv.org … networks can be successfully used in a number of tasks in natural language processing (NLP). … the translation probabilities of matching phrases in the source and target sentences.2 These … has been interest in training neural networks to score the translated sentence (or phrase … Cited by 24 Related articles All 10 versions

Improving image-sentence embeddings using large weakly annotated photo collections Y Gong, L Wang, M Hodosh, J Hockenmaier… – Computer Vision–ECCV …, 2014 – Springer … natural language sentences is an ambitious goal at the intersection of computer vision and natural language processing. … It is inspired by stacked denoising autoencoders [40, 41] and the recent work on … on ˆX and the clean sentences Y to obtain the image-sentence embedding. … Cited by 7 Related articles All 3 versions

An autoencoder with bilingual sparse features for improved statistical machine translation B Zhao, YC Tam, J Zheng – Acoustics, Speech and Signal …, 2014 – ieeexplore.ieee.org … These derived features from auto encoder are like the dense features, and are added to the features for each rule. … 4.3. Related work There have been some related work in using autoencoder or DNN (deep neural network) for natural language processing; however, … Cited by 1 Related articles All 6 versions

An Autoencoder Approach to Learning Bilingual Word Representations S Lauly, H Larochelle, M Khapra… – Advances in Neural …, 2014 – papers.nips.cc … The accuracy of Natural Language Processing (NLP) tools for a given language depend heavily on … classification results obtained by using the embeddings produced by our two autoencoders. … In- stead of an autoencoder paradigm, they propose a margin-based objective that … Cited by 1 Related articles All 4 versions

Deep convolutional neural networks for sentiment analysis of short texts CN dos Santos, M Gatti – … of the 25th International Conference on …, 2014 – aclweb.org … In this example, we have a negative sentence (left) and its negation (right). … 2011. Semi- supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the Conference on Empirical Methods in Natural Language Processing, pages 151–161. … Cited by 3 Related articles All 3 versions

Learning sentiment-specific word embedding for twitter sentiment classification D Tang, F Wei, N Yang, M Zhou… – Proceedings of the …, 2014 – anthology.aclweb.org … recursively. Hermann et al. (2013) present Combinatory Cate- gorial Autoencoders to learn the compositionality of sentence, which marries the Combinatory Cat- egorial Grammar with Recursive Autoencoder. The … Cited by 11 Related articles All 7 versions

Deep learning for answer sentence selection L Yu, KM Hermann, P Blunsom, S Pulman – arXiv preprint arXiv: …, 2014 – arxiv.org … These properties have caused distributed representations to become increasingly popular in natural language processing. … theory [4], using parse trees in conjunction with recursive autoencoders [17, 9 … The answer sentence selection dataset contains a set of factoid questions … Cited by 1 Related articles All 2 versions

Mind the gap: Machine translation by minimizing the semantic gap in embedding space J Zhang, S Liu, M Li, M Zhou… – Association for the …, 2014 – nlpr-web.ia.ac.cn … The same auto-encoder is re-used until the vector of the whole phrase is generated. … Accord- ingly, we adopt our proposed Bilingually-constrained Recur- sive Auto-encoders (Zhang et al. … in the training data, we can perform forced de- coding for the source sentence to find the … Cited by 1 Related articles All 5 versions

Dependency-based word embeddings O Levy, Y Goldberg – Proceedings of the 52nd Annual Meeting of the …, 2014 – aclweb.org … Bottom: the contexts extracted for each word in the sentence. … 2011. Natural language processing (almost) from scratch. The Journal of Machine Learning Re- search, 12:2493–2537. … 2011. Semi-supervised recursive autoencoders for predict- ing sentiment distributions. … Cited by 13 Related articles All 7 versions

Learning topic representation for smt with neural networks L Cui, D Zhang, S Liu, Q Chen, M Li, M Zhou… – Proceedings of the 52nd …, 2014 – aclweb.org … The representation learned by auto-encoders tends to be influenced by the function words, thereby it is not robust. … (2008) proposed the Denoising Auto-Encoder (DAE), which … In our task, for each sentence, we treat the retrieved N relevant documents as a single large document … Cited by 1 Related articles All 7 versions

Resolving lexical ambiguity in tensor regression models of meaning D Kartsaklis, N Kalchbrenner, M Sadrzadeh – arXiv preprint arXiv: …, 2014 – arxiv.org … 2013. Prior disambiguation of word tensors for construct- ing sentence vectors. … 2011. Dynamic Pooling and Un- folding Recursive Autoencoders for Paraphrase De- tection. … In Conference on Empirical Methods in Natural Language Processing 2012. Cited by 3 Related articles All 12 versions

Natural Language Processing and Chinese Computing C Zong, JY Nie, D Zhao, Y Feng – Springer … 2014, the Third International Conference on Natural Language Processing and Chinese … Translation and Multi-Lingual Information Access Sentence-Length Informed … Dong Li Cross-Lingual Sentiment Classification Based on Denoising Autoencoder….. …

Data Selection via Semi-supervised Recursive Autoencoders for SMT Domain Adaptation Y Lu, DF Wong, LS Chao, L Wang – Machine Translation, 2014 – Springer … Natural Language Processing & Portuguese-Chinese Machine Translation Laboratory, Department of Computer and Information Science … Data Selection via Semi-supervised Recursive Autoencoders … In order to measure the domain relevance of a sentence with this model, we … Related articles All 3 versions

[BOOK] Natural Language Processing and Chinese Computing: Third CCF Conference, NLPCC 2014, Shenzhen, China, December 5-9, 2014. Proceedings C Zong, JY Nie, D Zhao, Y Feng – 2014 – books.google.com … 2014, the Third International Conference on Natural Language Processing and Chinese … Translation and Multi-Lingual Information Access Sentence-Length Informed … Dong Li Cross-Lingual Sentiment Classification Based on Denoising Autoencoder….. … All 2 versions

Generating Sentences from Semantic Vector Space Representations M Iyyer, J Boyd-Graber, HD III – cs.umd.edu … 2 Unfolding Recursive Autoencoders The unfolding recursive autoencoder was first introduced in Socher et al. [20] for a para- phrase detection task. … In Proceedings of Emperical Methods in Natural Language Processing. [13] Le, QV and Mikolov, T. (2014). … Related articles All 2 versions

Deep Learning for Natural Language Processing and Machine Translation K Duh – 2014 – cl.naist.jp Page 1. Deep Learning for Natural Language Processing and Machine Translation Kevin Duh Nara Institute of Science and Technology, Japan 2014/11/04 Page 2. What is Deep Learning? A family of methods that uses deep … Related articles All 2 versions

Unsupervised Domain Adaptation with Feature Embeddings Y Yang, J Eisenstein – arXiv preprint arXiv:1412.4385, 2014 – arxiv.org … has an unlabeled training set of 100,000 sentences, along with development and test sets of about 1000 labeled sentences each. … In Proceedings of Empirical Methods for Natural Language Processing (EMNLP), pp. … Marginalized denoising autoencoders for domain adaptation. … Related articles All 3 versions

Bridging the Language Gap: Learning Distributed Semantics for Cross-Lingual Sentiment Classification G Zhou, T He, J Zhao – Natural Language Processing and Chinese …, 2014 – Springer … Sentiment classification has gained wide interest in natural language processing (NLP) community. … The weights of each autoencoder are tied, ie, W (1) = W(1) in Figure 1. We employ denoising stacked autoencoders (DAEs) for pre-training the sentences in each language. … Related articles All 5 versions

A Study on Recursive Neural Network Based Sentiment Classification of Sina Weibo C Fu, B Xue, Z Shaobin – Trust, Security and Privacy in …, 2014 – ieeexplore.ieee.org … SEMI-SUPERVISED RECURSIVE AUTOENCODERS FOR SENTIMENT CLASSIFICATION … CONCLUSION AND FUTURE WORK We recursive autoencoder algorithm and Word2vec to analyze sentiment … [7] Mirowski P, Ranzato M, LeCun Y. Dynamic auto-encoders for semantic … Related articles

DRWS: A Model for Learning Distributed Representations for Words and Sentences C Yan, F Zhang – PRICAI 2014: Trends in Artificial Intelligence, 2014 – Springer … to a complexity of O(n log n). [18] introduced Recursive Autoencoder (RAE), which is … of data set semantically, and the cluster may be a better choice for each sentence. … 1022 (2003) 4. Collobert, R., Weston, J.: A unified architecture for natural language processing: Deep neural … Related articles

Investigating the Role of Prior Disambiguation in Deep-learning Compositional Models of Meaning J Cheng, D Kartsaklis, E Grefenstette – arXiv preprint arXiv:1411.4116, 2014 – arxiv.org … A recursive auto-encoder (RAE) [16, 6] learns to reconstruct the input, encoded via a … Prior disambiguation of word tensors for constructing sentence vectors. In Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing (EMNLP), Seattle, USA … Related articles All 3 versions

A Sentence Similarity Method Based on Chunking and Information Content D ?tef?nescu, R Banjade, V Rus – Computational Linguistics and …, 2014 – Springer … Socher et al. [25] recursive autoencoder .768 .836 … In: Recent Advances in Natural Language Processing V, vol. 309, pp. … Socher, R., Huang, EH, Pennin, J., Manning, CD, Ng, A.: Dynamic pooling and unfold- ing recursive autoencoders for paraphrase detection. … Related articles All 2 versions

Vietnamese Sentence Similarity Based on Concepts HT Nguyen, PH Duong, VT Vo – Computer Information Systems and …, 2014 – Springer … texts and in [8] the authors uses unfolding recursive auto-encoder method for … Ng, AY, Manning, CD: Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase … the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational … Related articles All 2 versions

Study on Distributed Representation of Words with Sparse Neural Network Language Model H Yanagimoto – … Applied Informatics (IIAIAAI), 2014 IIAI 3rd …, 2014 – ieeexplore.ieee.org … Introducing sparseness of active states, sparse autoencoder can capture a latent structure regardless of the … on statistics in a corpus, like occurrence frequency and cooccurrence frequency in a sentence, the SNNLM has ability to improve natural language processing. … Related articles All 2 versions

Cross-Lingual Sentiment Classification Based on Denoising Autoencoder H Zhou, L Chen, D Huang – Natural Language Processing and Chinese …, 2014 – Springer … Collobert et al. [19] applied deep learning to natural language processing (NLP), and proposed a multi-task learning system … Page 9. Cross-Lingual Sentiment Classification Based on Denoising Autoencoder 189 … 4.3 Effect of Destruction Fraction in Denoising Autoencoders Fig. … Related articles All 4 versions

Deep Recursive Neural Networks for Compositionality in Language O Irsoy, C Cardie – Advances in Neural Information Processing …, 2014 – papers.nips.cc … In Figure 3, we give an example sentence, “Roger Dodger is one of the best variations on … Semi- supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the Confer- ence on Empirical Methods in Natural Language Processing, pages 151–161 … Related articles All 6 versions

Extraction of Salient Sentences from Labelled Documents M Denil, A Demiraj, N de Freitas – arXiv preprint arXiv:1412.6815, 2014 – arxiv.org … Hinton, GE, Krizhevsky, A, and Wang, S D. Transforming Auto-encoders. … Jeffrey, Huang, Eric H, Ng, Andrew Y, and Manning, Christopher D. Semi-Supervised Recursive Autoencoders for Predicting … In Conference on Empirical Methods in Natural Language Processing, 2012. … Related articles All 3 versions

A Deep Learning Model for Structured Outputs with High-order Interaction H Guo, X Zhu, MR Min – cs.toronto.edu … propose to integrate high-order hidden units, guided discriminative pretraining, and high-order auto- encoders for this … In our study, we use an auto-encoder with tied parameters for convenience. … vector to rep- resent a word or a sentence in the natural language processing (NLP … Related articles All 2 versions

TBCNN: A Tree-Based Convolutional Neural Network for Programming Language Processing L Mou, G Li, Z Jin, L Zhang, T Wang – arXiv preprint arXiv:1409.5718, 2014 – arxiv.org … an interesting example, “The dog the stick the fire burned beat bit the cat.” This sentence complies with all … Semi-supervised recursive autoencoders for pre- dicting sentiment distributions. … In Pro- ceedings of Conference on Empirical Methods in Natural Language Processing. Related articles All 3 versions

Sentence Level Paraphrase Recognition Based on Different Characteristics Combination M Zhang, H Zhang, D Wu, X Pan – … and Natural Language Processing …, 2014 – Springer … Sentence Level Paraphrase Recognition 289 … 18–26 (2006) 8. Socher, R., et al.: Dynamic Pooling and Unfolding Recursive Auto encoders for Paraphrase … In: Proceedings of the Joint SIGDAT Con- ference on Empirical Methods in Natural Language Processing and Very Large … Related articles All 3 versions

Toward automatic inference of causal structure in student essays P Hastings, S Hughes, A Britt, D Blaum… – Intelligent Tutoring …, 2014 – Springer … student essay sentences had an LSA cosine greater that 0.75 with some sentence from the … 3, 1137–1155 (2003) 2. Bird, S., Klein, E., Loper, E.: Natural Language processing with Python … J., Huang, E., Ng, A., Manning, C.: Semi-supervised recur- sive autoencoders for predicting … Related articles All 4 versions

Auto-encoder based bagging architecture for sentiment analysis W Rong, Y Nie, Y Ouyang, B Peng, Z Xiong – Journal of Visual Languages …, 2014 – Elsevier … equation(5). J = 1 N ? ( x , t ) E ( x , t ; ? ) + 1 2 ? ? ? 2. where N is number of training instances, (x, t) is a training (sentence, label) pair, and ? … In order to perform layer-wise pre-training, the AEBPA method decomposes the stacked auto-encoder into auto-encoders by taking … Related articles All 2 versions

Leveraging Monolingual Data for Crosslingual Compositional Word Representations H Soyer, P Stenetorp, A Aizawa – arXiv preprint arXiv:1412.6334, 2014 – arxiv.org … Another approach has been to use auto-encoders and bag of words representa- tions of sentences that can easily be applied to jointly leverage both bilingual and … (2014) which train their auto-encoder model for … An autoencoder approach to learning bilingual word represen- 8 … All 3 versions

Feature Weight Tuning for Recursive Neural Networks J Li – arXiv preprint arXiv:1412.3714, 2014 – arxiv.org … Language Pro- cessing, A model of coherence based on distributed sentence representation. … Semi-supervised recursive autoencoders for predicting sentiment distributions. In Pro- ceedings of the Conference on Empirical Methods in Natural Language Processing, pages 151 … Related articles All 2 versions

The STAVICTA Group Report for RepLab 2014 Reputation Dimensions Task A Rahimi, M Sahlgren, A Kerren… – CLEF 2014 Evaluation …, 2014 – ceur-ws.org … 1524 Page 7. Socher-recursive-autoencoder 0.83 – … 21. Socher, Richard et al. “Semi-supervised recursive autoencoders for predicting sentiment distributions.” Proceedings of the Conference on Empirical Methods in Natural Language Processing 27 Jul. 2011: 151-161. … Related articles All 5 versions

Semantics, Modelling, and the Problem of Representation of Meaning–a Brief Survey of Recent Literature Y Gal – arXiv preprint arXiv:1402.7265, 2014 – arxiv.org … Concrete sentence spaces for compositional distributional models of meaning. … Semi-supervised recursive autoencoders for predicting senti- ment distributions. In Proceedings of the Conference on Empirical Methods in Natural Language Processing, pages 151–161. … Related articles All 3 versions

Keyword extracting using auto-associative neural networks G Aquino, W Hasperué… – XX Congreso Argentino de …, 2014 – sedici.unlp.edu.ar … In autoencoders the output set is the same as the input set, so that these networks … are shown in the Table 2, identifying the algorithm introduced in this work as AE, for autoencoder. … Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing, vol. … Related articles

A Deep Architecture for Semantic Parsing E Grefenstette, P Blunsom, N de Freitas… – arXiv preprint arXiv: …, 2014 – arxiv.org … class of a previous sentence, or the vector representation of a source sentence. … In Proceed- ings of the 2013 Conference on Empirical Methods in Natural Language Processing (EMNLP), Seattle … Learning multilingual word representa- tions using a bag-of-words autoencoder. … Related articles All 12 versions

Recursive Neural Network Paraphrase Identification for Example-based Dialog Retrieval L Nio, S Sakti, G Neubig, T Toda, S Nakamura – isw3.naist.jp … B. Recursive Autoencoder By using the RAE algorithm, we combine word representa- tions in … weight (We) and bias (b) value are trained using recursive autoencoders [20 … Furthermore, we use natural language processing tools and Wordnet synsets provided by the NLTK toolkit6 … Related articles All 2 versions

CRF Autoencoder Models for Structured Prediction with Partial Supervision W Ammar – cs.cmu.edu … of CRF models because it makes no further independence assumptions; hence the name CRF autoencoder. … where the observation X is often assumed to be the target sentence), metadata of a … side information is one of the relative strengths of CRF autoencoders compared to … All 2 versions

An Exploration of Embeddings for Generalized Phrases W Yin, H Schütze – ACL 2014, 2014 – ling.uni-potsdam.de … Discrimi- native improvements to distributional sentence sim- ilarity … Semi-supervised recursive autoencoders for predict- ing sentiment distributions … In Pro- ceedings of the 2012 Joint Conference on Empiri- cal Methods in Natural Language Processing and Computational Natural … Related articles All 8 versions

Ensemble of Generative and Discriminative Techniques for Sentiment Analysis of Movie Reviews G Mesnil, MA Ranzato, T Mikolov, Y Bengio – arXiv preprint arXiv: …, 2014 – arxiv.org … moreau by Sentence Vector version by john frankenheimer . … Richard, Pennington, Jeffrey, Huang, Eric, Ng, Andrew, and Manning, Christopher D. Semi- supervised recursive autoencoders for predicting … Conference on Empiri- cal Methods in Natural Language Processing, 2011 … Related articles All 2 versions

Improving The Robustness Of Example-Based Dialog Retrieval Using Recursive Neural Network Paraphrase Identification L Nio, S Sakti, G Neubig, T Toda, S Nakamura – phontron.com … When calculating the recursive autoencoders, every child and non-terminal node in the binary … Using the recursive autoencoder, we can not only capture the word paraphrase similarity, but … details on the data can be found in [18] We use natural language processing tools and … Related articles

A generic evaluation of a categorical compositional-distributional model of meaning An MSc thesis proposal J Zhang – 2014 – jiannanweb.com … His recursive autoencoder model has been applied to paraphrase detection,which also … In Proceedings of the Conference on Empirical Methods in Natural Language Processing, pages 1394–1404 … Dynamic pooling and unfolding recursive autoencoders for paraphrase detection … Related articles

Multilingual models for compositional distributed semantics KM Hermann, P Blunsom – arXiv preprint arXiv:1404.4641, 2014 – arxiv.org … Distributed representations of words provide the basis for many state-of-the-art approaches to var- ious problems in natural language processing to- day. … Figure 1: Model with parallel input sentences a and b. The model minimises the distance between the sentence level en … Cited by 14 Related articles All 7 versions

A Short Texts Matching Method Using Shallow Features and Deep Features L Kang, B Hu, X Wu, Q Chen, Y He – Natural Language Processing and …, 2014 – Springer … [11], which use the unfolding recursive auto-encoder and parsing … R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural Language Processing (Almost) from … EH, Pennington, J., Ng, AY: Dynamic pooling and unfolding recur- sive autoencoders for paraphrase … Related articles All 4 versions

Global Belief Recursive Neural Networks R Paulus, R Socher, CD Manning – Advances in Neural Information …, 2014 – papers.nips.cc … [20] unfold the same autoencoder multiple times which gives it more representational power with the same number of parameters. … Discriminative recurrent sparse auto-encoders. … Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection. In NIPS, 2011. … Related articles All 5 versions

RNN-based Derivation Structure Prediction for SMT F Zhai, J Zhang, Y Zhou, C Zong – aclweb.org … ???? lisuodangran singapore reached the achievements cannot be taken for granted (a) the example test sentence and its … 2013. Recur- sive autoencoders for itg-based translation. In Pro- ceedings of the Conference on Empirical Methods in Natural Language Processing. … Related articles All 7 versions

Explicit Representation of Antonymy in Language Modeling G Zweig, WA Redmond – research.microsoft.com … 2011. Semi-supervised recursive autoencoders for predict- ing sentiment distributions. … In Pro- ceedings of the 2012 Joint Conference on Empiri- cal Methods in Natural Language Processing and Computational Natural … The Microsoft Research sentence completion chal- lenge. … Related articles All 7 versions

Cross-modal retrieval with correspondence autoencoder F Feng, X Wang, R Li – Proceedings of the ACM International Conference …, 2014 – dl.acm.org … As illus- trated in Figure 5, the autoencoder reconstruct not only the input itself but also input from different modalities. … The third component can involve any one of the three correspondence autoencoders given above. … Each image is labeled with 5 sentences. … Cited by 2 Related articles All 2 versions

Modelling, Visualising and Summarising Documents with a Single Convolutional Neural Network M Denil, A Demiraj, N Kalchbrenner, P Blunsom… – arXiv preprint arXiv: …, 2014 – arxiv.org … Transforming Auto-encoders. In International Conference on Artificial Neural Networks, 2011. … In Conference on Empirical Methods in Natural Language Processing, 2012. … Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions. … Cited by 2 Related articles All 3 versions

Improving relation descriptor extraction with word embeddings and cluster features T Liu, M Li – Systems, Man and Cybernetics (SMC), 2014 IEEE …, 2014 – ieeexplore.ieee.org … of the Neural Network and x is concatenation of word embeddings in a window of a sentence: … Semi-supervised recursive autoencoders for predicting sentiment distributions. roceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2011. … Related articles

Joint Opinion Relation Detection Using One-Class Deep Neural Network L Xu, K Liu, J Zhao – aclweb.org … is equivalent to assessing how well a candidate is reconstructed by the autoencoder. … Semi- supervised recursive autoencoders for predicting sentiment distributions … In Proceedings of the Conference on Empirical Methods in Natural Language Processing, EMNLP ’11, pages 151 … Related articles All 4 versions

Unsupervised Learning of Word Semantic Embedding using the Deep Structured Semantic Model X Song, X He, J Gao, L Deng – research.microsoft.com … 3]. For example, Salakhutdinov and Hinton extended the semantic modeling using deep auto-encoders [2][3 … 3]. In semantic hashing and other Deep neural network (DNN) based natural language processing models, usually … Eg, for the t-th word in a sentence, we form a training … Related articles All 6 versions

Distributed Word Representation Learning for Cross-Lingual Dependency Parsing M Xiao, Y Guo – CoNLL-2014, 2014 – anthology.aclweb.org … From the perspective of applying deep networks in natural language processing systems, there are a number of … Socher et al.(2011) ap- plied recursive autoencoders to address sentence- level sentiment … For ex- ample, for a given sentence “I visited New York .”, we can produce … Related articles All 8 versions

Think Positive: Towards Twitter Sentiment Analysis from Scratch CN dos Santos – SemEval 2014, 2014 – anthology.aclweb.org … architec- ture that analyses text at multiple levels, from character-level to sentence-level … Semi-supervised recursive autoencoders for predicting sentiment distributions … In Proceedings of theConference on Empirical Meth- ods in Natural Language Processing, pages 1201– 1211 … Related articles All 8 versions

A neural reordering model for phrase-based translation P Li, Y Liu, M Sun, T Izuha… – Proceedings of …, 2014 – nlp.csai.tsinghua.edu.cn … vectors in the binary tree used by the corresponding recursive autoencoder, denoted as … Semi- supervised recursive autoencoders for predicting sentiment distributions. In Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing, pages 151 … Cited by 2 Related articles All 10 versions

Synalp-Empathic: A Valence Shifting Hybrid System for Sentiment Analysis A Denis, S Cruz-Lara, N Bellalem… – 8th International …, 2014 – hal.archives-ouvertes.fr … labeled Ignoring forms in Table 2. Finally since our approach is sentence-based we … Semi-supervised recursive autoencoders for predict- ing sentiment distributions. In Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing, pages 151–161 … Related articles All 11 versions

Extractive summarization using continuous vector space models M Kågebäck, O Mogren, N Tahmasebi… – Proceedings of the …, 2014 – nlp3.cs.ntou.edu.tw … An unfolding recursive auto-encoder (RAE) is used to derive the phrase embedding on the … The second group of configurations are built upon recursive auto-encoders using CW vectors and are … The dataset is well suited for multi- document summarization (each sentence is con … Cited by 2 Related articles All 6 versions

Improving Citation Polarity Classification with Product Reviews C Jochim – aclweb.org … (2012)6 plus a linear SVM. mSDA takes the con- cept of denoising – introducing noise to make the autoencoder more robust – from Vincent et al. … 2012. Marginalized denoising autoencoders for domain adaptation. … 2011. Natural language processing (almost) from scratch. … Related articles All 5 versions

A Joint Segmentation and Classification Framework for Sentiment Analysis D Tang, F Wei, B Qin, L Dong, T Liu, M Zhou – icm.hit.edu.cn Page 1. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 477–487, October 25-29, 2014, Doha, Qatar. … (2011) inves- tigate Stacked Denoising Autoencoders to learn … RAE: Recursive Autoencoder (Socher et al., 2011 … Cited by 1 Related articles All 9 versions

Semantic parsing via paraphrasing J Berant, P Liang – Proceedings of ACL, 2014 – nlp.stanford.edu Page 1. Semantic Parsing via Paraphrasing Jonathan Berant Stanford University joberant@stanford.edu Percy Liang Stanford University pliang@cs.stanford.edu Abstract A central challenge in semantic parsing is handling the … Cited by 19 Related articles All 14 versions

Neural Network Language Models for Low Resource Languages A Gandhe, F Metze, I Lane – Fifteenth Annual Conference …, 2014 – mazsola.iit.uni-miskolc.hu … words, ini- tial work, such as [9], used neural networks to label words within a sentence a part … [9] R. Miikkulainen, and MG Dyer, “Natural Language Processing with Modular … Y. Miao, F. Metze and A. Waibel, “Extracting deep bottleneck features using stacked auto-encoders”, Proc … Related articles All 7 versions

Inside-Outside Semantics: A Framework for Neural Models of Semantic Composition P Le, W Zuidema – dlworkshop.org … Auto-encoders (RAE) [5], which replace the traditional feedfor- ward neural networks by auto-encoders. … We compared our IORNNs against two models: the Recursive Auto-encoder (RAE) replicated by [24], and the Combinatory Categorial Autoencoders (CCAE)-B … Related articles

Learning task-specific bilexical embeddings PS Madhyastha, X Carreras Pérez, A Quattoni – 2014 – upcommons.upc.edu … model assings vector representations to each of the lexical tokens of the sentence, and then … Semi- supervised recursive autoencoders for predicting sentiment distributions … In Proceedings of the Conference on Empirical Methods in Natural Language Processing, pages 151–161 … Cited by 1 Related articles All 6 versions

Learning semantic representations of words and their compositionality L Rei, D Mladenic – 2014 – kt.ijs.si … a more common dimensionality reduction technique. 2.5 Natural Language Processing … sentence is determined by (1) the meanings of its words and (2) the rules that combine them. Page 21. … Autoencoders achieved near state of the art results in paraphrase detection [48]. … Related articles

Multimodal neural language models R Kiros, R Salakhutdinov… – Proceedings of the …, 2014 – machinelearning.wustl.edu … (2011) pro- posed using deep autoencoders to learn … We define a sentence to be correctly matched if the matching image to the sentence query is ranked in the top kr images sorted by model perplexity. Retrieving sentences from image queries is performed equivalently. … Cited by 17 Related articles All 11 versions

Learning image embeddings using convolutional neural networks for improved multi-modal semantics D Kiela, L Bottou – … Methods in Natural Language Processing ( …, 2014 – emnlp2014.org Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 36 … Sri- vastava and Salakhutdinov, 2012; Feng et al., 2013), auto-encoders (Wu et al … They use a stacked auto-encoder to learn combined embeddings of textual and vi … Cited by 4 Related articles All 5 versions

Learning bilingual word representations by marginalizing alignments T Ko?iský, KM Hermann, P Blunsom – arXiv preprint arXiv:1405.0947, 2014 – arxiv.org … 2008. A unified architecture for natural language processing: deep neural networks with multitask learning. … 2013. Learning multilingual word representations using a bag-of-words autoencoder. … Semi-supervised recursive autoencoders for predict- ing sentiment distributions. … Cited by 1 Related articles All 7 versions

Transduction Recursive Auto-Associative Memory: Learning Bilingual Compositional Distributed Vector Representations of Inversion Transduction Grammars K Addanki, D Wu – Syntax, Semantics and Structure in Statistical …, 2014 – aclweb.org … Christian Scheible and Hinrich Schütze. Cutting recursive autoencoder trees. … Semi-supervised recursive autoencoders for predicting sentiment distributions. In Pro- ceedings of the Conference on Empirical Meth- ods in Natural Language Processing, pages 151–161. … Related articles All 4 versions

Evaluating neural word representations in tensor-based compositional settings D Milajevs, D Kartsaklis, M Sadrzadeh… – arXiv preprint arXiv: …, 2014 – arxiv.org … is limited to additive and multiplicative composition, compared against composition via a neural autoencoder. … Section 2 provides a concise introduction to distributional word repre- sentations in natural language processing. … and it is very rare to see the same sentence twice, any … Cited by 1 Related articles All 9 versions

Deep Learning X He, J Gao, L Deng – 2014 – research-srv.microsoft.com … Part I (by Li Deng): Background of deep learning, common and natural Language Processing (NLP) centric architectures … conditioned not only on the previous words in the sentence but also on images. The model … Page 46. Multi-Modal Audio-Visual Deep Autoencoder 47 … Related articles All 6 versions

An Information Theoretic Approach to Quantifying Text Interestingness K Rohanimanesh – cs.cmu.edu … SVD), and a deep learning approach using the recursive auto-encoders (RAE) framework … on Human Language Technology and Empirical Methods in Natural Language Processing, HLT ’05, pages … Semi-supervised recursive autoencoders for predicting sentiment distributions. … Related articles

Unsupervised Induction of Semantic Roles within a Reconstruction-Error Minimization Framework I Titov, E Khoddam – arXiv preprint arXiv:1412.2812, 2014 – arxiv.org … x Figure 1: (a) An autoencoder from Rm to Rp (typically p<m). (b) Modeling roles within the … Extracting and composing robust features with denoising autoencoders. … In Proceedings of the Conference on Empirical Methods in Natural Language Processing, pages 1456–1466. … Related articles All 3 versions

Predicting Stock Market Trends by Recurrent Deep Neural Networks A Yoshihara, K Fujikawa, K Seki, K Uehara – PRICAI 2014: Trends in …, 2014 – Springer … the sentiment of a sentence by using a model extending the autoencoder, one of … J., Huang, EH, Ng, AY, Manning, CD: Semi- supervised recursive autoencoders for predicting … Pro- ceedings of the Sixteenth Conference on Empirical Methods in Natural Language Processing, pp. … Related articles

Learning semantic representations using convolutional neural networks for web search Y Shen, X He, J Gao, L Deng, G Mesnil – Proceedings of the companion …, 2014 – dl.acm.org … 9] demon- strated that the semantic structures can be extracted via a semantic hashing approach using a deep auto-encoder. … semantic meaning of a sentence is often determined by a few key words in the sentence, thus, simply … Natural language processing (almost) from scratch … Cited by 10 Related articles All 9 versions

Sentiment Analysis and City Branding R Grandi, F Neri – New Trends in Databases and Information Systems, 2014 – Springer … polarity of words, but also on the syntactical tree of the sentence being analyzed. … 86 (July 2002) 4. Socher, R., et al.: Semi-supervised recursive autoencoders for predicting … Proceedings of EMNLP 2011 – the Conference on Empirical Methods in Natural Language Processing, pp … Cited by 1 Related articles All 5 versions

Linguistic regularities in sparse and explicit word representations O Levy, Y Goldberg, I Ramat-Gan – CoNLL-2014, 2014 – anthology.aclweb.org … For example, in the sentence abcde the contexts of the word c are a? 2, b? 1, d+ 1 and e+ 2 … Overall, we retained a corpus of about 1.5 billion tokens, in 77.5 million sentences. … A unified architecture for natural language processing: Deep neural networks with multitask learning. … Cited by 7 Related articles All 10 versions

Continuous word embeddings for detecting local text reuses at the semantic level Q Zhang, J Kang, J Qian, X Huang – … of the 37th international ACM SIGIR …, 2014 – dl.acm.org … can observe that although only one word differs between the three sentences, sentence S3 should … considered as the local text reuses of sentence S1 and sentence S2. … word repre- sentations in capturing the semantic similarities in various natural language processing tasks, we … Cited by 1 Related articles All 2 versions

Regularized Structured Perceptron: A Case Study on Chinese Word Segmentation, POS Tagging and Parsing K Zhang, PR Fujian, J Su, C Zhou – EACL 2014, 2014 – anthology.aclweb.org … during the training of an autoencoder, the model is called denoising autoencoder (Vin- cent … E-NR S-DEG B-NN E-NN, (24) for the input sentence in Equation … Proceedings of the 2012 Joint Confer- ence on Empirical Methods in Natural Language Processing and Computational … Related articles All 4 versions

Learning sense-specific word embeddings by exploiting bilingual resources J Guo, W Che, H Wang, T Liu – Proceedings of COLING, 2014 – ir.hit.edu.cn … All corpora are sentence-aligned. After cleaning and filtering the corpus,4 we obtain 918,681 pairs of sentences (21.7M words). … 2011. Natural language processing (almost) from scratch. … 2011. Dynamic pooling and unfolding recursive autoencoders for paraphrase detection. … Cited by 3 Related articles All 7 versions

Embedding Word Similarity with Neural Machine Translation F Hill, K Cho, S Jean, C Devin, Y Bengio – arXiv preprint arXiv:1412.6448, 2014 – arxiv.org … Rather, the objective of equalizing encoded sentence representations from distinct lan- guages, as … An Autoencoder Approach to Learning Bilingual Word Repre- sentations … In Proceedings of the Empiricial Methods in Natural Language Processing (EMNLP 2014), October 2014b … Related articles All 2 versions

Investigation of sentence structure in domain adaptation for sentiment classification A Gentile – 2014 – digital.lib.washington.edu … (2011) leverages these techniques, specifically, Stacked Denoising Autoencoders, using the Amazon product reviews dataset, as his predecessors … 3.4 Structure Counts Once the sentence structures for all the sentences in all the reviews have been obtained … Related articles All 2 versions

Optimization of Neural Network Language Models for keyword search A Gandhe, F Metze, A Waibel… – Acoustics, Speech and …, 2014 – ieeexplore.ieee.org … Statistical Language modeling is an important component of many natural language processing applications, including spelling … Sentences Tokens Vocabulary FullLP 78.6k 918k 6.2k … and A. Waibel, “Ex- tracting deep bottleneck features using stacked auto- encoders”, Proc. … Related articles All 5 versions

Learning semantic hierarchies via word embeddings R Fu, J Guo, B Qin, W Che, H Wang… – Proceedings of the 52th …, 2014 – 202.118.253.69 … of “canine.” As key sources of knowledge, semantic thesauri and ontologies can support many natural language processing applications … we learn word embeddings from a Chinese encyclopedia corpus named Baidubaike4, which contains about 30 million sentences (about 780 … Cited by 3 Related articles All 14 versions

Active Semi-Supervised Learning Method with Hybrid Deep Belief Networks S Zhou, Q Chen, X Wang – PloS one, 2014 – dx.plos.org … learn text document representations based on semi-supervised auto-encoders that are … a novel machine learning framework based on recursive autoencoders for sentence … Computational Linguistics and 4th International Joint Conference on Natural Language Processing of the … Related articles All 7 versions

Building Large-Scale Twitter-Specific Sentiment Lexicon: A Representation Learning Approach D Tang, F Wei, B Qin, M Zhou, T Liu – anthology.aclweb.org … (2) where S is the occurrence of each sentence in the corpus, ? k poljk = 1. For binary classification … 2011. Semi-supervised recursive autoencoders for predicting sentiment distributions. In Conference on Empirical Methods in Natural Language Processing, pages 151–161. … Cited by 1 Related articles All 8 versions

Exploiting discourse information to identify paraphrases NX Bach, N Le Minh, A Shimazu – Expert Systems with Applications, 2014 – Elsevier … Considering the two following sentence pairs, the first sentence pair is a paraphrase although the two sentences only share a few words, while the second one is not a paraphrase even though the two sentences contain almost all the same words. • … Cited by 2 Related articles All 3 versions

Word translation prediction for morphologically rich languages with bilingual neural networks KTABC Monz – staff.fnwi.uva.nl Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1676–1688 … for example, the gender of the subject in a source sentence constrains the … pair, accuracies are measured on a held-out set of 10K parallel sentences. … Cited by 1 Related articles All 9 versions

Named Entity Recognition: A Literature Survey R Sharnagat – 2014 – cfilt.iitb.ac.in … Named Entity Recognition(NER) is one of the major task in Natural Language Processing(NLP). … tion of capitalized words in ambiguous position (eg sentence beginning). … In the next section, we will discuss two major methods, autoencoder, denoising autoencoder and Deep … Related articles

Bringing machine learning and compositional semantics together P Liang, C Potts – Annual Reviews of Linguistics (submitted), 0, 2014 – stanford.edu … and y the syntactic parse of x; or x could be a Japanese sentence and y its … the paradigm of linear classification, which is the bread and butter of statistical natural language processing (Manning & … Semantic parsing is the task of mapping from sentences in X to logical forms in Y … Cited by 5 Related articles All 2 versions

Towards Syntax-aware Compositional Distributional Semantic Models L Ferrone, F Zanzotto – COLING 2014, 25th International Conference …, 2014 – art.torvergata.it … The STS task consists of determining the degree of similarity (ranging from 0 to 5) between two sentences. … RTE is instead the task of deciding whether a long text T entails a shorter text, typically a single sentence, called hypothesis H. It has been often seen as a classification … Related articles All 4 versions

Rule-based Emotion Detection on Social Media: Putting Tweets on Plutchik’s Wheel E Tromp, M Pechenizkiy – arXiv preprint arXiv:1412.4682, 2014 – arxiv.org … Remarkable is that the recursive auto-encoder performs worse than SVM and regression … Semi-supervised recur- sive autoencoders for predicting sentiment distribu- tions. In Proceedings of the Conference on Em- pirical Methods in Natural Language Processing, EMNLP’11 … Related articles All 3 versions

A Simple and Efficient Method To Generate Word Sense Representations LN Pina, R Johansson – arXiv preprint arXiv:1412.6045, 2014 – arxiv.org … Bengio et al., 2003) and improve the performance of many natural language processing applications such … training data usually consists of a large collection of sentences or documents … D, and Ng, Andrew Y. Dynamic pooling and unfolding recursive autoencoders for paraphrase … Related articles All 3 versions

A neural network for factoid question answering over paragraphs M Iyyer, J Boyd-Graber, L Claudino… – … Language Processing …, 2014 – cs.colorado.edu … Deep neural networks have seen widespread use in natural language processing tasks such as parsing, language modeling, and … entities—answering the question correctly requires an actual un- derstanding of the sentence (Figure 1). Later sentences, however, progressively … Cited by 6 Related articles All 12 versions

Segment-based Fine-grained Emotion Detection for Chinese Text Z Wang – CLP 2014, 2014 – aclweb.org … the emotion label of each dependen- cy subtree of a subjective sentence or short … Sentiment Composition, In Proceedings of Recent Advances in Natural Language Processing. … Semi-Supervised Recursive Autoencoders for Pre- dicting Sentiment Distributions, EMNLP 2011, … Related articles All 2 versions

Semantic Compositionality in Tree Kernels P Annesi, D Croce, R Basili – Proceedings of the 23rd ACM International …, 2014 – dl.acm.org … The accuracy achieved by the different systems is the percentage of sentences that are correctly assigned to the proper question class, and it is reported in Table 1 … The task is binary and corresponds to recognize if given a sentence pair they are in a paraphrase relation or not. … Related articles

A latent semantic model with convolutional-pooling structure for information retrieval Y Shen, X He, J Gao, L Deng, G Mesnil – Proceedings of the 23rd ACM …, 2014 – dl.acm.org … and Hinton proposed the Semantic Hashing method based on a deep auto-encoder in [32][16]. … CNN) have been applied successfully in speech, image, and natural language processing [8][41 … which is then mapped to a fixed-length representation using recursive auto-encoders. … Cited by 8 Related articles All 8 versions

Product feature mining: Semantic clues versus syntactic constituents L Xu, K Liu, S Lai, J Zhao – Proceedings of ACL, 2014 – nlpr.ia.ac.cn … a statistic-based method in (Zhu et al., 2009) is used to detect noun phrases in R. Then, an Unfold- ing Recursive Autoencoder (Socher et … Network The architecture of the Convolutional Neural Net- work is shown in Figure 2. For a product feature candidate t in sentence s, every … Cited by 1 Related articles All 8 versions

A multiplicative model for learning distributed text-based attribute representations R Kiros, R Zemel, RR Salakhutdinov – Advances in Neural …, 2014 – papers.nips.cc … A unified architecture for natural language processing: Deep neural networks with multitask learning. … Semi- supervised recursive autoencoders for predicting sentiment distributions. … An autoencoder approach to learning bilingual word representations. NIPS, 2014. … Cited by 2 Related articles All 6 versions

Compositional operators in distributional semantics D Kartsaklis – Springer Science Reviews, 2014 – Springer … A sentence vector, then, could be compared with other sentence vectors, providing a way for assessing the semantic similarity between sentences as if they were words. The benefits of such a feature are obvious for many natural language processing tasks, such as paraphrase … Cited by 4 Related articles All 8 versions

Learning a Recurrent Visual Representation for Image Caption Generation X Chen, CL Zitnick – arXiv preprint arXiv:1411.5654, 2014 – arxiv.org … to u, this property would be lost and the network would act as a normal auto- encoder [34 … For any natural language processing task, pre-processing is crucial to the final performance. … Note that we reset the model after an End-of- Sentence (EOS) is encountered, so that prediction … Cited by 2 Related articles All 2 versions

Learning Word Representations from Relational Graphs D Bollegala, T Maehara, Y Yoshida… – arXiv preprint arXiv: …, 2014 – arxiv.org … Better rep- resentations of words can improve the performance in nu- merous natural language processing tasks that require … Next, for each generated word-pair, we retrieve the set of sentences in which the two … us assume that the two words u and v co-occur in a sentence s. We … Related articles All 4 versions

An artificial neural network approach to automatic speech processing SM Siniscalchi, T Svendsen, CH Lee – Neurocomputing, 2014 – Elsevier … The remaining 6877 SI-84 sentences were used as training material. … 1. Detection curves of manner of articulation for the sentence numbered 440c20t (RATES FELL ON SHORT TERM TREASURY BILLS) of the SI-84 data set [38]. … Cited by 3 Related articles All 4 versions

From machine learning to machine reasoning L Bottou – Machine learning, 2014 – Springer … instance of the dissociation module is equivalent to an auto-encoder (Fig … 11.) The parsing mechanism described for the natural language processing system provides an opportunity to … languages based on transformation operators: starting from elementary sentence forms, more … Cited by 23 Related articles All 14 versions

Sentiment expression conditioned by affective transitions and social forces M Sudhof, A Goméz Emilsson, AL Maas… – Proceedings of the 20th …, 2014 – dl.acm.org … In both settings, our models yield improvements of statistical and practical significance over ones that classify each text independently of its emotional or social context. Categories and Subject Descriptors I.2.7 [Natural Language Processing]: Discourse; Text analysis … Related articles All 6 versions

Learning Continuous Phrase Representations for Translation Modeling JGXH Wen-tau, YL Deng – 131.107.65.14 … The parallel corpus in- cludes 688K sentence pairs of parliamentary pro- ceedings for training. The development set con- tains 2000 sentences, and the test set contains other 2000 sentences, all from the official WMT 2006 shared task. … Related articles All 9 versions

Structural information aware deep semi-supervised recurrent neural network for sentiment analysis W Rong, B Peng, Y Ouyang, C Li, Z Xiong – Frontiers of Computer Science – Springer … into a continuous vector space has a long history in the domain of natural language processing. … in a training corpus may help the model generalize to the sentence “Book two … is mainly because that “Boston” and “HongKong”, “three” and “two” in the sentences have similarities in … Related articles

Structural information aware deep semi-supervised recurrent neural network for sentiment analysis W Rong, B Peng, Y Ouyang, C Li, Z Xiong – Frontiers of Computer Science – Springer … into a continuous vector space has a long history in the domain of natural language processing. … in a training corpus may help the model generalize to the sentence “Book two … is mainly because that “Boston” and “HongKong”, “three” and “two” in the sentences have similarities in … Related articles

ReNew: A Semi-Supervised Framework for Generating Domain-Specific Lexicons and Sentiment Analysis Z Zhang, MP Singh – csc.ncsu.edu … MONEY#). After the normalization step, it splits each re- view r into sentences, and each sentence into sub- clauses (lines 6–10) provided transition cues oc- cur. In effect, the algorithm converts each review into a set of segments. … Related articles All 7 versions

Political ideology detection using recursive neural networks M Iyyer, P Enns, J Boyd-Graber… – Association for …, 2014 – cs.colorado.edu … As phrases themselves merge into complete sentences, the underlying vector representation is trained to retain the sentence’s whole meaning. … Each of these mod- els have the same task: to predict sentence-level ideology labels for sentences in a test set. … Cited by 5 Related articles All 11 versions

Learning to Predict Distributions of Words Across Domains D Bollegala, D Weir, J Carroll – aclweb.org … Although the distributional hypothesis has been applied successfully in many natural language processing tasks, systems using distributional information … H, such as a user-review of a product, we split H into sentences, and lemma- tize each word in a sentence using the … Related articles All 7 versions

Building Program Vector Representations for Deep Learning L Mou, G Li, Y Liu, H Peng, Z Jin, Y Xu… – arXiv preprint arXiv: …, 2014 – arxiv.org … significant breakthroughs in a variety of fields, such as natural language processing [6], [7 … in NLP Neural networks and the pretraining approaches like RBMs, autoencoders work well … art approaches in NLP compositional semantics can only model sentences, paragraphs roughly … Cited by 2 Related articles All 3 versions

Sequential Labeling with online Deep Learning G Chen – arXiv preprint arXiv:1412.3397, 2014 – arxiv.org … is discriminative probabilistic model for structured prediction [21], which has been widely used in natural language processing [4, 33 … (3) can be thought as the deep autoencoder [14 … Each sentence is a 24- dimensional bi- nary feature that describes lexical characteristics of the … Related articles All 2 versions

Probabilistic distributional semantics with latent variable models DO Séaghdha, A Korhonen – Computational Linguistics, 2014 – MIT Press … meaning is a fundamental component of the way language works: Sentences (and larger … In Natural Language Processing (NLP), the term distributional semantics encompasses a broad range of methods … as subject and object, with which it combines to form a basic sentence. … Related articles All 5 versions

Simlex-999: Evaluating semantic models with (genuine) similarity estimation F Hill, R Reichart, A Korhonen – arXiv preprint arXiv:1408.3456, 2014 – arxiv.org Page 1. SimLex-999: Evaluating Semantic Models With (Genuine) Similarity Estimation Felix Hill Computer Laboratory Cambridge University felix.hill@cl.cam.ac.uk Roi Reichart Technion, IIT roiri@ie.technion.ac.il Anna Korhonen … Cited by 6 Related articles All 3 versions

Dependency vs. Constituent Based Syntactic N-Grams in Text Similarity Measures for Paraphrase Recognition H Calvo, A Segura-Olivares, A García – Computación y Sistemas, 2014 – scielo.org.mx Page 1. Dependency vs. Constituent Based Syntactic N-Grams in Text Similarity Measures for Paraphrase Recognition Hiram Calvo, Andrea Segura-Olivares, and Alejandro Garcia Centro de Investigation en Computacion (CIC … Cited by 2 Related articles All 9 versions

Foundations and Trends in Signal Processing L Deng, Y Dong – Signal Processing, 2014 – research.microsoft.com … over MFCCs in effective coding of bottleneck speech features using autoencoders in an … 235] applied a deep recurrent auto encoder neural network … to the task of language identification [429], phone recognition [410], sequential labeling in natural language processing [428], and … Related articles All 8 versions

Joint Author Sentiment Topic Model S Mukherjee, G Basu, S Joshi – SIAM … We refer the association between facets to topics as semantic dependencies. In the above review the first two sentences refer to the same topic ‘story’ with facets like ‘plot’ and ‘narration’. The author makes a topic switch in the next sentence using the discourse particle ‘however’. … Related articles

A Statistical Parsing Framework for Sentiment Classification L Dong, F Wei, S Liu, M Zhou, K Xu – arXiv preprint arXiv:1401.6330, 2014 – arxiv.org … 2011) used recursive autoencoders to learn … Given the sentence s and parameters ?, log-linear model defines a conditional probability … log-partition function with respect to T(s). The log-linear model is a dis- criminative model, and it is widely used in natural language processing. … Cited by 1 Related articles All 2 versions

A tutorial survey of architectures, algorithms, and applications for deep learning L Deng – APSIPA Transactions on Signal and Information …, 2014 – Cambridge Univ Press … Specifically, in denoising autoencoders, the input vec- tors are first corrupted; eg, randomizing a percentage of … are used as the inputs to the next level of the stacked denoising autoencoder. … (In the literature, DBN is sometimes used to mean DNN) 7. Deep auto-encoder: A DNN … Cited by 9 Related articles All 3 versions

A Generic Evaluation of a Categorical Compositional-distributional Model of Meaning J Zhang – 2014 – jiannanweb.com … After that, the sentence similarity problem is reduced to a graph isomorphism search (graph … The model works for sentences with variable sizes, and it works particularly well on … Their recursive auto-encoder model has been applied to paraphrase detection and prediction of … Related articles All 2 versions

Deep Learning via Stacked Sparse Autoencoders for Automated Voxel-Wise Brain Parcellation Based on Functional Connectivity C Gravelines – 2014 – ir.lib.uwo.ca … 12 Sermanet, Scoffier, Muller, & LeCun, 2008), and natural language processing (Collobert … RBMs are stacked to form DBNs, this deep approach can be extended to non-linear autoencoders to form a stacked autoencoder network (Bengio, Lamblin, Popovici, & … Related articles

A Location-Aware Social Media Monitoring System J Liu – 2014 – ruor.uottawa.ca … 57 4.3.4 Statistical Noises for Denoising Auto-encoders . . . . . … 19 3 A typical denoising auto-encoder. . . . . … approaches or as the baseline models. 2.1 Natural Language Processing Natural language processing (NLP) seeks automatic understanding and generation of … Related articles All 3 versions

Unsupervised neural controller for Reinforcement Learning action-selection: Learning to represent knowledge A Gkiokas, AI Cristea – Neural Network Applications in …, 2014 – ieeexplore.ieee.org … 8]. A variety of Deep models, Convolutional Neural Networks and auto-encoders have seen a … To put matters into perspective, a small sentence of 6 words could in fact … Parsing”, Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Vol. … Related articles

Automatic Speech Recognition H Soltau, G Saon, L Mangu, HK Kuo… – … Language Processing …, 2014 – Springer Imed ZitouniTheory and Applications of Natural Language ProcessingNatural Language Processing of Semitic Languages201410.1007/978-3-642-45358-8_13 © Springer-Verlag Berlin Heidelberg 2014. 13. Automatic Speech Recognition. … Related articles All 3 versions

Minerva: A Scalable and Highly Efficient Training Platform for Deep Learning M Wang, T Xiao, J Li, J Zhang, C Hong, Z Zhang – 2014 – msr-waypoint.com … For ex- ample, many image classification tasks are delivered via variations of the deep convolutional net- work [4], large-vocabulary speech recognition tasks typically employ multi-layer networks [5, 6], and those in natural language processing use recurrent architectures [ … Cited by 1 Related articles All 6 versions

Semantic Composition and Decomposition: From Recognition to Generation PD Turney – arXiv preprint arXiv:1405.7908, 2014 – arxiv.org … Many other ideas have been proposed for extending distributional semantics to phrases and sentences. … Unsupervised recursive autoencoders were used to compose context vectors and then a supervised softmax classifier was used to compose a similarity matrix. … Related articles All 2 versions

Semantic frame identification with distributed word representations KM Hermann, D Das, J Weston… – Proceedings of …, 2014 – research.google.com … For exam- ple, consider the pair of sentences in Figure 1(a). COMMERCE BUY is a frame that can be evoked by morphological variants … dis- ambiguation stage,4 and a stage that finds the var- ious arguments that fulfill the frame’s semantic roles within the sentence, respectively. … Cited by 9 Related articles All 11 versions

[BOOK] Deep Learning L Deng, D Yu – 2014 – isoft.postech.ac.kr … recent results of applying deep learning to language modeling and natural language processing, where we … by considering each pair of layers as a de-noising autoencoder regularized by … Another alternative is to use contractive autoencoders for the same purpose by favoring … Cited by 17 Related articles All 8 versions

Supervised Speech Separation And Processing K Han – 2014 – cse.ohio-state.edu … 111 5.13 ASR results. “Original / GMM” and “Processed / GMM”denote the results for the GMM-HMM systems using original sentences and pro- cessed sentences, respectively. “Original / Hybrid” and … Related articles All 5 versions

Distributed Representations for Compositional Semantics KM Hermann – arXiv preprint arXiv:1411.3146, 2014 – arxiv.org … 60 4.4.1 Autoencoders . . . … 77 5.2 Forward application in CCG and as an autoencoder rule . . … This thesis investigates the application of distributed representations to semantic models in natural language processing (NLP). NLP is the discipline concerned with the interpretation … Related articles All 3 versions

Domain Adaptation for Sentiment Classification in Light of Multiple Sources F Fang, K Dutta, A Datta – INFORMS Journal on Computing, 2014 – pubsonline.informs.org … In sentence-level classification, sentences are first classified as subjective or objective. Then subjec- tive sentences are further classified into positive or negative (Liu 2010). … They used stacked denoising auto- encoders as the building blocks of the deep network and trained a … Related articles

Double LDA: A Sentiment Analysis Model Based on Topic Model X Chen, W Tang, H Xu, X Hu – Semantics, Knowledge and …, 2014 – ieeexplore.ieee.org … [10] uses some sentences and words as classification terms and puts forward a sentiment analysis model SAS based on seed words. Davidov et al. … Socher et al. [15] puts forward a machine learning framework based on recursive auto encoders for sentence-lever prediction … Related articles

Distributional Semantics For Robust Automatic Summarization JCK Cheung – 2014 – cs.mcgill.ca … heterogeneous source text sentences into a novel sentence, resulting in more informative and grammatical summary sentences than a previous sentence fusion approach. The success of this … the Speech and Natural Language Processing groups. …

Collaborative Deep Learning for Recommender Systems H Wang, N Wang, DY Yeung – arXiv preprint arXiv:1409.2944, 2014 – arxiv.org … We first present a Bayesian formulation of a deep learning model called stacked denoising autoencoder (SDAE) [26]. … Generalized denoising auto-encoders as generative models. In NIPS, pages 899–907, 2013. … Marginalized denoising autoencoders for domain adaptation. … Related articles All 2 versions

Bilingual and Cross-lingual Learning of Sequence Models with Bitext M Wang – 2014 – www-nlp.stanford.edu … This thesis presents four main contributions to the field of natural language processing. • We present two new methods that address the weight undertraining problem that … Figure 2.1 illustrates two tagged sentences in English and Chinese. … Since most of the words in a sentence … Related articles All 2 versions

A critical examination of deep learning approaches to automated speech recognition NGA LAYOUSS – 2014 – diva-portal.org … approaches to a wide range of classical challenging pattern recognition problems ranging from object identification in man-made or natural images [6, 7], natural language processing to ASR. … To explore different types of building blocks of a DBN such as Auto-Encoders and its … Related articles

Draft: Deep Learning in Neural Networks: An Overview J Schmidhuber – 2014 – people.idsia.ch … 11 5.7 1987: UL Through Autoencoder (AE) Hierarchies . . . . . … Sec. 5.7 discusses a first hierarchical stack of coupled UL-based Autoencoders (AEs)—this concept resurfaced in the new millennium (Sec. 5.15). Sec. … Related articles All 2 versions

Data famine in big data era: machine learning algorithms for visual object recognition with limited training data Z Guo – 2014 – circle.ubc.ca … a deep structure of the stacked Hierarchical Dirichlet Process (HDP) auto-encoder, in order to … Also inspired by the Natural Language Processing (NLP) area where topic models are widely used for … Process (HDP), to model the high- level visual phrases and sentences from low … Related articles All 2 versions

An analysis of deep neural networks for texture classification LG Hafemann – 2014 – dspace.c3sl.ufpr.br Page 1. LUIZ GUSTAVO HAFEMANN AN ANALYSIS OF DEEP NEURAL NETWORKS FOR TEXTURE CLASSIFICATION Dissertation presented as partial requisite to obtain the Master’s degree. M.Sc. pro- gram in Informatics, Universidade Federal do Paraná. Advisor: Prof. … Related articles All 2 versions

Graph-based approaches for semi-supervised and cross-domain sentiment analysis N Ponomareva – 2014 – wlv.openrepository.com … 52 MU reviews on MUsic 44 NB Naive Bayes 25 NLP Natural Language Processing 3 OR Odds Ratio 56 PMI Pointwise Mutual Information 22 PoS Parts of Speech 16 PSP Positive Sentence Percentage 33 PWP Positive Word Percentage 89 RANK graph RANKing 104 … Related articles All 3 versions

KIT-Conferences PI Lichtblau – 2014 – isl.anthropomatik.kit.edu … The 6th International Joint Conference on Natural Language Processing (IJCNLP 2013), Nagoya, Japan … MT Error-driven Discriminative Word Lexicon using Sentence Structure Features, … Extracting Deep Bottleneck Features Using Stacked Auto-Encoders, Jonas Gehring, Yajie … All 2 versions

Studies on Machine Learning for Data Analytics in Business Application F FANG – 2014 – scholarbank.nus.sg … 14 In sentence level classification, sentences are first classified as subjective or objective. Then subjective sentences are further classified into positive or negative (Liu 2010). … They used Stacked Denoising Auto-encoders (SDA) as the building blocks of the deep network and … Related articles All 2 versions

Towards Real-Time Image Understanding with Convolutional Networks Y Bengio – 2014 – tel.archives-ouvertes.fr … Learning features in an unsupervised manner (ie without labels) can be achieved simply, by using auto-encoders. An auto-encoder is a model that takes a vector input y, maps it into a hidden representation z (code) using an encoder which typically has the form: … Related articles All 9 versions

Parallel distributed processing at 25: Further explorations in the microstructure of cognition TT Rogers, JL McClelland – Cognitive science, 2014 – Wiley Online Library … It became apparent that, in many cases, the role assignments of nouns are strongly influenced by word meanings and by the plausibility of the resulting event scenarios that the objects to which the sentence referred might take on. For instance, in sentences like: … Cited by 2 Related articles All 11 versions