Skipgrams, Deep Learning & Question Answering 2015


Notes:

The continuous Skip-gram algorithm is an efficient deep learning method for learning high-quality distributed vector representations that capture a large number of precise semantic word relationships.

Between 2014 and 2015, the number of papers using skip-grams for deep learning sky-rocketed from less than 100x in 2014 to nearly 400x papers in 2015.

Resources:

  • bioasq .. organizes challenges on biomedical semantic indexing and question answering
  • computational stylistics group .. suite of stylometric tools
  • ijcai.org .. international joint conference on artificial intelligence (melbourne, august 2017)
  • movieqa .. dataset which aims to evaluate automatic story comprehension

Wikipedia:

See also:

100 Best Deep Learning Videos100 Best GitHub: Deep Learning100 Best GitHub: N-gram | 100 Best GitHub: Ngram100 Best N-gram Videos | ConcGrams | Deep Learning & Dialog SystemsDeep Reasoning SystemsDeepDiveDNLP (Deep Natural Language Processing)N-gram & Tag Clouds | N-gram Dialog Systems | N-gram Grammars | N-gram Transducers (NGT) | Word2vec Neural Network


Exploring models and data for image question answering M Ren, R Kiros, R Zemel – Advances in Neural Information …, 2015 – papers.nips.cc … bedding, dataset-specific skip-gram embedding and general-purpose skip-gram embedding model … Image question answering is a fairly new research topic, and the approach we … Explain images with multimodal recurrent neural networks,” NIPS Deep Learning Workshop, 2014. … Cited by 78 Related articles All 10 versions

When are tree structures necessary for deep learning of representations? J Li, MT Luong, D Jurafsky, E Hovy – arXiv preprint arXiv:1503.00185, 2015 – arxiv.org … Deep learning based methods learn low- dimensional, real-valued vectors for word tokens, mostly from … Word embeddings are initialized using skip-grams and kept fixed in the learning … We trained skip-gram embeddings on the Wikipedia+Gigaword dataset using the word2vec … Cited by 39 Related articles All 15 versions

Learning continuous word embedding with metadata for question retrieval in community question answering G Zhou, T He, J Zhao, P Hu – Proceedings of ACL, 2015 – aclweb.org … cQA archives valuable resources for various tasks like question-answering (Jeon et al., 2005; Xue et … Recently, a se- ries of works applied deep learning techniques to learn high-quality … models for learn- ing word representations, including the continu- ous skip-gram model and … Cited by 27 Related articles All 5 versions

Image question answering: A visual semantic embedding model and a new dataset M Ren, R Kiros, R Zemel – CoRR, abs/1505.02074, 2015 – pdfs.semanticscholar.org … Under review by the Deep Learning Workshop at the International Conference on Machine … Image Question Answering: A Visual Semantic Embedding Model and a New Dataset … em- bedding models: randomly initialized embedding, dataset-specific skip-gram embedding and … Cited by 25 Related articles

Large scale deep learning J Dean – Keynote GPU Technical Conference, 2015 – research.google.com … Page 13. What is Deep Learning? rking hypothesis: entral stream “untangles” objects “cat” … nearby word Skipgram Text Model meeting with Putin … very interesting properties (especially the skip-gram model). Skip-gram model w/ 640 dimensions trained on 6B words of news text … Cited by 3 Related articles All 3 versions

Question/Answer Matching for CQA System via Combining Lexical and Sequential Information. Y Shen, W Rong, Z Sun, Y Ouyang, Z Xiong – AAAI, 2015 – pdfs.semanticscholar.org … When skip- gram networks are optimised via gradient ascent, the deriva- tives will modify the word … problems in CQA applications, inspired partially by the long thread of work on deep learning. … Rank learning for factoid question answering with lin- guistic and semantic constraints … Cited by 9 Related articles All 3 versions

Twitter sentiment analysis with deep convolutional neural networks A Severyn, A Moschitti – Proceedings of the 38th International ACM SIGIR …, 2015 – dl.acm.org … To train the embeddings, we use a skipgram model with window size 5 and filter words with … In the future we plan to apply deep learning ap- proach to other IR applications, eg, learning to rank for Microblog retrieval and answer reranking for Question Answering. … Cited by 35 Related articles All 7 versions

Named Entity Recognition and Question Answering Using Word Vectors and Clustering ZAR Veluri – pdfs.semanticscholar.org … The Word2Vec model, proposed by Tomas Mikolov at Google, used Skip Gram and Continuous … Another scalable algorithm using deep learning for NER was proposed by Collobert and Weston … based NER system, which is then utilized to aid in Question answering by reducing … Related articles All 2 versions

Learning to rank short text pairs with convolutional deep neural networks A Severyn, A Moschitti – Proceedings of the 38th International ACM SIGIR …, 2015 – dl.acm.org … which has been widely used as a scoring model in information re- trieval and question answering [13]. … of our word embeddings to be 50 to be on the line with the deep learning model of … To train the embeddings we use the skipgram model with window size 5 and filtering words … Cited by 75 Related articles All 7 versions

Deep graph kernels P Yanardag, SVN Vishwanathan – Proceedings of the 21th ACM …, 2015 – dl.acm.org … of sub-structures by using recently introduced language modeling and deep learning techniques. … we simply build the model by using CBOW or Skip-gram algorithms and … IAmA and AskReddit are two question/answer- based subreddits and TrollXChromosomes and atheism are … Cited by 16 Related articles All 3 versions

Image-Based Question Answering with Visual-Semantic Embedding M Ren – 2015 – cs.utoronto.ca … In recent years, we have evidenced major breakthroughs in deep learning, machine learning … natural language processing and question answering using word embeddings and recurrent neural networks. … 2.2.1 Skip-gram embedding model … Related articles All 3 versions

Solving verbal comprehension questions in IQ test by Knowledge-Powered word embedding H Wang, F Tian, B Gao, J Bian, TY Liu – arXiv preprint arXiv:1505.07909, 2015 – arxiv.org … The second component of our framework leverages deep learning technologies to learn distributed representations … Note that in the context of verbal question answering, we have some specific require … we learn a single-sense word embedding by using the skip-gram method in … Cited by 7 Related articles All 6 versions

Radical Embedding: Delving Deeper to Chinese Radicals X Shi, J Zhai, X Yang, Z Xie, C Liu – Volume 2: Short Papers – pdfs.semanticscholar.org … makes the following three-fold contributions: (1) we propose a new deep learning technique, called … many NLP tasks such as machine translation (Sutskever et al., 2014), question answering (Iyyer et … There are two ways of embedding: CBOW and skip-gram (Mikolov et al., 2013 … Cited by 2 All 7 versions

Improving Distributed Representation of Word Sense via WordNet Gloss Composition and Context Clustering TCR Xu, YHX Wang – Volume 2: Short Papers – aclweb.org … (2014) proposed the MSSG model which extends the skip-gram model to … word vectors in more relevant NLP tasks such as word sense disam- biguation and question answering. … In ICML 2014 Workshop on Knowledge-Powered Deep Learning for Text Mining (KPDLTM2014). … Related articles All 7 versions

Learning semantic hierarchies: A continuous vector space approach R Fu, J Guo, B Qin, W Che, H Wang, T Liu – IEEE Transactions on Audio, …, 2015 – dl.acm.org … More recently, [12] propose two log-linear models, namely the Skip-gram and CBOW model, to efficiently induce word embeddings. … Additionally, their experiment results have shown that the Skip-gram model performs best in identifying semantic relationship among words. … Cited by 2 Related articles All 8 versions

In defense of word embedding for generic text representation G Lev, B Klein, L Wolf – … Conference on Applications of Natural Language …, 2015 – Springer … In our experiments, we employ the Skip-gram architecture, which is considered preferable. … 0.644. Deep learning bigram + count [40]. … http://?dl.?acm.?org/?citation.?cfm??id=?944919.?944966 MATH. 6. Bordes, A., Chopra, S., Weston, J.: Question answering with subgraph … Cited by 2 Related articles All 6 versions

Learning multi-faceted representations of individuals from heterogeneous evidence using neural networks J Li, A Ritter, D Jurafsky – arXiv preprint arXiv:1510.05198, 2015 – arxiv.org … performances in varieties of NLP tasks eg, sentiment analysis [64], question-answering [25], tagging … Deep learning methods are incorporated into relation learning models [62, 5] to learn deep … which model user embedings by taking inspi- ration from skip-gram language models … Cited by 5 Related articles All 3 versions

Distributional Neural Networks for Automatic Resolution of Crossword Puzzles A Severyn, M Nicosia, G Barlacchi… – Volume 2: Short … – anthology.aclweb.org … 2.3 Reranking with Kernels We applied our reranking framework for question answering systems (Moschitti, 2008 … Considering recent deep learning models for matching sentences, our network is most similar to … We opt for a skipgram model with window size 5 and filtering words … Cited by 2 Related articles All 14 versions

Learning multi-faceted representations of individuals from heterogeneous evidence using neural networks J Li, A Ritter, D Jurafsky – pdfs.semanticscholar.org … performances in varieties of NLP tasks eg, sentiment analysis [67], question-answering [25], tagging … Deep learning methods are incorporated into relation learning models [65, 5] to learn deep … which model user embedings by taking inspi- ration from skip-gram language models …

Topics, Trends, and Resources in Natural Language Processing (NLP) M Bansal – Citeseer … NLP Examples Question Answering Page 4. … Distributional Semantics: PMI, NNs, CCA Compositional Semantics I: Vector-form, Deep Learning Compositional Semantics II: Logic-form, Semantic Parsing, Q&A … BROWN (Brown et al., 1992): SKIPGRAM (Mikolov et al., 2013): tree. … Related articles All 2 versions

Deep Learning for NLP K Vodrahalli – 2015 – vision.princeton.edu … word sense, semantic similarity, etc. ? How does Deep Learning relate? – NLP typically has sequential learning tasks … Word similarity, word disambiguation – Analogy / Question answering Page 3. … Q = N*D + D*log 2 V ? Skip-gram (we will go into more detail later) … Related articles All 2 versions

A Convolutional Architecture for Short Text Expansion and Classification P Wang, J Xu, B Xu, C Liu, H Hao – 2015 IEEE/WIC/ACM …, 2015 – ieeexplore.ieee.org … reviews, and short messages, plays an important role in understanding user intent, question answering and intelligent … Methods based on deep learning mainly obtain the phrase or sentence level representation [3], [4]by … [6] introduced the continuous Skip- gram model that is an … Related articles

Learning term embeddings for hypernymy identification Z Yu, H Wang, X Lin, M Wang – Proceedings of the 24th …, 2015 – pdfs.semanticscholar.org … encing, and as a result, hypernymy identification has many applications, including ontology construction [Suchanek et al., 2008], machine reading [Etzioni et al., 2006], question answering [McNamee et … [2013a] introduced a log-linear model, namely the Skip- gram, to efficiently … Cited by 4 Related articles All 5 versions

An Investigation of Neural Embeddings for Coreference Resolution V Godbole, W Liu, R Togneri – International Conference on Intelligent Text …, 2015 – Springer … that refer to the same entity in the real world, with applications in question answering and document … for tasks the network may not have been explicitly trained for [3]. Work in deep learning has led … Specifically we used the word2vec that were trained using the skip-gram objective … Cited by 1 Related articles

Joint Embedding of Query and Ad by Leveraging Implicit Feedback S Lee, Y Hu – emnlp2015.org … Joint embedding has also been ap- plied to question answering (Wang et al., 2014) and se- mantic understanding (Yang et al., 2014). … The former approach is known as continuous bag-of-words (CBOW) and the latter Skip-gram. … Cited by 1 Related articles All 10 versions

Integrating word embeddings and traditional NLP features to measure textual entailment and semantic relatedness of sentence pairs J Zhao, M Lan, ZY Niu, Y Lu – 2015 International Joint …, 2015 – ieeexplore.ieee.org … language models [2], [3], [4] over a large raw corpus using deep learning paradigms have … be entailed from other expressions should be omitted in the summary, question answering [13] where a … 4] introduced a continuous bag-of-words (CBOW) model and a skip-gram model for … Related articles

Classifying relations by ranking with convolutional neural networks CN Santos, B Xiang, B Zhou – arXiv preprint arXiv:1504.06580, 2015 – arxiv.org … intermediate step in many com- plex NLP applications such as question-answering and automatic … We perform pre-training using the skip-gram NN architecture (Mikolov et al., 2013 … Recently, deep learning (Bengio, 2009) has be- come an attractive area for multiple applications … Cited by 58 Related articles All 8 versions

Neural Self Talk: Image Understanding via Continuous Questioning and Answering Y Yang, Y Li, C Fermuller, Y Aloimonos – arXiv preprint arXiv:1512.03460, 2015 – arxiv.org … 2) we propose an image question generation module based on deep learning method. … 3: The presented architecture of question generation module (part A) and question answering module (part … also uses word embedding model from general pur- pose skip-gram embedding [P … Cited by 5 Related articles All 4 versions

Predicting Best Answerers for New Questions: An Approach Leveraging Distributed Representations of Words in Community Question Answering H Dong, J Wang, H Lin, B Xu… – 2015 Ninth International …, 2015 – ieeexplore.ieee.org … A. Comminity Question Answering In recent years, CQA sites have built a very large questions and … It is difficult to tolerate Deep Learning if features have high dimension. … Google adopt Continuous Bag-Of-Words (CBOW) model and Skip-gram model to implement Word2Vec[22]. … Cited by 1 Related articles All 2 versions

Generic Text Representation for Vision and NLP based on Neural Word Embedding G Lev – 2015 – cs.tau.ac.il … In our experiments, we employ the Skip-gram architecture, which is considered preferable. … In [63], Yu et al. are using distributed representations that are based on deep learning for the task of identifying sentences that contain the answer to a given question. … Related articles

Medical synonym extraction with concept space models C Wang, L Cao, B Zhou – arXiv preprint arXiv:1506.00528, 2015 – arxiv.org … query expansion, text summarization [Barzilay and Elhadad, 1999], question answering [Ferrucci, 2012 … 4 different settings: HS+CBOW, HS+SkipGram, NEG+CBOW, and NEG+SkipGram. … of the resulting synonym knowl- edgebase, and exploring how deep learning models can … Cited by 7 Related articles All 7 versions

Incorporating Linguistic Knowledge for Learning Distributed Word Representations Y Wang, Z Liu, M Sun – PloS one, 2015 – journals.plos.org … thus are widely adopted in various applications such as information retrieval, text classification and question answering. … representation [6]: Continuous Bag-of-Words Model (CBOW) and Continuous Skip-gram Model. … Model initialization plays an important role in deep learning. … Related articles All 9 versions

A Hierarchical Knowledge Representation for Expert Finding on Social Media Y Li, W Li, S Li – Volume 2: Short Papers – aclweb.org … We apply the word2vec skip-gram model (Mikolov et al, 2013 (a); Mikolov et al, 2013 (b)) to encode each word in our vocabulary with a … 2008. Identifying authoritative actors in question-answering forums: the case of yahoo … Is deep learning really necessary for word em- beddings … Cited by 1 Related articles All 9 versions

A Knowledge Resources Based Neural Network for Learning Word and Relation Representations S Yuan, Y Xiang, M Li – … Safety and Security (CSS), 2015 IEEE …, 2015 – ieeexplore.ieee.org … Recently, with the rapid development of deep learning techniques, distributed representations of words … predicts the current word given its context and the skip-gram model predicts the … resource to implement AI tasks like knowledge reasoning and automatic question answering. … Related articles All 2 versions

Smarter Image Search By Generating a Natural Language Description of Images NI Puri, V Agarwal – 2015 – vipulaggarwal.in … Hence question answering systems such as AquaLog that do not rely simply on keyword matching provide more accurate results. … Deep Learning is an extremely active area of research. … [45] have presented improvements on the existing Skip-Gram model. … Related articles

Contextual Text Understanding in Distributional Semantic Space J Cheng, Z Wang, JR Wen, J Yan, Z Chen – Proceedings of the 24th ACM …, 2015 – dl.acm.org … features computed from our framework have been used as important features to shift a commercial question-answering system … Figure 2: Generative Word-Concept Skip-grams … in Figure 2. 4.3 Speeding up the Training In all variants of the extended neural Skip-gram models, com … Related articles All 3 versions

A Novel Hierarchical Convolutional Neural Network for Question Answering over Paragraphs S Zheng, H Bao, J Zhao, J Zhang… – 2015 IEEE/WIC/ACM …, 2015 – ieeexplore.ieee.org … Quiz bowl question answering Quiz bowl question answering is a popular quiz game played by … Recently, deep learning methods have been successfully applied to many NLP tasks and many … model we use to train word embedding is the hierarchical skipgram model setting … Related articles

Identifying synonymy between relational phrases using word embeddings NTH Nguyen, M Miwa, Y Tsuruoka, S Tojo – Journal of biomedical …, 2015 – Elsevier … we apply the continuous bag-of-words (CBOW) model, a deep-learning technique proposed by … the performance of high-level text-mining applications such as question answering and entailment … the Continuous Bag-Of-Word (CBOW) model and the continuous Skip-gram model. … Related articles All 8 versions

Sparse overcomplete word vector representations M Faruqui, Y Tsvetkov, D Yogatama, C Dyer… – arXiv preprint arXiv: …, 2015 – arxiv.org … Table 3 shows consistent improvements of sparsifying vectors (method A). The exceptions are on the SimLex task, where our sparse vectors are worse than the skip-gram initializer and on par with the multilingual initializer. … Cited by 16 Related articles All 5 versions

Analysis of word embeddings and sequence features for clinical information extraction L De Vine, M Kholghi, G Zuccon, L Sitbon, A Nguyen – 2015 – eprints.qut.edu.au … For example, Zhang and LeCun (2015) demonstrated that deep learning can be applied to text under … a sequence W = {w1,…,wt,…,wn} of training words, the objective of the Skip-gram model is … grams, bi- grams, tri-grams and tetra-grams, but we also in- clude skip-grams such as … Cited by 2 Related articles All 12 versions

Detecting semantically equivalent questions in online user forums D Bogdanova, C dos Santos, L Barbosa, B Zadrozny – CoNLL, 2015 – aclweb.org … 1 Introduction Question-answering (Q&A) community sites, such as Yahoo … using Machine Translation evaluation met- rics (Madnani et al., 2012) and Deep Learning techniques (Socher … We perform pre-training using the skip-gram NN architecture (Mikolov et al., 2013) available … Cited by 6 Related articles All 8 versions

Computer Vision and Natural Language Processing: Recent Approaches in Multimedia and Robotics P Wiriyathammabhum – cs.umd.edu … Some complex tasks in NLP are machine translation, information extraction, dialog interface, question answering, parsing, summarization etc. … At first, SP was first introduced as a natural language interface for question answering database system [202]. … Related articles All 3 versions

Machine Learning Sentiment Prediction based on Hybrid Document Representation P Stalidis, M Giatsoglou, K Diamantaras… – arXiv preprint arXiv: …, 2015 – arxiv.org … Related tasks are question answering (recognizing opinion … words; (b) Skip-gram model (SG): predicts a window of words when a single word is known. It operates … As mentioned above, we employed both shallow and deep learning models for the classification stage. … Related articles All 3 versions

Fine-grained opinion mining with recurrent neural networks and word embeddings P Liu, S Joty, H Meng – Conference on Empirical Methods in Natural …, 2015 – cs.cmu.edu … benefit from fine-grained opinion mining including opinion summarization and opinion-oriented question answering. … An alternative approach of deep learning auto- matically learns latent features as distributed … word based on the context words, and (ii) a skip-gram model that … Cited by 21 Related articles All 12 versions

Segment-phrase table for semantic segmentation, visual entailment and paraphrasing H Izadinia, F Sadeghi, SK Divvala… – Proceedings of the …, 2015 – cv-foundation.org … important enabling factor is the availability of large-scale image databases [8, 29], com- bined with deep learning methods [24 … Both these tasks are important constituents in a wide range of NLP applications, including question answering, summarization, and machine translation … Cited by 4 Related articles All 14 versions

Automatic ranking of swear words using word embeddings and pseudo-relevance feedback LF D’Haro, RE Banchs – 2015 Asia-Pacific Signal and …, 2015 – ieeexplore.ieee.org … Finally, in both cases the embedding were obtained using the skip-gram method. … “VectorSLU: A Continuous Word Vector Approach to Answer Selection in Community Question Answering Systems.” Proceedings of the 9th … “Applications of Deep Learning to Sentiment Analysis of … Related articles All 2 versions

Sparse Overcomplete Word Vector Representations MFYTD Yogatama, CDNA Smith – aclweb.org … Table 3 shows consistent improvements of sparsifying vectors (method A). The exceptions are on the SimLex task, where our sparse vectors are worse than the skip-gram initializer and on par with the multilingual initializer. … Related articles All 6 versions

A primer on neural network models for natural language processing Y Goldberg – arXiv preprint arXiv:1510.00726, 2015 – arxiv.org … 2013; Hermann & Blunsom, 2013), target-dependent sentiment classification (Dong, Wei, Tan, Tang, Zhou, & Xu, 2014) and question answering (Iyyer, Boyd … in Section 9. Networks with more than one hidden layer are said to be deep networks, hence the name deep learning. … Cited by 38 Related articles All 8 versions

Biomedical semantic indexing using dense word vectors in BioASQ A Kosmopoulos, I Androutsopoulos… – Journal Of Bio-Medical …, 2015 – aueb.gr … MeSH subject headings to biomedical articles is also one of the tasks of the annual Biomedical Semantic Indexing and Question Answering challenge (BioASQ, Task A … We used the ‘skip-gram’ model of word2vec, which can be efficiently applied to very large corpora [26, 30, 35]. … Cited by 6 Related articles All 5 versions

Movieqa: Understanding stories in movies through question-answering M Tapaswi, Y Zhu, R Stiefelhagen, A Torralba… – arXiv preprint arXiv: …, 2015 – arxiv.org … Fast progress in Deep Learning as well as a large amount of available labeled data … We investigate a number of intelligent baselines for question-answering ranging from very simple ones to … The general problem of multi-choice question answer- ing can be formulated by a three … Cited by 19 Related articles All 8 versions

The Mechanism of Additive Composition R Tian, N Okazaki, K Inui – arXiv preprint arXiv:1511.08407, 2015 – arxiv.org … This constraint provides a rigorous and unified explanation for the additive compositional- ity of several recently proposed word vectors, including the Skip-Gram with Negative Sampling (SGNS) (Mikolov et al., 2013b), the GloVe model (Pennington et al., 2014), the Hellinger … Cited by 2 Related articles All 3 versions

A Vector Space Approach for Aspect-Based Sentiment Analysis A Alghunaim – 2015 – groups.csail.mit.edu … Socher, 2014). To compute the vector representations of words, we use the skip-gram model of Word2Vec (Mikolov, 2014, Mikolov et al., 2013a,b,d). The Skip-gram model aims to find word … When training the skip-gram model we use the GoogleNews dataset (Mikolov, 2014) that … Cited by 3 Related articles All 12 versions

Beyond the Distributional Hypothesis M Faruqui – cs.cmu.edu … tasks: SEM-REL, SYN-REL. These benchmarks are described in appendix C. We used three different word vector models: SVD, RNN and Skip-Gram (SG) described in appendix B. Vector Training Data. For English, German and … All 2 versions

Welcome to IJCAI 2015! G Simari, RO Rodriguez, M Wooldridge, Q Yang – ijcai-15.org Page 1. 2 IJCAI 15 TECHNICAL PROGRAM Welcome to IJCAI 2015! We are delighted to welcome you to the Twenty-Fourth International Joint Conference on AI (IJCAI-15). The two years since IJCAI-13 in Beijing have been a tre- mendously exciting time for AI. … All 2 versions

Structural information aware deep semi-supervised recurrent neural network for sentiment analysis W Rong, B Peng, Y Ouyang, C Li, Z Xiong – Frontiers of Computer Science, 2015 – Springer … Keywords sentiment analysis, recurrent neural network, deep learning, machine learning 1 Introduction … Among them, Skip-gram architecture is a widely used mechanism for this task. Skip-gram tries to pre- dict surrounding words based on the current word. … Cited by 3 Related articles All 4 versions

Analyzing Analytics R Bordawekar, B Blainey, R Puri – Synthesis Lectures on …, 2015 – morganclaypool.com … Using case studies from important application domains such as deep learning, text analytics, and business intelligence (BI), we demonstrate how … and performance of gaming systems as demonstrated by the recent win of IBM’s Wat- son intelligent question-answer system over … Cited by 3 Related articles All 9 versions

Unsupervised Learning and Modeling of Knowledge and Intent for Spoken Dialogue Systems YN Chen – target, 2015 – cs.cmu.edu … 12 2.5.1 Linear Word Embeddings . . . . . 13 2.5.1.1 Continuous Bag-of-Words (CBOW) Model . . . . . 14 2.5.1.2 Continuous Skip-Gram Model . . . . . 14 2.5.2 Dependency-Based Word Embeddings . . . . . 14 v Page 12. … Cited by 1 Related articles All 10 versions

Learning visually grounded meaning representations CH Silberer – 2015 – era.lib.ed.ac.uk … 79 5 Visually Grounded Semantic Representations with Autoencoders 81 5.1 Deep Learning in Artificial Neural Networks . . . . . 83 5.1.1 (Deep) NeuralNetworks . . . . . 83 5.1.2 Multimodal Deep Learning . . . . . 85 … Related articles All 2 versions