Autoencoder & Natural Language 2013


See also:

100 Best GitHub: Deep Learning | Autoencoder & Natural Language 2014 | Autoencoder & Natural Language 2015Neural Network & Dialog Systems


Cutting recursive autoencoder trees C Scheible, H Schütze – arXiv preprint arXiv:1301.2811, 2013 – arxiv.org … Abstract Deep Learning models enjoy considerable success in Natural Language Process- ing. … We present an analysis of the Semi-Supervised Recursive Autoencoder, a well-known model that produces structural representations of text. … Cited by 3 Related articles All 2 versions Cite Save

Natural Language Processing and Chinese Computing GZJ Li, DZY Feng – Springer … interest is in machine learning, with applications in natural language processing, cog- nitive science, and social media. http://pages. cs. wisc. edu/jerryzhu/ Page 16. Table of Contents Fundamentals on Language Computing Text Window Denoising Autoencoder: Building Deep … Cite Save

Text Window Denoising Autoencoder: Building Deep Architecture for Chinese Word Segmentation K Wu, Z Gao, C Peng, X Wen – Natural Language Processing and …, 2013 – Springer Abstract Deep learning is the new frontier of machine learning research, which has led to many recent breakthroughs in English natural language processing. However, there are inherent differences between Chinese and English, and little work has been done to apply … Related articles All 4 versions Cite Save

Deep Learning for Natural Language Processing T Du, VK Shanker – eecis.udel.edu … language process. This paper reviews the recent research on deep learning, its applications and recent development in natural language processing. 1 Introduction Deep learning … Hinton 2009). 2.1 Stacked auto-encoder One good … Cite Save More

Hebrew Paraphrase Recognition Using Deep Learning Architecture G Stanovsky, ABPM Elhadad – cs.bgu.ac.il … This auto-encoder was trained to minimize the reconstruction error. ? By recursively applying the autoencoder – a feature vector is obtained also for internal … A unified architecture for natural language processing: deep neural networks with multitask learning. … Related articles All 2 versions Cite Save More

Word classification for sentiment polarity estimation using neural network H Yanagimoto, M Shimada, A Yoshimura – Human Interface and the …, 2013 – Springer … This approaches are called a neural network language model and proposed by Bengio and Arisoy[7,8]. In natural language processing a document is represented as a discrete vector … Finally Restricted Boltzmann Machine (RBM)[10] and Sparse Autoencoder[11] are described. … Related articles All 2 versions Cite Save

Document similarity estimation for sentiment analysis using neural network H Yanagimoto, M Shimada… – … and Information Science …, 2013 – ieeexplore.ieee.org … Contest. In natural language processing deep architecture neural networks are used to construct a language model. Since … filtering. Finally Restricted Boltzmann Machine (RBM)[?] and Sparse Autoencoder[?] are described. A … Related articles Cite Save

Deep Learning & Convolutional Networks In Vision (part 2) Y LeCun – di.ens.fr … INPUT SPACE FEATURE SPACE Page 24. Y LeCun Predictive Sparse Decomposition (PSD): sparse auto-encoder Prediction the optimal code with a trained encoder … Y 0 () + [Rolfe & LeCun ICLR 2013] Discriminative Recurrent Sparse Auto-Encoder (DrSAE) Page 34. Y LeCun … Related articles Cite Save More

Multilingual Deep Learning MM Khapra, B Ravindran, V Raykar, A Saha – sarathchandar.in … However, the ever increasing number of languages on the web today has made it important to accurately process natural language data in such less fortunate 1 Page 2. languages also. … We propose a novel variant of auto-encoder called predictive auto-encoder, that learns the … Related articles All 2 versions Cite Save More

Learning entity representation for entity disambiguation Z He, S Liu, M Li, M Zhou, L Zhang, H Wang – Proc. ACL2013, 2013 – oldsite.aclweb.org … Entity linking or disambiguation has recently re- ceived much attention in natural language process- ing community (Bunescu and Pasca, 2006; Han et al., 2011 … As- sume the input is a vector x, an auto-encoder con- sists of an encoding process h(x) and a decod- ing process g(h … Cited by 3 Related articles All 7 versions Cite Save More

Bidirectional Recursive Neural Networks for Token-Level Labeling with Structure O Irsoy, C Cardie – arXiv preprint arXiv:1312.0493, 2013 – arxiv.org … This description fits many natural language processing tasks, when a single sentence is viewed as a sequence of tokens. … Note that this definition is structurally similar to the unfolding recursive autoencoder [10]. However the goals of the two architectures are different. … Related articles All 4 versions Cite Save

Convolution Neural Network for Relation Extraction CY Liu, WB Sun, WH Chao, WX Che – Advanced Data Mining and …, 2013 – Springer … But in terms of wide variety of natural language forms in which a given relation may be expressed by multi ways, including syntactic, morpho … As for compositive applications, [8] trained a deep architecture, auto-encoder, for sentiment classifier, and surpassed the state-of-the-art. … Related articles Cite Save

On autoencoder scoring H Kamyshanska, R Memisevic – Proceedings of the …, 2013 – machinelearning.wustl.edu Page 1. On autoencoder scoring Hanna Kamyshanska kamyshanska@fias.uni-frankfurt.de … Recent work has shown how certain autoencoders can assign an un- normalized “score” to data which measures how well the autoencoder can represent the data. … Cited by 3 Related articles All 8 versions Cite Save

Answer Extraction by Recursive Parse Tree Descent C Malon, B Bai – ACL 2013, 2013 – aclweb.org … If the autoencoder is trained only on reconstruction error and not subtree recognition, the F1 score for token classification drops from. … capable of using learned representations of words and syntax in a parse tree structure to answer free form questions about natural language text … Related articles All 5 versions Cite Save More

Recursive Autoencoders for ITG-based Translation P Li, Y Liu, M Sun – nlp.csai.tsinghua.edu.cn … Fortunately, the rapid development of intersect- ing deep learning with natural language processing (Bengio et al., 2003; Collobert and Weston, 2008; Collobert et al., 2011; Glorot et al., 2011; Bordes et al., 2011; Socher et … Figure 2: A recursive autoencoder for multi-word strings … Cited by 2 Related articles All 9 versions Cite Save More

Stochastic Ratio Matching of RBMs for Sparse High-Dimensional Inputs Y Dauphin, Y Bengio – Advances in Neural Information Processing …, 2013 – papers.nips.cc … recognition, an im- portant set of application areas involve high-dimensional sparse input vectors, for example in some Natural Language Processing tasks … Sampling algorithm (Dauphin et al., 2011) that was proposed to handle that problem in the case of auto-encoder variants. … Related articles All 4 versions Cite Save

The role of syntax in vector space models of compositional semantics KM Hermann, P Blunsom – … of the 51st Annual Meeting of …, 2013 – newdesign.aclweb.org … the meaning of an utterance arises from the meaning of its parts is a funda- mental task of Natural Language Process- ing. … of recursive mod- els, the Combinatory Categorial Autoencoders (CCAE), which marry a semantic process pro- vided by a recursive autoencoder with the … Cited by 18 Related articles All 9 versions Cite Save More

Incremental Feature Construction for Deep Learning Using Sparse Auto-Encoder M Udommitrak, B Kijsirikul – ijoee.org … [12] G. Rostislav, and YL Cun, “Saturating auto-encoder,” arXiv preprint arXiv: 1301.3577 , 2013. … of Computer Engineering, Chulalongkorn Uni-versity. His research interests include artificial intelligence, machine learning and natural language processing. 176 … Related articles Cite Save More

Linear Autoencoder Networks for Structured Data A Sperduti – knoesis.wright.edu … where it is natural to represent data in a structured form; just to name a few, Chemistry, Bioinfor- matics, and Natural Language Processing … We show that linear autoencoder networks are actually closely linked to PCA not only in the case of vectorial input (see [Bourlard and Kamp … Related articles Cite Save More

A Deep Graphical Model for Spelling Correction S Raaijmakers – 2013 – repository.tudelft.nl … We proposed the use of small sublexicons as a remedy, relying on the reconstruction error of the autoencoder to identify the suitable … Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. … Related articles All 3 versions Cite Save More

Embedding with Autoencoder Regularization W Yu, G Zeng, P Luo, F Zhuang, Q He, Z Shi – Machine Learning and …, 2013 – Springer Page 1. Embedding with Autoencoder Regularization Wenchao Yu1,2, Guangxiang Zeng3, Ping Luo4, Fuzhen Zhuang1, Qing He1, and Zhongzhi Shi1 … This is guaranteed by the joint minimization of the embedding loss and the autoencoder reconstruction error. … Related articles All 2 versions Cite Save

Paraphrase Identification and Applications in Finding Answers in FAQ Databases A Amaral, RA Redol – fenix.tecnico.ulisboa.pt … For natural language engi- neers, the task has important applications in differ- ent types of real-world problems that … The recursive auto-encoder is a recursive neural network that learns feature rep- resentations for each node in the … (2011) Recursive autoencoder with dynamic … Cite Save More

Recursive deep models for semantic compositionality over a sentiment treebank R Socher, A Perelygin, JY Wu, J Chuang… – … in Natural Language …, 2013 – oldsite.aclweb.org Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pages 1631–1642, Seattle, Washington, USA, 18-21 … This model uses the same compositionality func- tion as the recursive autoencoder (Socher et al., 2011b) and recursive auto … Cited by 40 Related articles All 14 versions Cite Save More

Multilingual Distributed Representations without Word Alignment KM Hermann, P Blunsom – arXiv preprint arXiv:1312.6173, 2013 – ttic.uchicago.edu … [7] Ronan Collobert and Jason Weston. A unified architecture for natural language processing: Deep neural networks with multitask learning. … Learning multilingual word represen- tations using a bag-of-words autoencoder. In Deep Learning Workshop at NIPS, 2013. … Cited by 2 Cite Save More

“Not not bad” is not “bad”: A distributional account of negation KM Hermann, E Grefenstette, P Blunsom – arXiv preprint arXiv:1306.2158, 2013 – arxiv.org … In Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing, pages 1183–1193. Association for Computational Linguis- tics. … Proceedings of the 2012 Conference on Empir- ical Methods in Natural Language Processing. J. Bos. 2008. … Cited by 3 Related articles All 8 versions Cite Save

Large-Vocabulary Continuous Speech Recognition with Linguistic Features for Deep Learning P Qi – Education – cs229.stanford.edu … speech recognition (ASR) still re- mains one of the most challenging tasks in both machine learning and natural language processing … to train a DNN model that also predicts the linguistic feature themselves alongside the senone labels, which resembles an autoencoder in some … Related articles Cite Save More

Receptive field resolution analysis in convolutional feature extraction E Phaisangittisagul… – … (ISCIT), 2013 13th …, 2013 – ieeexplore.ieee.org … We can notice that the sparse autoencoder can obviously detect edge components from the natural image with the receptive resolution of 10 … the paucity-of-data problem: exploring the effect of training corpus size on classifier performance for natural language processing”, 1st … Related articles Cite Save

A multimodal framework for unsupervised feature fusion X Li, J Gao, H Li, L Yang, RK Srihari – Proceedings of the 22nd ACM …, 2013 – dl.acm.org … Recently, deep architectures achieved impressive results in application domains such as computer vision and natural language processing [2], [15] etc. … We can see from Figure 4-left that GPF has similar per- formance with Recursive Autoencoder (RAE) [5] because GPF uses … Cited by 1 Related articles Cite Save

Detecting Mutual Functional Gene Clusters from Multiple Related Diseases N Du, X Li, Y Zhang, A Zhang – Bioinformatics and Biomedicine …, 2013 – ieeexplore.ieee.org … Recently, many efforts have been devoted to develop learning algorithms for deep learning methods such as Deep Belief Networks and Stacked Autoencoder, with impressive results obtained in application domains such as computer vision and natural language processing [6 … Cite Save

Training energy-based models for time-series imputation P Brakel, D Stroobandt, B Schrauwen – The Journal of Machine Learning …, 2013 – dl.acm.org … Nelwamondo et al. (2007) trained autoencoder neural networks to impute missing … The idea of training Markov Random Fields in a discriminative way by using a simple determin- istic inference procedure is not new and has been used in image and natural language processing. … Cited by 2 Related articles All 3 versions Cite Save

Personalized news recommendation based on implicit feedback I Ilievski, S Roy – Proceedings of the 2013 International News …, 2013 – dl.acm.org … low. Figure 2(a) depicts an impersonation of our auto-encoder network. … 7. REFERENCES [1] R. Collobert, J. Weston, L. Bottou, M. Karlen, K. Kavukcuoglu, and P. Kuksa. Natural language processing (almost) from scratch. Journal … Cite Save

Improving neural networks with dropout N Srivastava – 2013 – cs.toronto.edu … Neural networks are powerful computational models that are being used extensively for solving problems in vision, speech, natural language processing and many … in the context of Denoising Autoencoders [26, 27] where noise is added to the inputs of an autoencoder and the … Cited by 9 Related articles All 2 versions Cite Save More

Deep learning of representations: Looking forward Y Bengio – Statistical Language and Speech Processing, 2013 – Springer … One of the key ingredients for success in the applications of deep learning to speech, images, and natural language processing [9,35] is the use of … The denoising auto-encoder takes a noisy version N(x) of original input x and tries to reconstruct x, eg, it minimizes ||r(N(x)) ? x||2 … Cited by 15 Related articles All 6 versions Cite Save

Dynamics of learning in deep linear neural networks AM Saxe, JL McClelland, S Ganguli – stanford.edu … Deep learning methods have realized impressive performance in a range of applications, from visual object classification [1, 2] to speech recognition [3] and natural language processing [4, 5]. These successes have been achieved … (18) holds, autoencoder pretraining will have … Related articles Cite Save More

Analysis of Relation between Unlabeled and Labeled Data to Self-Taught Learning Performance E Phaisangittisagul, R Chongprachawat – waset.org … The visualizations of some bases of each unlabeled data learned by the sparse autoencoder are shown in Fig.7-9. It is not … M., Brill, E. ”Mitigating the paucity-of-data problem: exploring the effect of training corpus size on classifier performance for natural language processing”, 1 … Related articles All 2 versions Cite Save More

Exact solutions to the nonlinear dynamics of learning in deep linear neural networks AM Saxe, JL McClelland, S Ganguli – arXiv preprint arXiv:1312.6120, 2013 – arxiv.org … in a range of applications, from visual object classification [1, 2, 3] to speech recognition [4] and natural language processing [5 … Page 9. autoencoder pretraining will have properly set up decoupled initial conditions for W21, with an appreciable initial association strength of ? ?. … Cited by 2 Related articles All 4 versions Cite Save

Sports Video Classification from Multimodal Information Using Deep Neural Networks DS Sachan, U Tekwani, A Sethi – 2013 AAAI Fall Symposium Series, 2013 – aaai.org … All rights reserved. areas of object recognition, action recognition, speech and speaker classification, natural language processing(NLP), scene labeling etc. … ISA layer 2 Video part of input Autoencoder Layer 1 Softmax Classifier Audio fea- tures Video fea- tures … Related articles All 2 versions Cite Save

Deep Canonical Correlation Analysis G Andrew, R Arora, J Bilmes, K Livescu – Proceedings of The 30th …, 2013 – jmlr.org … Contrastive divergence (Bengio & Delalleau, 2009) has had great success as a pretraining technique, as have many variants of autoencoder networks, includ- ing the denoising autoencoder (Vincent et al., 2008) used in the present work. … Cited by 8 Related articles All 8 versions Cite Save

Training neural networks with stochastic hessian-free optimization R Kiros – arXiv preprint arXiv:1301.3641, 2013 – arxiv.org … of Martens’ Hessian-free optimization incorporating dropout for training neural networks on classification and deep autoencoder tasks. … networks have been successfully used for tasks such as sentiment classification and compositional modeling of natural language from word … Cited by 2 Related articles All 4 versions Cite Save

Activities report from August 2012 to April 2013 T Amaral – 2013 – paginas.fe.up.pt … 4. Ruslan Salakhutdinov, Geoff Hinton Training a deep autoencoder or a classifier on MNIST … Better than shallow NNs in vision tasks; and natural language processing (NLP) tasks. … Ruslan Salakhutdinov’s and Geoff Hinton’s Matlab code for training a deep auto-encoder made of … Cite Save More

Deep Learning of Representations Y Bengio, A Courville – Handbook on Neural Information Processing, 2013 – Springer … Zemel, 1994). A one-hidden layer auto-encoder is very similar to an RBM and its reconstruction error gradient can be seen as an approximation of the RBM log-likelihood gradi- ent (Bengio and Delalleau, 2009). Both RBMs … Related articles All 4 versions Cite Save

Three Classes of Deep Learning Architectures and Their Applications: A Tutorial Survey L Deng – research.microsoft.com … processing including audio/speech, image/vision, multimodality, language modeling, natural language processing, and … DBN is sometimes used to mean DNN) 7. Deep auto-encoder: a DNN … The original form of the deep autoencoder (Hinton and Salakhutdinov, 2006; Deng et al … Related articles All 3 versions Cite Save More

Deep Learning for Signal and Information Processing L Deng, D Yu – cs.tju.edu.cn … present recent results of applying deep learning in language modeling and natural language processing, respectively. … Deep auto-encoder: a DNN whose output target is the data input itself. … The original form of the deep autoencoder (Hinton and Salakhutdinov, 2006; Deng et al … Cited by 1 Related articles All 2 versions Cite Save More

Deep Learning of Representations: a AAAI 2013 Tutorial Y Bengio – iro.umontreal.ca … In ICML’2011. Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., and Kuksa, P. (2011). Natural language processing (almost) from scratch. Journal of Machine Learning Research, 12, 2493–2537. … Higher order contractive auto-encoder. In European Conference 9 … Related articles Cite Save More

Transfer Learning for Voice Activity Detection: A Denoising Deep Neural Network Perspective XL Zhang, J Wu – arXiv preprint arXiv:1303.2104, 2013 – arxiv.org … It was motivated from the stacked denoising autoencoder [20, 21]. … In [24], Collobert and Weston proposed a joint training scheme for the multitask learning problem of natural language processing, whose key idea is similar with Scheme 3. The architecture of [24] is also … Related articles All 2 versions Cite Save

Applying Deep Learning to Enhance Momentum Trading Strategies in Stocks L Takeuchi, YYA Lee – cs229.stanford.edu … Lawrence Takeuchi * ltakeuch@stanford.edu Yu-Ying (Albert) Lee yy.albert.lee@gmail. com Abstract We use an autoencoder composed of stacked restricted Boltzmann machines to extract features from the history of individual stock prices. … Related articles Cite Save More

Learning deep structured semantic models for web search using clickthrough data PS Huang, X He, J Gao, L Deng, A Acero… – Proceedings of the 22nd …, 2013 – dl.acm.org … DAE (Row 6) is our implementation of the deep auto-encoder based semantic hashing model proposed by Salakhutdinov and Hinton in [22]. … “Natural language processing (almost) from scratch.” in JMLR, vol. 12. [5] Dahl, G., Yu, D., Deng, L., and Acero, A., 2012. … Cited by 12 Related articles All 13 versions Cite Save

Classification of mysticete sounds using machine learning techniques XC Halkias, S Paris, H Glotin – The Journal of the Acoustical Society …, 2013 – scitation.aip.org … However, we would also like to impose a sparse constraint on the hidden units, , since that would force the auto-encoder network to … R. Socher , CC-H. Lin , AY Ng , and CD Manning , “ Parsing natural scenes and natural language with recursive neural networks,” in ICML, edited … Cited by 1 Related articles All 3 versions Cite Save

Semantic clustering of scientific articles using explicit semantic analysis M Szczuka, A Janusz – Transactions on Rough Sets XVI, 2013 – Springer … texts (the corpus and the DBpedia abstracts) are converted to the bag-of-words (word-vector) 2 Natural Language Processing (NLP … Training Algorithm and Applications of New Neural Classifier [1] 9.19 ”Neural_Lab” [2] 9.17 ”Echo_state_network” [3] 8.75 ”Auto-encoder” [4] 8.30 … Cited by 2 Related articles All 3 versions Cite Save

Workshop on Neural-Symbolic Learning and Reasoning NeSy13 ASA Garcez, P Hitzler, LC Lamb – knoesis.wright.edu … page 13 – Linear Autoencoder Networks for Structured Data (invited talk) Alessandro Sperduti … Never- theless, newer approaches towards natural language un- derstanding systems attempt to integrate soft computing elements into the symbolic representations of the inte- grated … Related articles Cite Save More

Discriminative Improvements to Distributional Sentence Similarity Y Ji, J Eisenstein – aclweb.org … Measuring the semantic similarity of short units of text is fundamental to many natural language processing tasks, from evaluating machine transla … (2011) pro- pose a syntactically-informed approach to combine word representations, using a recursive auto-encoder to propagate … Related articles All 6 versions Cite Save More

Features in Deep Learning Architectures with Unsupervised Kernel k-Means KS Ni, RJ Prenger – 2013 – e-reports-ext.llnl.gov … Tasks in computer vision, audio and multimedia research, and natural language and text processing require features that saliently describe a semantic concept. … Accuracy Sparse Autoencoder 80.2% Sparse RBM 79.0 % Kernel GMM’s 75.6 % Kernel k-Means 80.6% … Related articles Cite Save More

Representation learning: A review and new perspectives Y Bengio, A Courville, P Vincent – 2013 – ieeexplore.ieee.org … Index Terms—Deep learning, representation learning, feature learning, unsupervised learning, Boltzmann machine, autoencoder, neural nets … 2.3 Natural Language Processing (NLP) Besides speech recognition, there are many other NLP applications of representation learning … Cited by 107 Related articles All 23 versions Cite Save

Fast algorithms for sparse principal component analysis based on Rayleigh quotient iteration V Kuleshov – … of the 30th International Conference on …, 2013 – machinelearning.wustl.edu … This technique is referred to in the litera- ture as sparse principal component analysis (sPCA), and has been successfully applied in areas as diverse as bioinformatics (Lee et al., 2010), natural language pro- cessing (Richtárik et al., 2012), and signal … (a) sSVD (b) Autoencoder … Cited by 1 Related articles All 2 versions Cite Save

Learning Paired-associate Images with An Unsupervised Deep Learning Architecture T Wang, DL Silver – arXiv preprint arXiv:1312.6171, 2013 – arxiv.org … Binary coding of speech spectrograms using a deep auto-encoder. In Takao Kobayashi, Keikichi Hirose, and Satoshi Nakamura, editors, Interspeech, pages 1692–1695. ISCA, 2010. … [5] Stephan Gouws. Deep unsupervised feature learning for natural language processing. … Related articles All 3 versions Cite Save

Deep belief networks based voice activity detection XL Zhang, J Wu – Audio, Speech, and Language Processing, …, 2013 – ieeexplore.ieee.org … Recently, DBN has received much attention in both the machine learning community [36] and the signal processing community [37] with successful applications to the speech recognition [4]–[9], natural language processing [38], etc. … Cited by 6 Related articles All 2 versions Cite Save

Active Labeling in Deep Learning and Its Application to Emotion Prediction D Wang – 2013 – mospace.umsystem.edu … 9 2.2.1 Restricted Boltzmann Machine ….. 9 2.2.2 Autoencoder ….. 14 … Fig. 5 An autoencoder ….. 15 Fig. … Cite Save

Missing Value Imputation With Unsupervised Backpropagation MS Gashler, MR Smith, R Morris, T Martinez – arXiv preprint arXiv: …, 2013 – arxiv.org … Related approaches have been used to generate labels for images (Coheh and Shawe-Taylor, 1990), and for natural language (Bengio et al., 2006). … UBP is comparable with the latter-half of an autoencoder (Hinton and Salakhutdinov, 2006). … Related articles All 2 versions Cite Save

Deep learning for detecting robotic grasps I Lenz, H Lee, A Saxena – arXiv preprint arXiv:1301.3592, 2013 – arxiv.org … of-the-art performance in a wide variety of tasks, including visual recognition [19, 37], audio recognition [22, 25], and natural language processing [6 … Using a standard feature learning approach such as sparse auto-encoder [10], a deep network can be trained for the problem of … Cited by 8 Related articles All 10 versions Cite Save

Learning the Thematic Roles of Words in Sentences via Connectionist Networks that Satisfy Strong Systematicity P Geranmayeh – 2013 – summit.sfu.ca … system. All the mentioned systems adopt the approaches and pursue the goals that are common in the field of Natural Language Processing. … neural networks, the backpropagation of errors, Hebbian learning, competitive learning, and auto-encoder neural networks. … Cite Save

Exploring Deep and Recurrent Architectures for Optimal Control S Levine – arXiv preprint arXiv:1311.1761, 2013 – arxiv.org … Multilayer neural networks such as autoencoders and convolutional neural networks have achieved state of the art results on a number of perception and natural language tasks [5, 11]1. However, recent attempts to … Deep auto-encoder neural networks in reinforcement learning. … Cited by 3 Related articles All 5 versions Cite Save

Sparse unsupervised feature learning for sentiment classification of short documents S Albertini, I Gallo, A Zamberletti – artelab.dista.uninsubria.it … Differ- ent natural language processing techniques are adopted to support the building of dictionaries and lexicons to identify opinion-bearing words such as the polarity of specific part-of-speech influ- enced by … 2011) employ a semi-supervised re- cursive auto-encoder to obtain … Related articles All 2 versions Cite Save More

Deep generative stochastic networks trainable by backprop Y Bengio, É Thibodeau-Laufer – arXiv preprint arXiv:1306.1091, 2013 – arxiv.org … Collobert, R. and Weston, J. (2008). A unified architecture for natural language processing: Deep neural networks with multitask learning. In ICML’2008. … Binary coding of speech spectrograms using a deep auto-encoder. In Interspeech 2010, Makuhari, Chiba, Japan. … Cited by 11 Related articles All 4 versions Cite Save

Understanding Deep Architectures using a Recursive Convolutional Network D Eigen, J Rolfe, R Fergus, Y LeCun – arXiv preprint arXiv:1312.1847, 2013 – arxiv.org … This contrasts with our model which has the same input and output dimension. Our model has links with several auto-encoder approaches. Sparse coding [17] uses iterative algo- rithms, such as ISTA [1], to perform inference. Rozell et al. … Cited by 2 Related articles All 3 versions Cite Save

Computational Phenotype Discovery Using Unsupervised Feature Learning over Noisy, Sparse, and Irregular Clinical Data TA Lasko, JC Denny, MA Levy – PloS one, 2013 – dx.plos.org … operators over structured data such as laboratory values, medication doses, vital signs, billing codes and concepts extracted via natural language processing of … At its simplest, an autoencoder is a network of three layers – an input layer with nodes representing the original data … Cited by 4 Related articles All 9 versions Cite Save More

Extreme Learning Machines E Cambria, GB Huang, LLC Kasun… – Intelligent Systems, …, 2013 – ieeexplore.ieee.org … In “Representational Learning with ELMs for Big Data,” the authors propose using the ELM as an auto-encoder for learning feature representations using singular values. … Stacked auto-encoder (SAE) 98.6 – Stacked eenoising auto-encoder (SDAE) 98.72 – Page 5. 34 … Cite Save

Faust: Flexible Acquistion and Understanding System for Text LL Voss, DE Wilkins, D Israel, C Manning, D Jurafsky… – 2013 – DTIC Document … 13. SUPPLEMENTARY NOTES 14. ABSTRACT The vast majority of scientific and technical knowledge is expressed in natural-language (NL) texts. … 24 4.3 4.3.1 Natural Language Processing ….. 24 … Cite Save

Sentiment analysis with limited training data L Qu – 2013 – scidok.sulb.uni-saarland.de … This dissertation focuses on learning based systems for automatic analysis of sentiments and comparisons in natural language text. The proposed approach … This dissertation is concerned with automatic analysis of sentiments and comparisons in natural language text. … Related articles All 4 versions Cite Save

A Multi-scale Approach to Gesture Detection and Recognition N Neverova, C Wolf, G Paci… – … (ICCVW), 2013 IEEE …, 2013 – ieeexplore.ieee.org … [1] M. Baccouche, F. Mamalet, C. Wolf, C. Garcia, and A. Baskurt. Spatio-Temporal Convolutional Sparse Auto- Encoder for Sequence Classification. … Evalita 2011: Forced alignment task. In Evaluation of Natural Language and Speech Tools for Italian, pages 305–311, 2013. … Cite Save

Image search—from thousands to billions in 20 years L Zhang, Y Rui – ACM Transactions on Multimedia Computing, …, 2013 – dl.acm.org Page 1. 36 Image Search—From Thousands to Billions in 20 Years LEI ZHANG and YONG RUI, Microsoft Research Asia This article presents a comprehensive review and analysis on image search in the past 20 years, emphasizing … Cited by 2 Related articles All 2 versions Cite Save

DLID: Deep Learning for Domain Adaptation by Interpolating between Domains S Chopra, S Balakrishnan… – ICML Workshop on …, 2013 – deeplearning.net … of Computer Vision (Duan et al., 2009; Jain & Learned-Miller, 2011; Wang & Wang, 2011), but also in Natural Language Processing (Blitzer et … with unstructured text, one might want to use a fully connected neural network, trained using the de-noising auto-encoder regime used … Cited by 4 Related articles Cite Save More

Probabilistic Siamese Network for Learning Representations C Liu – 2013 – tspace.library.utoronto.ca Page 1. Probabilistic Siamese Network for Learning Representations by Chen Liu A thesis submitted in conformity with the requirements for the degree of Master of Applied Science Graduate Department of Electrical and Computer Engineering University of Toronto … Related articles Cite Save

Deep architectures and deep learning in chemoinformatics: the prediction of aqueous solubility for drug-like molecules A Lusci, G Pollastri, P Baldi – Journal of chemical information and …, 2013 – ACS Publications … to natural language understanding to bioinformatics.(29-36) Thus, it is natural to try to apply deep learning methods to the prediction of molecular properties. There are several nonexclusive ways of generating deep architectures for complex tasks, such as autoencoder-based … Cited by 9 Related articles All 4 versions Cite Save

Introduction to Economic Modeling T Marwala – Economic Modeling Using Artificial Intelligence …, 2013 – Springer … This technique is useful in that unlike neural networks, for example where the identified model is a strictly mathematical concept, in rough sets method the identified model that describes the data is in terms of natural language. … Related articles Cite Save

Exploring sharing patterns for video recommendation on YouTube-like social media X Ma, H Wang, H Li, J Liu, H Jiang – Multimedia Systems, 2013 – Springer … so that the dimensionality of the transformed data is reduced [7]. Auto- encoder is a … used in many applications, such as handwritten digits recognition [53] and natural language processing [46]. … Autoencoder neural networks attempt to find features of the input set which can then … Cited by 2 Related articles Cite Save

Deep Learning For Sequential Pattern Recognition P Safari – 2013 – upcommons.upc.edu … 50 4.4 Pretraining a DBM with three hidden layers . . . . . 52 4.5 The denoising autoencoder architecture . . . . . 55 4.6 Deep stacking denoising autoencoder . . . . . 55 4.7 A deep stacking network . . . . . … Related articles Cite Save

Understanding and Quantifying Creativity in Lexical Composition P Kuznetsova, J Chen, Y Choi – rt4.cs.stonybrook.edu … 1 Introduction An essential property of natural language is the gen- erative capacity that makes it possible for people to express indefinitely many thoughts through indef- initely many different ways of composing phrases and sentences (Chomsky, 1965). … Cited by 1 Related articles All 9 versions Cite Save More

Machine Learning and Knowledge Discovery in Databases HBK Kersting, SNF Železný – Springer Page 1. Hendrik Blockeel Kristian Kersting Siegfried Nijssen Filip Železný (Eds.) LNAI 8188 Machine Learning and Knowledge Discovery in Databases European Conference, ECML PKDD 2013 Prague, Czech Republic, September 2013 Proceedings, Part I 123 Page 2. … Related articles All 2 versions Cite Save

Learning deep physiological models of affect HP Martinez, Y Bengio… – Computational …, 2013 – ieeexplore.ieee.org … An auto- encoder can be viewed as a non-linear generalization of PCA [8]; however, while PCA has been applied in AC to transpose sets of manually extracted features into low dimensional spaces, in this paper auto-encoders are used to train unsupervised CNNs to transpose … Cited by 8 Related articles All 2 versions Cite Save

Latent feature learning in social media network Z Yuan, J Sang, Y Liu, C Xu – Proceedings of the 21st ACM international …, 2013 – dl.acm.org … The ability of deep learning in high-level abstraction and distributed representation has been validat- ed in many classical machine learning problems, such as computer vision [20], natural language processing [21], in- formation retrieval [22]. … Cited by 1 Related articles Cite Save

Dimensionality Reduction RE Banchs – Text Mining with MATLAB®, 2013 – Springer … This process is actually a very complex task, as well as an interesting Natural Language Processing problem per se; and it falls out of the scope of this book. An alternative procedure to vocabulary merging is the one know as stemming. … Related articles Cite Save

[BOOK] Economic Modeling Using Artificial Intelligence Methods T Marwala – 2013 – Springer Page 1. Advanced Information and Knowledge Processing Economic Modeling Using Artificial Intelligence Methods Tshilidzi Marwala Page 2. Economic Modeling Using Artificial Intelligence Methods Page 3. Advanced Information and Knowledge Processing … Cited by 4 Related articles All 4 versions Cite Save

Herding: Driving Deterministic Dynamics to Learn and Sample Probabilistic Models DISSERTATION Y Chen – 2013 – ics.uci.edu … Although MRFs/CRFs have been applied successfully in various areas, among which are image segmentation (Blake et al., 2004), natural language processing (Sha & Pereira, 2003), social networks (Wasserman & Pattison, 1996) and bioinformatics (Li et al., 2007), these … Related articles All 2 versions Cite Save More

Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications J Ruiz-Shulcloper, GS di Baja – Springer … speech recognition, as well as their application in a number of diverse areas such as industry, health, robotics, data mining, entertainment, space exploration, telecommunications, document analysis, and natural language processing and … Auto-encoder Based Data Clustering … Related articles Cite Save

Machine learning paradigms for speech recognition: An overview L Deng, X Li – 2013 – ieeexplore.ieee.org … AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. X, NO. X, MONTH, 2013 8 as bioinformatics and natural language processing. For many ML as well as ASR researchers, the success of HMM in ASR is a bit surprising due to the well-known weaknesses of the HMM. … Cited by 18 Related articles All 9 versions Cite Save