Notes:
Text generation is the process of generating natural language text automatically, using artificial intelligence (AI) techniques such as natural language processing (NLP) and machine learning. Text generation algorithms analyze patterns in large datasets of text, and use this analysis to generate new text that is similar in style and content to the input data.
Text generation is used in a variety of applications, including natural language processing, information retrieval, and machine translation. In natural language processing, text generation algorithms can be used to generate responses to user input, generate summaries of text, or generate descriptions of images or other data.
In information retrieval, text generation algorithms can be used to generate search results or recommendations based on user input. For example, a search engine might use a text generation algorithm to generate a list of relevant search results based on a user’s query.
In machine translation, text generation algorithms can be used to translate text from one language to another. Text generation algorithms can analyze patterns in large datasets of translated text, and use this analysis to generate accurate and natural-sounding translations of new text.
Neural text generation is a type of artificial intelligence (AI) technique that uses neural networks to generate natural language text automatically. Neural networks are a type of machine learning algorithm that are inspired by the structure and function of the human brain, and are composed of interconnected “neurons” that process and transmit information.
In the context of text generation, neural networks are trained on large datasets of text, and learn to recognize patterns and generate text that is similar in style and content to the input data. Neural text generation algorithms can be used to generate responses to user input, generate summaries of text, or generate descriptions of images or other data.
There are several types of neural networks that can be used for text generation, including feedforward neural networks, convolutional neural networks, and recurrent neural networks. Each type of neural network has its own strengths and is suited to different types of tasks.
Feedforward neural networks are the simplest type of neural network, and are composed of an input layer, one or more hidden layers, and an output layer. They are used to classify and predict outcomes based on input data, and are well suited to tasks such as image classification and spam detection.
Convolutional neural networks (CNNs) are a type of feedforward neural network that are particularly well suited to tasks involving image or audio data. They are designed to recognize patterns and features in images or audio, and are commonly used in tasks such as image classification and object recognition.
Recurrent neural networks (RNNs) are a type of neural network that are designed to process sequential data, such as text or time series data. They are commonly used in natural language processing (NLP) and text generation tasks to analyze and generate human language, and can be trained to maintain context and memory of past events in a conversation.
Headlines can be automatically generated using artificial intelligence (AI) techniques such as natural language processing (NLP) and machine learning. Here are some common approaches to generating headlines automatically:
- Rule-based systems: Rule-based systems rely on a set of predefined rules or heuristics to generate headlines. For example, a rule-based system might be programmed to always generate headlines in a particular format (e.g., “5 Ways to…”), or to use specific words or phrases (e.g., “Amazing,” “Incredible,” “Must-See”).
- Machine learning-based systems: Machine learning-based systems use algorithms and data to learn patterns in text, and generate headlines based on these patterns. For example, a machine learning-based system might be trained on a dataset of headlines from a particular website or industry, and use this training data to generate headlines that are similar in style and content to the input data.
- Hybrid systems: Hybrid systems combine rule-based and machine learning-based approaches, and may use a combination of rules and machine learning algorithms to generate headlines.
In general, headline generation algorithms are trained on large datasets of headlines, and use this training data to learn patterns in the language and style of headlines. Once trained, the algorithms can be used to generate headlines for a variety of applications, including news articles, social media posts, and advertising.
References:
- User Modelling in Text Generation (2015)
- The Linguistic Basis of Text Generation (2009)
- Building Natural Language Generation Systems (2000)
See also:
Data-to-Text Systems | Generative Text & Natural Language Processing
User modelling in text generation
C Paris – 2015 – books.google.com
Linguistics: Bloomsbury Academic Collections This Collection, composed of 19 reissued titles from The Athlone Press, Cassell and Frances Pinter, offers a distinguished selection of titles that showcase the breadth of linguistic study. The collection is available both in e-book and print
Controllable text generation
Z Hu, Z Yang, X Liang, R Salakhutdinov… – arXiv preprint arXiv …, 2017 – arxiv.org
Abstract: Generic generation and manipulation of text is challenging and has limited success compared to recent deep generative modeling in visual domain. This paper aims at generating plausible natural language sentences, whose attributes are dynamically
A hybrid convolutional variational autoencoder for text generation
S Semeniuta, A Severyn, E Barth – arXiv preprint arXiv:1702.02390, 2017 – arxiv.org
Abstract: In this paper we explore the effect of architectural choices on learning a Variational Autoencoder (VAE) for text generation. In contrast to the previously introduced VAE model for text where both the encoder and decoder are RNNs, we propose a novel hybrid
MaskGAN: Better Text Generation via Filling in the _
A Dai, I Goodfellow, L Fedus – 2018 – research.google.com
Abstract Recurrent neural networks (RNNs) are a common method of generating text token by token. These models are typically trained via maximum likelihood (known in this context as teacher forcing). However, this approach frequently suffers from problems when using a
MaskGAN: Better Text Generation via Filling in the _
W Fedus, I Goodfellow, AM Dai – arXiv preprint arXiv:1801.07736, 2018 – arxiv.org
Abstract: Neural text generation models are often autoregressive language models or seq2seq models. These models generate text by sampling words sequentially, with each word conditioned on the previous word, and are state-of-the-art for several machine
Neural text generation from structured data with application to the biography domain
R Lebret, D Grangier, M Auli – arXiv preprint arXiv:1603.07771, 2016 – arxiv.org
Abstract: This paper introduces a neural model for concept-to-text generation that scales to large, rich domains. We experiment with a new dataset of biographies from Wikipedia that is an order of magnitude larger than existing resources with over 700k samples. The dataset is
The role of oral language in underpinning the text generation difficulties in children with specific language impairment
JE Dockrell, V Connelly – Journal of Research in Reading, 2015 – Wiley Online Library
Abstract Children with specific language impairments (SLI) have difficulties in producing written text. It was hypothesised that the constraints on writing in children with SLI were similar to typically developing younger children with the same level of vocabulary
The effect of language specific factors on early written composition: the role of spelling, oral language and text generation skills in a shallow orthography
B Arfé, JE Dockrell, B De Bernardi – Reading and Writing, 2016 – Springer
Abstract Spelling skills have been identified as one of the major barriers to written text production in young English writers. By contrast oral language skills and text generation have been found to be less influential in the texts produced by beginning writers. To date,
Globally coherent text generation with neural checklist models
C Kiddon, L Zettlemoyer, Y Choi – … of the 2016 Conference on Empirical …, 2016 – aclweb.org
Abstract Recurrent neural networks can generate locally coherent text but often have difficulties representing what has already been generated and what still needs to be said–especially when constructing long texts. We present the neural checklist model, a recurrent
Affect-lm: A neural language model for customizable affective text generation
S Ghosh, M Chollet, E Laksana, LP Morency… – arXiv preprint arXiv …, 2017 – arxiv.org
Abstract: Human verbal communication includes affective messages which are conveyed through use of emotionally colored words. There has been a lot of research in this direction but the problem of integrating state-of-the-art neural language models with affective
A knowledge graph-based content selection model for data-driven text generation
JP Gong, J Cao, PZ Zhang – International Journal of …, 2017 – inderscienceonline.com
Content selection is a critical task for natural language generation. A novel approach based on knowledge graph is proposed. Structure data is mapping to the graph and combined with user defined knowledge. The model analyses the content selection features on the graph,
Discriminative syntax-based word ordering for text generation
Y Zhang, S Clark – Computational Linguistics, 2015 – MIT Press
Word ordering is a fundamental problem in text generation. In this article, we study word ordering using a syntax-based approach and a discriminative model. Two grammar formalisms are considered: Combinatory Categorial Grammar (CCG) and dependency
Adversarial feature matching for text generation
Y Zhang, Z Gan, K Fan, Z Chen, R Henao… – arXiv preprint arXiv …, 2017 – arxiv.org
Abstract: The Generative Adversarial Network (GAN) has achieved great success in generating realistic (real-valued) synthetic data. However, convergence issues and difficulties dealing with discrete data hinder the applicability of GAN to text. We propose a
AMR-to-text generation as a Traveling Salesman Problem
L Song, Y Zhang, X Peng, Z Wang, D Gildea – arXiv preprint arXiv …, 2016 – arxiv.org
Abstract: The task of AMR-to-text generation is to generate grammatical text that sustains the semantic meaning for a given AMR graph. We at-tack the task by first partitioning the AMR graph into smaller fragments, and then generating the translation for each fragment, before
Deep Learning for Image-to-Text Generation: A Technical Overview
X He, L Deng – IEEE Signal Processing Magazine, 2017 – ieeexplore.ieee.org
Generating a natural language description from an image is an emerging interdisciplinary problem at the intersection of computer vision, natural language processing, and artificial intelligence (AI). This task, often referred to as image or visual captioning, forms the
Towards more variation in text generation: Developing and evaluating variation models for choice of referential form
TC Ferreira, E Krahmer, S Wubben – … of the 54th Annual Meeting of the …, 2016 – aclweb.org
Abstract In this study, we introduce a nondeterministic method for referring expression generation. We describe two models that account for individual variation in the choice of referential form in automatically generated text: a Naive Bayes model and a Recurrent
Data-to-text generation improves decision-making under uncertainty
D Gkatzia, O Lemon, V Rieser – IEEE Computational …, 2017 – ieeexplore.ieee.org
Decision-making is often dependent on uncertain data, eg data associated with confidence scores or probabilities. This article presents a comparison of different information presentations for uncertain data and, for the first time, measures their effects on human
Expressionist: An authoring tool for in-game text generation
J Ryan, E Seither, M Mateas… – … Conference on Interactive …, 2016 – Springer
Abstract We present Expressionist, an authoring tool for in-game text generation that combines the raw generative power of context-free grammars (CFGs) with the expressive power of free-text markup. Specifically, authors use the tool to define CFGs whose
Effects of text generation on P300 brain-computer interface performance
JE Huggins, RE Alcaide-Aguirre, K Hill – Brain-Computer Interfaces, 2016 – Taylor & Francis
Abstract Brain-computer interfaces (BCIs) are intended to provide independent communication for those with the most severe physical impairments. However, development and testing of BCIs is typically conducted with copy-spelling of provided text, which models
Automatic Text Generation via Text Extraction Based on Submodular
L Ai, N Li, J Zheng, M Gao – Asia-Pacific Web (APWeb) and Web-Age …, 2017 – Springer
Abstract Automatic text generation is the generation of natural language texts by computer. It has many applications, including automatic report generation, online promotion, etc. However, the problem is still a challenged task due to the lack of readability and coherence
A novel approach for automatic text analysis and generation for the cultural heritage domain
F Piccialli, F Marulli, A Chianese – Multimedia Tools and Applications, 2017 – Springer
… Keywords. Natural language generation Cultural heritage Text generation Knowledge modeling. Download fulltext PDF. 1 Introduction … 4 A multidimensional representation model for knowledge supporting user profiling and domain driven text generation …
Improving fluency in narrative text generation with grammatical transformations and probabilistic parsing
E Ahn, F Morbini, A Gordon – … of the 9th International Natural Language …, 2016 – aclweb.org
Abstract In research on automatic generation of narrative text, story events are often formally represented as a causal graph. When serializing and realizing this causal graph as natural language text, simple approaches produce cumbersome sentences with repetitive syntactic
When to Finish? Optimal Beam Search for Neural Text Generation (modulo beam size)
L Huang, K Zhao, M Ma – Proceedings of the 2017 Conference on …, 2017 – aclweb.org
Abstract In neural text generation such as neural machine translation, summarization, and image captioning, beam search is widely used to improve the output text quality. However, in the neural generation setting, hypotheses can finish in different steps, which makes it difficult
Rule-based Approach to Text Generation in Natural Language-Automated Text Markup Language (ATML3).
A Bauer, N Hoedoro, A Schneider – Challenge+ DC@ RuleML, 2015 – ceur-ws.org
Abstract. The need for text online is quite large. The majority of websites still make use of text as their main representative form for information. Most people can easily comprehend issues in text form and often prefer this type information for communication. Still the prices for
Texygen: A Benchmarking Platform for Text Generation Models
Y Zhu, S Lu, L Zheng, J Guo, W Zhang, J Wang… – arXiv preprint arXiv …, 2018 – arxiv.org
Abstract: We introduce Texygen, a benchmarking platform to support research on open-domain text generation models. Texygen has not only implemented a majority of text generation models, but also covered a set of metrics that evaluate the diversity, the quality
AMR-to-text generation with synchronous node replacement grammar
L Song, X Peng, Y Zhang, Z Wang, D Gildea – arXiv preprint arXiv …, 2017 – arxiv.org
Abstract: This paper addresses the task of AMR-to-text generation by leveraging synchronous node replacement grammar. During training, graph-to-string rules are learned using a heuristic extraction algorithm. At test time, a graph transducer is applied to collapse
Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation
TC Ferreira, I Calixto, S Wubben… – Proceedings of the 10th …, 2017 – aclweb.org
Abstract In this paper, we study AMR-to-text generation, framing it as a translation task and comparing two different MT approaches (Phrasebased and Neural MT). We systematically study the effects of 3 AMR preprocessing steps (Delexicalisation, Compression, and
Procedural Text Generation from an Execution Video
A Ushiku, H Hashimoto, A Hashimoto… – Proceedings of the Eighth …, 2017 – aclweb.org
Abstract In recent years, there has been a surge of interest in automatically describing images or videos in a natural language. These descriptions are useful for image/video search, etc. In this paper, we focus on procedure execution videos, in which a human makes
Order-planning neural text generation from structured data
L Sha, L Mou, T Liu, P Poupart, S Li, B Chang… – arXiv preprint arXiv …, 2017 – arxiv.org
Abstract: Generating texts from structured data (eg, a table) is important for various natural language processing tasks such as question answering and dialog systems. In recent studies, researchers use neural language models and encoder-decoder frameworks for
Long Text Generation via Adversarial Training with Leaked Information
J Guo, S Lu, H Cai, W Zhang, Y Yu, J Wang – arXiv preprint arXiv …, 2017 – arxiv.org
Abstract: Automatically generating coherent and semantically meaningful text has many applications in machine translation, dialogue systems, image captioning, etc. Recently, by combining with policy gradient, Generative Adversarial Nets (GAN) that use a discriminative
An Application of AOC-Posets: Indexing Large Corpuses for Text Generation Under Constraints
A Gutierrez, M Chein, M Huchard… – … on Methodologies for …, 2017 – Springer
Abstract In this paper, we describe the different ingredients of the CogiText tool which can be used for building, editing, and using large corpuses for text generation under constraints à la Alamo. In CogiText, AOC-posets are used as indexes that give information about the shape
Sketch-to-Text Generation: Toward Contextual, Creative, and Coherent Composition
Y Choi – Proceedings of the 9th International Natural Language …, 2016 – aclweb.org
Abstract The need for natural language generation (NLG) arises in diverse, multimodal contexts: ranging from describing stories captured in a photograph, to instructing how to prepare a dish using a given set of ingredients, and to composing a sonnet for a given topic
rLDCP: R package for text generation from data
P Conde-Clemente, JM Alonso… – Fuzzy Systems (FUZZ …, 2017 – ieeexplore.ieee.org
The generation of text reports from numerical and symbolic data is getting the attention of many researchers. This paper presents an R package useful to develop computational systems able to generate linguistic descriptions of complex phenomena. It generates text
The Code2Text Challenge: Text Generation in Source Libraries
K Richardson, S Zarrieß, J Kuhn – Proceedings of the 10th International …, 2017 – aclweb.org
Abstract We propose a new shared task for tactical datato-text generation in the domain of source code libraries. Specifically, we focus on text generation of function descriptions from example software projects. Data is drawn from existing resources used for studying the
RevManHAL: towards automatic text generation in systematic reviews
MT Torres, CE Adams – Systematic reviews, 2017 – systematicreviewsjournal …
Systematic reviews are a key part of healthcare evaluation. They involve important painstaking but repetitive work. A major producer of systematic reviews, the Cochrane Collaboration, employs Review Manager (RevMan) programme—a software which assists
The Code2Text Challenge: Text Generation in Source Code Libraries
K Richardson, S Zarrieß, J Kuhn – arXiv preprint arXiv:1708.00098, 2017 – arxiv.org
Abstract: We propose a new shared task for tactical data-to-text generation in the domain of source code libraries. Specifically, we focus on text generation of function descriptions from example software projects. Data is drawn from existing resources used for studying the
Hierarchical Text Generation and Planning for Strategic Dialogue
D Yarats, M Lewis – arXiv preprint arXiv:1712.05846, 2017 – arxiv.org
Abstract: End-to-end models for strategic dialogue are challenging to train, because linguistic and strategic aspects are entangled in latent state vectors. We introduce an approach to generating latent representations of dialogue moves, by inducing sentence
Methods for Automatic Text Generation
D Pawade, M Jain, G Sarode – i-Manager’s Journal on …, 2016 – search.proquest.com
Abstract In the world of automating tasks and reducing human effort, it is essential for a computer to be able to produce text like humans. This will enable us to let the computer work on insignificant tasks, such as create a summary for an advertisement or a product as well as
Neural Text Generation: A Practical Guide
Z Xie – arXiv preprint arXiv:1711.09534, 2017 – arxiv.org
Abstract: Deep learning methods have recently achieved great empirical success on machine translation, dialogue response generation, summarization, and other text generation tasks. At a high level, the technique has been to train end-to-end neural network
Designing an Algorithm-Driven Text Generation System for Personalized and Interactive News Reading
D Kim, J Lee – International Journal of Human–Computer …, 2018 – Taylor & Francis
ABSTRACT Algorithms are playing an increasingly important role in the production of news content as their computation capacity in manipulating large-scale data continues to grow. In this article, we present Personalized and Interactive News Generation System (PINGS), an
Text Generation Based on Generative Adversarial Nets with Latent Variable
H Wang, Z Qin, T Wan – arXiv preprint arXiv:1712.00170, 2017 – arxiv.org
Abstract: In this paper, we propose a model using generative adversarial net (GAN) to generate realistic text. Instead of using standard GAN, we combine variational autoencoder (VAE) with generative adversarial net. The use of high-level latent random variables is
Text Generation Starting from an Ontology.
DA Cojocaru, S Trausan-Matu – RoCHI, 2015 – pdfs.semanticscholar.org
ABSTRACT The subject of this paper is the development of an application which generates natural language text, starting from an OWL ontology. The Natural Language Generation, in the context of Semantic Web, represents a relatively new field of research, but due to the
Table-to-text Generation by Structure-aware Seq2seq Learning
T Liu, K Wang, L Sha, B Chang, Z Sui – arXiv preprint arXiv:1711.09724, 2017 – arxiv.org
Abstract: Table-to-text generation aims to generate a description for a factual table which can be viewed as a set of field-value records. To encode both the content and the structure of a table, we propose a novel structure-aware seq2seq architecture which consists of field-
DANCin SEQ2SEQ: Fooling Text Classifiers with Adversarial Text Example Generation
C Wong – arXiv preprint arXiv:1712.05419, 2017 – arxiv.org
… 3.1 Adversarial REINFORCE for Text Generation The idea of generative adversarial networks cannot be directly applied for text generation; unlike in images, text sequences are discrete, which makes the discriminator error hard to backpropagate to the generator …
Evaluative Pattern Extraction for Automated Text Generation
CC Lee, SK Hsieh – Proceedings of the 9th International Natural …, 2016 – aclweb.org
Abstract Getting travel tips from the experienced bloggers and online forums has been one of the important supplements to the travel guidebook in the web society. In this paper we present a novel approach by identifying and extracting evaluative patterns, providing a
Text generation with Language models
AK Behera – 2016 – pdfs.semanticscholar.org
Language generation from any machine represented model for any specific domain is known as Natural language generation. Random text generation is a branch of language generation where random text is generated using model trained on text-data specific
Text Generation Using Different Recurrent Neural Networks
P Taneja, KG Verma – 2017 – dspace.thapar.edu
Today, Computers have influenced the life of human beings to a great extent. To provide communication between computers and humans, natural language techniques have proven to be very efficient way to exchange the information with less personal requirement.
Content Filtering and Enrichment Using Triplets for Text Generation
TR Ferreira – 2016 – eprints.ucm.es
Abstract There is an extremely large amount of information on the Internet about almost every topic, and every day this information is constantly expanding. Theoretically, computer programs could benefit from this huge source of information in order to establish new
Deep Text Generation–Using Hierarchical Decomposition to Mitigate the Effect of Rare Data Points
N Dethlefs, A Turner – International Conference on Language, Data and …, 2017 – Springer
Abstract Deep learning has recently been adopted for the task of natural language generation (NLG) and shown remarkable results. However, learning can go awry when the input dataset is too small or not well balanced with regards to the examples it contains for
Efficient key management and cipher text generation using BCD coded parity bits
R Ranjan, D Swain, B Paikaray – Procedia Computer Science, 2015 – Elsevier
Abstract In this paper we have presented a new symmetric encryption technique where a BCD converter, a four bit parity checker along with a sign function is used to generate the key sequence. In the next subsequent steps the input text and the key sequence are
Conceptual text generation based on key phrases
M Charnine, N Somin… – Proceedings on the …, 2014 – search.proquest.com
Abstract The method and system for automatic generating meaningful articles called Conceptual Texts from key-phrases found on the Internet is presented. Conceptual Texts are intended to describe basic concepts of subject domain and their relationships. Key
A Continuous Approach to Controllable Text Generation using Generative Adversarial Networks
DI Helgøy, M Lund – 2017 – brage.bibsys.no
The challenges of training generative adversarial network (GAN) to produce discrete tokens, have seen a considerable amount of work in the past year. However, the amount of successful work on applying deep generative models to text generation is limited, when
Text Steganography Based on Ci-poetry Generation Using Markov Chain Model
YB Luo, YF Huang, FF Li, CC Chang – KSII Transactions on Internet and …, 2016 – kpubs.org
… Abstract Steganography based on text generation has become a hot research topic in recent years. However, current text-generation methods which generate texts of normal style have either semantic or syntactic flaws. Note …
Sentence ordering in electronic navigational chart companion text generation
J Sauvage-Vincent, Y Haralambous… – Proceedings of the 15th …, 2015 – aclweb.org
Abstract We present the sentence ordering part of a natural language generation module, used in the framework of a knowledge base of electronic navigation charts and sailing directions. The particularity of the knowledge base is that it is based on a controlled hybrid
A k-anonymized Text Generation Method
Y Suzuki, K Yoshino, S Nakamura – International Conference on Network …, 2017 – Springer
Abstract In this paper, we propose a method for automatically generating k-anonymized texts from texts which include sensitive information. Many texts are posted on social media, but these texts sometimes include sensitive information, such as living places, phone numbers,
Analysing data-to-text generation benchmarks
L Perez-Beltrachini, C Gardent – arXiv preprint arXiv:1705.03802, 2017 – arxiv.org
Abstract: Recently, several data-sets associating data to text have been created to train data-to-text surface realisers. It is unclear however to what extent the surface realisation task exercised by these data-sets is linguistically challenging. Do these data-sets provide
Explicit Handwriting and Spelling Interventions Targeting the Text Generation Skills of Students in Grades 1 to 3 Who are At Risk or Have Disabilities: Quality of the …
PG Jung, S Yeo – Special Education, 2014 – dev02.dbpia.co.kr
Pyung-Gang Jung· Seungsoo Yeo. Explicit Handwriting and Spelling Interventions Targeting the Text Generation Skills of Students in Grades 1 to 3 Who are At Risk or Have Disabilities: Quality of the Evidence Base. ????, 2014, ? 13 ?, ? 2 ?, 167-190. The purpose of this
FlowGraph2Text: Automatic sentence skeleton compilation for procedural text generation
S Mori, H Maeta, T Sasada, K Yoshino… – Proceedings of the 8th …, 2014 – aclweb.org
Abstract In this paper we describe a method for generating a procedural text given its flow graph representation. Our main idea is to automatically collect sentence skeletons from real texts by replacing the important word sequences with their type labels to form a skeleton
Automatic text generation by learning from literary structures
A Daza, H Calvo, J Figueroa-Nazuno – Proceedings of the Fifth …, 2016 – aclweb.org
Abstract Most of the work dealing with automatic story production is based on a generic architecture for text generation; however, the resulting stories still lack a style that can be called literary. We believe that in order to generate automatically stories that could be
Toward controlled generation of text
Z Hu, Z Yang, X Liang… – International …, 2017 – proceedings.mlr.press
… However, autoencoder frameworks (Sutskever et al., 2014) and recurrent neural network language models (Mikolov et al., 2010) do not ap- ply to generic text generation from arbitrary hidden rep- resentations due to the unsmoothness of effective hidden codes (Bowman et al …
Adaptive text generation based on emotional lexical choice
S Bautista, P Gervás, A Díaz – … of the XV International Conference on …, 2014 – dl.acm.org
Abstract The emotional information can have influence in several stages in text generation process. In this paper, the emocional connotations play a role in the lexical choice stage belongs to discourse planner component in the generator. We present a new module for
Syntactic smt using a discriminative text generation model
Y Zhang, K Song, L Song, J Zhu, Q Liu – Proceedings of the 2014 …, 2014 – aclweb.org
Abstract We study a novel architecture for syntactic SMT. In contrast to the dominant approach in the literature, the system does not rely on translation rules, but treat translation as an unconstrained target sentence generation task, using soft features to capture lexical
Building rdf content for data-to-text generation
L Perez-Beltrachini, R Sayed, C Gardent – The 26th International …, 2016 – hal.inria.fr
In Natural Language Generation (NLG), one important limitation is the lack of common benchmarks on which to train, evaluate and compare data-to-text generators. In this paper, we make one step in that direction and introduce a method for automatically creating an
Predicting sentential semantic compatibility for aggregation in text-to-text generation
V Chenal, JCK Cheung – Proceedings of COLING 2016, the 26th …, 2016 – aclweb.org
Abstract We examine the task of aggregation in the context of text-to-text generation. We introduce a new aggregation task which frames the process as grouping input sentence fragments into clusters that are to be expressed as a single output sentence. We extract
SHTM: A neocortex-inspired algorithm for one-shot text generation
Y Wang, Y Zeng, B Xu – Systems, Man, and Cybernetics (SMC) …, 2016 – ieeexplore.ieee.org
Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are
Joint models for concept-to-text generation
I Konstas – 2014 – era.lib.ed.ac.uk
Much of the data found on the world wide web is in numeric, tabular, or other nontextual format (eg, weather forecast tables, stock market charts, live sensor feeds), and thus inaccessible to non-experts or laypersons. However, most conventional search engines and
Automated Narratives and Journalistic Text Generation: The Lead Organization Structure Translated into Code.
MC dos Santos – Brazilian Journalism Research, 2016 – bjr.sbpjor.org.br
Resumo It describes the experiment of building a software capable of generating leads and newspaper titles in an automated fashion from information obtained from the Internet. The theoretical possibility Lage already provided by the end of last century is based on relatively
Synthetic Text Generation for Sentiment Analysis
U Maqsud – Proceedings of the 6th Workshop on Computational …, 2015 – aclweb.org
Abstract Natural language is a common type of input for data processing systems. Therefore, it is often required to have a large testing data set of this type. In this context, the task to automatically generate natural language texts, which maintain the properties of real texts is
Challenges of military applications for data-to-text generation
H Hastie – … on Data-to-text Generation (D2T), Edinburgh, UK, 2015 – researchgate.net
Abstract This position paper describes the potential for data-to-text generation for military applications and the associated challenges in terms of user interaction, system development and deployment. We also present a use-case for a prototype system for debrief generation
From Web to Web: A General Approach for Data-to-text Natural Language Generation and One Example
X Han, S Sripada – 1st W. Data-to-text Generation, 2015 – academia.edu
We proposed a general approach of ac- quiring NLG knowledge from web, building a data-to-text NLG system ac- cordingly, and evaluating the perfor- mance interactively on web. One exam- ple about river information communica- tion was given to explain the
New approach to designing an educational automated test generation system based on text analysis
AV Arzhakov, DS Silnov – ARPN Journal of Engineering and …, 2016 – arpnjournals.org
… Language Technologies Institute School of Computer Science Carnegie Mellon University. 195 ?. [4] Fairon C. et al. 2002. Automatic item text generation in educational assessment //Proceedings of TALN. pp. 395-401. [5] Gierl MJ, Haladyna TM 2012 …
Semantic-Map-Based Assistant for Creative Text Generation
AY Shedko – Procedia Computer Science, 2018 – Elsevier
Abstract A weak semantic map of English words is used here as an intuitive interface to a pattern generator, which produces paragraphs of text in the style of a specific writer. For this purpose, a recurrent neural net based on the Long Short-Term Memory (LSTM) model is
Automatic generation of air quality index textual forecasts using a data-to-text approach
A Ramos-Soto, A Bugarín, S Barro, N Gallego… – Conference of the …, 2015 – Springer
… 1. Schema of the air quality state text generation. 2.1 Input Air Quality State Forecast Data Characterization. MeteoGalicia’s air quality database covers all the 315 Galician municipalities and includes air quality and meteorological forecast data in a three-day temporal window …
A comparative study of the effect of the text-generation on incidental vocabulary learning in Efl and Esl contexts
M Yarahmadi – 2017 – lib.unipune.ac.in
… 2.2.3. Structure Words versus Content Words 2.2.4. Aspects of Vocabulary Learning 2.2.5. Choice of Vocabulary 2.3. Factors Influencing Vocabulary Learning 2.3.1. Learner 2.3.2. Strategy 2.3.2.1. Vocabulary Learning Strategies 2.3.2.1.1. Intentional versus Incidental Vocabulary
Inference generation in text comprehension: Automatic and strategic processes in the construction of a mental representation
P van den Broek, K Beker, M Oudega – Inferences during reading, 2015 – books.google.com
… In this paper, we review central aspects of inferences in text generation and comprehension, in three sections organized around three aspects of inference making:(1) What inferences are made during read- ing and what are the processes involved (online)?(2) What role do …
Data-driven approaches to content selection for data-to-text generation
D Gkatzia – 2015 – ros.hw.ac.uk
Data-to-text systems are powerful in generating reports from data automatically and thus they simplify the presentation of complex data. Rather than presenting data using visualisation techniques, data-to-text systems use human language, which is the most
Paper 9: Validation through Text Generation
H Burden, R Heldal, P Ljunglöf – A Scholarship Approach to Model …, 2014 – cse.chalmers.se
One way of handling complexity in large-scale software development is to decompose the system into autonomous subsystems that can be independently developed and maintained. In order to successfully integrate the implemented subsystems into a complete and well-
Microstructural (Cohesion and Coherence) Text Generation Problems of Syrian Refugee Students Learning Turkish.
S Demirgünes – Universal Journal of Educational Research, 2017 – ERIC
Abstract In language education, teaching a language as a foreign language is an emerging field compared to teaching it as a mother tongue. However, the experiences obtained in mother tongue education are adapted to teaching a language as a foreign language with
A multilingual multi-domain data-to-text natural language generation approach
C Barros, E Lloret – Procesamiento del Lenguaje Natural, 2017 – journal.sepln.org
… Comput. Lin- guist., 40(4):763–799. Novais, EM and I. Paraboni. 2012. Por- tuguese text generation using factored lan- guage models. Journal of the Brazilian Computer Society, 19(2):135–146. Padró, L. and E. Stanilovsky. 2012. Freel- ing 3.0: Towards wider multilinguality …
Syntax and Data-to-Text Generation
C Gardent – International Conference on Statistical Language and …, 2014 – Springer
Abstract With the development of the web of data, recent statistical, data-to-text generation approaches have focused on mapping data (eg, database records or knowledge-base (KB) triples) to natural language. In contrast to previous grammar-based approaches, this more
Automatic Generation of Air Quality Index Textual Forecasts Using a Data-To-Text Approach
I Fraga, A Saunders – … in Artificial Intelligence: 16th Conference of …, 2015 – books.google.com
… GALiWeather’s weather forecast generation [2]. Figure1 Page 176. 166 A. Ramos-Soto et al. Fig. 1. Schema of the air quality state text generation. shows a global schema of the whole process. Starting from a three day length …
Constructing Sentences from Text Fragments: Aggregation in Text-to-text Generation
V Chenal – 2017 – digitool.library.mcgill.ca
Abstract Sentence aggregation, the task of determining what input units belong in the same output sentence is an essential process in a natural language generation system. Although recent years have seen the emergence of text-to-text generation systems that are more
Text to 3D Scene Generation
AX Chang – 2015 – pdfs.semanticscholar.org
Page 1. TEXT TO 3D SCENE GENERATION A DISSERTATION SUBMITTED TO THE DEPARTMENT OF COMPUTER SCIENCE AND THE COMMITTEE ON GRADUATE STUDIES OF STANFORD UNIVERSITY IN PARTIAL FULFILLMENT OF THE REQUIREMENTS …
Natural language generation in the context of multimodal interaction in Portuguese: Data-to-text based in automatic translation
JC Pereira – 2017 – ria.ua.pt
Page 1. Universidade de Aveiro Departamento de Electrónica, Telecomunicaç˜oes e Informática 2017 das Universidades de Aveiro, Minho e Porto Programa de Doutoramento em Informática José Casimiro Pereira Geraç˜ao de Linguagem Natural noÂmbito de …
Malay Text Features For Automatic News Headline Generation.
MS Hasan, SAM Noah, NM Ali – Journal of Theoretical & …, 2015 – search.ebscohost.com
… 38 involvement of machine translation. Shamfard et al. [19] and Alotaiby et al. [20] demonstrate that a text generation technique dealing with the Arabic language without the use of machine translation able to produces good results …
Procedural text generation with stateful context-free grammars
A Uotila – 2018 – tampub.uta.fi
Context-free grammars have become an increasingly popular solution to procedural text generation. However, standard context-free grammars have inherent limitations with handling state and continuity. This thesis describes the implementation and evaluation of a
Data-Driven Generation of Text Balloons based on Linguistic and Acoustic Features of a Comics-Anime Corpus
S Matsumiya, S Sakti, G Neubig, T Toda… – … Annual Conference of …, 2014 – isca-speech.org
… The experiments were conducted in order to (1) evaluate the performance of automatic generation of text balloons, (2) analyze the usefulness of text balloons for expressive speech- to-text generation, and (3) analyze the relationship with emotion classification …
Generation of code from text description with syntactic parsing and Tree2Tree model
A Stehnii – 2018 – er.ucu.edu.ua
Page 1 …
Classification with Multiple Classes using Naïve Bayes and Text Generation with a Small Data Set using a Recurrent Neural Network
TEG Reiten – 2017 – brage.bibsys.no
In this thesis, text classification and text generation are explored using only a small data set and many classes. This thesis experiments with text classification, and show how it is able to find the most similar output compared to the input even with thousands of classes.
News text generation with adversarial deep learning
F Månsson, F Månsson – LU-CS-EX 2017-18, 2017 – lup.lub.lu.se
In this work we carry out a thorough analysis of applying a specific field within machine learning called generative adversarial networks, to the art of natural language generation; more specifically we generate news text articles in an automated fashion. To do this, we
Data-Driven Text Generation using Neural Networks & Provenance is Complicated and Boring—Is there a solution?
W ECS – 2016 – edshare.soton.ac.uk
Title: Data-Driven Text Generation using Neural Networks Speaker: Pavlos Vougiouklis, University of Southampton Abstract: Recent work on neural networks shows their great potential at tackling a wide variety of Natural Language Processing (NLP) tasks. This talk will
Applications of Rhetorical Structure Theory in Text Generation
C Nakos – 2014 – dataspace.princeton.edu
Natural Langauge Generation, also known as text generation, deals with the use of computers to convey information in human language. The problem is a sizable one and touches on many aspects of computer science and linguistics, ranging from Information
The Text Generation Process on the Basis of Intertextuality
HU Yuan-yan – Journal of Northeast Normal University (Philosophy …, 2014 – en.cnki.com.cn
Based on the theory of intertextuality and Systemic-Functional Grammar, this paper aims to integrate a text generation process by elaborating the interactions and four intertextual processes among context-text, reader-text, intertext and individual text. By constructing the