Iparraguirre-Villanueva, OrlandoGuevara-Ponce, VictorRuiz-Alvarado, DanielBeltozarClemente, SaulSierra-Liñan, FernandoZapata-Paulini, JoselynCabanillas-Carbonell, Michael2023-03-132023-03-132022-10-29https://hdl.handle.net/20.500.13053/8063“Unit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability to learn long-term dependencies. The present research integrates the LSTM network and dropout technique to generate a text from a corpus as input, a model is developed to find the best way to extract the words from the context. For training the model, the poem ““La Ciudad y los perros““ which is composed of 128,600 words is used as input data. The poem was divided into two data sets, 38.88% for training and the remaining 61.12% for testing the model. The proposed model was tested in two variants: word importance and context. The results were evaluated in terms of the semantic proximity of the generated text to the given context.“application/pdfenginfo:eu-repo/semantics/openAccesshttps://creativecommons.org/licenses/by/4.0/"Dropout Prediction Recurrent neural network Text Unit short-term memory"Text prediction recurrent neural networks using long shortterm memory-dropoutinfo:eu-repo/semantics/article10.11591/ijeecs.v29.i3.pp1758-1768http://purl.org/pe-repo/ocde/ford#1.02.00