Abstract
The problem of automatic question generation from text is of increasing importance due to many useful applications. While deep neural networks achieved success in generating questions from text paragraphs, they mainly focused on a whole paragraph in generating questions, assuming all sentences are question-worthy sentences. However, a text paragraph often contains only a few important sentences that are worthy of asking questions. To that end, we present a feature-based sentence selection method for identifying question-worthy sentences. Such sentences are then used by a sequence-to-sequence (i.e., seq2seq) model to generate questions. Our experiments show that these features significantly improves the question generated by seq2seq models.