- Abstract: Translating natural language to SQL queries for table-based question answering has recently attracted more research attention. Previous approaches develop models whose architecture is specifically tuned to the structure of the task, such as separately predicting the arguments of the SELECT clause. In this work, we show that a more general attention-based sequence-to-sequence model outperforms more specialized state-of-the-art approaches by only adapting the input and output layers. In particular, we extend it with on-the-fly embedding and output vectors as well as an input copying mechanism, which are used throughout the whole decoding process. We also investigate the potential order-matters problem that could arise due to having multiple correct decoding paths and investigate the use of a dynamic oracle in this context.