Joshua Maynez
Title
Cited by
Cited by
Year
Morphosyntactic tagging with a meta-BiLSTM model over context sensitive token encodings
B Bohnet, R McDonald, G Simoes, D Andor, E Pitler, J Maynez
arXiv preprint arXiv:1805.08237, 2018
692018
Morphosyntactic tagging with a meta-BiLSTM model over context sensitive token encodings
B Bohnet, R McDonald, G Simoes, D Andor, E Pitler, J Maynez
arXiv preprint arXiv:1805.08237, 2018
692018
On faithfulness and factuality in abstractive summarization
J Maynez, S Narayan, B Bohnet, R McDonald
arXiv preprint arXiv:2005.00661, 2020
682020
Stepwise extractive summarization and planning with structured transformers
S Narayan, J Maynez, J Adamek, D Pighin, B Bratanič, R McDonald
arXiv preprint arXiv:2010.02744, 2020
32020
Planning with Entity Chains for Abstractive Summarization
S Narayan, Y Zhao, J Maynez, G Simoes, R McDonald
arXiv preprint arXiv:2104.07606, 2021
22021
Focus Attention: Promoting Faithfulness and Diversity in Summarization
R Aralikatte, S Narayan, J Maynez, S Rothe, R McDonald
arXiv preprint arXiv:2105.11921, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–6