Show Summary Details

Page of

PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).

date: 18 January 2019

Abstract and Keywords

This chapter espouses an engineering perspective on lying, showing how various departures from the truth arise naturally when computers talk about data. We start by explaining how Natural Language Generation (NLG) systems work when they talk about data. Then we discuss how, perhaps surprisingly, such systems cannot help deviating from the truth in two different ways, namely, by omitting information, and by generating statements whose truth is questionable; we illustrate our account with examples from practical NLG systems. Next, we discuss briefly why NLG systems should arguably deviate from the truth in other, more deliberate ways as well, after which we sketch a simple game-theoretical model that can help to explain our observations. We conclude by summing up what the engineering view tells us about lying, whether it is people or computers that take the floor.

Keywords: Natural Language Generation, data-to-text generation, Game Theory, avoidable deviations from the truth, vagueness, health communication, unavoidable deviations from the truth, benign deceit

Access to the complete content on Oxford Handbooks Online requires a subscription or purchase. Public users are able to search the site and view the abstracts and keywords for each book and chapter without a subscription.

Please subscribe or login to access full text content.

If you have purchased a print title that contains an access token, please see the token for information about how to register your code.

For questions on access or troubleshooting, please check our FAQs, and if you can''t find the answer there, please contact us.