27 November 25

Epistemological Debt

There is a concept in software engineering called technical debt. Basically, this is something that accrues when taking shortcuts in building a software system. You solve the problem that is immediately at hand, but in so doing you neglect to think through edge cases and these come back to haunt you as use of the system expands and additional pieces get built out.

I think something analogous though more sinister occurs when working with large language models (e.g. ChatGPT and its rivals) which are the core of the AI boom. I’m calling this “epistemological debt”. LLMs are by design extremely good at returning plausibly sounding text, outputs that look correct on first glance but often contain some inaccuracies. For example, using AI-based transcripts and summaries of meetings are now commonplace: this is a standard function in Zoom these days. But what happens when the summaries get saved and become the official record of the meeting without anybody checking the generated text for the inaccuracies? Given that people are only getting more and more busy one suspects these failures to review happen all the time. The inaccuracies start to accumulate, and nobody can figure out what is truth and what is not.

Posted by at 07:46 PM in Technology | Link |

Previous: Next: