In this case, there was file a, which is the backend file responsible for intake and sanitation. Depending on what's next, it might go on to file b or file c. He modified file a.
His rationale was that every single backend file should do sanitation, because at some future point someone might make a different project and take file b and pair it with some other intake code that didn't sanitize.
I know all about client side being useless for meaningful security enforcement.
He had the persosctive that once you hop between source code files that constitutes a security boundary. If you had intake.c and user data.c that got linked together, well data.c needed its own sanitation... Just in case...
I suspect he used a tool that checked files and noted the risky pattern and the tool didn't understand the relationship and be was so invested that he tortured it a bit to have any finding. I think he was hired by a client and in my experience a security consultant always has a finding, no matter how clean in practice the system was.
Another finding by another security consultant was that an open source dependency hasn't had any commits in a year. No vulnerabilities, but since no one had changed anything, he was concerned that if a vulnerability were ever found, the lack of activity means no one would fix it.
It's wild how very good security work tends to share the stage with very shoddy work with equal deference by the broader tech industry.