Well, not exactly useless, but less harmful, sure.
Say, you have a user who tries to "re-publish" (post again) his same item over and over again (every month or so, and given your site new items frequency of 100-150/day, it will be missed, if the limit is set to 1000 latest only). So, if you wish to prevent even those, another function - low priority one mind you - could run via cron, to scan items from particular users only and, well, block either the old ones, or new ones (per your preference).
The key point is, of course, to stop new "immediate" duplicates, and this can also be relatively simple to implement, just limit the scan to latest N items and you are safe. Another function, low priority one, could be split to check users with IDs, say from 1 to 1000, then next batch, then next, and split them around a full 24 hours period.
That should be more acceptable, performance wise.