Do you keep track of 404 errors?
When people or crawlers request a page that doesn't exist on your site they will get a 404 Not Found error. I'm wondering if I should keep track of these.
- Let's you monitor potential issues with the site that will otherwise go unnoticed
- You can follow up with any blogs/etc that might include an invalid link to your site (e.g. adding a period at the end of the URL).
- Lots of noise, especially with heavy crawling
- Many 404 can be user mistakes
This seems like an elegant compromise: https://andycroll.com/ruby/stop-robots-crawlers-triggering-errors-rails/
It tracks 404's, except those generated by crawlers that might crawl outdated links.
Monitoring erroneous backlinks is a great idea.
Wouldn't HTTP redirects on those links help with SEO?
Just use webmaster tools to know what 404 googlebot finds and add redirects to important ones. You'll only handle the 404 that crawlers know about, which should be enough.