You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This will potentially generate quite an amount of data and I'm not sure if this is really what we want.
Knowing what is requested and not found and where the dead links might be found is a legitimate interest. I assume that's what you had in mind.
We cannot trivially validate target URLs as we do now - they are obviously not found.
At least 99% of the 404s I do see in real-world logs are nonsense requests, probing for various leaks and vulnerabilities. Sometimes a hand full per day, sometimes thounds per hour, if a new exploit is published.
Assuming most of such bots won't execute any JS, AJAX-tracking would at least limit the data to mostly real browsers. Often error pages are not cached, so inline tracking will catch of the bogus requests.
My opinion:
We might add such feature as an opt-in, visualize the error calls in a separate view for analysis and document the negative impact, if enabled. Wouldn't be my first priority though.
What about tracking broken request which ended on an error page?
The text was updated successfully, but these errors were encountered: