I have a number of URLs showing in Google Search Console (Coverage report) that are being crawled but are 404s. They URLs are old pages from when the domain was owned by the previous owner (I assume).
What is the best way to deal with these? As the pages are not in my Shopify, i can’t delete them or add a noindex tag. I have also looked into temporary removals & outdated content tools in GSC, but looks like these tools aren’t relevant for this problem.
Should I just redirect the old URLs to a relevant page on my site?
This is a great question! Can I ask when you bought this domain? Typically when Google’s bot crawls through sites it updates the sitemap, have you submitted your sitemap for review to Google yet?
Once that is done, it can take 2-3 months for the older URLs that would not resolve any more to fall off on the console itself, or so we have seen.
If they are old 404s, the benefit of managing it now is going to be minimal as Google is likely to have relegated the pages as non-valuable. The main case I say for dealing with them is to correct live traffic going to them OR capture backlinks pointing to them. Either case, you’d want to setup a redirect from the old page to the new version. If there is no new version of the page, then it’s legitimate to let it stay as a 404.
Being another owner, if you can’t deduce the content of the page form the old URL handle, you could use https://archive.org/web/ to see the page’s contents to then redirect it to an accurate version.