I’m on Google Search Console > Coverage fixing the six affected pages marked ‘noindex’.
I have redirected most of them to the correct URL but for some reason they are still going to the original incorrect link.
Also one of the pages (https://www.aussieclotheslines.com.au/pages/terms-and-conditions) is already going to the correct page so don’t know why it shows as ‘noindex’? And when I request indexing it is rejected with this message “During live testing, indexing issues were detected with the URL”.
There is only one reason why you are seeing this error and reason is simple, crawlers are not able to access the file. I am sure you updated your robots.txt but you are are still seeing the error because Google has not updated the robots.txt yet in search console.
Go to old webmaster account and see the times stamp in Robots.txt tester & you will notice that Google is still picking the old rules:
Now copy your new robots.txt code and add it here and press submit.
There will be an option to ‘ask Google to update’, press submit and in few minutes, webmaster account will update the new robots.txt
Once this is done, do try with live test and it will work for sure.
If you feel my answer is helpful, like it or mark it as a solution. Let me know if you have any questions.