A developer who tells you "W3C validation isn't important" is like a house builder telling you "Those small cracks in the walls are nothing to worry about"
George
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
A developer who tells you "W3C validation isn't important" is like a house builder telling you "Those small cracks in the walls are nothing to worry about"
George
Personally I wouldn't rely just on robots.txt, as one accidental, public link to any of the pages (easier than you may think!) will result in Google indexing that subdomain page (it just won't be followed). This means that the page can get "stuck" in Google's index and to resolve it you would need to remove it using WMT (instructions here). If there were a lot of pages accidentally indexed, you would need to remove the robots.txt restriction so Google can crawl it, and put a noindex/nofollow tags on the page so Google drops it from its index.
To cut a long story short, I would do both Steps 1 and 2 outlined by Federico if you want to sleep easy at night :).
George
It looks like this error is caused by a plugin you have installed and enabled on your wordpress site that probably isn't compatible with the version of wordpress you're running. If you disable the Backlinker plugin it will probably go away.
As for SEO impact - it appears to also have mangled your /robots.txt (which you should fix), and the user experience of seeing this error is poor and so it's worth fixing.
George
I've read conflicting studies about use of Trust badges. Sometimes they're a good idea as it instills a feeling of trust in a nervous/cautious customer but other times it can have a negative effect by scaring off a user who hadn't considered security/privacy to be an issue until you mentioned it! It depends on the level of technical/online shopping experience your customers have.
As Gregory says - test it - get them to give you a month free trial and see if it impacts your conversion.
George
@methodicalweb
I would throw HTTP 410s for them all if they don't get traffic. 410 carries a little bit more weight than 404s and we're not talking about a small number of pages here. I wouldn't redirect them to the homepage as you'll almost certainly get a ton of "soft 404s" in WMT if done all at once.
Matt Cutts on 404 vs 410: https://www.youtube.com/watch?v=xp5Nf8ANfOw
If they are getting traffic, then it'll be a harder job to unpick the pages that have value.
George
Yes this is a good idea as it's a catch all for URLs that might include tracking URL parameters, or other parameters that don't affect the page content. When there are no tracking parameters, it's going to be more development and testing work to hide the canonical, when having it there doesn't cause any issues. It's also quite a brutal but effective catch all if your page was accidentally accessible via other URLs - e.g. non-www or https.
George
No need to be concerned. Aside from all the really well documented best practices on canonicals, in your original question you've spotted at least one big site that does this. They pay the SEO big bucks and rank well.
I've not come across any reason ever that would give cause to be concerned about losing Page Authority by having a page canonical to itself.