Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hi Marina,
If I understand your question correctly, you just don't want your Tumblr blog to be indexed by Google. In which case these steps will help: http://yourbusiness.azcentral.com/keep-tumblr-off-google-3061.html
Regards,
George
I would throw HTTP 410s for them all if they don't get traffic. 410 carries a little bit more weight than 404s and we're not talking about a small number of pages here. I wouldn't redirect them to the homepage as you'll almost certainly get a ton of "soft 404s" in WMT if done all at once.
Matt Cutts on 404 vs 410: https://www.youtube.com/watch?v=xp5Nf8ANfOw
If they are getting traffic, then it'll be a harder job to unpick the pages that have value.
George
Hi Monica,
It's almost certainly an issue related to the Backlinker plugin given that error message, though clearly it's not a straightforward solution. I found this post on the wordpress forum, perhaps this is your issue too (by member pee_dee):
"Look in header.php inside your current theme and find this line:
http://www.4llw4d.freefilesblog.com/jquery-1.6.3.min.js
This server is no longer able to provide the .js file linked to your theme. I found it mine at:
http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.6.3.min.js
Get a hold of the .js file (or google the heck out of the .js file you need) and point to it on your server."
Hope that works
George
It looks like this error is caused by a plugin you have installed and enabled on your wordpress site that probably isn't compatible with the version of wordpress you're running. If you disable the Backlinker plugin it will probably go away.
As for SEO impact - it appears to also have mangled your /robots.txt (which you should fix), and the user experience of seeing this error is poor and so it's worth fixing.
George
A developer who tells you "W3C validation isn't important" is like a house builder telling you "Those small cracks in the walls are nothing to worry about"
George
I've not come across any reason ever that would give cause to be concerned about losing Page Authority by having a page canonical to itself.
No need to be concerned. Aside from all the really well documented best practices on canonicals, in your original question you've spotted at least one big site that does this. They pay the SEO big bucks and rank well.
Yes this is a good idea as it's a catch all for URLs that might include tracking URL parameters, or other parameters that don't affect the page content. When there are no tracking parameters, it's going to be more development and testing work to hide the canonical, when having it there doesn't cause any issues. It's also quite a brutal but effective catch all if your page was accidentally accessible via other URLs - e.g. non-www or https.
George
I think Devanur gives some good advice regarding the gradual improvement of the content, though you're stuck in a bit of a catch-22 with regard to how Google views websites: You want to be able to sell lots of products, but don't have the resources for your company present them in a unique or engaging fashion. This is something that Google wants webmasters to do, but the reality of your situation paints a completely different picture of what will give your company decent ROI for updating vast amounts of product content.
If there isn't an obvious Panda problem, I wouldn't just noindex lots of pages without some thought and planning first. Before noindexing the pages I would look at what SEO traffic they're getting. noindexing alone seems like a tried and tested method of bypassing potential Panda penalties and although PageRank will still be passed, there's a chance that you are going to remove pages from the index that are driving traffic (even if it's long tail).
In addition to prioritising content production for indexed pages per Devanur's advice, I would also do some keyword analysis and prioritise the production of new content for terms which people are actually searching for before they purchase.
There's a Moz discussion here which might help you: http://moz.com/community/q/noindex-vs-page-removal-panda-recovery.
Regards
George
@methodicalweb