Control indexed content on Wordpress hosted blog...
-
I have a client with a blog setup on their domain (example: blog.clientwebsite.com) and even though it loads at that subdomain it's actually a Wordpress-hosted blog. If I attempt to add a plugin like Yoast SEO, I get the attached error message. Their technical team says this is a brick wall for them and they don't want to change how the blog is hosted.
So my question is... on a subdomain blog like this... if I can't control what is in the sitemap with a plugin and can't manually add a sitemap because the content is being pulled from a Wordpress-hosted install, what can I do to control what is in the index?
I can't add an SEO plugin...
I can't add a custom sitemap...
I can't add a robots.txt file...
The blog is setup with domain mapping so the content isn't actually there. What can I do to avoid tags, categories, author pages, archive pages and other useless content ending up in the search engines?
-
That almost looks like... your client doesn't have WordPress actually installed on their sub-domain at all. It looks like they set up a 'something.wordpress.com' site, which WordPress actually hosts - and somehow overlayed their own sub-domain over it (using DNS / name-server shenanigans)
If that is true then, since WordPress hosts the blog, there's not much you can do. If it is a local WordPress install that does exist on your client's actual website instead of being 'framed' in (or something shady like that) - then I haven't seen this error before and it seems really odd. It smacks of someone trying to cut corners with their hosting environment, trying to 'be clever' instead of shelling out for a proper WP install. Clearly there are limitations...
Ok, there's only one other alternative really. This is also technical though and I don't know if it wold be any easier for your dev guys but...
You can send no-index directives to Google without altering the site / web-page coding, as long as you are willing to play around with the (server-level) HTTP headers
There's something called X-Robots which might be useful to you. You need to read this post here (from Google). You need to start reading from (Ctrl+F for): "Using the X-Robots-Tag HTTP header"
As far as I know, most meta-robots indexation tag directives, can also be fired through the HTTP header using X-robots
It's kinda crazy but, it might be your only option
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Issues: Duplicate Content
Hi there
Technical SEO | | Kingagogomarketing
Moz flagged the following content issues, the page has duplicate content and missing canonical tags.
What is the best solution to do? Industrial Flooring » IRL Group Ltd
https://irlgroup.co.uk/industrial-flooring/ Industrial Flooring » IRL Group Ltd
https://irlgroup.co.uk/index.php/industrial-flooring Industrial Flooring » IRL Group Ltd
https://irlgroup.co.uk/index.php/industrial-flooring/0 -
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
Indexation and visibility problem
Hi I am working on a website (usarrestsearch org) for 6 months. I wrote about 100 pages full of good content. for some reason I see only 75% of the pages indexed in GWT. and Im having problems with SERP positions not rising. I suspect that it might be connected to the structure of the site. will appreciate any help thanks
Technical SEO | | holdportals0 -
Duplicate Content Reports
Hi Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Link Indexing Thoughts
We have have several promotional Articles put out for a few client sites, (posted on sites - not article directories) That was in Sept, it looks like they have not yet been indexed - any ideas on best to get them indexed? Not just these, but a lot of external links indexed quickly -Google seem to be slowing getting to them (big web after all....)
Technical SEO | | OnlineAssetPartners0 -
Merged old wordpress site to new theme and have crazy amount of 4xx and duplicate content that wasn't there before?
URL is awardrealty.com We have a new website that we merged into a new wordpress theme. I just crawled the site with my seomoz crawl tool and it is showing a ridiculous amount of 4xx pages (200+) and we cant find the 4xx pages in the sitemap or within wordpress. Need some help? Am i missing something easy?
Technical SEO | | Mark_Jay_Apsey_Jr.0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0