Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
-
Hi
According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast:
- Redirect attachment URLs to parent post URL.
- Media...Meta Robots: noindex, follow
I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages.
However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc?
As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time.My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues..
Please can you advise.
Thanks.
-
Hi Kate,
Here is an update as to what is happening so far. Please excuse the length of this message.
-
The database according to the host is fine (please see below) but WordPress is still calling https:
-
In the WP database wp-actions, http is definitely being called* All certificates are ok and SSL is not active* The WordPress database is returning properly* The WP database mechanics are ok* The WP config-file is not doing https returns, it is calling http correctly
-
They said that the only other possibility could be one of the plugins causing the problem. But how can a plugin cause https problems?...I can see 50 different https pages indexed in Google. Bing has been checked and there are no https pages indexed there. All internal urls always have been http only and that is still the case.
-
I have Google fetched the website pages and in the 50 https pages most are images which I think probably must have came from the Yoast sitemap which was originally submitted to the search engines (more recently though I have taken all media image url's out of the Yoast sitemap and put noindex, follow on all image attachments files (the pages and the images on the pages will still be crawled and indexed in Google and search engines, it just means that any image url's won't. What will happen to those unwanted https files though? If I place rel canonical links on the pages that matter will the https pages drop out of the index eventually? I just wish I could find what is causing it (analogy: best to fix a hole in a roof to stop having to use a bowl to catch the water each time it rains).
-
** I looked at analytics today and saw something really interesting (see attached image) - you can see 5 instances of the trailing slash at the home page and to my knowledge there should only be 1 for a website. The Moz Crawl shows just 1 home domain http://example.co.uk/ so I am somewhat confused. Google search results showed 256 results for https url references, and there were 50 available to click on. So perhaps there are 50 https pages being referenced for each trailing slash (could there be 4 other trailing slash duplicate pages indexed and how would I fix it if that is the case?). This might sound naive but I don't have the skillset to fix this at this time so any help and advice would be appreciated.
-
Would Search and Replace plugin help at all or would it be a waste of time since the WordPress database mechanics seem to be ok.
-
I can't place any https to http 301 redirects for the 50 https url's that are indexed in Google, and I can't add any https rewrite rules in htaccess since that type of redirect will only work if a SSL is active. I already tried several redirect rules in htaccess and as expected they wouldn't work which again would probably mean that the SSL is not active for the site.
-
When https is entered instead of http, there should be an automatic resolve to http without me having to worry about that, but I tried again and the https version with a red diagonal line through it appears instead. The problem is that once a web visitor lands on that page they stay in that land of https (visually the main nav bar contents stretch across the page and the images and videos don't appear), and so the traffic will drop off..so hence a bad experience for the user and dropped traffic, decreasing income and bad for seo (split page juice, decreased rankings). There are no crawl errors in Google Search Console and Analytics shows Google Fetch completed for all pages - but when I request fetch and render for the home page it shows as partial instead of completed.
-
I don't want to request any https url removals through Google and search engines - it's not recommended because Google states that http version could be removed as well as https.
-
I did look at this last week:
http://www.screamingfrog.co.uk/5-easy-steps-to-fix-secure-page-https-duplicate-content/
-
Do you think that the https urls are indexed because of links pointing to the site are using https? Perhaps most of the backlinks are https but the preferred setting in Webmaster Tools / Search Console is already set to the non-www version instead of the www version; there has never been a https version of the site.
-
This was one possibility re duplicate content. Here are two pages and the listed duplicates:
-
The first Moz crawl I ever requested came back with hundreds of duplicate errors and I have resolved this. Google crawl had not picked this up previously (so I figured everything had been ok) and it was only realised after that Moz crawl. So https links were seen to be indexed and so the goals are to stop the root cause of the problem and to fix the damage so that any https url's can drop off out of the serps and the index.
-
I considered that the duplicate links in question might not be considered as true duplicates as such - it is actually just that the duplicate pages (these were page attachments created by WordPress for each image uploaded to the site) have no real content so the template elements outweighed the actual unique content elements which was flagging them as duplicates in the moz tool. So I thought that these were unlikely to hurt as they were not duplicates as such but they were indexed thin content. I did a content audit and tidy tidied things up as much as I could (blank pages and weak ones) hence the new recent sitemap submission and fetch to Google.
-
I have already redirected all attachments to the parent page in Yoast, and removed all attachments from the Yoast sitemap and set all media content (in Yoast) to 'noindex, follow'.
-
Naturally it's really important to eliminate the https problem before external backlinks link back to any of the unwanted https pages that are currently indexed. Luckily I haven't started any backlinking work yet, and any links I have posted in search land have all been http version. As I understand it, most server configurations should redirect by default to http when https isn’t configured, so I am confused as to where to take this especially as the host has given the WP database the all clear.
-
It could be taxonomies related to the theme or a slider plugin as I have learned these past few weeks. Disallowing and deindexing those unwanted http URLs would be amazing since I have so far spent weeks already trying to get to the bottom of the problem.
-
Ideally I understand from previous weeks that these 2 things would be very important:
(1)301 redirects from http to https (the host in this case cannot enable this directly through their servers and I can only add these redirects in the htaccess file if there is an active SSL in place).(2)Have in place a canonical url using http for both the http and https variations. Both of those solutions might work on their own and if the 301 redirect can't work with the host then the canonical will fix it? I saw that I could just set a canonical with a fixed transport protocol of http:// - then Google will then sort out the rest. Not preferred from a crawl perspective but would suffice? (Even so I don't know how to put that in place).
-
There are around 180 W3C validation errors. Would it help matters to get these fixed? Would this help to fix the problem do you know? The homepage renders with critical errors and a couple of warnings.
-
The 907 Theme scores well for its concept and functionality but its SEO reviews aren't that great.
-
Duplicate problems are not related to the W3 Total Cache plugin which is one of the plugins in place.
-
Regarding addons (trailing slash): Example: http://domain.co.uk/events redirects to http://domain.co.uk/events/ the addon must only do it on active urls - even if it didn't there were no reports of / duplicate errors in the Moz Crawl so its a different issue that would need looking at separately I would think.
-
At the bottom of each duplicate page there is an option for noindex. There are page sections and parallax sections that make up the home page, and each has to be published to become a live part of the home page. This isn't great for SEO I understand that because only the top page section is registered in Yoast as being the home page the other sections on the home page are not crawled as part of the home page but are instead separate page sections. Is it ok to index those page sections? If I noindex, follow them would that be good practice here. The theme does not auto block the page section from appearing in search engines.
-
Can noindex only be put on whole pages and not the specific page sections? I just want to make sure that the content on all the pages (media and text) and page sections are crawlable.
-
To ultimately fix the https problem re indexed pages out there could this eventually be a case of having to add SSL to the site just because there is no better way - just so the https to http redirect rule can be added to the htaccess file? If so, I don't think that would fix the root cause of the problem, but the root cause could be one of the plugins? Confused.
-
With Canonical url's does that mean the https links that don't have canonicals will deindex eventually? Are the https links giving a 404 (I'm worried because normally 404's need 301's as you know and I can't put a 301 on a https url in this situation). Do I have to do set a canonical for every single page on the website because of the extent of the problem that has occurred?
-
Nearly all of the traffic is being dropped after visiting the home page, and I can't for the life of me see why. Is it because of all these https pages? Once canonicals are in place how long will it take for everything to return to how it should be? Is it worthwhile starting a ppc campaign or should I wait until everything has calmed down on the site?
-
Is this a case of setting the canonical URL and then the rest will sort itself out? (please see the screenshot attached regarding the 5 home pages that each have a trailing slash).
-
This is the entire current situation. I understand this might not be so straight forward but I would really appreciate help as the site continues to drop traffic and income. Others will be able to learn from this string of questions and responses too. Thank you for reading this far and have a nice day. Kind Regards,
-
-
Hi Paul
I did (1) which did not resolve the problem, so I then set media to noindex. follow
I have already exclude attachment URLs from sitemap
When you say: When adding media, make certain the Link to box does NOT point to the attachment page. Are you saying to edit all the link settings to current images, or do you mean for future image uploads? Or in both cases?
Thanks
-
In order to accomplish your goal, setup Yoast SEO to:
- redirect attachment URLs to parent post
- exclude attachment URLs from sitemap (it's a checkbox under the Post Types tab in the XML Sitemaps section of Yoast SEO Settings)
- leave all media indexed and followed.
- When adding media, make certain the Link to box does NOT point to the attachment page.
What this accomplished is to allow the actual image file to still be indexed and hence show up in Image search. It also ensures that the pointless image attachment pages don't waste crawl budget and don't show up to the search crawlers as thin/dupe content. Win!
Hope that helps?
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Google Index a weird version of my blog post?
i wrote a page - https://domain.com/how-to-do-xyz/ but when doing an inurl search, i see that it is indexed by google as - https://secureservercdn.net/58584.883848.9834983/myftpupload/how-to-do-xyz/ (not actual url) and when i view that page, it is a weirdly formatted version of the page with many design elements missing. this is a wordpress site. Why would this be? thanks, Ryan
Web Design | | RyanMeighan0 -
Funnel tracking with one page check-out?
Hi Guys, I'm creating a new website with a one page checkout that follows the following steps:
Web Design | | Jerune
1. Check availability
2. Select product
2. Select additional product & Add features
3. Provide personal information
4. Order & Pay I'm researching if it is possible to track all these steps (and even steps within the steps) with Google Analytics in order to analyse checkout abandonment. The problem is only that my one-page checkout has only one URL (I want to keep it that way) and therefore can not be differentiated on URL in the Analytics funnel. To continue to the next step also the same button (in a floating cart) in used to advance. The buttons to select/choose something within one step are all different. Do you guys know how I can set this up and how detailed I can make this? For example, is it also possible to test at which field visitors leave when for example filling in their personal information? Would be great if you can help me out!0 -
Using a query string for linked, static landing pages - is this good practice?
My company has a page with links for each of our dozen office locations as well as a clickable map. These offices are also linked in the footer of every page along with their phone number. When one of these links is clicked, the visitor is directed to a static page with a picture of the office, contact information, a short description, and some other information. The URL for these pages is displayed as something like http:/example.com/offices.htm?office_id=123456, with seemingly random ID numbers at the end depending on the office that remain static. I know first off that this is probably bad SEO practice, as the URL should be something like htttp://example.com/offices/springfield/ My question is, why is there a question mark in the page URL? I understand that it represents a query string, but I'm not sure why it's there to begin with. A search query should not required if they are just static landing pages, correct?. Is there any reason at all why they would be queries? Is this an issue that needs to be addressed or does it have little to no impact on SEO?
Web Design | | BD690 -
Unique content but exactly the same graphical layout - a problem?
Hello, I have a coaching website at www.bobweikel.com I want to make a second coaching website with the same exact wordpress theme, totally identical except a slightly different logo image. Everything the same. The only difference is that all the content will be unique on the new site. Does it matter to Google that the graphics are absolutely identical? I assume it's fine, I just am making sure.
Web Design | | BobGW0 -
Site as one page - SEO implications
We may be inheriting a site and will be asked to do SEO for it. We will have control over the development of the site, so this structure is what it is. My question is - how significant of an impact do you think this is going to have and can you think of any workarounds that may help? Basically, the user experience of the site will feel similar to multiple pages. However, this site will, in essence be one page and pull various content through javascript from different locations. I have not seen the site yet (and believe it is still in development), but this is how it has been explained to me. Any thoughts? My first thought was to add a blog to add page depth to the site and expand the content. Any other thoughts are welcome and appreciated. Thanks. (I know this is limited information, I'm sorry. It's just about all I have to work with right now, and I was a little concerned and was hoping for a second opinion)
Web Design | | AdamWormann0 -
One Page Guide vs. Multiple Individual Pages
Howdy, Mozzers! I am having a battle with my inner-self regarding how to structure a resources section for our website. We're building out several pieces of content that are meant to be educational for our clients and I'm having trouble deciding how to layout the content structure. We could either layout all eight short sections on a single page, or create individual pages for each section. The goal is obviously to attract new potential clients by targeting these terms that they may be searching for in an information gathering stage. Here's my dilemma...
Web Design | | jpretz
With the single page guide, it would be nice because it will have a lot of content (and of course, keywords) to be picked up by the SERPS but I worry that it is going to be a bit crammed (because of eight sections) for the user. The individual pages would be much better organized and you can target more specific keywords, but I worry that it may get flagged for light content as some pages may have as little as a 150 word description. I have always been mindful of writing copy for searchers over spiders, but now I'm at a more technical crossroads as far as potentially getting dinged for not having robust content on each page. Here's where you come in...
What do you think is the better of the two options? I like the idea of having the multiple pages because of the ability to hone-in on a keyword and the clean, organized feel, but I worry about the lack of content (and possibly losing out on long-tail opportunities). I'd love to hear your thoughts. Please and thank you. Ready annnnnnnnnnnnd GO!0 -
Can only get a few pages indexed on by google
Hi I've touched upon this before on previous questions so apologies for repeating myself. In a nutshell out of the 60 webpages submitted to Google 11 have been indexed and out of the 140 images submitted none have indexed any ideas would be great! Here is a screen shot of what Google Webmaster is showing http://www.tidy-books.com/sitemapshow.png and here is the sitemap - > http://www.tidy-books.com/sitemap/us/sitemap.xml Thanks
Web Design | | tidybooks0 -
WordPress blog hosted on GoDaddy domain mapping help
We set up a WP blog that's hosted through GoDaddy. For various reasons, we purchased a URL to use to get through the technical build and set up and are trying to map that to a subdomain of our company website. (We can't host it on our own server, unfortunately). My question is: for WP blogs hosted via WP you can buy a domain mapping upgrade and I'm trying to find a similar plugin that could offer the same thing that would apply to our GoDaddy hosting and point to our subdomain (GD apparently doesn't offer the domain mapping). Anyone have any thoughts, please?
Web Design | | josh-riley0