GWT and html improvements
-
Hi all
I am dealing with duplicate content issues on webmaster tool but I still don't understand what's happening as the number of issues keeps changing. Last week the duplicate meta description were 232, then went down to 170 now they are back to 218.
Same story for duplicate meta title, 110, then 70 now 114. These ups and downs have been going on for a while and in the past two weeks I stopped changing things to see what would have happened.
Also the issues reported on GWT are different from the ones shown in the Crawl Diagnostic on Moz.
Furthermore, most URL's have been changed (more than a year ago) and 301 redirects have been implemented but Google doesn't seem to recognize them.
Could anyone help me with this?
Also can you suggest a tool to check redirects?
Cheers
Oscar
-
Thank you guys for your answers, I will look into it, and try to solve the problems.
I think many pages are self canonicalized, but I see that many URL's haven't been redirect to the new ones so I will start fixing the redirects.
In the top pages report though shows just the new URL's.
Anyway, I will keep you update on this as I am not too sure how to tackle this.
Thanks a lot.
Cheers
-
Had a few minutes and wanted to help out...
Google doesn't always index/crawl the same # of pages week over week, so this could be the cause of your indexing/report problem with regards to the differences you are seeing. As well, if you are working on the site and making changes, you should be seeing these numbers improve (depending on site size of course
Enterprise sites might take more time to go through and fix up, so these numbers might look like they are staying at the same rate - if your site is huge
To help with your 301 issue - I would definitely look up and download SEO Screaming Frog. It's a great tool to use to identify potential problems on the site. Very easy to download and use. Might take some getting used too, but the learning curve isn't very hard. Once you use it a few times to help diagnose problems, or see things you are working on improve through multiple crawling. It will allow you to see some other things that might not be working and get to planning fixes there too
As well, make sure to review your .htaccess file and how you have written up your 301's. If you are using Apache, this is a great resource to help you along. Read that 301 related article here
Make sure to manually check all 301 redirects using the data/URL's from the SEO Screaming Frog tool. Type them in and visually see if you get redirected to the new page/URL. If you do, it's working correctly, and I'm sure it will only be a matter of time before Google fixes their index and displays the right URL or 301. You can also check this tool for verifying your 301 redirects using the old URL and see how it performs (here)
Hope some of this helps to get you off to working/testing and fixing! Keep me posted if you are having trouble or need someone to run a few tests from another location.
Cheers!
-
We had the same issue on one of our sites. Here is how I understand it after looking into it and talking to some other SEOs.
The duplicate content Title and Meta description seem to lag any 301 redirects or canonicals that you might implement. We went through a massive site update and had 301s in place for over a year with still "duplicates" showing up in GWT for old and new URLs. Just to be clear, we had the old URLs 301ing to the new ones for over a year.
What we found too, was that if you look into GWT under the top landing pages, we would have old URLs listed there too.
The solution was to put self canonicalizing links on all pages that were not canonicaled to another one. This cleaned thing up over the next month or so. I had checked my 301 redirects. I removed all links to old content on my site, etc.
What is still find are a few more "duplicates" in GWT. This happens on two types of URLs
-
We have to change a URL for some reason - we put in the 301. It takes a while for Google to pick that up and apply it to the duplicate content report. This is even when we see it update in the index pretty quick. As, I said, the duplicate report seems to lag other reports.
-
We still have some very old URLs that it has taken Google a while to "circle back" and check them, see the 301 and the self canonical and fix.
I am honestly flabbergasted at how Google is so slow about this and surprised. I have talked with a bunch of people just to make sure we are not doing anything wrong with our 301s etc. So, while I understand what is happening, and see it improving, I still dont have a good "why" this happens when technically, I have everything straight (as far as I know). The self canonical was the solution, but it seems that a 301 should be enough. I know there are still old links to old content out there, that is the one thing I cannot update, but not sure why.
It is almost like Google has an old sitemap it keeps crawling, but again, I have that cleared out in Google as well
If you double check all your stuff and if you find anything new, I would love to know!
Cheers!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetching & Rendering a non ranking page in GWT to look for issues
Hi I have a clients nicely optimised webpage not ranking for its target keyword so just did a fetch & render in GWT to look for probs and could only do a partial fetch with the below robots.text related messages: Googlebot couldn't get all resources for this page Some boiler plate js plugins not found & some js comments reply blocked by robots (file below): User-agent: *
Technical SEO | | Dan-Lawrence
Disallow: /wp-admin/
Disallow: /wp-includes/ As far as i understand it the above is how it should be but just posting here to ask if anyone can confirm whether this could be causing any prrobs or not so i can rule it out or not. Pages targeting other more competitive keywords are ranking well and are almost identically optimised so cant think why this one is not ranking. Does fetch and render get Google to re-crawl the page ? so if i do this then press submit to index should know within a few days if still problem or not ? All Best Dan0 -
Why is there a difference in the number of indexed pages shown by GWT and site: search?
Hi Moz Fans, I have noticed that there is a huge difference between the number of indexed pages of my site shown via site: search and the one that shows Webmaster Tools. While searching for my site directly in the browser (site:), there are about 435,000 results coming up. According to GWT there are over 2.000.000 My question is: Why is there such a huge difference and which source is correct? We have launched the site about 3 months ago, there are over 5 million urls within the site and we get lots of organic traffic from the very beginning. Hope you can help! Thanks! Aleksandra
Technical SEO | | aleker0 -
Log files vs. GWT: major discrepancy in number of pages crawled
Following up on this post, I did a pretty deep dive on our log files using Web Log Explorer. Several things have come to light, but one of the issues I've spotted is the vast difference between the number of pages crawled by the Googlebot according to our log files versus the number of pages indexed in GWT. Consider: Number of pages crawled per log files: 2993 Crawl frequency (i.e. number of times those pages were crawled): 61438 Number of pages indexed by GWT: 17,182,818 (yes, that's right - more than 17 million pages) We have a bunch of XML sitemaps (around 350) that are linked on the main sitemap.xml page; these pages have been crawled fairly frequently, and I think this is where a lot of links have been indexed. Even so, would that explain why we have relatively few pages crawled according to the logs but so many more indexed by Google?
Technical SEO | | ufmedia0 -
Duplicate title error in GWT over spelling in URL
Hi, How do I resolve a duplicate title error in GWT over spelling in URL? Ttile of Post: Minneapolis Median Home Sales Price Up 16 Percent Not sure how this happened, but I have two URL versions show up. Even with a 301 redirect, the both remain an error in GWT.
Technical SEO | | jessential/real-estate-blog/Minneapolis-median-home-sales-price-up-16-percent and
/real-estate-blog/minneapolis-median-home-sales-price-up-16-percent
0 -
Keyword place in page HTML code? Higher is better?
Hello, is it important to place keyword more higher in html code Our situation: item page. H1 and all text about this item with keyword mentioned three times is in the end of html code? Competitors pages with info about item, but higher keyword place and description in html code make better in SERPS. Could it be reason? Could we change place of text about item in html code ? Giedrius, Lithuania
Technical SEO | | Patogupirkti0 -
.co.uk/index.html or just .co.uk - my on-page reports are different for both - why?
It looks like the same thing, yet it has a different on-page report for each version - why is this. Please share your ideas with me on this. The original url is http://bath.waspkilluk.co.uk/index.html. Many Thanks - Simon.
Technical SEO | | simonberenyi0 -
How can I find my Webmaster Tools HTML file?
So, totally amateur hour here, but I can't for the life of me find our HTML verification file for webmaster tools. I see nowhere to look at it in Google Webmaster Tools console, I tried a site:, I googled it, all the info out there is about how to verify a site. Ours is verified, but I need the verification file code to sync up with the Google API and no one seems to have it. Any thoughts?
Technical SEO | | healthgrades0 -
How can i improve alexa rank of my website
www.meetuniversities.com Meet Universities - Get connected to your dream university
Technical SEO | | debal0