GWT and html improvements
-
Hi all
I am dealing with duplicate content issues on webmaster tool but I still don't understand what's happening as the number of issues keeps changing. Last week the duplicate meta description were 232, then went down to 170 now they are back to 218.
Same story for duplicate meta title, 110, then 70 now 114. These ups and downs have been going on for a while and in the past two weeks I stopped changing things to see what would have happened.
Also the issues reported on GWT are different from the ones shown in the Crawl Diagnostic on Moz.
Furthermore, most URL's have been changed (more than a year ago) and 301 redirects have been implemented but Google doesn't seem to recognize them.
Could anyone help me with this?
Also can you suggest a tool to check redirects?
Cheers
Oscar
-
Thank you guys for your answers, I will look into it, and try to solve the problems.
I think many pages are self canonicalized, but I see that many URL's haven't been redirect to the new ones so I will start fixing the redirects.
In the top pages report though shows just the new URL's.
Anyway, I will keep you update on this as I am not too sure how to tackle this.
Thanks a lot.
Cheers
-
Had a few minutes and wanted to help out...
Google doesn't always index/crawl the same # of pages week over week, so this could be the cause of your indexing/report problem with regards to the differences you are seeing. As well, if you are working on the site and making changes, you should be seeing these numbers improve (depending on site size of course Enterprise sites might take more time to go through and fix up, so these numbers might look like they are staying at the same rate - if your site is huge
To help with your 301 issue - I would definitely look up and download SEO Screaming Frog. It's a great tool to use to identify potential problems on the site. Very easy to download and use. Might take some getting used too, but the learning curve isn't very hard. Once you use it a few times to help diagnose problems, or see things you are working on improve through multiple crawling. It will allow you to see some other things that might not be working and get to planning fixes there too
As well, make sure to review your .htaccess file and how you have written up your 301's. If you are using Apache, this is a great resource to help you along. Read that 301 related article here
Make sure to manually check all 301 redirects using the data/URL's from the SEO Screaming Frog tool. Type them in and visually see if you get redirected to the new page/URL. If you do, it's working correctly, and I'm sure it will only be a matter of time before Google fixes their index and displays the right URL or 301. You can also check this tool for verifying your 301 redirects using the old URL and see how it performs (here)
Hope some of this helps to get you off to working/testing and fixing! Keep me posted if you are having trouble or need someone to run a few tests from another location.
Cheers!
-
We had the same issue on one of our sites. Here is how I understand it after looking into it and talking to some other SEOs.
The duplicate content Title and Meta description seem to lag any 301 redirects or canonicals that you might implement. We went through a massive site update and had 301s in place for over a year with still "duplicates" showing up in GWT for old and new URLs. Just to be clear, we had the old URLs 301ing to the new ones for over a year.
What we found too, was that if you look into GWT under the top landing pages, we would have old URLs listed there too.
The solution was to put self canonicalizing links on all pages that were not canonicaled to another one. This cleaned thing up over the next month or so. I had checked my 301 redirects. I removed all links to old content on my site, etc.
What is still find are a few more "duplicates" in GWT. This happens on two types of URLs
-
We have to change a URL for some reason - we put in the 301. It takes a while for Google to pick that up and apply it to the duplicate content report. This is even when we see it update in the index pretty quick. As, I said, the duplicate report seems to lag other reports.
-
We still have some very old URLs that it has taken Google a while to "circle back" and check them, see the 301 and the self canonical and fix.
I am honestly flabbergasted at how Google is so slow about this and surprised. I have talked with a bunch of people just to make sure we are not doing anything wrong with our 301s etc. So, while I understand what is happening, and see it improving, I still dont have a good "why" this happens when technically, I have everything straight (as far as I know). The self canonical was the solution, but it seems that a 301 should be enough. I know there are still old links to old content out there, that is the one thing I cannot update, but not sure why.
It is almost like Google has an old sitemap it keeps crawling, but again, I have that cleared out in Google as well
If you double check all your stuff and if you find anything new, I would love to know!
Cheers!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Purpose of static index.html pages?
Hi All, I am fairly new to the technical side of SEO and was hoping y'all could help me better understand the purpose of dynamic rendering with index.html pages and any implications they might hold for SEO. I work to support an eComm site that includes a subdomain for its product pages: products.examplesite.com. I recently learned from one of our developers that there are actually two sets of product pages - a set of pages that he terms "reactive," that are present on our site, that only display content when a user clicks through to them and are not retrievable by search engines. And then a second set of static pages that were created just for search engines and end in .index.html. So, for example: https://products.examplesite.com/product-1/ AND https://products.examplesite.com/product-1/index.html I am confused as to what specifically the index.html pages are doing to support indexation, as they do not show up in Google Site searches, but the regular pages do. Is there something obvious I am missing here?
Technical SEO | | Lauren_Brick0 -
Html extensions
I have remodeled an old html site using wordpress. I see some instructions in wordpress that says I can add an .html extension to some of the pages, but it looks pretty complicated. Is there any benefit in going through that hassle? or should I just ask my web guy to rewrite via htaccess | https://sacramentotop10.com/Weddings/Dresses.html | https://sacramentotop10.com/Weddings/Dresses.html becomes https://sacramentotop10.com/weddings/dresses
Technical SEO | | julie-getonthemap0 -
Html Improvements in Webmaster shows many as Duplicate Titles
Html Improvements in Webmaster shows many as Duplicate Titles. As attached they are not duplicates we made a way to make text hyperlinks if the name matches other objects in our site. How can we deal in such case for Google not to this it as 2 different URl's rather they are one. As the ones with ?alinks are just hyperlink URL's Say we have a name as "James" and he has a biography in our site. Say "Gerald" has a Bio as well and we talk about "James" in "Geralds" bio the word "James" gets a hyperlink automatically so when anyone clickes "James" it goes to his bio. k5jDM
Technical SEO | | ArchieChilds0 -
GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi All, I found one other discussion about the subject of PDFs and passing of PageRank here: http://moz.com/community/q/will-a-pdf-pass-pagerank But this thread didn't answer my question so am posting it here. This PDF: http://www.ccisolutions.com/jsp/pdf/YAM-EMX_SERIES.PDF is reported by GWT to have 38 links coming from 8 unique domains. I checked the domains and some of them are high-quality relevant sites. Here's the list: Domains and Number of Links
Technical SEO | | danatanseo
prodiscjockeyequipment.com 9
decaturilmetalbuildings.com 9
timberlinesteelbuildings.com 6
jaymixer.com 4
panelsteelbuilding.com 4
steelbuildingsguide.net 3
freedocumentsearch.com 2
freedocument.net 1 However, when I plug the URL for this PDF into OSE, it reports no links and a Page Authority if only "1". This is not a new page. This is a really old page. In addition to that, when I check the PageRank of this URL, the PageRank is "nil" - not even "0" - I'm currently working on adding links back to our main site from within our PDFs, but I'm not sure how worthwhile this is if the PDFs aren't being allocated any authority from the pages already linking to them. Thoughts? Comments? Suggestions? Thanks all!0 -
Html V magento
Please excuse what seem like a naive question we are In process of updating our site from old fashioned html pages to magento store. Our html page ranks well and I don't want to upset the spiders when they find something different. We are keeping all the same content images etc. Is there any reason the magento page should not do as well as the existing html. URL is staying the same. We are think of replacing a few pages at a time rather than replacing everything on one go. Is that necessary?
Technical SEO | | Discountdisplays0 -
GWT Images Indexing
Hi guys! How does normally take to get Google to index the images within the sitemap? I recently submitted a new, up to date sitemap and most of the pages have been indexed already, but no images have. Any reason for that? Cheers
Technical SEO | | PremioOscar0 -
Got a new improved dedicated server but something is not right
Hi, i have bought a new and improved dedicated server but something is not right. The first two days it was fast, but now i am having trouble doing tests on it using http://gtmetrix.com http://tools.pingdom.com/fpt/#!/tOBLrm0fR/www.in2town.co.uk gtmetrix is having trouble accessing the site and when it does it comes up with 14 seconds and more. here are the results http://gtmetrix.com/reports/www.in2town.co.uk/NqOj98OV everytime i contact the hosting company about it they say it is running fast and they do not see a problem but refuse to look at the history of this tool. can anyone please let me know if my computer could be causing these problems or is it my hosting company that is at fault. my site is www.in2town.co.uk
Technical SEO | | ClaireH-1848860 -
Will same language different region (US/UK) geotargeting via subdirectory (& GWT) cause dupe content or other issues ?
If a UK hosted site on a .com, needs to target US now too but for keywords that are spelt differently in US is creating duplicate version of uk hosted .com site and putting it on a subdirectory .com/us/ and geotargeting via webmaster tools (to usa) ok ? I take it in this scenario no dupe content issues (or other issues) so long as is geotargeted via GWT ? Or are there ? Comments from anyone with experience doing similar (same language, different region geo-targeting dupe content with kw spelling being only difference, via a subdirectory or other route) much appreciated ? Many Thanks 🙂
Technical SEO | | Dan-Lawrence0