Best Way To Go About Fixing "HTML Improvements"
-
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following.
-
Removed the pages.
-
Removed directories that had these dynamic pages with the remove tool in google webmasters.
-
Blocked google from scanning those pages with the robots.txt.
I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects.
Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
-
-
Great advise here,
Just to add Google Search Console seems to update it's index slower than the search index so it is possible to see old errors longer than they exists until it is re-indexed.
Kind Regards
Jimmy
-
Hi there
I wouldn't remove pages just because they had issues. Some of that content may hold value, it's just a matter of making sure that your on-site SEO is unique to those pages. Your users maybe searching for it - make sure you research and tailor those pages to your user's intent.
Google also offers advice on duplicate content, including parameters and dynamic pages, so make sure you read through that before you just start discarding pages/content.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any idea why Google Search Console stopped showing "Internal Links" and "Links to your site"
Our default eCommerce property (https://www.pure-elegance.com) used to show several dozen External Links and several thousand Internal Links on Google Search Console. As of this Friday both those links are showing "No Data Available". I checked other related properties (https://pure-elegance.com, http:pure-elegance.com and http://www.pure-elegance.com) and all of them are showing the same. Our other statistics (like Search Analytics etc.) remain unchanged. Any idea what might have caused this and how to resolve this?
Intermediate & Advanced SEO | | SudipG0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Is this the "Google Dance"?
We just did a site redesign, and removed the noindex, etc. about 10 days ago. Over the last 24 hours, I've gotten some of my top keywords on the first page, but now they are gone, a few hours later. I assume this is typical?
Intermediate & Advanced SEO | | CsmBill0 -
"Category" word in URLs of blog is it SEO Friendly URL ??
Hello respected community members, I saw many times that "Category" word comes in URL of blog. So my que is that is this negative for SEO or Positive. & if we don't wanna to come CATEGORY in URL how can we remove while URL Optimization ?
Intermediate & Advanced SEO | | sourabhrana390 -
Post your 3 best ways to rank well on Google
Hi, Anyone care to share what are your 3 best ways to rank well on Google? As for me i think: 1.) Link building & Social Media 2.) Onsite optimization 3.) Quality Content What about you?
Intermediate & Advanced SEO | | chanel270 -
Best way to re-order page elements based on search engine users
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else. Questions: Is it cloaking? what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet? Is there any better way to re-order elements based on search engine? Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
Intermediate & Advanced SEO | | StickyRiceSEO0 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0 -
Question about "launching to G" a new site with 500000 pages
Hey experts, how you doing? Hope everything is ok! I'm about to launch a new website, the code is almost done. Totally fresh new domain. The site will have like 500000 pages, fully internal optimized of course. I got my taticts to make G "travel" over my site to get things indexed. The problem is: to release it in "giant mode" or release it "thin" and increase the pages over the time? What do you recomend? Release the big G at once and let them find the 500k pages (do they think this can be a SPAM or something like that)? Or release like 1k/2k per day? Anybody know any good aproach to improve my chances of success here? Any word will be apreciated. Thanks!
Intermediate & Advanced SEO | | azaiats20