Best Way To Go About Fixing "HTML Improvements"
-
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following.
-
Removed the pages.
-
Removed directories that had these dynamic pages with the remove tool in google webmasters.
-
Blocked google from scanning those pages with the robots.txt.
I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects.
Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
-
-
Great advise here,
Just to add Google Search Console seems to update it's index slower than the search index so it is possible to see old errors longer than they exists until it is re-indexed.
Kind Regards
Jimmy
-
Hi there
I wouldn't remove pages just because they had issues. Some of that content may hold value, it's just a matter of making sure that your on-site SEO is unique to those pages. Your users maybe searching for it - make sure you research and tailor those pages to your user's intent.
Google also offers advice on duplicate content, including parameters and dynamic pages, so make sure you read through that before you just start discarding pages/content.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When i search for my domain name - google asks "did you mean" - why?
Hi all, I just noticed something quite odd - if i do a search for my domain name (see: http://goo.gl/LBc1lz) google shows my domain as first result, but it also asks "did i mean" and names another website with very similar name. the other site has far lower PA/DA according to Moz, any ideas why google is doing this? and more inportantly how i could stop it? please advise James
Intermediate & Advanced SEO | | isntworkdull0 -
How to structure links on a "Card" for maximum crawler-friendliness
My question is how to best structure the links on a "Card" while maintaining usability for touchscreens. I've attached a simple wireframe, but the "card" is a format you see a lot now on the web: it's about a "topic" and contains an image for the topic and some text. When you click the card it links to a page about the "topic". My question is how to best structure the card's html so google can most easily read it. I have two options: a) Make the elements of the card 2 separate links, one for the image and one for the text. Google would read this as follows. //image
Intermediate & Advanced SEO | | jcgoodrich
[](/target URL) //text
<a href=' target="" url'="">Topic</a href='> b) Make the entire "Card" a link which would cause Google to read it as follows: <a></a> <a>Bunch of div elements that includes anchor text and alt-image attributes above along with a fair amount of additional text.</a> <a></a> Holding UX aside, which of these options is better purely from a Google crawling perspective? Does doing (b) confuse the bot about what the target page is about? If one is clearly better, is it a dramatic difference? Thanks! PwcPRZK0 -
What is the best way to get anchor text cloud in line?
So I am working on a website, and it has been doing seo with keyword links for a a few years. The first branded terms comes in a 7% in 10th in the list on Ahefs. The keyword terms are upwards of 14%. What is the best way to get this back in line? It would take several months to build keyword branded terms to make any difference - but it is doable. I could try link removal, but less than 10% seem to actually get removed -- which won't make a difference. The disavow file doesn't really seem to do anything either. What are your suggestions?
Intermediate & Advanced SEO | | netviper0 -
Taking up an "abondoned" domain?
Hi, As far as SEO goes, are there any direct contradictions to picking up an approximately 1 year old domain, where the only thing that has ever been on is a static "Hello world" page from a wordpress install done when the domain was created? I'm thinking about picking it up again, as if it was a totally fresh domain, add content, and do SEO on it. What are your thoughts friends? Thanks.
Intermediate & Advanced SEO | | kaince0 -
Building "keyword" backlinks
Looking for some opinions here please. Been involved in seo for a couple of years mainly working on my websites and picking up the odd client here and there through word of mouth. I must admit that up until a few months back I was guilty of using some grey methods of link building - linkvana, unique article wizard and the such. While no penalties were handed out to my domains and some decent rankings gained, I got tired of always being on the lookout for what the next Google update will do to my results and which networks were being hit, and so I moved a lot more into the 'proper' way of seoing. These days my primary sources for backlinks are much more respectable... myblogguest bloggerlinkup postjoint Guest Blog Finder http://ultramarketer.com/guest-blogger-finder/ - not sure where i came across this resource but it's very handy I use these sources alongside industry only directories and general word of mouth. Ironically I have found that doing the word by hand not only leads to results I can happyily show people (content wise) but also it's much quicker and cheaper. The increased authority of the sites means far fewer links are needed. The one area I still am having a little issue with is that of building keyword based backlinks. I now find it fairly easy to get my content on a reasonable quality site - DA to 40 and above, however the vast majority of these sites will allow the backlink only as the company name or as a generic read more type thing. This is fine and it is improving my website performance and authority. The trouble I am finding is that while i am ranking for the title tag and some keywords in the page, I am struggling to get backlinks for other keywords. In an ideal world every page on the site would be optimised for a different keyword and you could then just the site name as anchor text to build the authority of that page and make it rank for it's content, but what about when you (or the client) wants to rank the home for a number of different keywords, some not featured on the page. The keywords are too similar to go to the trouble of making unique pages for, and that would also add no value to the site. My question really then, after a very long winded way of getting there, is are others finding it much more difficult to gain keyword based backlinks these days? The great thing about the grey seo tools, as mentioned above, is that it was super easy to get the backlinks with whatever anchor text you wanted - even if you needed hundreds of the thing to compensate for the low value of each!! Thanks Carl
Intermediate & Advanced SEO | | GrumpyCarl0 -
On page report card for the keyword "computers"
I was looking at which websites ranks in the TOP 3 for the keyword "computers"... I noticed that first is wikipedia and then there are Dell and Apple... I then did an on page report card and I noticed that wikipedia has a grade A (which is great ) However, Apple has an F ( which sucks !! ) but there still rank out there. My question is why is Apple ranking for the keyword computers with no tiitle, no URL, no H1, no body, no B/Strong... when wikipedia has all of that and the term " computers " occurs 290 times on its page... Is is due to the fact that apple has millions of external links and is that enough to rank even with an " irrelevant " page ? By the way I have noticed that on other keywords such as " bicycle ". Wikipedia is ranking 1 st and then sites like www.trekbikes.com are out there but they shouldn't based on their homepage "optimization ". I know there are other factors but I am just trying to figure why such sites ( like apple or trek bikes ) rank out there. Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
What is the best way to learn SEO?
I was wondering if it's worth taking an SEO Training course. If so is it better to take a live class or Online class. Or is better to just read all the SEO Books out there? Or is there a good video series anyone can recommend? What is the best way to learn SEO? I have a good understanding of SEO but I'm not a Pro ( Yet ). Obviously SEO is always evolving so even the Pro's are constantly updating their skill set but I want to make sure my foundation is solid and complete. Advice Please. Thank you all.
Intermediate & Advanced SEO | | bronxpad0 -
Maximum of 100 links on a page vs rel="nofollow"
All, I read within the SEOmoz blog that search engines consider 100 links on a page to be plenty, and we should try (where possible) to keep within the 100 limit. My question is; when a rel="nofollow" attribute is given to a link, does that link still count towards your maximum 100? Many thanks Guy
Intermediate & Advanced SEO | | Horizon0