Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best way to get anchor text cloud in line?
-
So I am working on a website, and it has been doing seo with keyword links for a a few years.
The first branded terms comes in a 7% in 10th in the list on Ahefs.
The keyword terms are upwards of 14%.
What is the best way to get this back in line? It would take several months to build keyword branded terms to make any difference - but it is doable.
I could try link removal, but less than 10% seem to actually get removed -- which won't make a difference.
The disavow file doesn't really seem to do anything either.
What are your suggestions?
-
I guess you have to go with option 1 and 2 remove the links and at the same time build some new links with the keyword branded term!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Combining images with text as anchor text
Hello everyone, I am working to create sub-category pages on our website virtualsheetmusic.com, and I'd like to have your thoughts on using a combination of images and text as anchor text in order to maximize keyword relevancy. Here is an example (I'll keep it simple): Let's take our violin sheet music main category page located at /violin/, which includes the following sub-categories: Christmas Classical Traditional So, the idea is to list the above sub-categories as links on the main violin sheet music page, and if we had to use simple text links, that would be something like: Christmas
Intermediate & Advanced SEO | | fablau
Classical
Traditional Now, since what we really would like to target are keywords like: "christmas violin sheet music" "classical violin sheet music" "traditional violin sheet music" I would be tempted to make the above links as follows: Christmas violin sheet music
Classical violin sheet music
Traditional violin sheet music But I am sure that would be too much overwhelming for the users, even if the best CSS design were applied to it. So, my idea would be to combine images with text, in a way to put those long-tail keywords inside the image ALT tag, so to have links like these: Christmas
Classical
Traditional That would allow a much easier way to work the UI , and at the same time keep relevancy for each link. I have seen some of our competitors doing that and they have top-notch results on the SEs. My questions are: 1. Do you see any negative effect of doing this kind of links from the SEO standpoint? 2. Would you suggest any better way to accomplish what I am trying to do? I am eager to know your thoughts about this. Thank you in advance to anyone!1 -
Best Practices for Converting PDFs to HTML
We're working with a client who gets about 80% of their organic, inbound search traffic from links to PDF files on their site. Obviously, this isn't ideal, because someone who just downloads a PDF file directly from a Google query is unlikely to interact with the site in any other way. I'm looking to develop a plan to convert those PDF files to HTML content, and try to get at least some of those visitors to convert into subscribers. What's the best way to go about this? My plan so far is: Develop HTML landing pages for each of the popular PDFs, with the content from the PDF, as well as the option to download the PDF with an email signup. Gradually implement 301 redirects for the existing PDFs, and see what that does to our inbound SEO traffic. I don't want to create a dip in traffic, although our current "direct to inbound" traffic is largely useless. Are their things I should watch out for? Will I get penalized by Google for redirecting a PDF to HTML content? Other things I should be aware of?
Intermediate & Advanced SEO | | atourgates0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Do Page Anchors Affect SEO?
Hi everyone, I've been researching for the past hour and I cannot find a definitive answer anywhere! Can someone tell me if page anchors affect SEO at all? I have a client that has 9 page anchors on one landing page on their website - which means if you were to scroll through their website, the page is really really long! I always thought that by using page anchors instead of sending users through to a dedicated landing page, ranking for those keywords makes it harder because a search spider will read all the content on that landing page and not know how to rank for individual keywords? Am I wrong? The client in particular sells furniture, so on their landing page they have page anchors that jump the user down to "tables" or "chairs" or "lighting" for example. You can then click on one of the product images listed in that section of the page anchor and go through to an individual product page. Can anyone shed any light on this? Thanks!
Intermediate & Advanced SEO | | Virginia-Girtz1 -
Low text-HTML ratios
Are low text-HTML ratios still a negative SEO ranking factor? Today I ran SEMRUSH site audit that showed 344 out of 345 pages on our website (www.nyc-officespace-leader.com) show an text-HTML ratio that ranges from 8% to 22%. This is characterized as a warning on SEMRUSH. This error did not exist in April when the last SEMRUSH audit was conducted. Is it worthwhile to try to externalize code in order to improve this ratio? Or to add text (major project on a site of this size)? These pages generally have 200-400 words of text. Certain URLs, for example www.nyc-officespace-leader.com/blog/nycofficespaceforlease more text, yet it still shows an text-HTML ratio of only 16%. We recently upgraded to the WordPress 4.2.1. Could this have bloated the code (CSS etcetera) to the detriment of the text-HTML ratio? If Google has become accustomed to more complex code, is this a ratio that I can ignore. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
How To Best Close An eCommerce Site?
We're closing down one of our eCommerce sites. What is the best approach to do this? The site has a modest link profile (a young site). It does have a run of site link to the parent site. It also has a couple hundred email subscribers and established accounts. Is there a gradual way to do this? How do I treat the subscribers and account holders? The impact won't be great, but I want to minimize collateral damage as much as possible. Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0