Question, Directory Listing Text Best Practices
-
I have been doing some reading on directory link anchor text and it appears the best practice for 2018 and beyond would be to build your brand. For Example Sue's Shoes and her website is shoesbysue.com
Directory Anchor Text As Follows
shoesbysue.com
http://wwwshoesbysue.com
Sue's ShoesNow my question would be if you had a competitor site who was top position on google and ALL anchor text for all directories at leat 90% was
Buy Shoes Here
I do not understand how they are following best practices? What do you recommend to be safe.
-
Just bare links and natural links go to your homepage and give it power. Then links from your homepage to other pages on the site contain the anchor text that tells google what the page is about. Also sometimes external places will link to particular pages inside your website (deep linking) with anchor text or not. Better with but it's becoming less and less important as time goes by. I don't think we'll all be talking about anchor text or maybe not even about links at all in a few years time.
Some of my best pages are ranking number one nationally (for the whole of the UK) and they are super-high competition and they don't have one single backlink. That must tell you something...
Likewise some of my pages I've done 'linkbuilding campaigns' and spent weeks and months building links to and it's not made a blind bit of difference.
Please remember to mark the answer as a good one if you like it. Helps me get recognition for helping out. And feel free to ask more questions. I only started learning SEO in May 2017 and my site back then had 900 users a month. Since then I wrote and optimised about 60 pages and we now get 32,000 users a month and our business has tripled in size. Linking was the last thing on my mind. Writing good content came first. And proving to google that we are who we say we are with links from universities and professional bodies for the dentists. those are the real power links. You must have suppliers who will link to you in their 'find a distributer' sections?
-
Hi Ed
Greatly appreciate the response and help THANKS, I have checked both those links working on the 50 list as we speak. While you made a great point, these are all relatively new links my competitor has added in the recent weeks to months. I read a great article on how you want to use anchor text for your brand first and foremost with less than 10% anchor text to be about the specific topic of the page. But what you said makes sense how does Google know that content is about that specific topic if the only anchor text is on your brand name?
-
Anchor Text is only one of the many ways google understands what a page is about. And links are only one of the many ways google verifies a domains authority to rank it.
Be careful of modelling your 'top competitors' because they may have been doing SEO on the site fr 10 years which means they may have all these dodgy "exact match" anchor texts and even dodgier domain listings.
You only need national, local and hyperlocal citations for your business. The practice of adding it to web directories is hopelessly out of date and worth nothing. You're more likely to get a penalty.
Google have said that you may use exact match internal anchor text liberally but if you start using exact match EXTERNAL anchor text (i.e links from other sites) then you'll get an algorithmic penalty or even a manual action.
Take a look here to get your first 50 links, here for some link strategies that will help your site and are up to date and STAY OFF THE DIRECTORIES. I answered a directories question the other day so it should be available in the search. We've done soem tests on using directories and it really, really doesn't work. Be aware this is not the same as the ESSENTIAL practice of 'citation building'
As for anchor text you have to make it look natural and mix it up. But just using bare URL's and the name of your site will work for now while you're building your brand. But blindly following the 'big players' out of date tactics can be suicide. All their dodgy links will still be there it's just that google will have updated and discounted them. But if you start getting bad links and over-optimising anchor text when that is known bad practice then you'll get stung.
Hope this helps. Please remember to mark it helpful if it's helped!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Best tools for an initial website health check?
Hi,
Moz Pro | | CamperConnect14
I'd like to offer free website health checks (basic audits) and am wondering what tools other people use for this? It would be good to use something that presents the data well. Moz is great but it gets expensive if I want to offer these to many businesses in the hope of taking on just a few as clients and doing a full manual audit for them. So far I've tried seositecheckup.com (just checks a single page though), metaforensics.io and mysiteauditor. Thanks!0 -
What's my best strategy for Duplicate Content if only www pages are indexed?
The MOZ crawl report for my site shows duplicate content with both www and non-www pages on the site. (Only the www are indexed by Google, however.) Do I still need to use a 301 redirect - even if the non-www are not indexed? Is rel=canonical less preferable, as usual? Facts: the site is built using asp.net the homepage has multiple versions which use 'meta refresh' tags to point to 'default.asp'. most links already point to www Current Strategy: set the preferred domain to 'www' in Google's Webmaster Tools. set the Wordpress blog (which sits in a /blog subdirectory) with rel="canonical" to point to the www version. Ask programmer to add 301 redirects from the non-www pages to the www pages. Ask programmer to use 301 redirects as opposed to meta refresh tags & point all homepage versions to www.site.org. Does this strategy make the most sense? (Especially considering the non-indexed but existent non-www pages.) Thanks!!
Moz Pro | | kimmiedawn0 -
Article Directory Page Rank 7
Hi Guys, Would I be correct in my assumption that if I submitted articles to this directory: www . thefreelibrary.com I would recieve a link back to my site of PR7. Could this be correct? Any help with this and any information on Article Directories worth submitting in would be great. Thank for looking, Craig
Moz Pro | | fenwaymedia1 -
Directories still on the do list?
on a new project do you still submit to some directories on the list?http://www.seomoz.org/directories do they still add some value? can they hurt SEO effort?
Moz Pro | | ciznerguy0 -
Multiple Anchor text Links to 1 Page
Hi, Currently we have this page on our site - http://www.highstreetvouchers.com/gift-vouchers/gift-vouchers-cards.jsp It contains both "Gift Vouchers " & "Gift Cards" content which are 2 keywords we are targeting for. Currently the top nav link anchor is "Gift Vouchers & Gift Cards" If we were to change the top nav to be "Gift Vouchers" & "Gift Cards" as different tabs, both linking to the same page, what might be the SEO consequences? thanks
Moz Pro | | NSJ780 -
Spam Directories creating misleading Authority ratings
How does the website www.starsdirectory.com.ar get a Domain Authority of 54 and a Page Authority of 61 when Google quite correctly gives it a PR 0? It is clearly a spam directory, which Google has recognised. It is very misleading using OSE or Campaign Management when sites such as these (and there are hundreds more we have found) are skewing the results of competitiors through the use of spam links. Is there no way that SEOMoz tools can identify such spam sites when they create their ratings?
Moz Pro | | paulsmithlondon0 -
How do I delete a question?
Having problems with this question, old edits get saved as copies of the question, even this one had to be edited twice, originally an old version of an edited question. How do I DELETE a question I authored (like this one) I see no button for it.
Moz Pro | | AspenFasteners0