Are directory listings still appropriate in 2013? Aren't they old-style SEO and Penguin-worthy?
-
We have been reviewing our off-page SEO strategy for clients and as part of that process, we are looking at a number of superb info-graphics on the subject. I see that some of current ones still list "Directories" as being part of their off-page strategy.
Aren't these directories mainly there for link-building purposes and provide Users no real benefit? I don't think I've ever seen a directory that I would use, apart for SEO research.
Surely Google's Penguin algorithm would see directories in the same way and give them less value, or even penalise websites that use them to try to boost page rank?
If I were to list my websites on directories it wouldn't be to share my lovely content with people that use directories to find great sites, it would be to sneakily build page rank.
Am I missing the point?
Thanks
Scott -
Thanks, I appreciate your response.
Scott
-
Hi, thanks for you reply. One of my clients, build garden offices and I have explored a lot of garden related directories. There are loads. Defining which is worthy isn't always straight forward. Some sites look good at first glance, but when you dig deeper, they seem pretty spammy. I got caught out with one website which when I added my listing it appeared on thousands of pages in a number of sites under an umbrella called Durokon. All horribly similar. I still can't work out if that site is real or not. I tried to contact them, but no luck, so I'm now assuming it's a link farm of sorts.
I suppose we need to be more critical when doing work like this.
Thanks
Scott -
I think any directory that is either niche relevant or local is still valuable! Especially in terms of local optimization because they also serve as citations!
-
95% of directories are for passing PR and I wouldn't use them for link building. There are only a few like Yahoo or Dmoz that I would consider safe.
-
Scott,
You are right - the majority of directories don't really seem to provide much real value to actual users. I would suggest being extremely critical and conservative if you do decide to pursue 'general' web directory links.
There are niche directories out there such as directories that feature businesses that are green, and electrical contractor directories. These types of hyper relevant directories both add value and Google responds well to them. The more general type of directories are the ones that tend to attract the penguin's wrath. Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Still seeing a terrible rank drop after last algo update?!
I'm still stumped as to why the ranking has gone so poor on a whitehat site. (see attached image) As you can see we've steadily been improving the ranking over the last 6+ months and then got hit with a massive change this month... I can't physically see any issues and Moz isn't reporting anything negatively that would have such a major effect.. Like not as if the drops were subtle... they've all gone into the 50+ section! Any insights into what may have changed in the latest algo update would be appreciated?! S0sD7d8.png
White Hat / Black Hat SEO | | snowflake740 -
Why did this fabric site disappear for "fabric" and why can't we get it back?
Beverlys.com used to rank on the first page for "fabric." I'm trying to get the date of their demise, but don't have it yet so I can't pinpoint what Google update might have killed them but I can guess. In doing a backlink analysis, there were hundreds of poor quality, toxic sites pointing to them. We have carefully gone through them all and submitted a disavow request. They are now on page 9 from nowhere to be found a week ago. But, of course, that's not good enough. They are on page 2 for "fabric online" and "quilt fabric." So Google doesn't completely hate them. But doesn't love them enough even for those terms. Any suggestions? They are rebuilding the site to use a different ecommerce platform with new content and new structure. They will also be incorporating the blog within the site and I've advised them on many other ways to attract traffic and backlinks. That's coming. But for now, any suggestions and help will be much appreciated. Something has got to be holding them back for that one gem of a keyword. Also, I would like to know what experiences others have had with the disavow request form. Does Google absolutely hold you to making every attempt you can at getting those links removed? ANd how does it know? No one responds so it seems to be such a waste of time. And many now actually charge to remove your links. Thoughts? Thanks everyone!
White Hat / Black Hat SEO | | katandmouse0 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
I think I've been hit by Penguing - Strategy Discusson
Hi, I have a network of 50 to 60 domain names which have duplicated content and whose domains are basically a geographical location + the industry I am in. All of these websites have links to my main site. Over the weekend I saw my traffic fall. I attribute our drop in rankings to what people are calling Penguing 1.1. I want to keep my other domains as we are slowly creating unique content for each of those sites. However, in the mean time, clearly I need to deal with the inbound linking and anchor text problem. Would adding a nofollow tag to all links that point to my main site resolve my issue with Google's penguin update? Thanks for the help.
White Hat / Black Hat SEO | | MangoMan160