Are directory listings still appropriate in 2013? Aren't they old-style SEO and Penguin-worthy?
-
We have been reviewing our off-page SEO strategy for clients and as part of that process, we are looking at a number of superb info-graphics on the subject. I see that some of current ones still list "Directories" as being part of their off-page strategy.
Aren't these directories mainly there for link-building purposes and provide Users no real benefit? I don't think I've ever seen a directory that I would use, apart for SEO research.
Surely Google's Penguin algorithm would see directories in the same way and give them less value, or even penalise websites that use them to try to boost page rank?
If I were to list my websites on directories it wouldn't be to share my lovely content with people that use directories to find great sites, it would be to sneakily build page rank.
Am I missing the point?
Thanks
Scott -
Thanks, I appreciate your response.
Scott
-
Hi, thanks for you reply. One of my clients, build garden offices and I have explored a lot of garden related directories. There are loads. Defining which is worthy isn't always straight forward. Some sites look good at first glance, but when you dig deeper, they seem pretty spammy. I got caught out with one website which when I added my listing it appeared on thousands of pages in a number of sites under an umbrella called Durokon. All horribly similar. I still can't work out if that site is real or not. I tried to contact them, but no luck, so I'm now assuming it's a link farm of sorts.
I suppose we need to be more critical when doing work like this.
Thanks
Scott -
I think any directory that is either niche relevant or local is still valuable! Especially in terms of local optimization because they also serve as citations!
-
95% of directories are for passing PR and I wouldn't use them for link building. There are only a few like Yahoo or Dmoz that I would consider safe.
-
Scott,
You are right - the majority of directories don't really seem to provide much real value to actual users. I would suggest being extremely critical and conservative if you do decide to pursue 'general' web directory links.
There are niche directories out there such as directories that feature businesses that are green, and electrical contractor directories. These types of hyper relevant directories both add value and Google responds well to them. The more general type of directories are the ones that tend to attract the penguin's wrath. Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Bad keywords sending traffic my site, but can't find the source. Advice?
Hi! My site seems to be the target of negative SEO (or some ancient black hat work that's just now coming out of the woodwork). We're getting traffic from keywords like "myanmar girls" and "myanmar celebrities" that just started in late June and only directs to our homepage. I can't seem to find the source of the traffic, though (Analytics just shows it as "Google," "Bing," and "Yahoo" even though I can't find our site showing up for these terms in search results). Is there any way to ferret out the source besides combing through every single link that is directing to us in Webmaster Tools? I'm not even sure that GWT has picked up on it since this is fairly new, and I'd really love to nip this in the bud. Thoughts? Thanks in advance!
White Hat / Black Hat SEO | | 199580 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Does the SEOmoz Suggested Directory List Need to be Updated?
So, since Google updated their link schemes page (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356) with avoid using "Low-quality directories", I've been thinking a lot about what makes a directory "low-quality". Obviously, this is important, or Google wouldn't have mentioned it. I was wondering if someone could explain to me how some of the directories suggested by SEOmoz at http://www.seomoz.org/directories are NOT low-quality, specifically some of the ones marked "General". The page lists stuff like busybits.com, for instance. One that I guess many are aware of, and yea it has a high home page PageRank, and it's got some history, and it's human-edited, ok great. But does it actually add any value to anyone that's not just looking to get a link? A page like http://busybits.com/Business/Others/2/ having (dofollow) listings like "Phone cards, Calling cards" "Insurance in Canada" .... ect. It just looks like an SEO backlink hub. No value at all to a user trying to discover new sites/content. Anyway, back to my main question, how is something like this NOT "low-quality"? Thank you
White Hat / Black Hat SEO | | MadeLoud4 -
Got an SEO package, paid $400+ for it, basically got scammed.
Hi guys, I know this is stupid but I bought an SEO package for around $400. Received the report, and my... it was a complete load of spam. It was basically a blast to lots of sites with random articles and my anchor texts all over the place. Theres thousands of these links and the articles dont make sense, I'm not sure what i'm going to do! This is my main Ecommerce website and i'm worried, i've complained and I hope to get a refund however i'm worried hes going to just blast my site and get me penalized by Google. It is clearly blackhat. Is there anything I can do? I'm very worried. Thanks
White Hat / Black Hat SEO | | Superinks0 -
SEO Experiment with Google Docs
Please check out this doc - https://docs.google.com/document/d/19VS4SnVvq6VJHQAIrB3CX7iL1ivZU4DH6fyfrHLsNFk/edit Any insights will be highly appreciated! Oleksiy
White Hat / Black Hat SEO | | wcrfintl0 -
French (Canadian) Directories? Know of any?
There's never any love for Quebec and the horrible french we speak! However, I obviously still need to rank in French SERPs. Anyone know of any good directories or possible backlinks for are Quebec/Montreal/French/Canadian ish? Thanks folks!
White Hat / Black Hat SEO | | deuce1s0 -
Why Proved Spammers are on 1st Google SERP's Results
This question is related exclusively to few proved spammers who have gained 1st Google search results for specific terms in the Greek market, targeting Greek audience. Why he looks spammer and very suspicious? For instance, the site epipla-sofa.gr, sofa.gr, fasthosting.gr and greekinternetmarketing.com look suspicious regarding their building link activities: 1. suspicious spiky link growth 2. several links from unrelated content (unrelated blog posts forom other markets, paid links, hidden links) 3. excessive amount of suspicious link placements (forum profiles, blog posts, footer and sidebar links) 4. Greek anchor text with the keyword within articles written in foreign languages (total spam) 5. Unnatural anchor text distribution (too many repetitions) So the main question is: Why Google is unable to recognize/trace some of these (or even all) obvious spamming tactics and still these spammy sites as shwon below reside on the 1st Google.gr SERPs. Examples of spam sites according to their link building history: www.greekinternetmarketing.com www.epipla-sofa.gr www.fasthosting.gr www.sofa.gr All their links look very similar. They use probably software to build links, or even hack authority sites and leave hidden links (really dont know how they could do that). Could you please explain or share similar issues? Have you ever found any similar cases in your industry, and how did you tackle it? We would appreciate your immediate attention to this matter. Regards, George
White Hat / Black Hat SEO | | Clickwisegr0