How many directory submission per day we have to do ?
-
Hello Moz Members,
I have read in many forums and articles, where people discuss about "How many directory submission to do per day" Please clarify my question which are mention below.
-
Is there any per day limit for Directory submission, If its then how much ?
-
Getting more links from directory submission, can hurt my site ?
Regards & Thanks,
Chhatarpal Singh
-
-
_Eternal dilemma of an SEO professional. Since you are there to build links, you have to think about building links and this is exactly where the problem creeps in. Rather take a different approach here. Think like a general user. Would you love to see your website listed in that directory? Do you believe that that the directory in question would be able to drive some traffic to your website? If the answer is yes, go ahead mate. Get your website listed there. Google or no Google, your website is going to get benefited at the end. _
-
When I said " good directories " I meant http://www.seomoz.org/directories/ Do you think these directories will cause Penguin signal to be triggered ? As of my 2nd advice, do it on the right pace. Few factors need to be determine how much to do per x time. Obviously I will agree that directories submission you can find on commercial seo tools will trigger Penguins signals and one should avoid them. Are we good?
-
Do you have an explict answer to those questions that will avoid a Penguin problem?
-
Guys, why to be negative? Today is December 25th
Mr. Singh didn't say what "Directory submission" he meant. There are very good directories that we have to use, anybody disagree?
Regards the other part of "day limit" for Directory submission, the pace is all based on two factors:
- how many links and their quality you have now?
- can you keep the same pace overtime month in, month out?
-
I more or less agree with EGOL. Directory Submissions are a thing of the past and are likely to get you in trouble nowadays. Getting backlinks is becoming harder and harder everyday. You need to diversify more and make sure that all those links to you look as natural as possible. It's not a bad thing to do linking yourself to get that initial push but the best possible linking strategy is a naturally occurring one. Make sure you use all relevant social avenues open to you... Facebook pages, G+, LinkedIn, Pinterest, Instagram, StumbleUpon, and so on as long as it make sense for your site to be there and you keep up with posting. Hopefully those will generate natural links back to your site as people learn who you are and grow to like your site.
-
Thank you sir for valuable suggestion, So what link building strategies shall i apply to get rank my keywords in Google 1st page.
-
I think that they can be harmful to your site - especially if you use keyword anchor text.
If these are the only types of links that you have, I think that your site will be hit by Penguin.
-
I dint get you sir.
-
If you do one or two per day... it will be enough to get you in trouble by the end of next year.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Reverse Phone Directory
Hi, Until a few months ago, my client's site had about 600 pages. He decided to implement what is essentially a reverse phone directory/lookup tool. There are now about 10,000 reverse directory/lookup pages (.html), all with short and duplicate content except for the phone number and the caller name. Needless to say, I'm getting thousands of duplicate content errors. Are there tricks of the trade to deal with this? In nosing around, I've discovered that the pages are showing up in Google search results (when searching for a specific phone number), usually in the first or second position. Ideally, each page would have unique content, but that's next to impossible with 10,000 pages. One potential solution I've come up with is incorporating user-generated content into each page (maybe via Disqus?), which over time would make each page unique. I've also thought about suggesting that he move those pages onto a different domain. I'd appreciate any advice/suggestions, as well as any insights into the long-term repercussions of having so many dupes on the ranking of the 600 solidly unique pages on the site. Thanks in advance for your help!
Technical SEO | | sally580 -
Multilingual Website - Sub-domain VS Sub-directory
Hi Folks - Need your advice on the pros and cons of going with a sub-domain vs a sub-directory approach for a multi lingual website. The best would be a ccTLD but that is not possible now, so I would be more interested in knowing your take on these 2 options. Though, I have gone through http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-seo-friendly-structure/ and this somewhat vouches for a sub-directory, but what would you say'?
Technical SEO | | RanjeetP0 -
Paid directory links--good or bad thing to do for Prof. Services sites?
Yes, I am pretty darn new to SEO and have heard that Google doesn't like Paid directory links. I have looked at some of my competition and they must have paid to get in certain directories as I don't see any backlinks posted on their home pages (usually a badge or icon must be placed on the homepage to avoid a fee.)
Technical SEO | | Stratocaster0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
How many links can i do in a day
Hi, I built a niche site yesterday, the domain was 1 week old. What is the safest way to do link building? I.e How many links should i do in order to be in safe? So far, i've concentrated on building my social profile. Like getting as many retweets as possible. Slowly getting facebook likes and added few youtube videos with my link in it. Now i am on to link building. I will only do legit link building methods. Currently i am gonna get 2 press releases. But can you tell me no. of links i must get in a day. PS: The keyword i am competing is highly competitive.
Technical SEO | | Vegit0 -
600+ Visitors a day after 6 months, can you do it?
So since the Penguin update the clients of the company I work for have gradually been losing traffic and money. Noone (except for me) has noticed this yet and connected the dots. Yesterday we all get called in to have a bollocking and the manager asks the head of our department if he would be confident of being able to get 600+ visitors a day to an 'average website' that has just started up, to which he replied 'yes'. Since I started here back in February there has not been a single new client that has been able to gain that many visitors (many have not gained even 25% of that figure), which in the post-panda and post-pegnuin world, I find completely understandable. When I first started here they were using SEO 'tactics' which people used to employ 5+ years ago and didn't even use exact match keyword data.I have had a few talks with them about how SEO has changed over the last few years and they still don't seem to understand that it is now significantly more difficult to gain traffic using SEO than it once was. If you were asked about the same question, thinking about the 'average' client you might get, would you be confident enough to guarantee that at the 6 month mark they would be getting 600+ visitors a day?
Technical SEO | | Kinsel0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0 -
Too many links on your blog?
In all of my campaigns, I have a lot of URLs with too many links on the page (defined loosely as around or over 100 links per page); these links are virtually all found on blog pages. The link count shoots up quickly when you start using things like tag clouds, showing all the tags/categories a post is in, in addition to all the cross linking thats typical of blog posts. My question is: Does this matter? Do you work to get blog pages down under that 100 link limit, or just assume most blogs are like this and move along? If you think it does matter, what strategies have you used to cut down the number of links while still keeping popular elements like tag clouds?
Technical SEO | | AdoptionHelp0