How many directory submission per day we have to do ?
-
Hello Moz Members,
I have read in many forums and articles, where people discuss about "How many directory submission to do per day" Please clarify my question which are mention below.
-
Is there any per day limit for Directory submission, If its then how much ?
-
Getting more links from directory submission, can hurt my site ?
Regards & Thanks,
Chhatarpal Singh
-
-
_Eternal dilemma of an SEO professional. Since you are there to build links, you have to think about building links and this is exactly where the problem creeps in. Rather take a different approach here. Think like a general user. Would you love to see your website listed in that directory? Do you believe that that the directory in question would be able to drive some traffic to your website? If the answer is yes, go ahead mate. Get your website listed there. Google or no Google, your website is going to get benefited at the end. _
-
When I said " good directories " I meant http://www.seomoz.org/directories/ Do you think these directories will cause Penguin signal to be triggered ? As of my 2nd advice, do it on the right pace. Few factors need to be determine how much to do per x time. Obviously I will agree that directories submission you can find on commercial seo tools will trigger Penguins signals and one should avoid them. Are we good?
-
Do you have an explict answer to those questions that will avoid a Penguin problem?
-
Guys, why to be negative? Today is December 25th
Mr. Singh didn't say what "Directory submission" he meant. There are very good directories that we have to use, anybody disagree?
Regards the other part of "day limit" for Directory submission, the pace is all based on two factors:
- how many links and their quality you have now?
- can you keep the same pace overtime month in, month out?
-
I more or less agree with EGOL. Directory Submissions are a thing of the past and are likely to get you in trouble nowadays. Getting backlinks is becoming harder and harder everyday. You need to diversify more and make sure that all those links to you look as natural as possible. It's not a bad thing to do linking yourself to get that initial push but the best possible linking strategy is a naturally occurring one. Make sure you use all relevant social avenues open to you... Facebook pages, G+, LinkedIn, Pinterest, Instagram, StumbleUpon, and so on as long as it make sense for your site to be there and you keep up with posting. Hopefully those will generate natural links back to your site as people learn who you are and grow to like your site.
-
Thank you sir for valuable suggestion, So what link building strategies shall i apply to get rank my keywords in Google 1st page.
-
I think that they can be harmful to your site - especially if you use keyword anchor text.
If these are the only types of links that you have, I think that your site will be hit by Penguin.
-
I dint get you sir.
-
If you do one or two per day... it will be enough to get you in trouble by the end of next year.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How are Server side redirects perceived compared to direct links (on a Directory site)
Hi, Im creating some listings for a client on a relevant b2b directory (a good quality directory) I asked if the links are 'followed' or no 'followed' and they said they are 'server side redirects' so no direct links. Does anyone know how these are likely to be perceived by Google ? All BEst Dan
Technical SEO | | Dan-Lawrence1 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
How Long To Recover Rankings After Multi-Day Site Outage?
Hi, A site we look after for a client was down for almost 3 days at the start of this month (11th - 14th of May, to be exact). This was caused by my client's failure to verify their domain name in accordance with the new ICANN procedures. The details are unimportant, but it took a long while for them to get their domain name registration contact details validated, hence the outage. Very soon after this down time we noticed that the site has slipped back in the Google rankings for most of the target keywords, sometimes quite considerably. I guess this is Google penalizing this client for their failure to keep their site live. (And they really can't have too many complaints about this, in my opinion). The good news is that the rankings show signs of improving again slightly. However, they have not recovered all the way to where they were before the outage, two weeks ago. My question is this ... do you expect that the site will naturally re-gain the previous excellent rankings without us doing anything? If so, how long do you estimate this could take? On the other hand, if Google typically penalizes this kind of error by 'permanently', is there is anything we can do to help signal to Google that the site deserves to get back up to where is used to be? I am keen to get your thoughts, and especially to hear from anyone who has faced a similar problem in the past. Thanks
Technical SEO | | smaavie0 -
Too many links?
Hello! I've just started with SEOmoz, and am getting an error about too many links on a few of my blog posts - it's pages with high numbers of comments, and the links are coming from each commenter's profile (hopefully that makes sense they're not just random stuffed links). Is there a way to help this not cause a problem? Thanks!
Technical SEO | | PaulineMagnusson0 -
Its now been 12 days since since google notified me that my manul link penality was revoke.. how long before I'm reindexed?
Hi, Its been 12 days since google revoked the manual link penalty they had one me, I'm still ranking 335 for my domain keyword, there has been no movement yet.. Googlebot spiders my site daily, and I have also tried resubmitting the page from within GWT, Fetch Web Page.. How long do you think it should be before I see some movement...? or should I file another reconsideration request just incase they forgot to remove the penalty and just thought they did.. Thanks,
Technical SEO | | Robdob20130 -
Same day our sitemap finished processing in G WebTools, our SERP results Tank!
Need a little help troubleshooting an SEO issue. The day our sitemap finished processing in Google Webmaster Tools, almost all of our keyword serp results tanked. Our top 4 keywords routinely were placing from 11 - 33 rank in serp result and now they're not even on in the top 200? Would the sitemap processing have anything do do with this or should I look somewhere else. FYI: Site is build in DNN, sitemap is fine and robot.txt file is good. Open to all suggestions!!
Technical SEO | | Firecracker0 -
Too many links on my site
Hi there everybody, I am a total SEO newbie and i am burning with questions. I had my site crawled and found out that it contains too many links. The reason is that it is a site where I constantly write news and articles and each one of them is a new Joomla item, thus a new link. I actually thought lots of content is good for SEO. How am I supposed to reduce the link amount?
Technical SEO | | polyniki0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0