Client Question
-
Client Question - How much time this keyword takes to rank?
Is there any tool or any calculation to find out the estimate time for a particular keyword?
-
That is a very open ended question and one I wouldn't even try to answer.
Tell your client that there are hundreds of factors at work with Google's algorithms and this is something that is completely out of your control. Even Google couldn't tell you the answer to that one.
If they want to know more, then tell them that each of the 200+ primary ranking signals, have their own set of criteria. I think it was 'assessing how links appear on a page', had something like 2,000+ individual signals that Google used. Now imagine that each of the 200 primary signals each have 2,000 of their own signals - this is hundreds of thousands of possibilities that Google will be looking at when ranking a site.
-Andy
-
There's really no way to answer that as each keyword will have different factors. Each niche will also be treated differently. Each competitor has their own way of doing things versus how you are doing it.
For me, an easy niche with good competitors but are not active, 3 months or less would be a good timeframe to get a site to rank. It's better to underestimate and overdeliver.
For mega keywords, well, some will take a year or so just to crack the front page. Some will take 6 months and if promotion is good, even less. It will depend on your experience in the niche so it's hard to say exactly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML Sitemap Question!
Hi All, I know that the sitemaps.xml URL must be findable but what about the sitemaps/pageinstructions.xml URL? Can we safely noindex the sitemaps/pageinstructions.xml URL? Thanks! Yael
Intermediate & Advanced SEO | | yaelslater0 -
Robots.txt question
I notice something weird in Google robots. txt tester I have this line Disallow: display= in my robots.text but whatever URL I give to test it says blocked and shows this line in robots.text for example this line is to block pages like http://www.abc.com/lamps/floorlamps?display=table but if I test http://www.abc.com/lamps/floorlamps or any page it shows as blocked due to Disallow: display= am I doing something wrong or Google is just acting strange? I don't think pages with no display= are blocked in real.
Intermediate & Advanced SEO | | rbai0 -
Client site is lacking content. Can we still optimize without it?
We just signed a new client whose site is really lacking in terms of content. Our plan is to add content to the site in order to achieve some solid on-page optimization. Unfortunately the site design makes adding content very difficult! Does anyone see where we may be going wrong? Is added content really the only way to go? http://empathicrecovery.com/
Intermediate & Advanced SEO | | RickyShockley0 -
Question about multiple websites in same field
I know what most people say that it is best to only have the 1 website for focus but if we can put this to the back of our minds, if we create 2 different websites that are totally different designs (one upmarket one and one targeting the cheaper market) but in the same fields (printing) and go after 80% of the same keywords is this ok (could we be penalized). Please note we will not be interlinking the websites, the website .will be on different servers and the names will be registered under different people (2 partners in the business). We will however be accessing webmaster tools from the same location.
Intermediate & Advanced SEO | | BobAnderson0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
What is better for SEO keywords in folder or in filename - also dupe filename question
Hey folks, I've got a question regarding URL structure. What is best for SEO given that there will be millions of lawyer names and 4 pages per lawyer www.lawyerz.com/office-locations/dr-al-pacino www.lawyerz.com/phone-number/dr-al-pacino www.lawyerz.com/reviews/dr-al-pacino www.lawyerz.com/ratings/dr-al-pacino OR www.lawyerz.com/office-locations-dr-al-pacino www.lawyerz.com/phone-number-dr-al-pacino www.lawyerz.com/reviews-dr-al-pacino www.lawyerz.com/ratings-dr-al-pacino OR www.lawyerz.com/dr-al-pacino/office-locations www.lawyerz.com/dr-al-pacino/phone-number www.lawyerz.com/dr-al-pacino/reviews www.lawyerz.com/dr-al-pacino/ratings Also, concerning duplicate file names: In the first example there are 4 duplicate file names with the lawyers name. (would this cause Google to not index some) In the second example there are all unique file names (would this look spammy to Google or the user) In the third example there are millions of duplicate file names (if 1 million lawyers then 1 million files called "office-locations" etc (could so many duplicate filenames cause ranking issues) Should the lawyers name (which is the main keyword target) appear in the filename or in the folder - which is better for SEO in your opinion? Thanks for your input!
Intermediate & Advanced SEO | | irvingw0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0