Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Was moving up in SERPS then Got Stuck on Page 2
-
Hi,
I was continuously acquiring quality back-links and my site was moving up in Google SERPS for 3 main keywords. Within a few weeks i was on Page 2 and 3 for these three keywords, but after reaching there I got stuck on these pages and positions despite no change in link building strategy / pattern. I have even increased the number and quality of links that I acquire per day, but I am still stuck at exact same positions.
The website is10 months old and related to a software niche. I update this website once a week.
For one keyword I am stuck at position 1 of page two (you can well imagine the frustration..!!).
My question is that what do I need to do to get out of this "SERP lock"?
-
Thanks all for the answers.
Yes EGOL, It looks like I need to do the Jolt (rate of change of acceleration) rather than just acceleration or speed as I am up against entities having $10b+ market cap(s). These guys love to mop the serp floor using small competition.
Are there any other factors you guys consider relevant in off-page SEO, in addition to rate of increase of links/day and quality/relevance of links?
-
Hi,
This is a great question and many people find themselves right where you are. Both Ryan and Egol have provided you some great responses and I agree with them both. There is a lot that goes into whether and when your site will move up or down. Personally, I feel the closer you get to page 1 the more fine tuning and possibly effort it will take to make the leap to the top of page one.
Sounds like you are doing pretty well for a very young site. Good job and good luck!
-
Did you take physics in high school or college?
Your question is similar to..... "What is the difference between speed and acceleration?"
Just because you are "continuously acquiring quality back-links" doesn't mean that your competitors are sitting on their butts.
If you are gaining ten links per day but the guy above you is gaining twenty you will never ever catch him....
.... and if he is on the third page then the guys near the top of page two might be gaining hundreds per day.
Every time you move up a position in the SERPs the website directly above you will require a greater level of effort to defeat.
That means you gotta press the accelerator down harder and harder as you move up the SERPs. Will you have it floored before you reach the top of page two?
In lightly competitive SERPs you might be able to defeat everyone... but when you get into the heavyweight SERPs the increasing competition will at some point be more than most people can muster.
That's where you hit "SERP lock" as you call it.
Keep in mind that the people behind you are working hard too..... you might tramp the pedal to the floor and see people from behind passing you buy.
My personal opinion is... more people who hold any SERPs today will be lower in the rankings than will be more highly ranked by this time next year. Really. They are going to be displaced by existing heavyweights who are expanding their reach and new heavyweights who are starting to accelerate.
Pick ten SERPs in different niches that interest you and record who is in position #5. Then come back in a year and look for them. More will go down than up.
-
what do I need to do to get out of this "SERP lock"?
Without looking at the site and the keywords involved, we can only offer generic advice.
I would suggest examining all aspects of your onpage factors. Some specifics are:
-
page title: focus a single keyword
-
header: focus the same keyword
-
content: the first sentence of your content should also focus the same keyword
-
site internal linking: when appropriate, other pages of your site should provide links in content to other relevant pages.
-
url: clean, friendly, static urls which offer appropriate use of keywords is helpful
Wikipedia is a great example for many of the above steps.
There are other items to check. My point is link building and site promotion is an ongoing process which happens over months and years. On page changes have the ability to instantly and dramatically change your ranking. There is a good chance you are stuck due to onpage factors.
-
-
Could you let us know the URL and the keywords you're targeting?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page rank and menus
Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO is this not bad?
Intermediate & Advanced SEO | | AL123al0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
How to combine 2 pages (same domain) that rank for same keyword?
Hi Mozzers, A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like: www.mycompetition1.com www.mycompetition2.com www.mywebsite.com/page1.html
Intermediate & Advanced SEO | | rayvensoft
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1. Does anybody have any experience in this? Any advice is much appreciated.0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0