What do you think about SEO of big sites ?
-
Hi,
I was doing some research of new huge sites for example carstory.com that have over million pages and i notice that many new sites have strong growing for number of keywords and then at some point everything start going down
(Image of traffic drop attached)
there are no major updates at this time but you can clearly see even on recent kewyords changes that this site start loosing keywords every day , so number of new keywords are much less that lost keywords.
How would you explain it ? Is that at some point when site have more than X number of indexed pages then power of domain is not enough to keep all of them at the top and those keywords start dropping ?
Please share you opinion and if you have any experience by yourself with huge sites.
Thank You very appreciated
-
It sounds to me like you need to do a Content Audit with the goal of pruning out all pages with zero traffic and zero links from the index. See the following resources:
How To Do a Content Audit - Step by Step (Moz)
Classic Content Audit Articles (Linked-In)
Content Audit Case Studies (Linked-In)
Using URL Profiler for Content Audits ( URL Profiler)
Here's the slide deck for a presentation I gave last year about them.
Here's a recording of a webinar with the presentation above.
Common Content Audit Strategies I think the site you described would be considered Extra Large with a Penalty "Risk" so this is what the tool recommends:
"Focus: Prioritization of Pages for Content Optimization. Are you SURE there is no content-based penalty risk?
Most sites with this many pages have major content issues, meaning this would be the wrong “situation” for them. In the rare case that they don’t, prioritize the pages based on rankings, traffic potential, revenue… and propose how many to improve each month with ongoing copywriting and on-page optimization." -
Hi Dani,
As John mentioned, there are so many factors involved that it's very tough to accurately pinpoint the reasons why a site might start losing rankings.
To answer your question directly - no, there is no tipping point where your link profile just isn't strong enough to carry a certain volume of pages. There is a similar scenario that could happen but I'll get to that in a moment.
The other thing is that to my knowledge, the traffic data in Ahrefs isn't 100% accurate like Analytics, it's more of an estimation based on extrapolating a smaller pool of data. I'm not completely sure about this but that's my understanding. Basically, the drops you're seeing might not actually be as pronounced as the graph indicates.
So, the type of scenario where there are "too many pages" is moreso related to the user experience. If you're creating large volumes of pages just for "better SEO", you're going to have a bad time. If your site has 20 real pages and 80 pages just targeting a keyword then this is where you're like to start seeing a drop in your rankings as the overall quality signals on your site begin to dip.
Users don't enjoy keyword-heavy and redundant pages and the metrics will demonstrate this to search engines very quickly.
You should always be looking to improve the strength of your link profile, but if you're worried about creating too many pages, don't be. Create the pages that make sense for the user, provide a great user experience and you'll be fine.
-
There are too many factors in rankings hence too think one factor is the answer is rarely the way forward.
Do a google search site: That will give you a rough idea of page authority & no. of pages indexed.
I recommend to stop trying to find one "short cut" answer, step back do a site audit - properly identify what is going on. It could be that competitors are now all mobile friendly... so they now rank better. It could be anything. Slow down do it properly, if you think it is domain authority alone, whilst doing a site audit obtain a relevant high authority backlink and see if that assists.
-
So you dont think its just domain authority (page rank) is not enough power to spread across all pages ? like when site has less pages power was enough and rankings climb up but when too many pages been indexed then it wasnt enough to spread across all pages ?
-
Then I would go back to the landing page experience. What is your conversions rate? Are they averaging over 5 minutes on site or less... How many page views? Are page views increasing? A factor in seo is the landing page experience and how long the customers stick around and what they do. There was a site called jabong in india that had a similar issue - they kept improving the landing page experience and all the metrics slowly turned around.
If it not that you have to go back to basics and do a site audit, which you should do annually anyway. https://moz.com/blog/technical-site-audit-for-2015
Hope that assists.
-
Hi,
Thanks for reply. I dont think its seasonal because its a clear that site loosing keywords in google everyday. So at some point at the beginning there was a boost and site was getting more new keywords each day than its been loosing, and now its loosing more than gaining.
-
Firstly there is not enough shared information.
That said we manage several large sites and there are times were they are static and there are dips, but seasonal factors are huge for most large sites, from education to health insurance. So have you accounted for seasonal factors? Also the traffic growth is exponential at the start and from an outside point of view it seems that the landing page experience is not strong enough or not answering the searchers queries.
I would do a deep dive into onsite behavioral analytics and see how long they are staying on, and where dropping off. In short look for pogo sticking.
Hope that assists.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Redesign Performance
Hi, I am looking for a bit of advice if possible. In October 2018 we did a site redesign for a website that we had acquired (www.drainageonline.co.uk) but we have lost SO much organic traffic since Oct (-40/50% each month). I have amended any crawl errors including broken redirects ensuring all 301's that we put in place are working correctly, ensured 404 pages working OK, implemented canonical onto all pages along with other bits but we don't really seem to be seeing any improvements in our organic traffic. Can anyone offer any advice on the above? Thanks, Lea
Intermediate & Advanced SEO | | GAP_Digital_Marketing0 -
Best SEO Practices for Displaying FAQs throughout site?
I've got an FAQ plugin (Ultimate FAQ) for a Wordpress site with tons of content (like 30 questions each with a page full, multi-paragraphs, of answers with good info -- stuff Google LOVES.) Right now, I have a main FAQ page that has 3 categories and about 10 questions under each category and each question is collapsed by default. You click an arrow to expand it to reveal the answer.I then have a single category's questions also displayed at the bottom of an appropriate related page. So the questions appear in two places on the site, always collapsed by default.Each question has a permalink that links to an individual page with only that question and answer.I know Google discounts (doesn't ignore) content that is hidden by default and requires a click (via js function) to reveal it.So what I'm wondering is if the way I have it setup is optimal for SEO? How is Google going to handle the questions being in essentially three places: it's own standalone page, in a list on a category page, and in a list on a page showing all questions for all categories. Should I make the questions not collapsed by default (which will make the master FAQ page SUPER long!)Does Google not mind the duplicate content within the site?What's the best strategy?
Intermediate & Advanced SEO | | SeoJaz0 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Has this site been a victim of negative seo?
The rankings for our client's site - www.yourlifeprotected.co.uk fell off the face of the earth back in June. Despite trying a huge number of things to try and help the site recover, we've seen no real positive improvements since then. Examples of things we have tried: Disavowed & manually removed poor quality Links Removed any internal Duplicate Content Removed any broken links Re-written all website content to ensure unique & high quality No-Followed all outbound links Added any missing title tags changed hosting Rewritten content to ensure no duplication internally or externally The most recent issue we've picked up is that some highly spammy sites seem to have copied extracts of text from the website and hidden them in their pages. This is a rather puzzling one, as there aren't backlinks, pointing to our site - just the copy. For example - Cancer Page and Diabetes Page.It feels very much as though this could be a negative SEO attack which could be responsible for the drop in rankings and traffic the site has experienced. If this is the case, what can we do about it?! Having already re-written the copy on the site, we obviously dont want to do this again unnecessarily - especially if this could just happen again in future! Any help or advice would be hugely appreciated.
Intermediate & Advanced SEO | | Digirank0 -
What recommendations do you have for the SEO of this site?
Hello, Could you glance at this site and let me know if you see anything we could improve upon? www.nlpca.com A couple of notes: We're over-optimized for the term "NLP" on the home page. We're removing the footer links soon. We have 2 locations: San Francisco, CA and Salt Lake City, UT. Our main keyword is "NLP Training" but we would like to rank higher for the keyword "NLP". Also note that we're completely white hat, but we have international site-wide backlinks, and a couple of site-wide backlinks in site's footers - all friends and colleagues of ours.
Intermediate & Advanced SEO | | BobGW0 -
SEO Blow-Up After Site Redesign
I contracted with a local web design firm - a highly recommended firm - to redo my law practice's Wordpress site. The redesign was done in early April. After the redesign I saw a large drop in rankings across all of my keywords, lost internal page rank, and had a big traffic drop. The site is www.toughtimeslawyer.com. There were a couple of issues that contributed to it; but I'm not sure how to rebuild. The internal URL structure changed completely. I wasn't aware of this until the site went live. I didn't have a sitemap for about a week, then the sitemap plugin they used was not very good and showing errors in Webmaster tools. Last week, I replaced it with Yoast's SEO plugin. The biggest problem is that they setup a subdomain old.toughtimeslawyer.com, without asking me or telling me. The subdomain had all of my content on it. It was not blocked with robots.txt; and it is being cached by Google. I just discovered it today, when I was doing something in my cpanel. I assume that this is creating a duplicate content problem with Google. I'm not sure what steps to take to recover. I am especially concerned about the subdomain old.toughtimeslawyer.com and the best want to handle it with the search engines. Thanks in advance, all advice is appreciated. I've been pulling my hair out for the last few weeks over my rankings.
Intermediate & Advanced SEO | | ToughTimesLawyer0 -
SEO Recommendations
For about 3 years our website was number one in Google.co.uk for our trades main keyphrase which resulted in excellent sales. About 12 months ago that position started to slide downwards and for that keyphrase we are now number 10. We are still at number's 1,2 and 3 for several other keyphrases but ones that result in fewer daily Google searches and resultant sales. I have always added unique content to the site but admit that my blog posts became less than daily over the past 12 months. However I am adding posts as often as I can now, of good length and of unique content. As well as tweaking all our online seo factors I'm trying to add good backlinks as often as possible. I wonder if anyone has been in a similar position and what they did to try and regain their previous position? Colin
Intermediate & Advanced SEO | | NileCruises0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90