What do you think about SEO of big sites ?
-
Hi,
I was doing some research of new huge sites for example carstory.com that have over million pages and i notice that many new sites have strong growing for number of keywords and then at some point everything start going down
(Image of traffic drop attached)
there are no major updates at this time but you can clearly see even on recent kewyords changes that this site start loosing keywords every day , so number of new keywords are much less that lost keywords.
How would you explain it ? Is that at some point when site have more than X number of indexed pages then power of domain is not enough to keep all of them at the top and those keywords start dropping ?
Please share you opinion and if you have any experience by yourself with huge sites.
Thank You very appreciated
-
It sounds to me like you need to do a Content Audit with the goal of pruning out all pages with zero traffic and zero links from the index. See the following resources:
How To Do a Content Audit - Step by Step (Moz)
Classic Content Audit Articles (Linked-In)
Content Audit Case Studies (Linked-In)
Using URL Profiler for Content Audits ( URL Profiler)
Here's the slide deck for a presentation I gave last year about them.
Here's a recording of a webinar with the presentation above.
Common Content Audit Strategies I think the site you described would be considered Extra Large with a Penalty "Risk" so this is what the tool recommends:
"Focus: Prioritization of Pages for Content Optimization. Are you SURE there is no content-based penalty risk?
Most sites with this many pages have major content issues, meaning this would be the wrong “situation” for them. In the rare case that they don’t, prioritize the pages based on rankings, traffic potential, revenue… and propose how many to improve each month with ongoing copywriting and on-page optimization." -
Hi Dani,
As John mentioned, there are so many factors involved that it's very tough to accurately pinpoint the reasons why a site might start losing rankings.
To answer your question directly - no, there is no tipping point where your link profile just isn't strong enough to carry a certain volume of pages. There is a similar scenario that could happen but I'll get to that in a moment.
The other thing is that to my knowledge, the traffic data in Ahrefs isn't 100% accurate like Analytics, it's more of an estimation based on extrapolating a smaller pool of data. I'm not completely sure about this but that's my understanding. Basically, the drops you're seeing might not actually be as pronounced as the graph indicates.
So, the type of scenario where there are "too many pages" is moreso related to the user experience. If you're creating large volumes of pages just for "better SEO", you're going to have a bad time. If your site has 20 real pages and 80 pages just targeting a keyword then this is where you're like to start seeing a drop in your rankings as the overall quality signals on your site begin to dip.
Users don't enjoy keyword-heavy and redundant pages and the metrics will demonstrate this to search engines very quickly.
You should always be looking to improve the strength of your link profile, but if you're worried about creating too many pages, don't be. Create the pages that make sense for the user, provide a great user experience and you'll be fine.
-
There are too many factors in rankings hence too think one factor is the answer is rarely the way forward.
Do a google search site: That will give you a rough idea of page authority & no. of pages indexed.
I recommend to stop trying to find one "short cut" answer, step back do a site audit - properly identify what is going on. It could be that competitors are now all mobile friendly... so they now rank better. It could be anything. Slow down do it properly, if you think it is domain authority alone, whilst doing a site audit obtain a relevant high authority backlink and see if that assists.
-
So you dont think its just domain authority (page rank) is not enough power to spread across all pages ? like when site has less pages power was enough and rankings climb up but when too many pages been indexed then it wasnt enough to spread across all pages ?
-
Then I would go back to the landing page experience. What is your conversions rate? Are they averaging over 5 minutes on site or less... How many page views? Are page views increasing? A factor in seo is the landing page experience and how long the customers stick around and what they do. There was a site called jabong in india that had a similar issue - they kept improving the landing page experience and all the metrics slowly turned around.
If it not that you have to go back to basics and do a site audit, which you should do annually anyway. https://moz.com/blog/technical-site-audit-for-2015
Hope that assists.
-
Hi,
Thanks for reply. I dont think its seasonal because its a clear that site loosing keywords in google everyday. So at some point at the beginning there was a boost and site was getting more new keywords each day than its been loosing, and now its loosing more than gaining.
-
Firstly there is not enough shared information.
That said we manage several large sites and there are times were they are static and there are dips, but seasonal factors are huge for most large sites, from education to health insurance. So have you accounted for seasonal factors? Also the traffic growth is exponential at the start and from an outside point of view it seems that the landing page experience is not strong enough or not answering the searchers queries.
I would do a deep dive into onsite behavioral analytics and see how long they are staying on, and where dropping off. In short look for pogo sticking.
Hope that assists.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate ecommerce sites, SEO implications & others?
We have an established eCom site built out with custom php, dedicated SERPs, traffic, etc.. The question has arisen on how to extend commerce on social and we have found a solution with Shopify. In order to take advantage of this, we'd need to build out a completely new site in Shopify and would have to have the site live in order to have storefronts on Pinterest and Twitter. Aside from the obvious problem with having two databases, merchant processing, etc, does anyone know whether there are SEO implications to having two live sites with duplicate products? Could we just disavow a Shopify store in Webmaster Tools? Any other thoughts or suggestions? TIA!
Intermediate & Advanced SEO | | PAC31350 -
Has this site been a victim of negative seo?
The rankings for our client's site - www.yourlifeprotected.co.uk fell off the face of the earth back in June. Despite trying a huge number of things to try and help the site recover, we've seen no real positive improvements since then. Examples of things we have tried: Disavowed & manually removed poor quality Links Removed any internal Duplicate Content Removed any broken links Re-written all website content to ensure unique & high quality No-Followed all outbound links Added any missing title tags changed hosting Rewritten content to ensure no duplication internally or externally The most recent issue we've picked up is that some highly spammy sites seem to have copied extracts of text from the website and hidden them in their pages. This is a rather puzzling one, as there aren't backlinks, pointing to our site - just the copy. For example - Cancer Page and Diabetes Page.It feels very much as though this could be a negative SEO attack which could be responsible for the drop in rankings and traffic the site has experienced. If this is the case, what can we do about it?! Having already re-written the copy on the site, we obviously dont want to do this again unnecessarily - especially if this could just happen again in future! Any help or advice would be hugely appreciated.
Intermediate & Advanced SEO | | Digirank0 -
Mobile Site Annotations
Our company has a complex mobile situation, and I'm trying to figure out the best way to implement bidirectional annotations and a mobile sitemap. Our mobile presence consists of three different "types" of mobile pages: Most of our mobile pages are mobile-specific "m." pages where the URL is completely controlled via dynamic parameter paths, rather than static mobile URLs (because of the mobile template we're using). For example: http://m.example.com/?original_path=/directory/subdirectory. We have created vanity 301 redirects for the majority of these pages, that look like http://m.example.com/product that simply redirect to the previous URL. Six one-off mobile pages that do have a static mobile URL, but are separate from the m. site above. These URLs look like http://www.example.com/product.mobile.html Two responsively designed pages with a single URL for both mobile and desktop. My questions are as follows: Mobile sitemap: Should I include all three types of mobile pages in my mobile sitemap? Should I include all the individual dynamic parameter m. URLs like http://m.example.com/?original_path=/directory/subdirectory in the sitemap, or is that against Google's recommendations? Bidirectional Annotations: We are unable to add the rel="canonical" tag to the m. URLs mentioned in section #1 above because we cannot add dynamic tags to the header of the mobile template. We can, however, add them to the .mobile.html pages. For the rel="alternate" tags on the desktop versions, though, is it correct to use the dynamic parameter URLs like http://m.example.com/?original_path=/directory/subdirectory as the mobile version target for the rel="alternate" tag? My initial thought is no, since they're dynamic parameter URLs. Is there even any benefit to doing this if we can't add the bidirectional rel="canonical" on those same m. dynamic URLs? I'd be immensely grateful for any advice! Thank you so much!
Intermediate & Advanced SEO | | Critical_Mass0 -
Development site is live (and has indexed) alongside live site - what's the best course of action?
Hello Mozzers, I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection? Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess... Thanks in advance for your help! 🙂
Intermediate & Advanced SEO | | McTaggart1 -
Entire site code copied - potential SEO issues?
Hi folks, We have noticed that our site has been directly duplicated by another site. They have copied the entire code, including the JS, CSS and most of the HTML and have simply switched their own text and images onto the template. (We discovered it because they even copied over our analytics tracking and were appearing in our reports - duh!) Does anyone know if there are potential SEO issues in copying the code like that, or do duplicate content issues only apply to indexable HTML content? Thanks! Matthew (I didn't want to out them by sharing their URL because it could have been an external contractor that built the site and they probably had no idea.)
Intermediate & Advanced SEO | | MattBarker0 -
Better SEO Option, 1 Site 3 Subdomains or 4 Separate Sites?
Hey Mozzers, I'm working with a client who wants to redo their web presence. They have a a main website for the umbrella and then 3 divisions which have their own website as well. My question is: Is it better to have the main site on the main domain and then have the 3 separate sites be subdomains? Or 4 different domains with a linking structure to tie them all together? To my understanding option 1 would include high traffic for 1 domain and option 2 would be building Page Authority by having 4 different sites linking to each other? My guess would be option 2, only if all 4 sites start getting relevant authority to make the links of value. But right out of the gates option 1 might be more beneficial. A little advice/clarification would be great!
Intermediate & Advanced SEO | | MonsterWeb280 -
Mobile SEO
Hey, In the following article, Google recommended using a 301 redirect but doesn't specify why. http://googlewebmastercentral.blogspot.co.uk/2011/02/making-websites-mobile-friendly.html I assume this is to pass over link equity to the relevant mobile/desktop variation. Can anyone confirm this? Also is there any other reason? Again assuming this would keep the correct URLs in the correct index? Anything else anyone can chip in would be great. Thanks
Intermediate & Advanced SEO | | CraigAddyman0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0