Can Dramatically Increasing Site Size Have Negative Effects?
-
I have a site with about 1000 pages. I'm planning to add about 30,000 pages to it. Can increasing the footprint by such an amount all of a sudden have any negative consequences for existing organic or hoped-for benefits from new pages?
Would the site draw any increased scrutiny from Google for doing this?
Any other considerations?
Thanks... Darcy
-
Hi Robert,
Good point. From a time cost to search benefit point of view, the initial set up is the big cost... getting the template right.
After that, if you're creating 3000 or 30,000 pages is really no extra time... just flowing in the data. So, with that in mind, wouldn't the site be better off search wise-with 10x the pages (30k vs 3k) to monetize the set-up cost over?
Also, having cut the list down already based on non-search criteria, I have no reliable way of knowing which of the 30,000 is better than the next search-wise. They're all kinda medium nichey with low volume and about the same estimated difficulty. So, no idea which page would do better than the next in search.
Also, I can't look at 30,000 pages or even 3000 of search data to make that cut. Know what I mean? So, the cut, if there were one, would be random.
Thanks... Darcy
-
Here's a question I often ask of our team: What are we spending our time on in terms of time cost vs. benefit? So, if putting in the 30K product pages is taking x amount of time, have you asked yourself - Why? Just to get clicks on some you don't have? Etc. So, if you were to take say the best 3000 and put the reviews, other content, etc. and when that was done then start on the next 3000 in the same way, would your result be improved? Would gaining PA on the ones you had truly optimized get you high enough that you can get more traffic to the site? Do you then keep improving at a rate that is augmenting where you are going?
So, instead of asking is this a good idea (and on the face it is), ask is this the best idea?
Hope it helps
-
Hi Dana & BigFish22,
Both good points... thanks!
Here's a little more detail... the site has been about a certain product area for a long time. I'm adding catalog pages for 30,000 different products.
The pages are geared to searches around the word "buy" plus the product name. The pages are different from each other in that the product name, manufacturer, pictures, picture tagging, title & description tags all use different product names and other information. So, the pages are different from each other, but out of the gate not amazingly different from the basic info available on other sites that sell these products... all manufacturer data flowing into a template.
My plan is to put up the catalog and then over time add reviews and other content to make the pages really different from other site's catalog paes that draw on substantially the same basic information.
What do you think? Thanks! Best... Darcy
-
Darcy,
I think Dana makes the best point possible and that is...is the content GOOD? Is it something people are looking for? etc.
To just add content is not helpful in my opinion. But, we have sites that we add 100's of pages a month to and it is all unique content around specific verticals. It is amazing to watch what happens as you add better and better content and more of it to a site.Hope this helps. A question would be: Over what period and is the content already created?
-
Hi Darcy,
I don't think anyone here can give you a concrete answer to your question. If they do, be suspicious. If you add 30,000 pages of totally unique content....then I think you will skyrocket to the top of organic listings. If your content is substantially similar, or just copies of content from other places, then, it's not really going to help you at all. It might hurt you. It might not.
I;m not sure that's the answer you were looking for but I hope it's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Negative SEO + Disavow
My site is very new (~1 years old), but due to good PR we have gotten some decent links and are already ranking for a key term. This may be why someone decided to start a negative SEO attack on us. We've had less than 200 linking domains up until 2 weeks ago, but since then we have been getting 100+ new domains /day with anchor texts that are either targeted to that key term or are from porn websites. I've gone through the links to get ready and submit a disavow... but should I do it? My rankings/site traffic has not been affected yet. Reasons for my hesitations: 1. Google always warns against using the disavow, and says "you shouldn't have to use it if you are a normal website." (sensing 'guilty-until-proven') 2. Some say Google is only trying to get the data to see if there are any patterns within the linking sites. I don't want the site owners to get hurt, since the villain is someone else using xrumer to put spammy comments on their site. What would you do?
Intermediate & Advanced SEO | | ALLee0 -
1 Ecommerce site for several product segments or 1 Ecommerce site for each product segment ?
I am currently struggling with the decision whether to create individual ecommerce sites for each of 3 consumer product segments or rather to integrate them all under one umbrella domain. Obviously integration under 1 domain makes link building easier, but I am not sure how far google will favor in rankings websites focussed on one topic=product segment. Product segments are medium competitive.Product segments are not directly related but there may be some overlap in customer demographics- Any thoughts ?
Intermediate & Advanced SEO | | lcourse1 -
How can I see all the pages google has indexed for my site?
Hi mozers, In WMT google says total indexed pages = 5080. If I do a site:domain.com commard it says 6080 results. But I've only got 2000 pages in my site that should be indexed. So I would like to see all the pages they have indexed so I can consider noindexing them or 404ing them. Many thanks, Julian.
Intermediate & Advanced SEO | | julianhearn0 -
How to make an AJAX site crawlable when PushState and #! can't be used?
Dear Mozzers, Does anyone know a solution to make an AJAX site crawlable if: 1. You can't make use of #! (with HTML snapshots) due to tracking in Analytics 2. PushState can't be implemented Could it be a solution to create two versions of each page (one without #!, so campaigns can be tracked in Analytics & one with #! which will be presented to Google)? Or is there another magical solution that works as well? Any input or advice is highly appreciated! Kind regards, Peter
Intermediate & Advanced SEO | | ConversionMob0 -
Can Linking Between Your Own Sites Excessively Be a Penguin No-No?
I have a bunch of travel-related sites that for a long time dominated google.com.au without any intensive SEO whatsoever. Aside from solid on-page content and meta tag, I did no link building. However, all of my sites are heavily interlinked, and I think they are linked with do follow links and lots of anchor texts. Here are a few of them: www.beautifulpacific.com www.beautifulfiji.com www.beautifulcooklands.com My idea in inter-linking them was to create a kind of branded "Beautiful" nexus of sites. However, when Penguin hit -- which I believe was on April 27th -- search traffic crashed, and has crashed over and over again. I've read that Penguin penalized over-optimization vis a vis anchor text links. I don't have a lot of inbound links like these, but they are everywhere among my sites. Is it possible that all of my text links have hurt me with Penguin? Thanks to everyone in advance for your time and attention. I really appreciate it. -Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
How to see which site Google views as a scraper site?
If we have content on our site that is found on another site, what is the best way to know which site Google views as the original source? If you search for a line of the content such as "xyz abc etc" and the other site shows before yours in search results, does that mean that Google views that site as the original source?
Intermediate & Advanced SEO | | nicole.healthline0 -
Bad neighborhood linking - anyone can share experience how significant it can impact rankings?
SEOMoz community, If you have followed our latest Q&A posts you know by now that we have been suffering since the last 8 months from a severe Google penalty we are still trying to resolve. Our international portfolio of sports properties has suffered significant ranking losses across the board. While we have been tediously trying to troubleshoot the problem for a while now we might be up to a hot lead now. We realized that one of the properties outside of our key properties, but are site that our key properties are heavily linking to (+100 outgoing links per property) seems to have received a significant Google penalty in a sense that it has been completely delisted from the Google index and lost all its PageRank (Pr4) While we are buffed to see such sort of delisting, we are hopeful that this might be the core of our experienced issues in the past i.e. that our key properties have been devalued due to heavy linking to a bad neighborhood site. My question two the community are two-fold: Can anyone share any experience if it is indeed considered possible that a high number of external links to one bad neighboorhood domain can cause significant ranking drops in the rank from being top 3 ranked to be ranked at around a 140 for a competetive key word? The busted site has a large set of high quality external links. If we swap domains is there any way to port over any link juice or will the penalty be passed along? If that is the case I assume the best approach would be to reach out to all the link authorities and have tem link to the new domain instead of the busted site? Thanks /Thomas
Intermediate & Advanced SEO | | tomypro0