Ecommerce SEO help
-
Hi
I'm having difficulty managing our product pages for optimisation, we have over 20,000 products.
We do keyword research & optimise product titles/meta of new products - however there's a lot to clean up but we have done a lot.
I find we rank/convert better on product pages so they would be great to focus on - however when an old product is discontinued, the page is removed & we lose authority by creating new pages for similar products - does anyone have any ideas for managing this? This is something done automatically on the dev side in France.
I then have the issue of trying to rank category pages - these are highly competitive areas competing with big brands.
I'm finding it tough to know where to focus, the site is vast and I am the only SEO.
I've started looking into low hanging fruit - but these aren't necessarily the areas which bring in much revenue.
Thanks!
-
Hey Becky
Marcus pretty well covered things, but wanted to point you to a video Matt Cutts did a few years back about discontinued products: https://www.youtube.com/watch?v=9tz7Eexwp_A - a good watch to get an idea of how Google may look at things, and he breaks out some options depending on the type of site you have.
-
Hey Becky
In an ideal world you will want to keep the old pages up or at least redirect them to the equivalent. If they 404 then that's just not ideal. If these pages acquire inbound links (which may be doubtful) you are throwing away equity by not keeping them or redirecting them.
You could try the following:
- Crawl all pages in screaming frog
- Export the list of all pages
- Crawl the list
- Note all internal 404s and 301 them to the relevant page
That is a bit of a hack but if aspects of this are outside of your control that will allow you to keep track of historical changes.
With regards to general optimisation of products you will want to look at ways to prioritise - one person and 20,000 products is a tough gig. However, there may only be 2,000 products that are really important / profitable. This should factor in what is important to the business first so you have some way to prioritise. Then look at the current situation (rank), opportunity (competition) and create an ordered list to give you some plan of attack.
The category pages are a little different and a sensible approach would be to do an audit of sorts as there is likely to be far less of these.
- Get a list of all category pages and targeted keywords
- Find where you are currently (situation analysis)
- Review the competition (opportunities)
- Look for categories where you are almost there OR could improve over what is there currently
- Order by opportunity and focus on those first
If you can create an ordered list that factors in opportunity / difficulty and any other factors you can at least tackle this in a structured way.
Sounds like Chaos though - 20,000 products is a tough gig for one SEO.
Hope that helps.
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Local Pages Help
Hi All, I have a client who is looking heavily at Google+ Local. He has a main business, with a number of locational franchises. He has created a local listing for each of these franchise pages. The question he has asked is 'How do I improve my rankings for these local listings?' Now some of them seem to rank well without any work performed to improve them, but some are not. My question is, What can we do to improve the rankings of Google+ Local listings? This has changed greatly since I last looked into it, so anyone who can say 'right, this is what you need to do to improve Google+Local listings' would be greatly appreciated!!!! Many thanks Guys!!
Algorithm Updates | | Webrevolve0 -
Wordpress Seo Title and Tagline
Hello, I am using wordpress. I also have a plugin called All in One SEO. I was wondering, how I should set up my SiteTitle and Tagline? The main keyword I am going after is "Baking " Others are Cooking, Teaching, World food
Algorithm Updates | | Cinfo10 -
Need some Real Insight into our SEO Issue and Content Generation
We have our site www.practo.com We have our blog as blog.practo.com We plan to have our main site in a months time from now as www.ray.practo.com The Issues - I will then need to direct all my existing traffic from www.practo.com to www.ray.practo.com Keeping in mind SEO and also since I will be generating new content via our Wordpress instance what are the best ways to do this so that google does not have difficulty in find out content 1. Would it be good if I put the Wordpress instance as ray.practo.com/ blog(wordpress instance comes in here in the directory) / article-url 2.Would it be better with www.practo.com / ray / blog/article-url I am using wordpress to roll out all our new SEO based content on various keywords and topics for which we want traffice - primary reasons are since we needed a content generation cms platform so that we dont have to deal with html pages and every time publish those content pages via a developer. Is the above - what soever I am planning to do in the correct manner keeping SEO in mind. Any suggestions are welcome. I seriously need to know writing seo based content on wordpress instance and have them in the urls is that a good idea? Or is only html a good idea. But we need some cms to be there so that content writers can write content independently. Please guide accordingly. Thanks
Algorithm Updates | | shanky10 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0 -
Is it hurting my seo ranking if robots.txt is forbidden?
robots.txt is forbidden - I have read up on what the robots.txt file does and how to configure it but what about if it is not able to be accessed at all?
Algorithm Updates | | Assembla0