Canonical tags being direct to "page=all" pages for an Ecommerce website
-
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
-
Currently my 301's are being directed to relative pages. but for example:
www.shoes.com/category/redshoes.do <------ Current Redirect from (www.shoes.com/category/myredshoes.do)
www.shoes.com/category/redshoes.do=sortby=page1
www.shoes.com/category/redshoes.do=sortby=page2
www.shoes.com/category/redshoes.do=sortby=page=all <----- **Current Canonical **
www.shoes.com/category/redshoes.do=sortby=page=all <--Should I Redirect from www.shoes.com/category/redshoes.do
I basically want to distribute my authority to one page and contemplating if redirecting to a "page=all" along with my canonical will improve the overall performance for that page.
-
I asked John Mueller in a recent hangout about 301 redirects and he stated that if you had multiple 301's from the same domain going to a single point i.e homepage , then google may discount many of those 301's and treat them 404's. In my context , it was as I had done a migration and being lazy I 301'd all the urls to the home t. He was saying to map them like for like or you could lose out.
So I guess it depends on your 301's etc..
Pete
-
The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, so should I also point my redirects to a "view=all" pages to aggregate Page Authority?
-
We use both rel=next and rel=prev along with a canonical tag pointing to the view all pages on our eCommerce site. As Greenstone mentions above, this is what google recommends.
We also use a Cloudflare CDN (Content delivery Network) which takes care of any speed issue . They offer a free package which you can use to trial it and the paid packages are also very good value ,approx $20-30 per month by memory but it does make the website lightening quick. It's very easy to setup to.
Pete
-
Implementing a rel canonical for a paginated series to a "view all" is certainly recommended practice from a technical standpoint.
With that said, this should be implemented as the recommended course if it enhances user experience. If it takes too long to load, and users abandon the page all together, it helps no one. I would certainly do speed tests, and check the usability of it.
- If it takes longer than a few seconds, I would certainly recommend checking to see if there are ways to speed it up.
- If this proves to be difficult, there is certainly room to consider implementing a paginated series that is more manageable and contains rel=prev and rel=next tags to ensure search engines are aware these pages are a related series.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[SEO] Star Ratings -> Review -> Category Page
Hello there, Basically, if you put non-natural star ratings on the category page, like in the attached images, you will get manual ban from google right?
White Hat / Black Hat SEO | | Shanaki
(i know it for sure, cause I had clients with this situation) The real question is:
If I put a form that allows users to write a review about the category products on the category page, for REAL, will google still ban? Any advice? Any example? With respect,
Andrei Irh0O kto4o0 -
Our webpage has been embedded in others website using iframe. Does this hurts us in rankings?
Hi all, One of our partners have embedded our webpage in their website using iframe such a way that anybody can browse to any page of our website from their website. It's said that content in iframes will not get indexed. But when I Google for h1 tag of our embedded webpage, I can see their website in SERP but not ours which is original. How it's been indexed? Is this hurting us in rankings? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Website in capital letters
Hello, Is website in capital letters affect SEO ?? or we can apply with it ?? please help thnx in advance..!!
White Hat / Black Hat SEO | | iepl20040 -
Spammy Website Framed My Site & Stole Rankings
Hi One of the pages of my website was starting to gather some real traction in Google rankings and hit 3500 visitors per day. This is the page: http://www.naturallivingideas.com/drinking-apple-cider-vinegar-benefits/ On 11th January search traffic to this page fell to virtually 0. The rest of the site rankings were unaffected. Yesterday I tried searching for some of the main keywords I was ranking for and instead of my search listing, this was appearing: Image: http://www.naturallivingideas.com/wp-content/uploads/2017/01/weird-rankings.jpg It is in the exact position I was ranking in, but instead this evn . moe site has stolen my ranking. Upon opening the website, it is simply my original article page in an iframe. If you look at the source code of the offending website, you will see what I mean. Hopefully now you are getting a 403 forbidden error as my host blocked referrals from that site but they still hold my rankings. Has anyone ever seen this before? How was this done? And how can I get my ranking back? Thanks in advance, James
White Hat / Black Hat SEO | | JamesPenn0 -
Am I over "Optimising My Site" or following "Best Practice"
Hi We're developing our site an wanted to ask if we are "over optimising" or following best practice. Maybe you have some recommendations. I've provided 4 examples below. Eventually we'll use Moz on page grader but as a new start up, I'd appreciate your help. Thank you, Faye. 1. URL: http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/ PAGE TITLE: Cipressa | Engineered Brown Wood | The Wood Galleries H1: Cipressa – Engineered Brown Wood KEYWORD: Engineered Brown Wood META: Buy Cipressa Brown Engineered Wood, available at The Wood Galleries, London. Provides an Exceptional Foundation for Elegant Décor & Extravagant Furnishings. IMAGE TAG: Brown Engineered Flooring KEYWORD IN BODY CONTENT: YES (1) 2. URL: http://www.thewoodgalleries.co.uk/engineered-wood/beiges/mauro/ H1: Mauro | Beige Engineered Wood | The Wood Galleries PAGE TITLE: Mauro – Beige Engineered Wood KEYWORD: Beige Engineered Wood META: Buy Mauro Beige Engineered Wood Flooring, available at The Wood Galleries, London. Designed to deliver Rich, Dark Undertones with Light hues of Muted Brown. IMG TAG: Beige Wood Flooring KEYWORD IN BODY CONTENT: YES (2) **3. URL: http://www.thewoodgalleries.co.uk/engineered-wood/beiges/vela-oak/ ** H1: Vela – Beige Engineered Oak PAGE TITLE: Vela | Beige Engineered Oak | The Wood Galleries KEYWORD: Beige Engineered Oak META: Buy Vela Beige Engineered Oak Wood, available at The Wood Galleries, London. Crafted from the most widely respected hardwoods in the world. IMG TAG: Engineered Oak Flooring KEYWORD IN BODY CONTENT: YES (1) 4. URL: http://www.thewoodgalleries.co.uk/engineered-wood/darks-blacks/ciro-rustic/ H1: Ciro – Engineered Rustic Wood PAGE TITLE: Ciro | Engineered Rustic Wood | The Wood Galleries KEYWORD: Engineered Rustic Wood META: Buy Ciro Engineered Rustic Wood, at The Wood Galleries, London. Its stylishly classic oak look exudes a sense of luxury that is simply undeniable. IMG TAG: Dark Wood Flooring, The Wood Galleries KEY WORD IN BODY CONTENT: YES (2)
White Hat / Black Hat SEO | | Faye2340 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Single Domain With Different Pages Deep Linking To Different Pages On External Domain
I've been partaking in an extensive trial study and will be releasing the results soon, however I do have quite a strong indication to the answer to this question but would like to see what everyone else thinks first, to see where the common industry mindset is at. Let's say SiteA.com/page1.html is PR5 and links out to SiteB.com/page1.html This of course would count as a valuable backlink. Now, what would happen if SiteA.com/page2.html, which is also PR5, links out to SiteB.com/page2.html ? The link from SiteA is coming from a different page, and is also pointing to a different deeplink on SiteB, however it will contain the same IP address. What would the benefit be for having multiple deeplinks in this way (as outlined above, please read it carefully before responding) as opposed to having just a single deeplink from the domain? If a benefit does exist, then does the benefit start to become trivial? This has nothing to do with sitewide links. Serious answers only please.
White Hat / Black Hat SEO | | stevenheron1