Canonical tags being direct to "page=all" pages for an Ecommerce website
-
I find it alarming that my client has canonical tags pointing to "page=all" product gallery pages. Some of these product gallery pages have over 100 products and I think this could effect load time, especially for mobile. I would like to get some insight from the community on this, thanks!
-
Currently my 301's are being directed to relative pages. but for example:
www.shoes.com/category/redshoes.do <------ Current Redirect from (www.shoes.com/category/myredshoes.do)
www.shoes.com/category/redshoes.do=sortby=page1
www.shoes.com/category/redshoes.do=sortby=page2
www.shoes.com/category/redshoes.do=sortby=page=all <----- **Current Canonical **
www.shoes.com/category/redshoes.do=sortby=page=all <--Should I Redirect from www.shoes.com/category/redshoes.do
I basically want to distribute my authority to one page and contemplating if redirecting to a "page=all" along with my canonical will improve the overall performance for that page.
-
I asked John Mueller in a recent hangout about 301 redirects and he stated that if you had multiple 301's from the same domain going to a single point i.e homepage , then google may discount many of those 301's and treat them 404's. In my context , it was as I had done a migration and being lazy I 301'd all the urls to the home t. He was saying to map them like for like or you could lose out.
So I guess it depends on your 301's etc..
Pete
-
The rel=canonical tag passes the same amount of link juice (ranking power) as a 301 redirect, so should I also point my redirects to a "view=all" pages to aggregate Page Authority?
-
We use both rel=next and rel=prev along with a canonical tag pointing to the view all pages on our eCommerce site. As Greenstone mentions above, this is what google recommends.
We also use a Cloudflare CDN (Content delivery Network) which takes care of any speed issue . They offer a free package which you can use to trial it and the paid packages are also very good value ,approx $20-30 per month by memory but it does make the website lightening quick. It's very easy to setup to.
Pete
-
Implementing a rel canonical for a paginated series to a "view all" is certainly recommended practice from a technical standpoint.
With that said, this should be implemented as the recommended course if it enhances user experience. If it takes too long to load, and users abandon the page all together, it helps no one. I would certainly do speed tests, and check the usability of it.
- If it takes longer than a few seconds, I would certainly recommend checking to see if there are ways to speed it up.
- If this proves to be difficult, there is certainly room to consider implementing a paginated series that is more manageable and contains rel=prev and rel=next tags to ensure search engines are aware these pages are a related series.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor is interlinking between his websites
I have a competitor who ranks in the first page for all his keywords and i found out in open site explorer that he has been interlinking between websites and it is obvious because he owns the same domain but different countries. for example, www.example.id (indonesia) www.example.my (malaysia) www.example.sg (singapore) (asian countries domain) my question here is this even consider "white hat"? I read one of the blog post from moz and here is the quote "#7 - Uniqueness of Source + Target The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to): A large number of shared, reciprocated links
White Hat / Black Hat SEO | | andzon
Domain registration data
Shared hosting IP address or IP address C-blocks
Public acquisition/relationship information
Publicized marketing agreements that can be machine-read and interpreted If the engines determine that a pre-existing relationship of some kind could inhibit the "editorial" quality of a link passing between two sites, they may choose to discount or even ignore these. Anecdotal evidence that links shared between "networks" of websites pass little value (particularly the classic SEO strategy of "sitewide" links) is one point many in the organic search field point to on this topic." will interlinking between your sites will be ignored by google in the future? is this a time bomb method or it is fine doing so? Because as far as concern my competitor is actually ranking on the first page for quite some time.1 -
Google Panda and Penguin "Recovery"
We're working with a client who had been hit by Google Panda (duplicate content, copyright infringement) and Google Penguin (poor backlinks). While this has taken a lot of time, effort and patience to eradicate these issues, it's still been more than 6 months without any improvement. Have you experienced longer recovery periods? I've seen sites perform every black hat technique under the sun and still nearly 2 years later..no recovery! In addition many companies I've spoken to advised their clients to begin right from the very beginning with a new domain, site etc.
White Hat / Black Hat SEO | | GaryVictory0 -
Why website isn't showing on results?
Hello Moz! Just got a quick question - we have a clientcalled and for some reason they just aren't showing up in the search results. It's not a new domain and hasn't been penalised (or has reason for penalty). All the content is fresh and has no bad back links to the site. It is a new website and has been indexed by Google but for even for branded search terms, it just doesn't show up anywhere on page 1 (i think page 4). Any help or advise is great appreciated is it's doing my head in. We are using www.google.com.au. Kindest Regards
White Hat / Black Hat SEO | | kymodo0 -
Update: Copied Website
So I discovered a website the other day that is a complete duplicate of ours: justinchina.co.uk This is our website: petmedicalcenter.com . Thanks to help from Erica, I dug in deeper to see why this was happening. It seems that the justinchinca.co.uk which is hosted by GoDaddy has their A Record pointing at our web host. So that being said, our website does not seem to be hacked which is good news. Would this still cause an issue with our Google rankings? Our host, Host Monster said to contact GoDaddy and GoDaddy said that a domain owner can point their URL to anywhere that they choose. Anyway, any feedback would be helpful. Thanks for everyone thus far that has helped me. Brant
White Hat / Black Hat SEO | | BCB11210 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Website mallware attacks
I keep getting attacks to my website every time that are being blocked by OSE firewall Is there any way to stop this? I am affraid because they actually manage enter my website on the past, and i dont know if they can enter on the future or if having all the pluggins and wordpress updated. I am safe enough, and i am not sure if there is any type of virus on my computer Macbook as those attacked pages were recently updated from my computer. Is there any malware scan for Mac Thanl you == Attack Details == TYPE: Found Basic DoS Attacks DETECTED ATTACK VALUE: dDos Attack ACTION: Blocked LOGTIME: 2013-02-25 11:48:18 FROM IP: http://whois.domaintools.com/75.126.24.81 URI: [http://www.propdental.es/](http://www.propdental.es/) METHOD: HEAD USERAGENT: N/A REFERRER: N/A == Attack Details == TYPE: Found Basic DoS Attacks DETECTED ATTACK VALUE: dDos Attack ACTION: Blocked LOGTIME: 2013-02-25 10:13:17 FROM IP: http://whois.domaintools.com/107.21.150.82 URI: [http://www.propdental.es/blanqueamiento-dental/](http://www.propdental.es/blanqueamiento-dental/) METHOD: HEAD USERAGENT: N/A REFERRER: N/A ``` == Attack Details == TYPE: Found Malicious User Agent DETECTED ATTACK VALUE: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5 ACTION: Blocked LOGTIME: 2013-02-25 03:13:52 FROM IP: http://whois.domaintools.com/119.245.226.74 URI: [http://www.propdental.es/sonrisas/los-martinez/](http://www.propdental.es/sonrisas/los-martinez/) METHOD: HEAD USERAGENT: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5 REFERRER: N/A ``` ```
White Hat / Black Hat SEO | | maestrosonrisas0 -
Pages higher than my website in Google have fewer links and a lower page authority
Hi there I've been optimising my website pureinkcreative.com based on advice from SEOMoz and at first this was working as in a few weeks the site had gone from nowhere to the top of page three in Google for our main search term 'copywriting'. Today though I've just checked and the website is now near the bottom of page four and competitors I've never heard of are above my site in the rankings. I checked them out on Open Site Explorer and many of these 'newbies' have less links (on average about 200 less links) and a poorer page authority. My page authority is 42/100 and the newly higher ranking websites are between 20 and 38. One of these pages which is ranking higher than my website only has internal links and every link has the anchor text of 'copywriting' which I've learnt is a bad idea. I'm determined to do whiter than white hat SEO but if competitors are ranking higher than my site because of 'gimmicks' like these, is it worth it? I add around two blog posts a week of approx 600 - 1000 words of well researched, original and useful content with a mix of keywords (copywriting, copywriter, copywriters) and some long tail keywords and guest blog around 2 - 3 times a month. I've been working on a link building campaign through guest blogging and comment marketing (only adding relevant, worthwhile comments) and have added around 15 links a week this way. Could this be why the website has dropped in the rankings? Any advice would be much appreciated. Thanks very much. Andrew
White Hat / Black Hat SEO | | andrewstewpot0