Let me re-word this ...
Should an order page be canonical to a product description page?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Let me re-word this ...
Should an order page be canonical to a product description page?
We have inherited a site that has a Joomla CMS "showroom" front-end and a Magento "store room" for check out etc.
Question - As the site's main pages are in the CMS section should we:
make all Magento product pages canonical to the main sections/product pages within the CMS (even though there are no duplicate content issues)
"No index" the product pages
Index but indicate low page value in sitemap
Do something else?
Thanks for any and all input!
Thanks for the responses.
We're still examining the change in traffic volume, source and search terms.
We have a couple of theories on what's going on and will revert back after the next Mozscape index update.
HI Tom
Thanks for this. I looked at the suspicious links and they seem to be false positives. Mostly revolving around our brand name "military plaques"
Noted on the low quality /dead links - will have these removed.
Hi
This site www.militaryplaques.com
We have had steady traffic over a number of years with the site both on terms of impressions, click through rates, bounce rates, time on site and most importantly sales.
The site has remained fairly static over the last 6 months with no significant changes to content or structuire.
However, on July 11 our traffic and impressions crashed by over 90% and remain at this low level.
We have never been hit by Panda but this looks like such a case?!
Any insight or suggestions as to why the sudden "de-listing" may have happened?
R/
John J Morgan
Thanks Peter for your assistance.
Hope to hear from the SEOMOZ team soon with regards to this issue.
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors.
We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain.
As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: /
We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain.
Any help is greatly appreciated!
Thank you so much for the response. At least we know why it has remained the same for so long. Thanks for the info!
We are doing a Competitive Domain Analysis for a client and 2 of their competitors.
The results from the Root Domain Metrics have been ok from the past months where in the results we are getting changes every time the website is re-crawled by SEOMOZ to update its analysis.
However, in the past few weeks, the results we are getting for the Root Domain Metrics have remained the same even after three weeks. We have attempted a separate Open Site Explorer analysis for the client and their competitors but be are getting the same results.
From the "Domain Authority" to the "No Followed Linking Root Domains", the results have remained the same for three weeks.
Could anything be affecting the results or is it just the fact that the links have not changed (not a single one, for three websites) in the past three weeks.
Hope you can help me clarify my issue. Thanks!
Thanks Robert,
So the weekly historical changes for the month will appear after the next crawl?
R/
The updates to the Competitive Domain Analysis were occuring every 7 days upto Jan 16.
When will the next updates be and how often is the new schedule?
R/
John
Been using SEOMOZ now to analyze and crawl a client's website for a while now.
One thing I've noticed is that our client's website is indexing well with Google. a few thousand pages are being indexed.
However, when it comes to Yahoo and Bing, the website only has a 100+ pages indexed.
We've submitted updated sitemaps to Google and Bing and have been fixing any broken links, and on-page SEO. Content is also good.
Here's the website: www.imaginet.com.ph
Any suggestions/recommendations are highly appreciated.
Thank you!
I understand what you're saying.
However, there are several hundreds of pages like that which are 1 word definition pages like the 4 ones that I've mentioned above.
Here's one example of a 1 word definition page that did not report any error:
http://www.imaginet.com.ph/local-area-network-definition
Among all those pages, there's only around 14-20 that are getting duplicate page content errors (including the four I mentioned above).
That is why I'm curious to find out the reason why the four above reported duplicate page errors.
Thanks!
A recent crawl diagnostic for a client's website had several new duplicate page content errors.
The problem is, I'm not sure where the error comes from since the content in the webpage is different from one another.
Here's the pages that SEOMOZ reported to have duplicate page content errors:
http://www.imaginet.com.ph/wireless-internet-service-providers-term
http://www.imaginet.com.ph/antivirus-term
http://www.imaginet.com.ph/berkeley-internet-name-domain
http://www.imaginet.com.ph/customer-premises-equipment-term
The only thing similar that I see is the headline which says "Glossary Terms Used in this Site" - I hope that the one sentence is the reason for the error.
Any input is appreciated as I want to find out the best solution for my client's website errors.
Thanks!
Thank you for the reply! I'll definitely be taking your advice on this one.
Thanks again!
I am using SEOMOZ for a client to track their website's performance and fix any errors and issues.
A few weeks ago, they created a sub-domain (sub.example.com) to create a niche website for some of their specialized content.
However, when SEOMOZ re-crawled the main domain (example.com), it also reported the errors for the subdomain.
Is there any way to stop SEOMOZ from crawling the subdomain and only crawl the main domain? I know that can be done by starting a new campaign, but is there any way to work around an existing campaign?
I'm asking because we would like to avoid the setting up the campaign again and losing the historical data as well.
Any input would be greatly appreciated.
Thanks!
Hi! thanks for the reply, but we have a Google Analytics in the site... Up until now, only 1 page is being crawled in our site.. Thanks..
I would like to ask for the possible problem plus solution on one of our campaigns.
Only 1 page is being crawled by SEOmoz for the last 2 crawls. Before the last two crawls, SEOmoz crawls numerous pages and we can’t think of a possible reason for this error.
For this particular campaign , there are no data --- no errors, warnings and notices.
Thanks!
One of the website's we monitor have been getting high duplicate page titles, as we work through the pages, we see changes and the number of duplicate page titles are decreasing.
However, lately, it went up again and the duplicate page titles have increased.
I wanted to ask if there's any way to view the new errors and the old errors separately or sorted in a way that can help me identify why we are getting new page crawl errors.
Any advice would be great.
Thanks!
Our client needs a content disclaimer on their site. This is a simple "If you agree to these rules then click YES if not click NO" and you're pushed back to the home page.
I have this gut feeling that this may cause an upset with the search robots.
Any advice?
R/
John