I just tested the url on web-sniffer.net with mobile user agent - it seems that the tool you use to detect if the user agent is a mobile one doesn't work. I get a normal 200 status (from the desktop site) - idem like Patrick
rgds,
Dirk
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I just tested the url on web-sniffer.net with mobile user agent - it seems that the tool you use to detect if the user agent is a mobile one doesn't work. I get a normal 200 status (from the desktop site) - idem like Patrick
rgds,
Dirk
The normal way to check is like you did - if it fails it implies something is wrong with the way you set it up. Check the link in Patrick's answer - the doc of Google is quite detailed how to set it up properly.
rgds
Dirk
It would be quite logical. Compare it with real estate. Your primary location is your homepage -which normally will list your most important & interesting content. As you normally cannot put everything on your home, you shift less important content to level 2, and then 3 ..etc. The deeper the content - the less important you find it. So it's quite normal that Google follows this logic as well.
The second part is the number of internal links - because your most important content will receive a lot of internal links. Normally - the more links a piece of content receives, the closer it will be to the homepage (chances are bigger it receives links from the home, level 2 or level 3 content).
Index pages can help to move your content closer to the home - but this will only get you so far (this doesn't change a lot to the number of links these articles receive).
You could try to regroup your content in a cluster per theme - with it's own homepage & a lot on links inside the cluster to create more internal (theme based links) & move content closer to the home. There is an interesting post on this topic from Gianluca Fiorelli: http://moz.com/blog/topical-hubs-whiteboard-friday
Hope this helps,
Dirk
Hi
According to Matt Cutts a 404 means page is gone (but not necessarily permantent) - permanent would be 410 (although he also indicates that there is very little difference between these two from SEO perspective.
What to do with these 404 depends a bit on the situation
if these pages have external links pointing to them - I would try to redirect (even better ask the one who's linking to update the links although maybe difficult in practice)
if these are old url's which haven't been used for a while & don't generate traffic - just leave them - they will disappear
where do these 404 come from - if they are just listed in WMT - you can ignore them. If it's actual people trying to visit your site on these pages I would try to redirect them to the appropriate new page (if not for the SEO than for the user experience)
check that no internal links exist to these old 404 pages (Screaming Frog is made for this)
You say you 301'd the pages of the old site - did you check your landing page report in Analytics before migration - you should make sure that these top 5000 url's are properly redirected - normally Google will figure it out after a while, but it can have a negative impact on your results if a lot of these landingpages generate a 404.
As additional resource - probably a bit too late now you could check the different steps in this guide: http://moz.com/blog/web-site-migration-guide-tips-for-seos to be sure that you didn't miss something important.
Hope this helps
Dirk
Hi Dustin
I checked the last example a bit more in detail:
Webpagetest indicates that his page is even heavier than the one I tested the first time - results here: http://www.webpagetest.org/result/150501_XQ_Q1H/ (load time ok - but page way to heavy)- page speed analyser results are not terrific either. Check also the links on the page - a lot of your css files seemed to be 301 to a new location - it's better to call directly the final location. Optimising shouldn't be too difficult - compressing images, js,..etc should really increase your scores.
Also on the technical side, probably good to clean the code a bit. You don't really have to pass the W3C validator test, but it seems your page generates a awful lot of errors.
From a content perspective - looks good. A remark could be that a lot of content is not directly visible when loading the page (read more, tabs). Google announced end of 2014 that they don't really like content that is hidden when the page is loaded (https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html). Don't really think this would cause a drop - but it's something you could consider changing if you modify the HTML code.
Hope this helps,
Dirk
Great reply from Tom. The other way to see it (more from philosophical nature) - by definition the 'world wide web' is
'The World Wide Web (www, W3) is an information system of interlinked hypertext documents that are accessed via the Internet and built on top of the Domain Name System'. (Wikipedia)
If you site only has only incoming links, and no outgoing links - it's like a dead end street (or and end-node) and would be a bit in contradiction of what the www is (was) all about. It would also seem that you are convinced that your site has all the possible answers / solutions for a certain topic and that you are convinced that no other site could be of interest for your visitors.
No hard facts or figures to support this theory, just my gut feeling. If you have a number of good external references that have added value content for your visitors I would certainly link to them.
rgds,
Dirk
Hi Dustin,
If the site is the one mentioned in your profile - it's almost certain that it's a speed/performance issue. I checked one product page - and grade from google for desktop is 49/100 - 67/100 for mobile. Webpage test indicates the page size of 2700K - mainly images, javascript & flash. It should not be to difficult to improve these scores - optimising the images & the js will help a lot.
rgds,
Dirk
This can have different reasons - without having the actual example it's difficult to assess. Based on the info you give, adding all this content, video, images, it could be that your page became a lot heavier to load (check the page on webpagetest.org and with pagespeed insights). May be you exaggerated a bit with the optimisation - and is Google considering the page a bit too spammy.
If possible, can you give the actual url of the page?
rgds,
Dirk
Hi Rajiv,
If you post the same content on both FR & EN version:
if both are written in English (or mainly written in English) - best option would be to have a canonical pointing to the EN version
Example: https://fr.sitegeek.com/category/shared-hosting - most of the content is in English - so in this case I would point a canonical to the EN version
if the FR version is in French - you can use the HREF lang tag - you can use this tool to generate them, check here for common mistakes and doublecheck the final result here.
Just some remarks:
partially translated pages offer little value for users - so it's best to fully translate them or only refer to the EN version
I have a strong impression that the EN version was machine translated to the FR version. (ex. French sites never use 'Maison' to link to the Homepage - they use Acceuil). Be aware that Google is perfectly capable to detect auto-translated pages and they consider it to be bad practice (check this video of Matt Cutts - starts at 1:50). So you might want to invest in proper translation or proofreading by a native French speaker.
rgds
Dirk
Hi Justin,
It depends a bit on where these links are & on how your site is build.
In general - if these links are in the navigation, footer, ...etc (elements which are called on each page) - you probably only need to change them once and then it is ok for the full site.
If these links are inside articles, you will probably have to update them manually. It sometimes is possible to do something like 'find/replace' - so you could replace (as example) all references from zenory.com to zenory.co.nz (for the NZ version) - but again it depends a bit on the platform.
Check with the one who has build your website - changing links manually can be a very annoying & time consuming job (depending on the number of links that need to be changed). A programmer can sometimes find a solution to automate part of it.
rgds,
Dirk
Hi Justin,
If the same page exists as well on the .com version (which seems to be the case) I would replace the .co.nz version by the .com version.
In Screaming Frog under Bulk Export you can export all the outgoing links. This will give you both the pages with outgoing links & the pages where they are linking to. Open in xls - and filter the destination pages on links that contain .co.nz or .com.au (when you are checking the .com version). This will give you all the source pages which need to be updated.
rgds
Dirk
Jennifer,
Some of your pages still reference http images - example https://www.tsheets.com/proadvisors-we-trust.php calls image http://cdn.tsheets.com/images/pros/denise-loter-koch.png
"Your connection to www.tsheets.com is encrypted with obsolete cryptography. However, this page includes other resources which are not secure. These resources can be viewed by others while in transit, and can be modified by an attacker to change the look of the page."
This is probably the reason for the 804 errors in Moz.
You should also check your internal links - some of them still point to the http version which is then again redirected to the https version
Example https://www.tsheets.com/infographics/time-tracking-infographic-hr-industry links to http://www.tsheets.com/infographics/time-tracking-infographic which is then redirected to https.
Unrelated to the https - but you might want to optimise the image on https://www.tsheets.com/online-invoicing-and-billing/ (https://www.tsheets.com/online-invoicing-and-billing/images/main-image-billing.png)
rgds
Dirk
Hi Ana,
Just to clarify - if you redirect based on ip to a location based url like /newyork you can still have a link on the page going to the other locations like /sandiego - so Google can access all these pages & index them. This is not possible it the scenario you mentioned.
Not sure how old the article from unbounce is, but Google bot is able to interpret javascript (to a certain extent). Using javascript won't change the problem - as long as you have only one page that adapts automatically to the ip location you will be unable show all versions of the page to Google - it will help your Californian branch, but hurt all the others.
rgds,
Dirk
Hi Justin,
It's quite common to have links between different versions of site if they are targeting different countries. It could also be considered good for the user experience to offer the user the choice to choose the site which corresponds to his country.
You just have to make sure that the links between these different versions are voluntary not accidental links. If a user is browsing your site and suddenly switches from to the .com to the .co.nz because the internal links on your site are not properly configured this could be considered as bad practice.
As usual - Screaming Frog is your best friend for this. Crawl the sites, check the pages with outgoing links to the other versions. If the link is accidental - correct it.
Hope this helps,
Dirk
Hi,
You could check this tool http://flang.dejanseo.com.au/ to check if the tags are properly implemented.
rgds,
Dirk
The easiest way would be to put the robots.txt in the root of your subdomain & block access for search engines
User-agent: Googlebot
Disallow: /
If you subdomain & the main domain are sharing the same root - this option is not possible. In that case, rather than working with robots.txt I would add a canonical on each page pointing to the main domain, or block all pages in the header (if this is technically possible)
You could also check these similar questions: http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt and http://moz.com/community/q/blocking-subdomain-from-google-crawl-and-index - but the answers given are the same as the options above.
Apart from the technical question, qiven the fact that only the labels are translated, these pages make little sense for human users. It would probably make more sense to link to the normal (English) version of the blog (and put (en Anglais) next to the link.
rgds,
Dirk
Hi,
It seems that they are having technical issues - as mentioned by Rand last week on this question http://moz.com/community/q/when-is-ose-updated
"Our index is taking much longer to run than expected. We think there may be a hardware or software issues in processing that's having problems. We'll be doing maintenance so we can get back to faster indices ASAP."
We'll just have to be patient I guess,
Dirk
Hi,
When you say that referral traffic was unaffected - did you see the burst of traffic in Analytics or not?
In webmaster tools - do you see that clicks go down to zero during this timeframe. Was there an increased number of crawl errors during this period?
Was your site able to handle the traffic or did it go down - sometimes when sites are unstable or frequently offline Google temporary removes them from the SERP's until the situation has normalised.
Without additional info it's difficult to judge what exactly could be the cause. I never heard that a sudden peak of traffic from one source had an impact (in the negative sense) on search traffic - normally more visits from other sources are rather positive for your rankings.
rgds,
Dirk
Hi Jennifer,
Migration to https has certain risks (like any other migration of your site). Without the actual url it's difficult to asses what's wrong with the site.
1. You can check here if the SSL was properly implemented: https://www.ssllabs.com/ssltest/
2. There is an interesting article on the technical migration on the site of Yoast (https://yoast.com/move-website-https-ssl/) - and about the potential SEO impacts here: http://moz.com/blog/seo-tips-https-ssl - even if you have already migrated you could check the different steps & check if you have skipped one.
3. Try crawling the site with Screaming Frog - it has a tab Protocol that can show you if all pages are on https or if some are missing. You can also check if all your internal links are updated to the https version.
4. I guess you have created a WMT for https version of your site - check if specific errors are listed.
5. Check pagespeed with google page speed analyser & webpagetest.org - check your scores. It possible that adding the https also made your site slower.
6. Sample pages in different browsers - do you get security warnings when visiting pages. These messages can really frighten your visitors, and have impact on stats like bounce rate & avg. visit duration, and as result have an impact on your rankings
7. Check vital stats in Analytics - like bounce rate, pages/visit, avg visit duration, avg time on page... - did you see major changes after migration. Also check if you see an increase in 404 pages.
Hope this helps in solving your problem,
Dirk
Did you seem a similar drop in Webmaster tools?
When I see such a sudden drop & fast recovery my first guess would be that something went wrong with your analytics tag. How was you overall traffic during this period? Did you see the Facebook traffic appearing in referrals?
Apart from the Facebook push - did you do technical changes to your site in the same timeframe?
rgds,
Dirk
Hi Patrick,
If the question would have been about country targeting I guess your answer would have been correct. As mentioned in the article however, the lowest level of geolocation is country. As the question was about locations "nationwide" I would conclude based on this article that at this point of time Google is unable to detect geo-targeted content based on region or city.
Even for countries I think it's a risky business - as the article doesn't indicate if this "local" bots visit the sites with the same frequency & depth as the normal ones, and they don't clearly indicate which country ip's are used.
It's a different story for languages - because you can indicate in the HTTP header that the content is depending on the user's language. A similar case is the dynamic serving for mobile (https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving?hl=en) - here you can indicate that the content is changing based on the user agent.
As far as I know, there is no way to indicate in the HTTP header that the content is varying based on ip address.
rgds,
Dirk
Hi,
I don't really agree with the answer of Patrick. Depending on the level of personalisation you apply, it can hurt your rankings for locations outside California (our eventual other ip locations for Google bot).
As an example - you manage a chain of Chinese restaurants spread around the country and you have the domain mychineserestaurant.com.
If users accesses the site directly in New York, he will see the address, picture, phone number etc. from the New York restaurant. Googlebot however will never see this content - the bot will only be able to access the content from your branch in Los Angeles. While this is great for the user experience, there is no way to show Google the other locations, as you are obliged to show the bot the same content as normal human users, and hence show the information based on the ip of the bot.
The example of Groupon given by Patrick is not exactly the same - they personalise the homepage based on your ip - but if you search for Groupon New York you go to http://www.groupon.com/local/new-york-city
What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links. In the example above - accessing the site in New York I would go by default to mychineserestaurant.com/newyork but with the option to change the location. This way Google bot would be able to crawl all the locations. It's also the advice coming from Matt Cutts: https://www.mattcutts.com/blog/give-each-store-a-url/
If the personalisation is only minor (example only the local address on the homepage) and if you already have targeted pages for each location it should not really be a problem.
To be honest - it's rather my own opinion than something which is supported by hard facts.
Hope this helps,
Dirk
Hi Alick,
You might find this article useful: http://moz.com/blog/how-to-stop-spam-bots-from-ruining-your-analytics-referral-data - the two options that are listed is blocking bot traffic using htaccess - or filter them in Analytics. If you check the link I gave you in my first answer - there are also some useful resources listed. Other resources can be found here: http://moz.com/community/q/google-analytics-referral-spam (and off course - just google "referral spam" - it's quite a popular subject)
Hope this helps
Dirk
Hi,
It's quite possible that it's just a matter of time before the label shows like Patricks mentions. However, if you check your site with PageSpeed Insights there seems to be something strange with the way you implement the redirect: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.pssl.com which could be the reason why the label does not appear:
Your page has 2 redirects. Redirects introduce additional delays before the page can be loaded.Avoid landing page redirects for the following chain of redirected URLs.
If you want to check this "manually" you can check the headers yourself using web-sniffer.net with a mobile user agent. I would check with your technical team how to avoid this redirect chain.Apart from that, if you check the insights for speed, your scores for both mobile & desktop are not really great. Testing on webpage test http://www.webpagetest.org/result/150429_62_19PT/1/details/ - loaded in 3.7 sec is not extremely bad but not great either. You could probably reduce the load time by combining your 19 js & 6 css files. For mobile 400K images & 270K javascript (!) is probably a bit too much to load over a mobile connection(the test also shows the double redirect).rgds,Dirk
As far as I know if your pages pass the "mobile friendly test" - they get the label "mobile friendly". If you check
=> it's considered mobile friendly so it is ok for the mobile update
If you change your user agent to mobile and do a site:billboard.com you can quite easily identify the pages that are not considered mobile friendly (you should see them in WMT as well). If these pages are important in terms of search traffic, you might consider to create a mobile version of these pages as well.
Don't really understand the point of the redirection for the Google bots - your system should normally send all mobile devices to the mobile version. Googlebot will identify itself as mobile device so should be redirected. There is no need to make specific redirects for the bots.
I would try to focus on the performance on your pages - the score from PageSpeed insights is not great (for mobile/desktop) - https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.billboard.com%2F&tab=mobile
Desktop version is loaded in 8sec (http://www.webpagetest.org/result/150429_4Q_15JQ/) - mobile fully loaded in 20sec (http://www.webpagetest.org/result/150429_S8_15QW/) - so there is probably some room for improvement.
Having the mobile friendly label is the first thing, but if your site is slow to load the user experience will not be great and this can impact your rankings for mobile (regardless of the label).
Hope this helps,
Dirk
I would prefer better load times in Analytics. It samples the actual load time of the pages on your site, and is a good indication on how fast your users are seeing your content. You build your site for users, not for search engines. Normally, the faster your site loads, the better the user experience will be.
Apart from that, Analytics allows you to analyse which browsers, operating systems, etc. have the best/worst loading times, which helps you to prioritise the issues that should be solved.
Page speed insights is a great tool and will give you a lot of useful information on how you can optimise. It's is not a measure of how fast a page is loading. If you have 4 x 200KB images on your site, that are losslessly compressed - the tool will be quite happy to give you a good score on image optimisation, even if images of this size will take ages to load over a mobile connection. On the other hand, it can give you a low score for some render blocking javascript or css file, that in reality hardly has an effect on the user experience.
There is a 3rd tool I often use to measure pagespeed (webpagetest.org) - it also indicates areas of improvement and gives scores on each individual item, and it will also shows you the load time of each individual item on your page. Maybe most important feature: it allows you to see how fast the visible content is completely rendered on screen (which is in fact the most important measure for your visitors).
Hope this helps,
Dirk
Hi,
I am not saying that schema is bad or that you shouldn't do it - it just seems that some big players only use schema on detail pages of an individual product & not on the overview pages. I found an example of site using it - but in the serp's it's only the average rating which appears (example http://www.goodreads.com/author/list/7779.Arthur_C_Clarke). The first result
You can always test what the impact will be - as mentioned before - I guess even for 50 elements fully tagged with Schema the impact on page speed will be minimal. Check your curent pages with webpagetest.org - see the repartition of load time. Probably the html will only account for 10-20% of the load time - rest being images, javascript & css files. Adding a few hundred lines of HTML will not fundamentally change this (text can be compressed quite well)
rgds
Dirk
Hi,
It's not always best practice to use a canonical url for paginated pages - according to Google:
"In cases of paginated content, we recommend either a rel=canonical from component pages to a single-page version of the article, or to use rel=”prev” and rel=”next” pagination markup."
Again from Google:
"When using rel=”alternate” and rel=”canonical” markup, maintain a 1-to-1 ratio between the mobile page and the corresponding desktop page. In particular, avoid annotating many desktop pages referring to a single mobile page (or vice versa)."
If you would decide to stick to the canonical for the pagination your desktop site would have a config like this:
http://www.example.com/article.htm&page=2
href="http://m.example.com/article.htm&page=2" >
On the mobile version m.example.com/article.htm&page=2 this would become:
This way you keep the 1-1 between mobile & still indicate to google that page=2 should not be indexed but rather the 1st page. It's quite possible that
could also work, but the other option seems to be more in line with the guidelines
Hope this helps;
Dirk
Hi,
You might want to check the FAQ on OSE which discusses the most obvious reason why no links are showing up and why there is a difference between OSE & Google WMT.
rgds,
Dirk
Hi John,
I seem to be unable to delete the image myself - I will ask Moz if they can remove the link. I also made the same request @Imgur.
rgds,
Dirk
Hi,
Nothing is wrong. The non-ASCII url is translated to ASCII - the non-sense symbols you see are the translated symbols (you could test that yourself here:http://www.w3schools.com/tags/ref_urlencode.asp.
As Moz works with the "translated" version - it will not find the original version - but it will find the ASCII version. Google is perfectly able to handle these "international url's"
That said, Moz is not really best in class for foreign languages with a lot of symbols and accents - these foreign characters render the page grade report virually useless (see also this quesion: http://moz.com/community/q/foreign-language-character-sets-in-page-grade-reports
rgds,
Dirk
Hi
it's an attempt to combine analytics spam & being funny. In each case you're not the only one - see this discussion posted yesterday : http://moz.com/community/q/analytics-spammer-can-haz-humors-who-else (the image is a bit better - the words are not in a foreign language, but are funny faces made in symbols)
rgds
Dirk
Here is an example of a mark-up for multiple currencies:
itemtype="http://schema.org/Offer">
itemtype="http://schema.org/Offer">
The example is coming from a reliable source
With this mark-up Google should be perfectly able to show the correct price in the SERP's, depending on the country of origin of the user.
Hope this helps,
Dirk
Sean
It's not really necessary to use canonical url's. If you are sure that every piece of content on your site is available on a unique url you don't need to implement them.
It doesn't hurt having them either. Using canonical url's (if implemented properly) can help to avoid duplicate content issues. Like Patrick mentioned, having canonicals doesn't imply that no duplicate content issues exist (I've seen sites where the canonical url is always equal to the url - which renders them completely useless)
Crawl tools like Screaming Frog are the best option to check if you need canonicals, and if you have them, to check if they are properly implemented.
rgds,
Dirk
You can't track this. You can only track the outgoing links. If they just enter google.com (or any other site) in the browser you are unable to track them.
rgds
Dirk
You can put in all your product pages. It seems a logical thing to do both for your company & and your users - a good score from a reliable source will reassure potential buyers that you're a serious company that can be trusted to buy stuff.
I am not sure if the aggregate rating will be displayed in the SERP's (most examples you see are ratings & reviews for individual products), but it will certainly do no harm.
rgds,
Dirk
Not sure if you could really call it a best practice - but in Belgium (3 different languages) the normal configuration is not to determine the default language automatically but rather to present a first time visitor a "choose language" page and store the choice in a cookie for future visits. This is mainly for direct visits.
People coming in via search engines use queries in one of the languages, so normally Google will direct them to pages in that language. Again, on first time visit, the implicit choice of language is stored in a cookie.
All pages contain a link to switch to the other language(s) - which also changes the choice stored in the cookie.
Disadvantage of this system is that you add an additional layer to the site (choice of language) - advantage is that error margin is zero.
Systems which are based ip, browser language, ...etc are not 100% reliable - which could lead to unwanted results (in Belgium quite a sensitive issue if you server a Dutch page to a French speaking person - idem for French & Dutch speaking).
Hope this helps,
Dirk
It is inline with Google policies as long as you add the rating to your organisation & not to an individual product.
If you check the different guidelines
(sources: https://developers.google.com/structured-data/policies & https://developers.google.com/structured-data/rich-snippets/reviews)
Don't see any issue here.
rgds,
Dirk
Hi,
The tagging seems to be on organisation level and looks ok on first sight (you can always check the tag inside WMT or using this tool . From your question I do not really understand where you add the product specific info. Normally they should be in two separate itemtypes: organisation / product (the one for the product should be without the aggregate rating - as this only exist for the organisation & not for the product).
You could check the example on https://schema.org/LocalBusiness (scroll to the bottom) for the tagging of your organisation. For the product - you could check the tagging on https://schema.org/Product (idem).
Hope this helps,
Dirk
Hi,
If you read this article (https://www.mattcutts.com/blog/search-results-in-search-results/) - the official guideline is "Use robots.txt to prevent crawling of search results pages or other auto-generated pages **that don’t add much value for users coming from search engines". **(added the bold)
The question is: what is a search result page. if you're selling LCD tv's - the page which is showing only Panasonic tv's could be considered a search result from a query on the site, but it could also be considered as a page which offers value for users searching for a Panasonic LCD tv. Idem if you look for 'jobs in Montreal' - one of the first results is http://ca.indeed.com/jobs-in-Montréal,-QC - which is the same result that you would get if you would search Montreal on http://ca.indeed.com/
If these sites didn't index these "search results pages" they would almost never show up in the SERP's. I think the important part is "adding value for the users".
On dynamic search pages (or facetted navigation) Google even made best practices (http://googlewebmastercentral.blogspot.nl/2014/02/faceted-navigation-best-and-5-of-worst.html) - even though you could consider all these kind of pages as search results.
Hope this clarifies,
Dirk
Hi,
Don't really understand your remark "it's definitely not a duplicate site" - according to me it is exactly that.
Set-up seems to be quite straightforward: both domain names point to the same server which is a classical case of duplicate content. The domains both are registered by the same company (I guess your client who probably forgot the existence of the old domain) - so all the content is available on both domains
What you should do is to choose a principal site & redirect the other domain with 301 to the main domain which should solve the issue.
rgds,
Dirk
A good checklist on site migration can be found here: http://moz.com/blog/web-site-migration-guide-tips-for-seos
Hope this helps,
Dirk
With a 301 you communicate that the requested resource is no longer available (The requested resource has been assigned a new permanent URI and any future references to this resource SHOULD use one of the returned URIs- source: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html)
If you look at the definition of a canonical url - it indicates the preferred URL to use, so that the search results will be more likely to show users that URL structure. (Google attempts to respect this, but cannot guarantee this in all cases.)
So basically what you are telling to Google:
On your site you ask Google not to index site.com/A.htm - but rather to index url site.com/B.htm
On the url site.com/B.htm you put a 301 to site.com/C.htm - in other words force Google to index C.htm rather than B.htm (the 301 indicates that the page has permanently moved to a new location - so is no longer available on B.htm)
So in fact - you ask Google not to index A.htm but C.htm instead. Rather than doing this in a complicated 2step process using both canonical & redirect it would be simpler & make more sense to directly put a canonical url on A.htm with C.htm as canonical.
In your case you could create www.site.com/iphone but if it's identical to www.site.com/iphone(black,16,000000000010204783).html I don't think you will gain a lot (especially if it requires a lot of development)
rgds,
Dirk
As René mentioned - there is very limited content on the pages. Given the fact that the site is images - you should try to optimise the images you publish on your site. A good guide can be found here: http://moz.com/blog/image-seo-basics-whiteboard-friday
You might also want to check the configuration of your site - http://www.webpagetest.org/result/150427_01_FW1/ - while you get A scores for every element, your homepage takes 17seconds to load & loads about 865KB of data - mainly .js & images. Some of the elements on your page give a 404 - you might want to work on that.
Similar image on pagespeed analyzer from Google https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fbrandstenmedia.com.au%2F&tab=mobile
=> mobile 56/100 - desktop 86/100
The slider on your homepage doesn't seem to work properly (I tested in Chrome on Mac)
rgds,
Dirk
Hi Justin,
The easiest way is just to take the pages off line - they will give a 404 status & after a while they will disappear from the index.
If you are regrouping the content into a new article - it's probably better to redirect these pages to the new one, using a 301 redirect. Here's an how-to: http://www.howto301redirect.com/wordpress-page-301-redirect/ (using a plugin). You could also manage the redirect directly in htaccess - see example here: https://wordpress.org/support/topic/proper-301-redirect-in-htaccess (you can manage the htaccess via the Yoast plugin).
rgds,
Dirk
Normally a canonical url should be physically available - see also: http://googlewebmastercentral.blogspot.be/2013/04/5-common-mistakes-with-relcanonical.html
With a canonical you indicate the Search engines which page you want to have listed in the SERP's. A page which is 301'd to another page will never get listed in the results.
In your case - it's probably better to use the url where your are redirecting to as canonical - or to create a page www.site.com/iphone that is not redirected
rgds,
Dirk
Hi,
I am not sure I adding schema.org on a result page is adding a lot of value. If you send 50 different blocks of structured data how should search engines understand which piece would be relevant to be shown in SERPS. I just did a check on 2 different sites (allrecipes.com & monster.com) - they only seem to use the schema markup on the detail pages - not on the result pages.
If you would like to go ahead - you could always try to measure the impact on the page by creating two (static) versions of a search result page - one with & one without markup and test both versions with webpagetest.org & Google page speed analyser. An alternative would be to using "lazy loading" - you first load the first x results (visible part on screen), when the user scrolls you load the next batch ...and so on. This way, the impact on loading times would remain minimal.
In each case, I would not try to show different pages to users & bots.
rgds,
Dirk
Just be sure that I understand what you're saying:
On pages like https://modli.co/dresses/elegant.html & https://modli.co/dresses.html you want to use rel next/rel previous.
On pages like https://modli.co/dresses.html?category=43&size=25 you plan to use a canonical to https://modli.co/dresses.html
Seem like a good strategy - the only minus is that you would not have a corresponding landing page for somebody looking for "red dress" (page exist on site but points to the generic dresses page). Not sure if these type of queries represent a search volume & if they would be important for your business.
rgds,
Dirk
PS You should check the H1 you use on your pages - on https://modli.co/dresses/elegant.html the H1 is "Elegant" - would be much better if it would be "Elegant dresses"