Hi Becky,
You are correct - normally if a tag is fired it won't be counted as bounce (unless you set "noninteraction=true" - check https://support.google.com/analytics/answer/1033068#NonInteractionEvents)
Dirk
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Becky,
You are correct - normally if a tag is fired it won't be counted as bounce (unless you set "noninteraction=true" - check https://support.google.com/analytics/answer/1033068#NonInteractionEvents)
Dirk
We did something similar in the past. Changing the servers won't make any difference as long as the front-end part isn't changing. Google doesn't care if your site is running on one or multiple servers, or if two parts of the sites are running on different technology, as long as the user experience is not affected.
If you are changing your URL structure at the same time - there are some things you will have to check. There was a post on Moz a few years ago on site migrations that could help. Basically you will have to check that all your current url's are properly redirected to the new ones and that performance of the new site is ok.
Dirk
Don't have hard evidence - but from my personal perspective: My browser is set to be-nl (Belgium)- when I'm in the Netherlands (nl-nl) I am automatically redirected to google.nl & all the results I get are from the Netherlands (even for international sites were Dutch Belgian versions exist). Browser language will have an impact - but in my opinion proximity will be more important.
Dirk
There is a very useful & detailed post on site migration on Moz - I used on two previous migrations and it worked great (no traffic drop at all) - check https://moz.com/blog/web-site-migration-guide-tips-for-seos
Dirk
Maybe Google is desperately trying to find a page which is replying within in reasonable delays... Time to first byte is horrible: http://www.webpagetest.org/result/151104_SQ_WXJ/1/details/
Apart from the speed issue - the 301 as proposed by Justin is the best solution;
Dirk
Nope - Moz is completely unaware of settings you define in Google Search Console.
Why don't you pick a preferred version & 301 redirect the other one to the preferred one?
Dirk
Hi,
You might want to try this solution:
RewriteCond %{REQUEST_FILENAME} !-d # is not directory
RewriteCond %{REQUEST_FILENAME}.php -f # is an existing php file
RewriteCond %{REQUEST_URI} ^(.+).php$ # request URI ends with .php
RewriteRule (.*).php$ /$1 [R=301,L] # redirect from file.php to file
Source: http://stackoverflow.com/questions/1992183/how-to-hide-the-html-extension-with-apache-mod-rewrite (2nd answer)
Dirk
Moz had a similar caching issue one week ago (/ugc/ content cached instead of /blog/ pages). They posted this question on Webmaster forum - may be the answers can help you find a solution (there was an answer from John Mu as well).
Dirk
Normally at the bottom of the graph you see a message stating "Table omitted as it contains no additional information. Select Queries to see the top queries for the current results"
Once you have set the filter on the page you'll have to click on "Queries" again to see the queries that correspond with the page you have put in the filter.
Google does everything to make our live easier...
If Moz is crawling them it implies that somewhere in the source of your page you are linking to these url's. Try crawling your site with a tool like Screamingfrog & check which pages are generating these links. Then look in the source of the page - it should be somewhere in the code.
It would already help if you would put a canonical url like http://www.example.ch/de/example/marken/brand/make-up/c/Cat_Perso_Brand_3 on the page.
Dirk
Hi
Google is being nice & is doing exactly what you ask it to do.
Example: http://www.customlogocases.com/custom-motorola-phone-cases-printed-logo/ - in the source you put:
rel="canonical" href="http://www.customlogocases.com/custom-motorola-phone-cases-printed-logo/?limit=all"/>
So - you ask google to show http://www.customlogocases.com/custom-motorola-phone-cases-printed-logo/?limit=all in the search results rather than http://www.customlogocases.com/custom-motorola-phone-cases-printed-logo/
Update the canonical & remove the limit=all & the problem will be solved.
Dirk
You could launch it on the .com.au extension - but I fear that the geotargeting resulting from the ccTLD is a much stronger signal for Google than the ahfref tag.
Not sure if this link is still valid but it states:
Q: Does “rel alternate hreflang” replace geotargeting?
A: No. This link-element provides a connection between individual URLs, and only allows Google to “swap out” the URLs from your site currently shown in the search results with ones that are more relevant to the user. It does not affect ranking, as geotargeting would.
Check this video from Matt Cutts (2013) on how Google deals with ccTLD's: https://www.youtube.com/watch?v=yJqZIH_0Ars
If your dot.com domain is not available - try if some of these generic tld's is available for your domain - and redirect to the .com as soon as you can. It's not optimal - but I honestly think that the .com.au/us/ solution is not going to work.
Dirk
Hi,
The Moz suggestion on the canonical is to put a self referencing canonical - not to put a canonical on the subpage to the homepage (or the other way round). Canonicals should only be used to avoid duplicate content - which is I guess not the case with your homepage/subpage.
The fact that Google prefers to show your homepage rather than your subpage is something which isn't really 100% under your control - without the actual site it's nearly impossible why this happens. My guess would be that your homepage probably has a higher authority (more incoming links) which Google could prefer over optimised content. If you browse the q&a you'll notice that you are certainly not the only one in this case;
Dirk
If it's only about the H1 - if you look at the source the H1 tag is commented
Example; http://runforcharity.com/start-to-run/get-off-that-couch/getting-fit-for-a-marathon - H1 tag in source:
Seems more an issue with your developers than content.
The main title on that page is GETTING FIT FOR A MARATHON - which is the H2 title. I would change this to the H1?
Dirk
The issue you have is quite similar to the issue you had previously https://moz.com/community/q/sitelinks-issue-different-languages - the same answers apply.
When I check the cached version of the .com version I get the French site
Personally I think you're IP detection system is messing things up. Checking your headers (from Belgium) I get this:
|
HTTP/1.1 302 Found
http://www.revolveclothing.com/ |
---|
HTTP/1.1 302 Found
http://www.revolveclothing.com/?nrv=true |
---|
HTTP/1.1 301 Moved Permanently
http://www.revolveclothing.fr/r/Homepage.jsp?nrv=true |
---|
HTTP/1.1 200 OK
Even if you implemented all the tags correctly there is no guarantee that Google will show the them in the SERP's - the choice whether or not to show them is entirely up to them.
I guess you already checked if all the tags are properly implemented (if not - check here or here)
You could also check this page for the usage guidelines (bottom of the page).
Again - even if you are doing everything correct & you meet the guidelines - it's entirely up to Google if the snippets will be shown or not. You can't control it.
Dirk
Don't know if you did changes in the mean time - but if I check the url you seem to do it ok:
The second url you mentioned will probably not appear in the index due to the canonical.
Dirk
Was a bit to quick in my first answer - on first sight it looks ok - but you messed up the canonicals.
All variations of http://www.key.co.uk/en/key/plastic-storage-boxes#orderBy:5&pageView:list& have a canonical url http://www.key.co.uk/en/key/plastic-storage-boxes - which is great as it's strips away the parameters. It also has a rel next pointing to http://www.key.co.uk/en/key/plastic-storage-boxes?page=2 - which is fine as well.
However - the second page (if you start from the ugly url http://www.key.co.uk/en/key/plastic-storage-boxes#orderBy:5&pageView:list&) has url http://www.key.co.uk/en/key/plastic-storage-boxes#productBeginIndex:30&orderBy:5&pageView:list& - this page has a canonical url http://www.key.co.uk/en/key/plastic-storage-boxes - in fact that page should have the canonical url http://www.key.co.uk/en/key/plastic-storage-boxes?page=2
The coding on the "clean" (=canonical) versions of your url are fine. Check https://support.google.com/webmasters/answer/1663744?hl=en - there is an example listed which is quite similar to your configuration.
Dirk
Bonus: You indicate that you don't have problems with 404's - but your site isn't exactly clean concerning the internal links. Can't crawl using Screaming Frog given the 503 I get - but manually checking few pages it seems you have mal formatted links:
Classic Car Insurance (the .com is missing) - it's not a 404 but DNS lookup error
I would do a full scan with Screaming Frog to check what other technical issues you have on the site.
It's a very thin affiliate site with 0% original content (all content = Amazon). On top of that - its quite heavy to load, has no optimisation whatsoever (H1/Meta/..etc); several elements on page that return 404 status, low pagespeed scores and as it is new, no incoming links.
You could check the logs - it's quite possible that Googlebot hasn't discovered the site yet. If it has visited, it probably considered the site too low quality to index. If not visited, you could register in Search Console and do a "fetch like Google".
It will probably put some pages in the index - but there is no chance that with the current content this site is going to rank.
Dirk
Sorry Peter - I was first - the reward is mine
While I don't disagree with the remarks of Donald - I don't think it's feasible to create really uniques descriptions for each type of product - especially when it's only the dimension which is changing (like the length you mention).
I think you mainly have to options:
keep the duplicate content (which is not necessarily a cause for punishment: check https://support.google.com/webmasters/answer/66359?hl=en), especially if these pages currently are generating traffic & from technical perspective it's not feasible to use one page where you can select the changing dimensions
use canonicals to a type of category page - which contains the general description with links to the products with different dimension
2nd would be my preferred solution - but again if not feasible I would just live with the semi duplicated content (we have some sites in this situation which are just doing fine). It's maybe a risk - but I guess it's not really a big one. You're not the only one in this situation - e-commerce sites selling common products like batteries or replacement parts are in the same situation as you. Many of them do just fine keeping the "duplicate" content.
Dirk
The reward is not important - it's solving your issue which is the thing that matters. I don't know Wordfence - but I'm getting suspicious if a tool from Google (Pagespeed Insights) renders a 503 error. This could indicate that Google bot also might encounter troubles when checking your site. The best way to check is go over your logfiles & check for the responses you serve to the Googlebot.
Removing the country blocking now seems to enable pagespeed insights to work properly. Score is not great (50/100 - mobile) so more or less confirms the test by Webpagetest.org. Try to enable caching for static resources like images and enable gzip to compress data transfer. On top of that - compress your images (some of them are really heavy - check http://www.lesliekays.com/wp-content/uploads/2014/09/brake-hand-position.jpg . The other elements are probably more difficult to do - but these things could already get your score around 70%.
I would try to keep the Wordfence disabled for a week or two - to see if situation improves. Without the log files it's difficult to be sure - but my gut feeling is that this is the main reason why the site isn't ranking. Working on site performance can only help with this - the suggestions above shouldn't be that difficult to implement.
Some other issues - your internal links are not optimal - you have about 600+ links going to 404 (a lot of them = images) - 7 with wrong formatted url (without .com) - 600+ url's which are redirected (in your internal linking you are mixing both www & non-www versions) - so might want to clean up a bit on these ones as well. It shouldn't be a reason for not getting indexed but it's something you should have a look at - especially if the site has some issues before.
I ran a crawl - but it seemed to go on forever - so it's possible you have some kind of eternal loop as well. I stopped at 1781 URI's crawled.
I would recommend you to invest the 99 GBP in Screaming Frog - which would really help you in identifying / solving these issues.
Dirk
Similar to Ubersuggest: keywordtool.io
Can't you do it the other way? Put the temp HTML site on www - use the www2 to put the Drupal instance. Once the other is ready - remove the HTML & activate the Drupal version? Don't really understand what you're going to do with the www version while the www2 temp version is online.
If not possible - can only agree with Egol - it's quite a risk you're running. Quite possible your rankings will take a dive. As we all know, it's easy to go down but a lot harder to regain position.
Hi Egol,
Interesting question, but difficult to answer. Could be a topic to ask on one of the Webmaster hangouts.
It all depends on how Google handles canonicals internally.
One possibility would be that Google considers the page from A that is syndicated on B not really as a page from B but a page from A. In that case, the links from that page would count as an internal link (A->A rather than as an external link B->A).
Another possibility would be that Google considers the fact that B is republishing the content from A as a kind of endorsement for A (in a non SEO world a site would only republish content from another site if the quality was really good). In that case, the links on the syndicated page would have value.
In both cases I would personally keep the links on the page. If you added them, it implies you think these links have some value for the visitor so taking them off wouldn't make much sense (unless your main goal was to add these links in order to optimise your internal link structure)
If you want to be on the safe side - if the links go to "commercial" pages, you could make them nofollow, if it's to other editorial content if would keep them as follow. I wouldn't omit the links - even when "nofollow" they could still generate traffic for your site.
Didn't found any "hard evidence" to support this, but we seem to have come in the stage where Google scared us so much about "bad links" that we start to question all type of incoming links.
Sometimes you just have to trust your gut feeling - if the link looks "normal" in the context (and adds some value for the visitor) I would stick to a follow link.
Dirk
+1 for Egol here. A canonical is just a request to Google - a 301 is a directive Google has to respect. I don't really understand why your technical team is making such a fuzz about it - enforcing the trailing slash (or not) is just 1/2 lines in your .htacess file. Check Stackoverflow
Dirk
The crawl by Googlebot will be more efficient if it can go directly to the destination page rather than having to go trough a redirection.
There is some discussion whether 3xx redirections do have an impact on page rank / page authority - Google official point of view is that it doesn't.
Redirects do however indicate have an impact on speed (check here: https://developers.google.com/speed/docs/insights/mobile: "we strongly encourage webmasters to minimize the number, and ideally eliminate redirects entirely" - context is mobile but is applicable on all redirects) - but again for most sites this won't make a huge difference on total load time.
I doubt that simply cleaning your site and removing the 301 will give a boost to your search traffic, but it just something you need to from time to time (idem for internal 4xx errors) to improve the general health of your site.
Dirk
PS I am a bit puzzled about the remark "auto redirects" - you must make sure if you redirect that you redirect to a page which is similar to the page that has disappeared. Google considers most other type of redirects as "soft 404". If the page never existed - like domain.com/kklfjklgjkldfjg - it should return 404 and not be redirect.
The answer from Kristen is correct. However changing 404 to 410 will just let these pages appearing as 410 in the Search Console. The fact that they are appearing is not a problem - it's just that Google wants to notify you that pages return a 4xx status. If this is intended (like in your case) you can just ignore these messages and mark them as fixed.
In your case you could as well consider another option - remove the pages from the listings but keep them published (with status 200). Update the page, indicating that the original property is sold but list some other (similar) properties as an alternative. This way, if there are external pages linking to the property page the link value doesn't get lost and if people would accidentally land on this page they still find content which could be interesting to them (as you remove the navigation links to these pages they become orphans - so little change that they will rank very high in Google)
Dirk
Hi Satish,
Not sure if I fully understand your answer.
Google will consider a redirect as a soft 404 if you redirect pages to non-related pages. Example: if you redirect a page about "shirts" to a page about "pants" or if you redirect this page to your homepage. If the pages are similar (example "green shirts" to "shirts") it's not considered as a soft 404. I understand that you are redirecting to similar pages - so that should be ok.
If you have pages with low quality incoming links (or a mix of high/low quality links) you can still redirect them - but in case of low quality links it's probably a good idea to disavow them (using the search console) - check https://support.google.com/webmasters/answer/2648487?hl=en
Hope this helps,
Dirk