Thanks Highland!
Posts made by AxialDev
-
From menu to dropdown menu: Is there a risk?
Good day!
We are thinking about replacing a traditional menu on an e-commerce website with a Shop button like on Amazon, with a dropdown and expandable sub-menus.Current menu: Category 1 | Category 2 | Category 3 | ...
New menu: Shop | Search bar
The Shop menu would expand on mouse hover. When clicked, it would link to a directory like on Amazon: https://www.amazon.ca/gp/site-directory/.
Is there anything we should be worried about (ex. link juice, engagement) or considerations to think about (CSS-based vs JS)?
Thanks for your time!
Ben -
RE: Http v https Duplicate Issues
Since HTTPS is now a ranking signal, it is better to use the HTTPS version as the canonical. I would personally make every page of the site HTTPS via 301 redirections (or rel=canonical but those can be trickier to implement).
http://site.com --301--> https://site.com
http://site.com/page1/ --301--> https://site.com/page1/
etc.This may require a few changes to the site (internal links shouldn't have unnecessary redirections, adding the HTTPS site to Search Consol (webmaster tools), etc.) so make sure you look around for resources on migration.
If you decide to keep HTTP only, do not noindex or disallow HTTPS because you may have valuable links pointing to HTTPS which help your ranking.
-
RE: Numbers in URL
The reference uses the words "Consider" and "when possible", which is not as clear as other suggestions Google make. Instructions are crystal clear for other on-page techniques, such as hreflang.
As a power user who works with clients in multiple languages, I frequently switch between languages using the URL, like going from https://support.google.com/webmasters/answer/76329?hl=en to https://support.google.com/webmasters/answer/76329?hl=fr. This wouldn't be possible if the URL was https://support.google.com/webmasters/answer/keep-a-simple-url-structure. For this particular use, I would argue the former are more "user-friendly" than the latter!
More and more the URL is becoming a relic of the past. Sitename and Breadcrumbs are replacing it in SERPs. Browsers on mobile hide it by default. There is no URL bar in recent in-app browsers (Twitter, Facebook, LinkedIn).
On the hand, it has been said in the past that keywords in URLs help search engines understand the context of a link when there is no anchor text.
A few things to consider:
- The need to create 301 redirects and the risk of losing trafic
- The impact on on-site SEO (hreflang, canonicals, sitemaps, internal links, etc.)
- The qualitative impact (do your users expect this feature? do visitors expect this feature?)
- Most importantly, the fact that it's probably a low priority optimization!
- If at all possible, consider running an experiment.
Hope this helps! I left out a clear answer on purpose - because I don't have one.
-
RE: Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Ahh I get it now, redirect every URL from the old site to its homepage. Makes sense!
For point 2) I meant the URL Removal tool to de-index the whole site but this would no longer be needed if I apply the above suggestion.
Thanks a bunch!
-
RE: Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Hi Travis, thanks for your response.
I swear those hreflangs were OK not long ago! We'll fix them up, thanks!
Can you give an example of "non-google sanctioned duplicate content"?
The robots.txt file seems OK even though it's a bit heavy in verbatim. I'll ask to shrink it a bit. (By the way, I was curious about PhysVisible's robots.txt but looks like you're disallowing everything. Thought I'd let you know!)
Thanks again!
-
RE: Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Merci Michael!
Can you elaborate on "Keep the old site running, but 301 redirect all of the pages to the home page..." ? Should any URL on www.oldsite.com redirect to the homepage of www.newsite.com?
We had these options in mind. What do you think of those?
-
robots.txt disallow the old site and map every URL with a 301 to help our users get to the right page while Googlebot won't follow those links (to be tested but seems logical), and/or...
-
Delete the whole old domain in GWT.
Thanks for your time!
-
-
RE: Https slower site Versus Non https faster site??
While I do not know how Google treats HTTPS in regards to site speed, WebPageTest.org uses the following to score a site's Time To First Byte:
"The target time is the time needed for the DNS, socket and SSL negotiations + 100ms. A single letter grade will be deducted for every 100ms beyond the target."
Which means WebPageTest does not penalize a site for being secured.
Edit: For redirections, 301 everything and change previously added redirects to point to the HTTPS so you don't end up with chained redirections.
As far as GWT is concerned, I would add both sites (http://site.com and https://site.com) and use the Change of Address feature on the HTTP one to the HTTPS one.
Hope this helps.
-
RE: Wrapping my head around an e-commerce anchor filter issue, need help
If you are worried that Google follows filter links, you can rel=nofollow those links and include a rel=canonical tag. See this article on faceted navigation: http://googlewebmastercentral.blogspot.ca/2014/02/faceted-navigation-best-and-5-of-worst.html
My understand is that http://makeupaddict.me/6-skin-care#/price-391-1217 will be seen and interpreted as http://makeupaddict.me/6-skin-care. Filtered pages should be seen and interpreted as their unfiltered pages.
This being said, I would compare how both pages looks like in Webmaster Tools using the Fetch as Googlebot tool. This will tell you how it sees the filtered page.
Ben
-
Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
Hi everyone,
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords.Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French.
Background info:
- In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites.
- Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE).
- We have a lot of sites on our C-Block, some of poor quality.
- We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox.
- We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business.
- Only a third of our organic visits come from Canada.
What are our options?
- Change domain and delete the current one?
- Disallow the blog except for a few good articles, hoping it helps Google understand what we really do.
- Keep donating to Adwords?
Any help greatly appreciated!
Thanks! -
RE: Is it better to have an hreflang go to the home page in a different language if there's no corresponding page
If an English page doesn't have an equivalent Spanish page, it's best not to include hreflangs.
Hreflangs need to be bi-directional, meaning if site.com/en/a.html hreflangs to site.com/es/a.html, then site.com/es/a.html also needs to hreflang to site.com/en/a.html. This can't be done if you refer the homepage.
I'd make a note to add them later when pages are translated.
-
RE: Webmaster Tools Not Indexing New Pages
Well I do see the page in SERPs using site:http://orangeoctop.us/against-real-madrid-fifa-15/ (2 days old, and it was published 3 days ago), so all must be OK now. In any case :
-
You can try the Fetch as Googlebot to check if there's anything weird happening, but all looks normal from a quick glance.
-
Submitting a sitemap could speed indexation.
-
You blog ~3 times a month, so if this frequency is consistent over time, Googlebot may learn it doesn't have to crawl your site daily. Check the crawl sections in GWT for insight.
-
Screamingfrog can reach your blog post, so other crawlers shouldn't have a problem.
Hope this helps!
-
-
JavaScript IP-based redirection, best approach?
Hi everyone,
What are the best practices for implementing Javascript redirections like on http://www.nike.com/ to send visitors to the right country section? I see it uses cookies and sessions to store the country and language, but what about search engines? Are they redirected via JS? Are there any risks that Google can't crawl everything?
We had IP-based, server-side redirections on a few country-specific websites (purehazelwood.com, purnoisetier.fr, purnoisetier.com) that we had to remove because googlebot was always redirected to the US site and couldn't access the other sites. We instead added pop-ups if the visitor is accessing the "wrong" site but we'd like the redirection to be automatic.
Is the javascript approach the best? Anything else we need to think about?
Thanks for your time!
-
RE: AJAX and High Number Of URLS Indexed
You could try using a robots.txt with a wildcard to stop Google from visiting those URLs :
Disallow: /*Tire?
or
Disallow: /*?0
It would help to see a full URL example (and matching categories).
See: https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt : URL matching based on path values
-
RE: Auto Complete misspells our brand. Can we do anything about it?
That is one bad predictive model then Thanks for the answer.
-
Auto Complete misspells our brand. Can we do anything about it?
Howdy!
Let's say our French client's name is "Something Réseau".
When we type in "Something" in Google, we're suggested "Something Réso" which sounds the same but is incorrect.
In the past year, the correct name has sent +21,000 branded keywords visits, whereas the incorrect has sent shy of 800. The brand is not new.
Anything we can do about it? Thanks!
-
RE: I have a mobile version and a standard version of my website. I'd like to show users some pages on the non-mobile site but keep googlebot mobile out. Is that ok?
You're on the right track as far as showing users the PC versions of pages without an equivalent instead of the 404's: "If your content is not available in a smartphone-friendly format, serve the desktop page instead." -http://googlewebmastercentral.blogspot.ca/2013/06/changes-in-rankings-of-smartphone_11.html
However, you should let Googlebot (mobile or not) crawl your site the same way as your users would, which means you shouldn't block googlebot-mobile. It is totally fine for a desktop page to be ranked in the mobile index. It's not ideal (see article above) but it's better than not being there at all.
Think about it this way: if you're OK with your mobile users to see non-mobile-optimised pages, then it doesn't matter whether they come from Google or not. The page is still relevent to them.
As an extra, you could consider making some of the non-optimised pages responsive such as to make the user experience a bit better. Sometimes a few tweaks here and there can make a difference.
-
RE: How do I geo-target continents & avoid duplicate content?
Moz most definitively need a "give a beer" feature!! Thanks for the in-depth response. We'll also work on building "local" links as you suggest.
We've since changed the structure of the site to :
USA/Canada: www.site.com
Europe EN: www.site.com/en_gb/
Europe FR: www.site.com/fr_fr/
Canada FR: www.site.com/fr/That way we can use hreflang and avoid duplicate content. In your experience, will Google serve www.site.com/fr_fr/ instead of www.site.com/fr/ to Belgium and Switzerland? Will UK and Ireland see www.site.com or www.site.com/en_gb/ ?
Thanks a lot for the answer!
-
RE: What is the point of XML site maps?
Your sitemap.xml will help googlebot crawl deep pages, but it serves other purposes such as:
-
helping Google identify canonical pages: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139066#3
-
creating sitemaps for video, images, etc.: "you can also use Sitemaps to provide Google with metadata about specific types of content on your site, including video, images, mobile, and News. For example, a video Sitemap entry can specify the running time, category, and family-friendly status of a video; an image Sitemap entry can provide information about an image’s subject matter, type, and license." http://support.google.com/webmasters/bin/answer.py?hl=en&hlrm=fr&answer=156184
-
you can specify alternate content, such as the URL of a translated page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
-
and more.
Sometimes working with a sitemap is less risky and maintenance is easier, especially when your CMS is limitative. The 3rd point is a good example. You may also appreciate the centralized approach more from a personnal point of view.
There are good resources on the Google webmaster resources, check them out.
Hope this helps!
-
-
How do I geo-target continents & avoid duplicate content?
Hi everyone,
We have a website which will have content tailored for a few locations:
USA: www.site.com
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-caLink hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices.
What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent.
Thanks!
-
RE: Using H3-4 tags in the footer or sidebars: good or not?
Hi Daniel,
Any reason you would be less inclined to use heading tags in the footer vs sidebars?
Thanks
-
Using H3-4 tags in the footer or sidebars: good or not?
Howdy SEOmoz fans!
Is it considered a good / bad / neutral practice to include H tags in the footer, as a mean to group a few links?
Take http://www.seomoz.org/ for instance:
- Voted Best SEO Tool 2010! = H2
- Looking for SEO consulting? = H3
- Product and Tools = H3- Company = H3
- etc.
I often see the same principle applied to sidebars.
I feel like because they don't contribute to the actual content structure and because they are repeated from page to page, we should avoid them, but I have nothing to back my intuition.[+] Perhaps they are helpful for usability (screen readers) and thin added value (i.e. category names that carry more weight than if they weren't headers).
What do you think?
Thanks for your time.