Google considers this to be spam. Sometimes pages get away with doing this, but generally you're going to eventually get a manual action reported in Search Console.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by MichaelC-15022
-
RE: Google Rich Snippets in E-commerce Category Pages
-
RE: Miriam's 7 Local SEO Predictions for 2019
I think most of the community is currently Christmas shopping online...and making decisions based on fake reviews :-p.
-
RE: Miriam's 7 Local SEO Predictions for 2019
Great predictions, Miriam!
I'll add one more...maybe it's more of a wish than a prediction...that Google will make some sort of serious strides towards cracking down on fake reviews (both positive and negative). Hopefully not as over-the-top as Yelp's approach (which throws a lot of babies out with the bathwater!) though.
-
RE: Client wants to rebrand but insists on keeping their old website live as well...
I'll second Miriam's points, above. There's substantial risk here if both sites are going to be visible to Google.
I'd block the old site in in robots.txt permanently. I'd never redirect the old site to the new, even if cleanup had been done. From the penalty recovery work I've done, it sure feels like Google keeps some sort of permanent flag on your site, even after you've done the cleanup. New, good links don't seem to have as much effect as you'd expect.
For the new site, spend the $$ and do some PR/outreach and some solid, strong links in addition to the core directory links you get via MozLocal. Do some community service work that gets a press mention; offer a scholarship to dentistry students from a specific school, so that the school will link to your scholarship page. A few really good links from newspaper stories will work wonders for getting the new site to rank, both in the 3-pack and in regular organic.
-
RE: Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Depending on how you cause the scroll to happen, Google might render the page unscrolled or scrolled. Usually if it's done in the onload() function via Jscript, Google will execute that script and render the page as it is after the script is executed. I've seen examples though where using JQuery's document ready function is NOT executed by Google to render the page.
Test in Google Search Console, using Fetch and Render as Googlebot.
-
RE: Location pages for Two location business
Hi Justin,
Don't sweat having the NAP of both locations on multiple pages if you don't mark those up with schema.org. FYI, multiple schema.org objects on a page is perfectly normal, even of the same type.
Be sure you have dedicated pages for each location, and on THOSE pages, mark the NAP up with schema. Then, in your Google My Business pages, you want to link to the specific location page that corresponds to the GMB page, NOT to your home page.
You can link back to the GMB page from the location-specific page on your website, or from all pages (e.g. in the footer).
-
RE: Is Pagination & thin text issue affecting our traffic?
It looks fine to me. You're using rel next/prev correctly, you've got plenty of text on the page. You're correctly setting rel canonical to the numbered page. All looks good to me.
-
RE: Should I disavow local citation page links?
I wouldn't sweat it. There are a jillion 3rd-tier business listing directories out there that are pulling that sort of data from the major directories. Yes, it's an issue if ALL you have is super weak links, but you'll need to be doing outreach for link-building anyway so that should not be a big deal.
I'd only disavow links that are actual spam. Not weak but legitimate links.
-
RE: Query results being indexed and providing no value to real estate website - best course of action?
Ideally, you'd set the meta robots in that page to noindex,follow. This will allow link juice to flow from all of those pages to the pages in your main navigation as well as removing them from the index.
If you cannot modify the section of those pages, then, at a minimum, you could tell Webmaster Tools to ignore the pre and start parameters (specify that the parameter merely sorts the data on the page). Then, you'd end up with just 1 page indexed per city, which is probably a lot better than where you are now.
-
RE: What is your opinion in the use of jquery for a continuous scroll type of page layout?
Google is NOT going to see the content that's rendered by scrolling. In general, more is better in terms of content on a single page (provided it's not crap of course). See this article from Search Engine Land.
For those same reasons, having it on separate pages isn't as good an idea. If you think about how RankBrain is supposed to work, Google is going to be looking for terms on the page that commonly co-occur with the page's primary target search term on other pages on the web about that topic. So, by farming subsections of content out to other pages, you're shooting yourself in the foot, as Google is only going to give you brownie points for covering the subtopics in the very first page.
A better way to do this:
- put all the content on one page
- in the onload() or the Jquery document ready function, hide all but the first page's worth of content
- now, you can react to a scroll by calling Jscript functions to hide the currently shown content and show the next page's worth...all on the same URL
-
RE: One landing page or many?
It seems that both people and Google like bigger pages better. See this study that found the average number of words on the page for pages in the top 10 results for something like 20,000 keywords was over 2000 words per page!
This article from SEL is also worth a read, and talks more about conversions etc.
And yes, I think the expand/contract approach is fine. Another good option is to divide the page into tabs (but have all the content present in the HTML), and then only show the content for the currently selected tab. Be sure however that all of the content is technically visible (i.e., not with a style of display:none) when the page initially loads. You can then use something like the JQuery document ready function to THEN walk the tabs and hide all but the top one when the page is done loading.
-
RE: One landing page or many?
Really, there has been a fairly radical change in how Google measures relevance of a page against a given keyword. A year or more ago, you'd have been better off making separate landing pages for each of those terms, putting the target term in the page title, H1 heading, body text, ALT text on an image, etc. etc.
Whether it's the new RankBrain piece of the algo or something else--it seems that Google is no longer as laser-focused on the page title having the EXACT words in it that were in the search term. Google appears to be able to identify the topic that a page is about by looking at the words on the page and how those words co-occur on other pages on the web.
As an example, my travel site has a page on it that I very carefully tuned for the term "best time of year to visit tahiti". So that's the page title, H1 heading, etc. etc....all the usual stuff. That page now ranks #3 for "tahiti weather", which is SUPER competitive, despite not having "weather" in the page title. I think it's only on the page maybe once, in fact. But, the page content talks about storms, precipitation, temperature, seasons, etc. etc. So, even though I'm telling Google that the page is about "the best time of year to visit Tahiti", Google is able to look at all that content and understand that really, it's about weather in Tahiti.
Long-winded story, I know. But I am indeed going somewhere with this...
I'd recommend having a single page targeted at "metal doors", then work all of the other terms into the page content, using subsections and H2's as Attain Design has suggested above.
I'd go a step further, though. Do a search for "metal doors", and look at the top 20 or 30 pages in the results. Look at the subtopics those pages discuss. Are they talking about locking mechanisms? Corrosion resistance? Insulation R-values? You're looking for other aspects of the core topic that you can add to your page to make it a more thorough discussion of the topic.
The theory I've seen as to how Google is doing this relevance is this: they're looking at a set of pages (maybe the top 100?) that they currently rank well for a given topic, and looking at the fairly rare OTHER terms that are showing up on at least some of those 100 pages. As an example, let's say a given term occurs on 90 of those 100 pages--that's a clue that if a page is supposed to be about topic X, and it does NOT have that term on it, it's probably a pretty poor page for that topic. Now, let's say we're looking at a term that occurs on 15 out of those 100 pages--that's probably a subtopic term that only the best pages...the most thorough pages on that topic...will have. If the term occurs on just 1 or 2 of those pages--well, that's probably an anomaly.
-
RE: Key Word in URL - To Include or Exclude?
Be careful that you don't end up with multiple URLs for the same page...if you do want to go that way, then be sure to set a rel=canonical from one to the other.
I don't know about a click-through advantage. You might say that the brand stands out more and is more readable at the end of the URL, actually.
-
RE: Key Word in URL - To Include or Exclude?
I'd agree with Aaron's comments on click through rate. I'd add that I'm still seeing a lot of boost in ranking from having the keywords in the URL itself, so I'd keep "shoes" in the page URLs.
-
RE: Google Rich Snippets in E-commerce Category Pages
I generally recommend putting basic Product markup (name, price, maybe image, URL pointing to the single product page) at that level. The idea here is to let Google understand that that page contains a big list of products that fit the category as seen in the page title.
DO NOT put reviews at this level--I saw something from Google recently that says they consider that to be a spammy attempt to get ratings snippets in the results for that page. Put the reviews only at the single product page level.
-
RE: How can I optimize pages in an index stack
Hello Rod,
Can you explain what you mean by an "index stack"? I haven't seen that term used before.
-
RE: Site went down and traffic hasn't recovered
Is that site still down? Typically when I've seen sites go down, unless it's for a long time, Google doesn't seem to drop it from the index. I had a client site down all day Saturday and it continued to rank well.
And I don't see a reason why that would affect the other sites, unless a huge percentage of their inbound links were from the site that was down--but even then, it would have to be down weeks, at least.
I'm inclined to think that the site outage is a red herring, and that there's something else in common between the sites that's causing an issue. Have you done a fetch-and-render as Googlebot for each of the sites in Search Console? Maybe something is blocked by robots.txt in all the sites that's preventing rendering, and Google is seeing very little content above the fold? <-- bit of a wild guess there...but that's all I've got!
-
RE: Are there any negative side effects of having millions of URLs on your site?
I'll echo Robert's concern about duplicate content. If those facet combinations are creating many pages with very similar content, that could be an issue for you.
If, let's say, there are 100 facet combinations that create essentially the same basic page content, then consider taking facet elements that do NOT substantially change the page content, and use rel=canonical to tell Google that those are all really the same page. For instance, let's say one of the facets is packaging size, and product X comes in boxes of 1, 10, 100, or 500 units. Let's say another facet is color, and it comes in blue, green, or red. Let's say the URLs for these look like this:
www.mysite.com/product.php?pid=12345&color=blue&pkgsize=1
www.mysite.com/product.php?pid=12345&color=green&pkgsize=10
www.mysite.com/product.php?pid=12345&color=red&pkgsize=100
You would want to set the rel=canonical on all of these to:
www.mysite.com/product.php?pid=12345
Be sure that your XML sitemap, your on-page meta robots, and your rel=canonicals are all in agreement. In other words, if a page has meta robots "noindex,follow", it should NOT show up in your XML sitemap. If the pages above have their rel=canonicals set as described, then your sitemap should contain www.mysite.com/product.php?pid=12345 and NONE of the three example URLs with the color and pkgsize parameters above.
-
RE: Multiple Blogs with Google Blogger
Sure thing. Presuming your main site's blog is in WordPress, there's this handy-dandy importer:
https://wordpress.org/plugins/blogger-importer/
There are instructions in the Installation section on how to export your existing Blogspot posts into an XML format that the importer can then read.
-
RE: SSL providers? Any reviews?
I've been pretty happy with Comodo. Some of their interface is a bit confusing, but their support is good and their prices are fine. Enormous numbers of options (which leads to some of the confusion!) but with tech support help I've been able to navigate it all pretty well. I've bought a number of simple ones from them, as well as multi-domain certs. They've also been good at helping me move existing certs from one hosting company to another--with no extra charges.
-
RE: No Google Ranking..yet
Be sure to check in Webmaster Tools to make sure you don't have a manual penalty. I was doing a test on a temporary site to try and spot when Penguin did a data update, and I just copied text from other places to make "filler" content, and I got a manual penalty in just a couple of days because of that :-). So it doesn't have to be links that get you in trouble.
-
RE: What's with the redirects?
Comment out both. The Rewrite Cond is the test condition that if satisfied, causes the rewriterule on the following line to be executed.
-
RE: How long should the Image Alt Text be for SEO?
There isn't really any limit, like there is for page titles, meta descriptions, etc. Typically you'll want the ALT text to explain what's in the image--the original purpose was to show the user what the image was before it was downloaded, and also for vision impaired folks, the screen readers would read to them what was in the image by reading the ALT text.
If you're looking for the image to reinforce the relevance of the page for the page's target topic, then make sure that the topic term is in the ALT text, usually as part of a long phrase or sentence. If you're looking for the image to rank well in Google image search, then I'd keep the ALT text to just what the target term is (and of course make sure the page title reflects that term as well).
-
RE: Responsive image plugins and seo / crawlability
So you've got a big performance issue if you put all 20 images in img src= notation, as the browser is going to try to download those, of course.
What I've done with my travel website with big hotel images (I'll have as many as 75 or more sometimes) is specify the 1st image in img src= notation, then use Javascript to update the src attribute on click or timer.
The downside of this: good luck getting Google to index the other 19 images, even if you put them in an image sitemap. In my experience, Google didn't want to index anything it couldn't verify was really on the page.
You can use @media queries to point at different images for different resolutions, but only if they're background images....which most likely means they won't be indexed, and they won't be seen as content by Panda.
What I've ended up doing is a bit of a hack; I use client-side Javascript to detect the screen resolution, then I can select different sized images based on that. I use Joe Lencioni's SLIR library to take any large image and automatically create and cache the various smaller sizes I need.
-
RE: Google Indexing of Images
I would definitely update that sitemap. If your sitemap is telling Google one thing, and the pages themselves are contradicting the sitemap, AND it's happening thousands of times--that's a negative quality signal to Google, and could affect all sorts of things, from crawl budget to indexation to rankings.
ALT tags are worth fixing as well. That's really the #1 clue Google has to what the images are about. (Other clues: the image filename, and the page title, if it's the main image on the page). Here, I'm presuming that the images are ones you hope to have show up in image search results (otherwise why would you bother creating an image sitemap?)...in which case, you really, REALLY need to put the ALT text on them.
-
RE: Google Indexing of Images
I've not seen instances where Google would index an image that's on a page that's marked noindex.
Be sure that you have consistency between your sitemap and your noindex/index tags on the pages, i.e. don't include a page or image in your sitemap where the page itself (or containing page) indicates noindex.
If you look at how Webmaster Tools OOPS I guess I mean "Search Console" (will Google EVER let a product keep the same name forever???) shows indexation of images in a image sitemap, you'll notice they pair the image indexation count with the web page indexation count. I take that as an indication that they're not interested in indexing images on noindexed pages (which I have to say makes sense to me).
-
RE: Best Captcha Recommendations for Magento Site?
Captcha is one of the most hated web technologies out there. Personally, I'm a fan of the simplistic solutions that just involve a checkbox. See Mike Blumenthal's blog for an example (look for "confirm you are not a spammer").
-
RE: Do mobile and desktop sites that pull content from the same source count as duplicate content?
Be sure you follow the best practices outlined here for separate mobile sites. In short, you want the desktop pages to have a rel alternate tag pointing at the mobile equivalent, and the mobile pages having their rel canonical pointing at the desktop equivalents.
-
RE: 2 menus Responsive website Seo question
In general, you should expect Google to look at your desktop pages when it's calculating PageRank, link juice flow, etc., NOT your mobile menu. So if there are links to your category pages in ONLY your mobile menu and not your desktop, then your category pages won't rank as well as they should.
-
RE: Trailing Slashes and SEO
So if you're talking about www.mysite.com/blog vs. www.mysite.com/blog/ I think I wouldn't worry about it too much. I'd definitely see if you could set a rel=canonical on that page to point to one or the other though.
But honestly, your blog homepage isn't generally going to be a search target anyway....it's the blog posts themselves (and possibly the category archives or tag archives) that will be search targets. Plus your site's homepage, of course, and other non-blog pages.
-
RE: How valid are these types of links?
Well, it's not going to pass link juice to www.domainlinked.com, that's for sure. And I bet you're right about the tracking.
-
RE: Is this okay with google if i can access my sub categories from two different path?
Hi Dev,
It's really going to depend on how much of the content is duplicated. From what I've seen, Google isn't very good at chunking pages up YET. They're good at spotting entire pages duplicated (e.g. press releases or articles syndicated across multiple sites), and pages on your site that have the majority of the content the same. But I don't think you're going to run into trouble with a page that has a number of sections, each of which is an entire page on its own.
Where you MIGHT run into trouble is with Panda and thin content. If the content you have for each of the manufacturers is very light, i.e. just a few sentences and an image or two, then those pages might be seen as thin content. While I don't think you have to hit the magic 2000 word mark on every page to avoid being seen as thin content, you certainly are going to want more than 100 words. And, if those manufacturer pages are important search targets for competitive terms--well, then, you probably WILL want those pages to contain somewhere near 2000 words each.
In THAT case, you'll probably want to change the content on the all-manufacturers page, and instead just put a short excerpt for each manufacturer there, along with some sort of "learn more" link to the single manufacturer page.
-
RE: How to add my company in google search search result in bangalore (INDIA)
I'm pretty sure that Moz Local only supports US locations at this time.
-
RE: Static looking URL - Best practices?
Really, I think people have gotten themselves all twisted up unnecessarily over dynamic URLs and hiding the fact that they're dynamic.
If you're dealing with a URL that really is dynamic, I'd stick with the ? & = notation that's pretty standard for this sort of thing. In my experience, Google is seeing ANY of those characters as word separators, and I'm not really seeing any downside in terms of ranking for terms when using those terms as traditional parameters, e.g.
I'd be careful with using a "+" sign if you go that route, as various conversions from text to URL-safe to HTML-encoded etc. will replace spaces with + signs...and if something is un-encoding that, you might end up with spaces there.
FYI where this all came from was URLs like this:
www.homes.com/showproperty.asp?pid=115235423ion=ABX&type=723
In THAT case, those numeric parameters (which tend to be database record identifiers) are NOT of use to Google in terms of relevance or ranking. But the english parameters in my example further up ARE useful, as they may match some of the query terms.
-
RE: SEO Concerns From Moving Mobile M Dot site to Responsive Version?
Make the old m dot URLs 301 redirect to the responsive version (the new pages). That'll take care of users landing on the m dot pages until Google removes those from the index, and will transfer over any link juice the m dot pages have gathered up (although that should have already happened from rel=canonicals on your m dot pages pointing at the desktop versions...but, if you missed any...).
-
RE: To use new domain name or not?
While it might be questionable, throwing away a well-aged domain, it sure sounds like it's causing you some other issues, including bounce rate issues for people who find you simply because of partial domain name match against their queries.
I'd be inclined to take the hit on domain age and move to the new site. 301 redirect the old site's pages to corresponding pages on the new site. You can set up your old email address to forward to your new one, so people using the old will still get to you fine. I wouldn't use the old email address on the new site, as you'll get people ignoring your emails because it's not from a site they recognize.
-
RE: Can I Use Multiple rel="alternate" Tags on Multiple Domains With the Same Language?
Typically Google is expecting you to use rel=alternate to tell them about (a) mobile vs. desktop versions, or (b) different language versions. If it's neither a device difference, nor a language difference, then really all you should be doing is using rel=canonical from the less important/less original version to the more important/more original version.
-
RE: SEO Impact of High Volume Vertical and Horizontal Internal Linking
No, keep doing it the way you're doing it. That's perfectly good link juice flowing between those pages.
Breadcrumbs are a nice way to communicate the hierarchy to Google--not because they're breadcrumbs, but simply because of their nature: all pages at each level contribute link juice back up to each of its ancestor pages. A child page has the least internal links; its parent has more; its grandparent even more; etc.
-
RE: YouTube vs. LimeLight - What are the SEO pros and cons of each platform for on-site video viewing?
Go over Phil's article carefully, especially the comparison bit at the end. There are times when YouTube makes more sense, but mostly it's either big brand awareness stuff, or things people would start by going to YouTube to find, e.g. rock videos, TV show excerpts, some viral content, etc.
-
RE: 'Mini' versions of our website for overseas markets. Does it matter?
Backlinks are going to matter much, much more than number of pages.
Don't use subdomains; they share almost no domain authority with the parent domain. AND, you aren't as likely to get a country-specific boost as if you used a country-specific TLD, Possibly you're just unclear on the difference: if so, here you go: the subdomains are qualifiers to the left of the domain (e.g. www., blog., etc.) and TLDs are the right side of the domain (e.g. .com, .org, .co.uk, etc.).
E.g. use www.toaddiaries.ca instead of ca.toaddiaries.com.
If the content is really similar across the various countries, i.e. it's just translated, you should use rel=canonical (pointing to the country-specific page) and hreflang alternate (in ALL pages, pointing to all of the other versions of the page). See Maile's talk on this here.
Pay close attention to the distinctions between LANGUAGE and COUNTRY, e.g. spanish versions might exist for dozens of countries, and those differences matter.
-
RE: Using a Colo Load Balancer to serve content
They're right in that you do NOT want the content to be on a different subdomain--in most cases, Google doesn't share domain authority across subdomains.
You can do a reverse proxy to handle this--see Jeremy's writeup here.
Load-balancing is a fairly generic term. I'm really familiar only with BigIP F5 hardware load balancing and Microsoft's software-based load balancing, but it's possible that some load balancing solutions can handle things like the reverse proxy would.
-
RE: 301 Redirect Attribute-Based Dynamic URL to Renamed Attributes
Update all internal links to point to the new URLs. 301 redirects don't pass 100% of the link juice.
The best option is to create a 301 redirect rule(s) that handle all possible situations. If that's not practical, then just worry about the ones that are linked to by external websites. You can spot these by making the transition to the new URLs, then watching for 404 errors in Webmaster Tools, and looking for pages linked to by external sites.
If you're using IIS, you can actually do some pretty complex logic in your 404 handler and make it return a 301 in certain cases. Check this post I did a while back.
-
RE: A/B Split Testing - Rankings Drop? Need an expert opinion...
How big is your site? Is the page being A/B tested a pretty strong page (PA, PR)?
Let's say you've got a relatively small site, with a few dozen pages, and the page in question is one of only a couple linked to from the main nav. So it's one of the stronger pages on the site. Blocking it in robots.txt means Google isn't going to continue to distribute link juice from that page to other pages on your site, so all of those other pages get slightly weaker.
In general, robots.txt isn't the place to block Google, as you throw away the outbound link juice from that page that's blocked. Instead, you'd want to do a meta robots noindex,follow on the page itself.
If you've got a big site, and this particular landing page isn't a major portion of the overall site in terms of PA, then you shouldn't have seen an effect like I've described.
-
RE: YouTube vs. LimeLight - What are the SEO pros and cons of each platform for on-site video viewing?
Hi Jake,
I'm not familiar with LimeLight, but Phil Nottingham did this great writeup on YouTube vs. hosting on other platforms.
One of the things you need to be concerned about is: will Panda recognize the embedded video as rich content on the page? iFramed solutions might not be....traditionally, Google has NOT treated iframed content as existing on the page (although I've seen a couple of examples with clients' sites where iframed-in content has caused the "wrapping" page to rank for content that's only in the iframed).
I'm a big fan of embedding using Wistia, using their video SEO embed type. It automatically creates not only your video sitemap, but also embeds schema.org/VideoObject markup on the page, so that Google absolutely can tell that there's a video embedded there, and what it's about etc. as well.
Michael
-
RE: Hreflang and paginated page
I found no examples, sorry...
I don't understand your comment about rel=canonical. There should be ONLY ONE rel=canonical, and it should reference its own page, EXCEPT in the rare case I outlined above where the content on two different country pages is essentially identical.
-
RE: Homepage ranks worse than subpages
I would agree IF the video is something that people would search for, e.g. big branding content, humor, viral content, music or celebrity related.
Otherwise you're just generating traffic for YouTube. Conversions from visitors to your YouTube page for your company to clicks back to your website will run around 1%.
And the YouTube page for your video is likely to outrank your own page that embeds that video.
Phil Nottingham from Distilled is the master of the universe when it comes to video SEO...see his writeup here where he talks about the pros and cons.
-
RE: Cross Domain Rel Canonical tags vs. Rel Canonical Tags for internal webpages
Rel=canonical pointing to a different domain is essentially telling Google "here's the original copy of this article".
That's fine if you choose to reprint just the occasional bit of content from somewhere else.
It's also a fine strategy to use in a white-label system, where you might have the same content published across a number of sites, all branded differently.
But you want to use this sparingly. If you've got a site with 1000 pages, and 750 of those pages are rel=canonicalled back to another domain, essentially you're telling Google that most of your website is just republished stuff that somebody else wrote. That's not going to be a good signal for Google of the likely quality of the site in general.
If you're in a situation where you really do need to publish a lot of pages on multiple sites, and all of the sites do need to be found in search for SOME terms, then for those duplicated pages, I'd noindex them on the "copy" sites, so that in the example above, Google would only see and index 250 pages, all of which would be original content.
-
RE: Homepage ranks worse than subpages
I'd agree with Monica. Panda's above-the-fold algo is absolutely going to slay your home page. You've got only 1 sentence of content above the fold. Your images in the slider are all clickable (except the Lego image), and besides, they don't seem to be foreground images (except the Lego image)...Panda is likely going to see them as decoration.
Your video is probably not seen as video. I see no schema.org/VideoObject markup, and it doesn't seem to be one of the standard embeds (YouTube, Vimeo, Wistia) that Panda can likely recognize in the HTML.
Everything else on the page is clickable, which (this is my theory only) is likely to cause Panda to see it as navigation....not content.
So....I'd recommend:
- chucking your current slider; choose a different plugin (or write it from scratch, it's only a couple dozen lines of Javascript and a few links of plain old boring HTML), so that the images are seen as content AND they're not clickable, except the next/prev slide buttons
- redesign your layout to pull some of the text below up above the fold, including moving the Archive & Communicate section and its siblings above the giant buttons
- use Wistia to embed the video, and follow their instructions re: creation of a video sitemap
I'd also recommend going into Google Webmaster Tools, and doing a Fetch & Render on your home page, to make sure that Google is able to see your page laid out the way you expect.
-
RE: Hreflang and paginated page
Separate the language markup issue from the pagination issue, and treat each of the paginated pages just like any other page on the site.
You should have an hreflang statement for EVERY language page you support for each page in the pagination sequence, including the current page. So, for example, if we're looking at Italian page 17 of your Purple Widgets category, it should have an hreflang for the Italian page 17, as well as for the English page 17, French page 17, etc.
Rel=next and rel=previous should refer to the page from the same language as the page you're in, i.e. on Italian page 17, rel=prev should point to Italian page 16, and rel=next should point to Italian page 18.
I'm presuming, of course, that the content in the paginated pages is roughly equivalent, i.e. if it's a set of pages of purple widgets that you sort them the same way on the Italian version as the French, etc. But really, if you didn't....I'd still probably do it the same way.
Don't forget to set the rel=canonicals as well. Unless you're looking at two pages with the same language and content but targeting different countries (e.g. Portugal and Brazil, with no pricing info on the pages...in that case, you might rel=canonical both the Portuguese and Brazilian pages to one of those), each page will rel=canonical to itself.