Off Screen Rendering & Other Tactics
-
Hi Folks,
We're currently trying to change our website search results to render in HTML in the first instance then switch off to AJAX when our user clicks on filters. But we came across an issue that diminishes the user experience, so we used this method below:
We have moved the search grid offscreen in the initial rendering because we execute a lot of Javascript that modifies the DOM within the grid. Also when a user has performed a search from within the page, the hash is updated to identify the new search terms. Because this is not sent to the server, a user who has done a search and refreshes would see incorrect search results initially and the correct search results would then replace them.
For example, on initial search a user reaches a URL akin to search.veer.com/chicken. When they perform a search from on that page, the hash gets updated tosearch.veer.com/chicken#keyword=monkey. If the user refreshes the page, the server only receives the request for chicken and then serves up the page with those results rendered on it. The Javascript then checks the hash and determines that it needs to run a different search and fires off an AJAX call to get the new results.
If we did not render the results offscreen the user would see the results for chicken (confusingly) and be able to briefly interact with them until the AJAX call returns and the results are replaced with the correct monkey results. By rendering offscreen, the initial results are not visible and the Javascript can move them immediately onscreen if there is no hash, or wait until the AJAX call returns and then rebuild the grid and move it onscreen.
Now I know that rendering text indent to -9999 is a black hat SEO tactic. But, would it be the same in this case? We're only doing this avoid bad UI. Please advise.
Also, we came across these two articles that may serve alternative options. These article state that each tactic is SEO-friendly, but I'd like to run it my the community and see if you guys agree.
http://joshblog.net/2007/08/03/make-your-rich-internet-application-seo-friendly/
http://www.inqbation.com/tools-to-increase-accessibility-in-the-web/
Thank you for your help!
-
Hi Cyrus,
Thanks for your note. So, if the subsequent links are not indexed by Google (via site command) would that be a sure way to know that the links are not getting followed through by Google?
Here's a sample:
http://search.veer.com/food is indexed and in the cache version you can see the image links and the text links below. However in the text version the same links are not visible. Now, I did a site command on the first image - http://marketplace.veer.com/stock-photo/Man-shopping-in-vegetable-department-FAN9018482?slot=01&pg=1&skeywords=search&stermids=1115 and http://marketplace.veer.com/stock-photo/Man-shopping-in-vegetable-department-FAN9018482 to ensure that all possible URL variations are covered. It looks like both links are not indexed. This leads to a conclusion that the links are not getting followed.
Please let me know if you agree. Thanks!
-
This is a grey area, but if they list the links in the text version, they are probably following those links. And if they follow them, they will follow the links on the next page, as long as that page has sufficient PageRank to justify it. So the answer is... maybe.
-
Thanks for your input. I'm learning a lot!
It's interesting because I see the links in the cache version, but not in the text version. Which version is the one I should go with in terms of ensuring that the links are found and crawled through?
Also if they aren't passing PR, anchor text weighting, and other good stuff, does it mean that they can't crawl through to the subsequent pages?
Thank you!
-
It's sort of a trick question, because we know that Google doesn't always list the non-html links it finds in the text only version of it's cache. In fact, it hardly ever does. I think the reason is that Google is still inconsistent about the type of javascript and other types of links it discovers, and so they probably record these links for discovery purposes, but they most likely don't pass much value.
So if Lynx sees the links, it's likely Google does too and is simply not reporting them. That said, if the links aren't listed in Google's cache, it's also more likely those links aren't passing the same value as a regular link (metrics such as pagerank, anchor text weighting, etc)
So we end up in this grey area - it's likely Google is seeing the links, we just don't know how much they are using those links in their ranking algorithms.
-
Hi Cyrus,
I've DL'ed the Lynx browser and compared the browser to the Google Cache Text Version of this page - http://search.veer.com/food. I got two different results - Lynx can definitely see the links in the search results, but Google Text Version is not seeing them. Could it be possible that Lynx browser is different from what Google sees? Or is it that Google Text Version is no longer a valid SEO tool/reference? Please let me know what you think.
Thank you!
-
Hi Folks,
We'd greatly appreciate it if we can get some development expertise regarding the proposed method in the original post in addition to the follow up articles. Does anyone have feedback regarding these? Thank you for your help!
-Corbis
-
Hi Cyrus,
Thanks for your response, we really appreciate it. All of the issues you mentioned are getting addressed in the near future. The reason why the search results pages are not getting index is because they have "noindex" tags on the pages. This is getting addressed as well.
What Lynx browser do you recommend?
Thank you!
-
First of all, let me start with the disclaimer that I'm not an expert in all these areas of technology (AJAX and offscreen rendering) but let me offer my 2 cents.
Google's quality guidelines state that no text should ever be hidden from the user, so in general I recommend against it. In reality, it's actually pretty common - and a lot of webmasters commonly justify CSS image replacement and other techniques using this line of argument > but in general I prefer to always play it safe.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66353
That said, I'm more concerned about other areas of your site, specifically that search results are rendered on their own subdomain, while product images are served from another separate subdomain. Splitting the domain authority like this is most likely hurting the ability of your products to rank.
I couldn't find a cached version of your search results, (probably because of the server no cache control) but it appears all the search results depend on JavaScript to render. At least, when I view the page with JavaScript turned off (using the MozBar) I get an empty page.
Also, how are the search results getting crawled? I see a few links to search results on the homepage and /product/images/ page. On the product pages themselves, I see links to search results like this:
http://search.veer.com/?termonly=1&searchTerm=2647 - which I actually think is a brilliant way to get those search results crawled, but I assume that URLs is identical to http://search.veer.com/childhood? Also, the related keywords disappear when I turn off javascript?!
So, I might be worried that big G isn't crawling your search results pages that much at all, which if true, sorta makes the point moot.
Like I said, I'm not expert in this field, so take everything I say with a grain of salt. If you're goal is to get better crawling/indexing of your product pages through Google crawling your search results,
- I'd do an audit of your site with all javascript turned off, or
- Use a text browser such as Lynx to examine your site
- check your server logs to see how often google is visiting your search results,
- See if there is another way to get better indexing of you product pages (related image links, etc)
- Make sure link juice is flowing through your site appropriately through html text links.
Hope this helps! Best of luck with your SEO.
-
Hi Cesar,
Thanks for the input. Greatly appreciate it.
In this case, we're not trying to serve different content for Google vs. our users. Rather, we're trying to make sure that the links in our internal search results are indexed by Google without hindering user experience. Google will see the same links as the users. It's just that users will experience an AJAX rendering instead of static HTML rendering. Would this still make it Black Hat SEO?
Also, any comment on the two articles?
Thank you for your help!
-
"Now I know that rendering text indent to -9999 is a black hat SEO tactic. But, would it be the same in this case? We're only doing this avoid bad UI. Please advise. "
I would definitely stay away from this one personally. This was and is a huge Black Hat practice. If Google sees it different than the user then stay away from it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Looking for list Pro's & Con's of removing Folder from URL?
Hi We have a sub-folder ("/shop-by-department/") which is pretty much useless on our site and I'm looking to remove it. But the team want a list of the Pro's & Con's in doing so. So for example I'll be changing www.example.ie/shop-by-department/furniture/beds/product-a to www.example.ie/furniture/beds/product-a I know there will be an intial hit as Google adjusts to the change but think it's definitely the way to go. I was lookng for a complete list of the Pro's & Con's to send onto the team. It'll be going to the traditional marketing (print, radio, etc.) too so can ve top-level points too. Hope you can help! Thanks
Web Design | | Frankie-BTDublin0 -
International Websites - Hosting, Domain Names, & Configuration
What is the best way to configure a website that targets a number of countries and languages around the world? For example, Apple has websites optimized for just about every country and language in the world (see: https://www.apple.com/choose-country-region/). When you choose the UK it takes you to: https://www.apple.com/uk/ When you choose China it take you to: https://www.apple.com/cn/ Etc. When you go to apple.co.uk it forwards you to the UK version of the website. The same is true for apple.cn. Is this the ideal way to set it up? I have also seen websites that have each version of the website on its own TLD such as exampleBrand.co.uk and exampleBrand.cn - in this example they don't forward to the .com. My concern with Apple's solution is SEO and hosting. Do consumers favor seeing their country's TLD in search results over exampleBrand.com/uk? For hosting, shouldn't the mainland China version of the website be hosted in China? Is it possible to just host a folder of a website in a certain country such as the cn folder for China? Any suggestions would be appreciated. I was unable to find much info on this.
Web Design | | rx3000 -
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Need suggestion: What is the best internal linking structure for our website to gain in SEO & UX too?
Hi all, We have 3 different editions of our product we are selling with 20 features. 1st edition & 2nd edition comes with 15 features in which 10 are common in each edition. 3rd edition comes with all 20 features. Now what's the best way to interlink and show the navigational menu to highlight 3 editions and features as well? Much appreciated if some one refer me a website with such structure. Thanks
Web Design | | vtmoz0 -
Pin It Button, Too Many Links, & a Javascript question...
One of the sites I work for has some massive on-page link problems. We've been trying to come up with workarounds to lower the amount of links without making drastic changes to the page design and trying to stay within SEO best practices. We had originally considered the NoFollow route a few months back but that's not viable. We changed around some image and text links so they were wrapped together as one link instead of being two links to the same place. We're currently running tests on some pages to see how else to handle the issue. What has me stumped now though is that the damned Pinterest Pin Button counts as an external link and we've added it to every image in our galleries. Originally we found that having a single Pin It button on a page was pulling incorrect images and not listing every possible image on the page... so to make sure that a visitor can pin the exact picture they want, we added the button to everything. We've been seeing a huge uptick in Pinterest traffic so we're definitely happy with that and don't want to get rid of the button. But if we have 300 pictures (which are all links) on a page with Pin It buttons (yet more links) we then have 600+ links on the page. Here's an example page: http://www.fauxpanels.com/portfolio-regency.php When talking with one of my coders, he suggested some form of javascript might be capable of making the button into an event instead of a link and that could be a way to keep the Pin It button while lowering on-page links. I'm honestly not sure how that would work, whether Google would still count it as a link, or whether that is some form of blackhat cloaking technique we should be wary of. Do any of you have experience with similar issues/tactics that you could help me with here? Thanks. TL;DR Too many on page links. Coder suggests javascript "alchemy" to turn lead into gold button links into events. Would this lower links? Or is it bad? Form of Cloaking?
Web Design | | MikeRoberts0 -
For A Corporation With 3 Distinct Business Divisions, Is It Better To Go With 1 Domain & 3 Sub-Domains, 1 Domain & 3 Folders, or 3 Domains for SEO Purposes?
Hi, I am working on a project right now for an existing client, we have one domain up and running well, they want to create an 'umbrella' site to cover three current business divisions and roll everything up under that main site, including the existing site on a totally different domain (would migrate over and 301 redirect from current domain). From what I've researched, I am inclined towards one main domain with three sub-domains due to the amount of content for each business division being significantly different enough that it seems to deserve separation from each other. However, in terms of SEO and maintaining consistent domain authority, would anyone recommend it be better to structure this as just folders/categories falling under the main domain instead of separate sub-domains for each division, and focus keyword targeting on pages tailored to that end within the main domain structure rather than spreading out link-juice to different sub-domains? Thanks!
Web Design | | Dan_InboundHorizons0 -
Why is this page removed from Google & Bing indices?
This page has been removed from indices at Bing and Google, and I can't figure out why. http://www.pingg.com/occasion/weddings This page used to be in those indices There are plenty of internal links to it The rest of the site is fine It's not blocked by meta robots, robots.txt or canonical URL There's nothing else to suggest that the page is being penalized
Web Design | | Ehren0 -
Drop Down Menus & SEO?
Do these typically have a negative impact on SEO? I know this is kind of a vague question, does it make it harder to spider? Are there SEO friendly ways of coding these? There are so many sites out there that have these, so I've got to assume it's different on a case by case basis.
Web Design | | MichaelWeisbaum0