Static homepage content and javascript - is this technique obsolete?
-
Hi
Apologies beforehand for any minor forum transgressions - this is my first post.
I'm redesigning my blog and I have a question re static homepage content.
It used to be common practice in the online gambling sector (and possibly others) to have a block of 'SEO copy' at the footer of the homepage.
To 'trick Google' into thinking it was directly underneath the header, web devs would use javascript to instruct the html to load the div with the SEO copy first.
The logic was that this allowed for the prime real estate of the page to be used for conversion and sales, while still having a block of relevant copy to tell the spiders what the page was about, and to provide deep links into the site.
I attended a seminar just over a year ago at which some notable SEOs said that Google had probably worked this one out but it was impossible to tell. However, I've recently noticed that Everest Poker has what I think is the code commented out, and on PokerStars I can't find it at all (even in the includes).
I would be happy to post the Everest code but, while I've read the etiquette, I'm not 100% whether this is allowed.
So my question is... for the blog I'm redesigning, do I still need to follow this practice? I would prefer search engines saw some static intro text describing the site, rather than the blog posts, the excerpts of which will probably be canonicalized to the actual post pages to avoid duplication issues. But I would prefer this static content to appear below the fold.
What is current best practice here?
Alex
-
Thanks Edward
-
It would be possible to have the text at the beginning of the html document but then display it further down using CSS, not java script.
I don't think there is a massive need to do something like this. In the past Google may not have indexed all of the content from a page, especially if the document size was very large. This position trick would ensure that the important SEO focused content would be indexed.. if you build your site properly and take into account the size, page load speed, make sure the code is clean etc then there should be no need to move the content around.
-
Hi Vahe
Thanks for the response, and the article link - I'll take a look at that later.
However, I think you've misunderstood the situation. The content is not hidden - it's clearly visible and crawlable at the bottom of the page. However, it's placed in a div and that div is loaded immediately after the header, through the use of javascript.
I'm no javascript expert but Everest Poker appears to hvae commented the function out, and PokerStars appears to have removed it altogether.
If that is, in fact, what they've done (and I'm not misreading the code, which is possible), then my question is, does this 'trick' of placing text lower in the page, but telling spiders to crawl it first no longer work.
Hope that clears things up.
Alex
-
Hi Alex,
In my belief, unless served as alternative content, any hidden content is unethical SEO.
Have a go at content stacking - http://www.dummies.com/how-to/content/move-up-your-web-page-content-for-better-search-en.html
Hope this helps,
Vahe
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would you recommend content within Javascript links?
We are an ecommerce site and I have noticed sites like this - workplace-products.co.uk/premises/canteen-furniture.html with hidden content (click on the details link under the canteen image) My question is would this content be as good as content that is placed normally within the body of a website? Because content I place on our pages is more for SE rankings than it is for visitors. Good to get your thoughts Thank you Jon
Intermediate & Advanced SEO | | imrubbish0 -
Linking to own homepage with keywords as link text
I recently discovered, that previous SEO work on a client's website apparently included setting links from subpages to the homepage using keywords as link text that the whole website should rank for. i.e. (fictional example) a subpage about chocolate would link to the homepage via "Visit the best sweet shop in Dallas and get a free sample." I am dubious about the influence this might have - anybody with any tests? I also think that it is quite weird when considering user friendliness - at least I would not expect such a link to take me to the homepage of the very site I was just on, probably browsing in a relevant page. So, what about such links: actually helpful, mostly don't matter or even potentially harmful? Looking forward to your opinions! Nico
Intermediate & Advanced SEO | | netzkern_AG0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Duplicate content in Webmaster tools, is this bad?
We launched a new site, and we did a 301 redirect to every page. I have over 5k duplicate meta tags and title tags. It shows the old page and the new page as having the same title tag and meta description. This isn't true, we changed the titles and meta description, but it still shows up like that. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Duplicate content, website authority and affiliates
We've got a dilemma at the moment with the content we supply to an affiliate. We currently supply the affiliate with our product database which includes everything about a product including the price, title, description and images. The affiliate then lists the products on their website and provides a Commission Junction link back to our ecommerce store which tracks any purchases with the affiliate getting a commission based on any sales via a cookie. This has been very successful for us in terms of sales but we've noticed a significant dip over the past year in ranking whilst the affiliate has achieved a peak...all eyes are pointing towards the Panda update. Whenever I type one of our 'uniquely written' product descriptions into Google, the affiliate website appears higher than ours suggesting Google has ranked them the authority. My question is, without writing unique content for the affiliate and changing the commission junction link. What would be the best option to be recognised as the authority of the content which we wrote in the first place? It always appears on our website first but Google seems to position the affiliate higher than us in the SERPS after a few weeks. The commission junction link is written like this: http://www.anrdoezrs.net/click-1428744-10475505?sid=shopp&url=http://www.outdoormegastore.co.uk/vango-calisto-600xl-tent.html
Intermediate & Advanced SEO | | gavinhoman0