Static homepage content and javascript - is this technique obsolete?
-
Hi
Apologies beforehand for any minor forum transgressions - this is my first post.
I'm redesigning my blog and I have a question re static homepage content.
It used to be common practice in the online gambling sector (and possibly others) to have a block of 'SEO copy' at the footer of the homepage.
To 'trick Google' into thinking it was directly underneath the header, web devs would use javascript to instruct the html to load the div with the SEO copy first.
The logic was that this allowed for the prime real estate of the page to be used for conversion and sales, while still having a block of relevant copy to tell the spiders what the page was about, and to provide deep links into the site.
I attended a seminar just over a year ago at which some notable SEOs said that Google had probably worked this one out but it was impossible to tell. However, I've recently noticed that Everest Poker has what I think is the code commented out, and on PokerStars I can't find it at all (even in the includes).
I would be happy to post the Everest code but, while I've read the etiquette, I'm not 100% whether this is allowed.
So my question is... for the blog I'm redesigning, do I still need to follow this practice? I would prefer search engines saw some static intro text describing the site, rather than the blog posts, the excerpts of which will probably be canonicalized to the actual post pages to avoid duplication issues. But I would prefer this static content to appear below the fold.
What is current best practice here?
Alex
-
Thanks Edward
-
It would be possible to have the text at the beginning of the html document but then display it further down using CSS, not java script.
I don't think there is a massive need to do something like this. In the past Google may not have indexed all of the content from a page, especially if the document size was very large. This position trick would ensure that the important SEO focused content would be indexed.. if you build your site properly and take into account the size, page load speed, make sure the code is clean etc then there should be no need to move the content around.
-
Hi Vahe
Thanks for the response, and the article link - I'll take a look at that later.
However, I think you've misunderstood the situation. The content is not hidden - it's clearly visible and crawlable at the bottom of the page. However, it's placed in a div and that div is loaded immediately after the header, through the use of javascript.
I'm no javascript expert but Everest Poker appears to hvae commented the function out, and PokerStars appears to have removed it altogether.
If that is, in fact, what they've done (and I'm not misreading the code, which is possible), then my question is, does this 'trick' of placing text lower in the page, but telling spiders to crawl it first no longer work.
Hope that clears things up.
Alex
-
Hi Alex,
In my belief, unless served as alternative content, any hidden content is unethical SEO.
Have a go at content stacking - http://www.dummies.com/how-to/content/move-up-your-web-page-content-for-better-search-en.html
Hope this helps,
Vahe
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog Content Displayed on Multiple Pages
We are developing an online guide that will provide information and listing for a few different cities in Canada and the US. We have blog content that will be pulled into each different city's blog articles page. Some articles are location agnostic and can be displayed for any city, and other articles will only be city specific, and only appear under a particular city. www.mysite.com//blog/seattle/article1
Intermediate & Advanced SEO | | EBKMarketing
www.mysite.com/blog/portland/article1 From what I know of SEO, it seems that this is a perfect example for the use of canonicalization. So for article that will appear in multiple city guides, should there be a tag that points to a home for that article www.mysite.com/blog/article1 Thanks0 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Should sub domains to organise content and directories?
I'm working on a site that has directories for service providers and content about those services. My idea is to organise the services into groups, e.g. Web, Graphic, Software Development since they are different topics. Each sub domain (hub) has it's own sales pages, directory of services providers and blog content. E.g. the web hub has web.servicecrowd.com.au (hub home) web.servicecrowd.com.au/blog (hub blog) http://web.servicecrowd.com.au/dir/p (hub directory) Is this overkill or will it help in the long run when there are hundreds of services like dog grooming and DJing? Seems better to have separate sub domains and unique blogs for groups of services and content topics.
Intermediate & Advanced SEO | | ServiceCrowd_AU0 -
Question on Moving Content
I just moved my site from a Wordpress hosted site to Squarespace. We have the same domain, however, the content is now located on a different URL (again, same base domain). I'm unable to easily set up 301 redirects for the old content to be mapped to the new content so I was wondering if anyone had any recommendations for a workaround. Basically, I want to make sure google knows that Product A's page is now located at this new URL. (www.domain.com/11245 > www.domain.com/product-a). Maybe it's something that I don't have to worry about anymore because the old content is gone? I mean, I have a global redirect set up that no matter what you enter after the base domain, it now goes to the homepage but I just want to make sure I'm not missing something here. Really appreciate your help!
Intermediate & Advanced SEO | | TheBatesMillStore1 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0