Best way to re-order page elements based on search engine users
-
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else.
Questions:
- Is it cloaking?
- what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet?
- Is there any better way to re-order elements based on search engine?
Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
-
I think we were confused about how the actual pages differ? Is the visible content Google is getting different, or is it just source code. Without giving away too many details, can you explain how the content/code is different?
-
Both versions meaning, (1) for users coming from Google and (2) coming from everywhere else- yahoo, direct load, e-mail links etc.
-
Agreed - if you're talking about source-code order, it can be considered cloaking AND it's not very effective these days. Google seems to have a general ability to parse the visual elements of your site (at least, site-wide, if not page-by-page). In most cases, just moving around a few code elements has little or no impact. It used to make a difference, but general consensus from people I've talked to is that it hasn't for a couple of years. These days, it can definitely look manipulative.
-
If you're stuck on doing this, I would recommend using a backend programming language like .Net or PHP to detect the Google Bot and generate a completely different page. That being said, it's highly black hat, and I wouldn't recommend doing anything close to it. Google doesn't like being fooled and has stated it penalizes for sites that try to display different content to the bot and users who browse the site normally.
-
I am guessing you are trying to reorder the sequence of HTML or on-page copy / H1 tags or something like that to essentially get the maximum benefit. If that's the case, then it's absolutely not recommended. Anything you are trying to do that only helps your site rank better, unfortunately is a form of cloaking. It's trying to fool the bot.
If however you are trying to help the user, it makes sense, but the way the question sounds, it is unlikely.
Think from a Search Engine's Perspective. Would you like your bot be fooled/manipulated ? The bots get smarter day by day and this form of cloaking is very old and is definitely track-able. Therefore I would suggest you not to do this.
-
What do you mean exactly by "Both versions of the page"?
And what is the outcome you hope to get from this?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
VTEX Infinite Scroll Design: What is On-Page SEO Best Practice?
We are migrating to the VTEX E Commerce platform and it is built on javascript, so there are no <a>tags to link product pages together when there is a long list of products. According to the Google Search Console Help document, "Google can follow links only if they are an</a> <a>tag with an href attribute." - Google Search Console Help document </a>http://support.google.com/webmasters/answer/9112205. So, if there a 1000 products, javascript just executes to deliver more content in order to browse through the entire product list. The problem is there is no actual link for crawlers to follow. Has anyone implemented a solution to this or a similar problem?
Intermediate & Advanced SEO | | ggarciabisco0 -
How to best add affiliate links in a way that minimizes panda risk?
We have a site of about 100.000 pages that is getting several million of visitors per year via organic search. We plan to add about 50.000 new pages gradually in the next couple of months and would like to add affiliate links to the new pages. All these 50.000 new pages will have unique quality data that a team has been researching for a while. I would like to add in the area under the fold or towards the end of the pages in an unobstrusive way affiliate links to about 5 different affiliate programs with affiliate links customized to page content and of real value to visitors. Since affiliate links are one of the factors that may trigger panda I am a bit nervous whether we should add the affiliate links and if there is any way of implementing the affiliate links in a way that they may be less likely to trigger panda. E.g. would you consider hiding affiliate links from google by linking to intermediate URL (which I would mark as noindex nofolllow) on our domain which then redirects to the final affiliate landing page (but google may notice via chrome or android data) ? Any other idea?
Intermediate & Advanced SEO | | lcourse0 -
I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2\. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect?
I'm going through the crawl report and it says I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2/ Now, the author/admin/page/2 I can't even find in WordPress, but it is the same thing as blog/page/2 nonetheless. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect it to blog/page/2?
Intermediate & Advanced SEO | | shift-inc0 -
NEW WEBSITE WHAT IS THE BEST WAY TO RECOVERY THE AUTHORITY OF OLD DOMAIN NAME?
HOW TO DO RECOVERY AUTHORITY OF OLD DOMAIN NAME? I got some advise on this in another post here on MOZ based on this i need a few answers TO SUMMERIZE**:****.** My client got some REALLY bad advice when they got their new website. So they ended up changing the domain name and just redirecting everything from the old domain and old website to the front page of the new domain and new website. As the new domain not optimized for SEO they of cause now are not ranking on anything in Google anymore. QUESTION 1 According to my client, they use to rank well on keywords for the old domain and get a lot of organic traffic. They don’t have access to their old google analytics account, and don’t have any reports on their rankings. Can anyone suggestions how I can find out what keywords they were ranking on? QUESTION 2 I will change the domain name back to the old domnain name (the client actually prefer the old domain name) But how to get back most possible page authority: For information titles, descriptions, content has all been rewritten. A - Redirect I will try to match the old urls with the new ones. B - Recreate site structure Make the URL structure of the new website look like the old URL structure Etc. the old structure use to be like olddomain.com/our-destinations/cambadia.html (old) newdomain.com/destinations/Cambodia (new) Or olddomain.com/private-tours.html (old) newdomain.com/tailor-made (new) does the html in the old urls need any attention when recreating the permalinks in the new websites. Look forward to hear your thoughts on this, thanks!
Intermediate & Advanced SEO | | nm19770 -
Best tips for getting a video page to rank?
We have a video for our company, located here: http://www.imageworkscreative.com/imageworks-creative-video It's an overview of our company and the services we offer. We'd like to get this page ranking, but we haven't had much luck so far. Our Youtube account does better, but I'm looking for some things we can do on or offsite to get this page to rank. Any tips would be appreciated!
Intermediate & Advanced SEO | | ScottImageWorks0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Pricing Page vs. No Pricing Page
There are many SEO sites out there that have an SEO Pricing page, IMO this is BS. A SEO company cannot give every person the same quote for diffirent keywords. However, this is something we are currently debating. I don't want a pricing page, because it's a page full of lies. My coworker thinks it is a good idea, and that users look for a pricing page. Suggestions? If I had to build one (which I am debating against) is it better to just explain why pricing can be tricky? or to BS them like most sites do?
Intermediate & Advanced SEO | | SEODinosaur0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640