Best way to re-order page elements based on search engine users
-
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else.
Questions:
- Is it cloaking?
- what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet?
- Is there any better way to re-order elements based on search engine?
Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
-
I think we were confused about how the actual pages differ? Is the visible content Google is getting different, or is it just source code. Without giving away too many details, can you explain how the content/code is different?
-
Both versions meaning, (1) for users coming from Google and (2) coming from everywhere else- yahoo, direct load, e-mail links etc.
-
Agreed - if you're talking about source-code order, it can be considered cloaking AND it's not very effective these days. Google seems to have a general ability to parse the visual elements of your site (at least, site-wide, if not page-by-page). In most cases, just moving around a few code elements has little or no impact. It used to make a difference, but general consensus from people I've talked to is that it hasn't for a couple of years. These days, it can definitely look manipulative.
-
If you're stuck on doing this, I would recommend using a backend programming language like .Net or PHP to detect the Google Bot and generate a completely different page. That being said, it's highly black hat, and I wouldn't recommend doing anything close to it. Google doesn't like being fooled and has stated it penalizes for sites that try to display different content to the bot and users who browse the site normally.
-
I am guessing you are trying to reorder the sequence of HTML or on-page copy / H1 tags or something like that to essentially get the maximum benefit. If that's the case, then it's absolutely not recommended. Anything you are trying to do that only helps your site rank better, unfortunately is a form of cloaking. It's trying to fool the bot.
If however you are trying to help the user, it makes sense, but the way the question sounds, it is unlikely.
Think from a Search Engine's Perspective. Would you like your bot be fooled/manipulated ? The bots get smarter day by day and this form of cloaking is very old and is definitely track-able. Therefore I would suggest you not to do this.
-
What do you mean exactly by "Both versions of the page"?
And what is the outcome you hope to get from this?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Scary bug in search console: All our pages reported as being blocked by robots.txt after https migration
We just migrated to https and created 2 days ago a new property in search console for the https domain. Webmaster Tools account for the https domain now shows for every page in our sitemap the warning: "Sitemap contains urls which are blocked by robots.txt."Also in the dashboard of the search console it shows a red triangle with warning that our root domain would be blocked by robots.txt. 1) When I test the URLs in search console robots.txt test tool all looks fine.2) When I fetch as google and render the page it renders and indexes without problem (would not if it was really blocked in robots.txt)3) We temporarily completely emptied the robots.txt, submitted it in search console and uploaded sitemap again and same warnings even though no robots.txt was online4) We run screaming frog crawl on whole website and it indicates that there is no page blocked by robots.txt5) We carefully revised the whole robots.txt and it does not contain any row that blocks relevant content on our site or our root domain. (same robots.txt was online for last decade in http version without problem)6) In big webmaster tools I could upload the sitemap and so far no error reported.7) we resubmitted sitemaps and same issue8) I see our root domain already with https in google SERPThe site is https://www.languagecourse.netSince the site has significant traffic, if google would really interpret for any reason that our site is blocked by robots we will be in serious trouble.
Intermediate & Advanced SEO | | lcourse
This is really scary, so even if it is just a bug in search console and does not affect crawling of the site, it would be great if someone from google could have a look into the reason for this since for a site owner this really can increase cortisol to unhealthy levels.Anybody ever experienced the same problem?Anybody has an idea where we could report/post this issue?0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Does site size (page count) effect search ranking?
If a company has a handful of large sites that function as collection of unique portals into client-specific content (password protected), will it have any positive effect on search ranking to migrate all of the sites to one URL structure.
Intermediate & Advanced SEO | | trideagroup0 -
Is it OK to Delete a Page and Move Content to a Another Page without 301 re-direct
I have a page "A" that I want to completely delete and move the written content from A" to page "B". Since I am deleting "A" (not keeping page) is it OK to upload the content from "A" to page "B" and search engines will give "B" credit for the unique content? Or, since the content has already once been indexed on "A", "B" may struggle to get full credit for this new unique content, even though page "A" is deleted?
Intermediate & Advanced SEO | | khi50 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
Best way to find broken links on a large site?
I've tried using Xenu, but this is a bit time consuming because it only tells you if the link sin't found & doesn't tell you which pages link to the 404'd page. Webmaster tools seems a bit dated & unreliable. Several of the links it lists as broken aren't. Does anyone have any other suggestions for compiling a list of broken links on a large site>
Intermediate & Advanced SEO | | nicole.healthline1