Best way to re-order page elements based on search engine users
-
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else.
Questions:
- Is it cloaking?
- what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet?
- Is there any better way to re-order elements based on search engine?
Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
-
I think we were confused about how the actual pages differ? Is the visible content Google is getting different, or is it just source code. Without giving away too many details, can you explain how the content/code is different?
-
Both versions meaning, (1) for users coming from Google and (2) coming from everywhere else- yahoo, direct load, e-mail links etc.
-
Agreed - if you're talking about source-code order, it can be considered cloaking AND it's not very effective these days. Google seems to have a general ability to parse the visual elements of your site (at least, site-wide, if not page-by-page). In most cases, just moving around a few code elements has little or no impact. It used to make a difference, but general consensus from people I've talked to is that it hasn't for a couple of years. These days, it can definitely look manipulative.
-
If you're stuck on doing this, I would recommend using a backend programming language like .Net or PHP to detect the Google Bot and generate a completely different page. That being said, it's highly black hat, and I wouldn't recommend doing anything close to it. Google doesn't like being fooled and has stated it penalizes for sites that try to display different content to the bot and users who browse the site normally.
-
I am guessing you are trying to reorder the sequence of HTML or on-page copy / H1 tags or something like that to essentially get the maximum benefit. If that's the case, then it's absolutely not recommended. Anything you are trying to do that only helps your site rank better, unfortunately is a form of cloaking. It's trying to fool the bot.
If however you are trying to help the user, it makes sense, but the way the question sounds, it is unlikely.
Think from a Search Engine's Perspective. Would you like your bot be fooled/manipulated ? The bots get smarter day by day and this form of cloaking is very old and is definitely track-able. Therefore I would suggest you not to do this.
-
What do you mean exactly by "Both versions of the page"?
And what is the outcome you hope to get from this?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://moz.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Local Search Results Tanked My 1st Page Ranking
My site was routinely ranking in the top 2-3 in Google for my relevant search terms. Then I started working on my local SEO. Now I'm in the map list at 1-2, but my site no longer shows up with the rest of the search results. I've heard that this has been happening to other local businesses with a big Google presence. I'm thinking that I should create some micro sites for each location listing that gives a location specific intro and then links to my main site. Then I can sever my main site from Google places. Here are my two questions: 1) Is this going to kill my placement in the map results; and, How long will it take for my main site to get back to its 2-3 spot rankings in Google's regular results?
Intermediate & Advanced SEO | | ToughTimesLawyer0 -
Block search engines from URLs created by internal search engine?
Hey guys, I've got a question for you all that I've been pondering for a few days now. I'm currently doing an SEO Technical Audit for a large scale directory. One major issue that they are having is that their internal search system (Directory Search) will create a new URL everytime a search query is entered by the user. This creates huge amounts of duplication on the website. I'm wondering if it would be best to block search engines from crawling these URLs entirely with Robots.txt? What do you guys think? Bearing in mind there are probably thousands of these pages already in the Google index? Thanks Kim
Intermediate & Advanced SEO | | Voonie0 -
Should product searches (on site searches) be noindex?
We have a large new site that is suffering from a sitewide panda like penalty. The site has 200k pages indexed by Google. Lots of category and sub category page content and about 25% of the product pages have unique content hand written (vs the other pages using copied content). So it seems our site is labeled as thin. I'm wondering about using noindex paramaters for the internal site search. We have a canonical tag on search results pointing to domain.com/search/ (client thought that would help) but I'm wondering if we need to just no index all the product search results. Thoughts?
Intermediate & Advanced SEO | | iAnalyst.com0 -
What's the best way to phase in a complete site redesign?
Our client is in the planning stages of a site redesign that includes moving platforms. The new site will be rolled out in different phases throughout a period of a year. They are planning to put the new site redesign on a subdomain (i.e. www2.website.com) during the roll out of the different phases while eventually switching the new site back over to the www domain once all the phases are complete. We’re afraid that having the new site on the www2 domain will hurt SEO. For example, if their first phase is rolling out a new system to customize a product design and this new design system is hosted on www2.website.com/customize, when a customer picks a product to customize they’ll be linked to www2.website.com/customize instead of the original www.website.com/customize. The old website will start to get phased out as more and more of the new website is completed and users will be directed to www2. Once the entire redesign is completed, the old platform can be removed and the new website moved back to the www subdomian. Is there a better way of rolling out a website redesign in phases and not have it hosted on a different subdomain?
Intermediate & Advanced SEO | | BlueAcorn0 -
Best way to preserve site authority / juice when moving a property to Facebook?
Hi, so, I have a website. Let's call it a cooking website with about 300 pieces of content cross-listed among 20 categories. I want to move my entire site, hook line and sinker, to Facebook. My first thought was to do this with a domain-wide 301, as that would preserve most of the authority and juice my site has built over the years... but would this have a corollary effect of unfocusing my keyword strategy? E.g. is there a risk in doing a sitewide 301 to a single landing page, in that some of the juice I'd be passing to my new home page would be from, say, "recipes for jelly donuts?" Has anyone had an experience making a large product transition like this, and are there any current best practices? Thanks!
Intermediate & Advanced SEO | | Kenn_Gold1