Best way to re-order page elements based on search engine users
-
Both versions of the page has essentially same content, but in different order. One is for users coming from Google (and google bot) and other is for everybody else.
Questions:
- Is it cloaking?
- what will be the best way to re-order elements on the page: totally different style sheets for each version, or calling in different divs in a same style sheet?
- Is there any better way to re-order elements based on search engine?
Let me make it clear again: the content is same for everyone, just in different order for visitors coming from Google and everybody else. Don't ask me the reason behind it (executive orders!!)
-
I think we were confused about how the actual pages differ? Is the visible content Google is getting different, or is it just source code. Without giving away too many details, can you explain how the content/code is different?
-
Both versions meaning, (1) for users coming from Google and (2) coming from everywhere else- yahoo, direct load, e-mail links etc.
-
Agreed - if you're talking about source-code order, it can be considered cloaking AND it's not very effective these days. Google seems to have a general ability to parse the visual elements of your site (at least, site-wide, if not page-by-page). In most cases, just moving around a few code elements has little or no impact. It used to make a difference, but general consensus from people I've talked to is that it hasn't for a couple of years. These days, it can definitely look manipulative.
-
If you're stuck on doing this, I would recommend using a backend programming language like .Net or PHP to detect the Google Bot and generate a completely different page. That being said, it's highly black hat, and I wouldn't recommend doing anything close to it. Google doesn't like being fooled and has stated it penalizes for sites that try to display different content to the bot and users who browse the site normally.
-
I am guessing you are trying to reorder the sequence of HTML or on-page copy / H1 tags or something like that to essentially get the maximum benefit. If that's the case, then it's absolutely not recommended. Anything you are trying to do that only helps your site rank better, unfortunately is a form of cloaking. It's trying to fool the bot.
If however you are trying to help the user, it makes sense, but the way the question sounds, it is unlikely.
Think from a Search Engine's Perspective. Would you like your bot be fooled/manipulated ? The bots get smarter day by day and this form of cloaking is very old and is definitely track-able. Therefore I would suggest you not to do this.
-
What do you mean exactly by "Both versions of the page"?
And what is the outcome you hope to get from this?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Do I need to re-index the page after editing URL?
Hi, I had to edit some of the URLs. But, google is still showing my old URL in search results for certain keywords, which ofc get 404. By crawling with ScremingFrog it gets me 301 'page not found' and still giving old URLs. Why is that? And do I need to re-index pages with new URLs? Is 'fetch as Google' enough to do that or any other advice? Thanks a lot, hope the topic will help to someone else too. Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page?
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page? If I have 4 or 5 different hashtag link section pages , consolidated into one HTML Page, no chance to get one of the Hashtag Pages to appear as a search result? like, if under one Single Page Travel Guide I have two essential sections: #Attractions #Visa no chance to direct search queries for Visa directly to the Hashtag Link Section of #Visa? Thanks for any help
Intermediate & Advanced SEO | | Muhammad_Jabali0 -
Is there a way to show random blocks of text to users without it affecting SEO? Cloaking for good?
My client has a pretty creative idea for his web copy. In the body of his page there will be a big block of text that contains random industry related terms but within that he will bold and colorize certain words that create a coherent sentence. Something to the effect of "cut through the noise with a marketing team that gets results". Get it? So if you were to read the paragraph word-for-word it would make no sense at all. It's basically a bunch of random words. He's worried this will affect his SEO and appear to be keyword stuffing to Google. My question is: Is there a way to block certain text on a webpage from search engines but show them to users? I guess it would be the opposite of cloaking? But it's still cloaking...isn't it? In the end we'll probably just make the block of text an image instead but I was just wondering if anyone has any creative solutions. Thanks!
Intermediate & Advanced SEO | | TheOceanAgency0 -
Keyword Research: How best to target keywords without using a region as part of the search query.
When doing keyword research and trying to rank for a keyword. I am wondering if we need to localize the query by adding a city to it. For example Phoenix Web Design vs. just targeting web design since Google is localizing search results now. Then when creating content and optimizing the site do we just put the keyword in the title and page content or do we also add the region/city to the keyword phrase? Any insight would be appreciated.
Intermediate & Advanced SEO | | hireawizseo0 -
Can I delay an AJAX call in order to hide specific on page content?
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content. We want to get away from using an iframe to solve potential duplicate content problem. Question: Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
Intermediate & Advanced SEO | | SEOAccount320 -
What's the best way to phase in a complete site redesign?
Our client is in the planning stages of a site redesign that includes moving platforms. The new site will be rolled out in different phases throughout a period of a year. They are planning to put the new site redesign on a subdomain (i.e. www2.website.com) during the roll out of the different phases while eventually switching the new site back over to the www domain once all the phases are complete. We’re afraid that having the new site on the www2 domain will hurt SEO. For example, if their first phase is rolling out a new system to customize a product design and this new design system is hosted on www2.website.com/customize, when a customer picks a product to customize they’ll be linked to www2.website.com/customize instead of the original www.website.com/customize. The old website will start to get phased out as more and more of the new website is completed and users will be directed to www2. Once the entire redesign is completed, the old platform can be removed and the new website moved back to the www subdomian. Is there a better way of rolling out a website redesign in phases and not have it hosted on a different subdomain?
Intermediate & Advanced SEO | | BlueAcorn0