Is Content Location Determined by Source Code or Visual Location in Search Engine's Mind?
-
I have a page with 2 scroll features. First 1/3 of the page (from left) has thumb pictures (not original content) and a vertical scroll next to. Remaining 2/3 of the page has a lot of unique content and a vertical scroll next to it.
Question: Visually on a computer, the unique content is right next to the thumbs, but in the source code the original content shows after these thumbs. Does that mean search engines will see this content as "below the fold" and actually, placing this content below the thumbs (requiring a lot of scrolling to get to the original content) would in a search engine's mind be the exact same location of the content, as the source code shows the same location?
I am trying to understand if search engines base their analysis on source code or also visual location of content? thx
-
That sounds like a reasonable approach. If you wanted to be extra careful you could also ad a robots follow,noindex tag to the header of the paginated pages since they all have very little unique content to add.
A third option, which I would only use if people are linking into those paginated pages (very rare), is to rel canonical the paginated pages to the first page.
-
thx, again. That is my big concern: should I put in the effort to move the content higher on page. It is year 2014 and Google does not give real estate websites or e-commerce sites any clue as to how they want us to deal with duplicate issues (content appearing across a bunch of other websites). I am using "noindex, follow" for the "MLS result pages" where I do not have unique content added, and when I have unique content on Page 1, then I keep entire serious of paginated pages (sometime Page 1 - 100) indexed but add rel=next prev.
Any thoughts on that?
-
I think Google is looking for more extreme situations than the one you have. The content is well-written, useful and isn't so far down the page that someone isn't going to see it. However, I don't have to tell you that it's going to take a LOT to compete in that niche.
Good luck.
-
th, Everett. Appreciate the input. Take a look here: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ - if I move all my "unique content" (currently below the thumbs and large map) up to location where the map is and get rid of that map, you are saying that most likely that will be seen as being located more "above the fold"?
-
Hello Khi5,
I can't say with 100% certainty, but I feel confident that Google looks at both. I'm not sure about other search engines. Specifically, "page layout" algorithm needs to render the html/CSS - and increasingly javascript - in order to determine if there are too many ads "above the fold". Google also used to render the page to provide "instant previews" of each website in the SERPs.
In other words, the all-seeing eye of Google knows if your "unique content" shows up above or below the fold, or even 6,000 pixels off-screen to the left.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the average rank update time after site and/or backlink changes?
What's currently the typical time, ON AVERAGE, it takes to see ranking changes when significant improvements are made to significant ranking signals on a long-established (as opposed to brand new) website? Does the rank update associated with on-page optimization happen sooner than addition of quality backlinks?
Intermediate & Advanced SEO | | JCCMoz0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Why is /home used in this company's home URL?
Just working with a company that has chosen a home URL with /home latched on - very strange indeed - has anybody else comes across this kind of homepage URL "decision" in the past? I can't see why on earth anybody would do this! Perhaps simply a logic-defying decision?
Intermediate & Advanced SEO | | McTaggart0 -
How do I prevent 404's from hurting my site?
I manage a real estate broker's site on which the individual MLS listing pages continually create 404 pages as properties are sold. So, on a site with 2200 pages indexed, roughly half are 404s at any given time. What can I do to mitigate any potential harm from this?
Intermediate & Advanced SEO | | kimmiedawn0 -
Don't affiliate programs have an unfair impact on a company's ability to compete with bigger businesses?
So many coupon sites and other websites these days will only link to your website if you have a relationship with Commission Junction or one of the other large affiliate networks. It seems to me that links on these sites are really unfair as they allow businesses with deep pockets to acquire links unequitably. To me it seems like these are "paid links", as the average website cannot afford the cost of running an affiliate program. Even worse, the only reason why these businesses are earning a link is because they have an affiliate program; that to me should violate some sort of Google rule about types and values of links. The existence of an affiliate program as the only reason for earning a link is preposterous. It's just as bad as paid link directories that have no editorial standards. I realize the affiliate links are wrapped in CJ's code, so that mush diminish the value of the link, but there is still tons of good value in having the brand linked to from these high authority sites.
Intermediate & Advanced SEO | | williamelward0 -
Help my site it's not being indexed
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design.... The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site. After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do.... Any help would be greatly appreciated Thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location. What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
Intermediate & Advanced SEO | | JChronicle0 -
Adding index.php at the end of the url effect it's rankings
I have just had my site updated and we have put index.php at the end of all the urls. Not long after the sites rankings dropped. Checking the backlinks, they all go to (example) http://www.website.com and not http://www.website.com/index.php. So could this change have effected rankings even though it redirects to the new url?
Intermediate & Advanced SEO | | authoritysitebuilder0