Does Unique Content Need to be Located Higher on my webpages?
-
I have 1 page that ranks well with unique written content located high up on page (http://www.honoluluhi5.com/new-condos-in-honolulu/). I struggle to rank for 200+ other pages where unique content requires scrolling (ex: http://www.honoluluhi5.com/oahu/honolulu-homes/). I am thinking to do as follows:
- Change layout of all my pages to have unique content higher on page
- When users are on my site (not coming from search engines) and use my search filters, then users will land on pages where unique content is lower on page (so keep this layout: http://www.honoluluhi5.com/oahu/honolulu-homes/). I will then add these pages to my robots.txt file so they do not show in Google's index. Reason: unique content lower on page offers best user experience.
With unique content higher on page, I expect bounce rate to increase about 10% (based on the 1 page I have with unique content higher), but I think it is worthwhile, as I am sure search engines will start having my pages rank higher.
-
follow backlinks. site artchitecture and quality of content way above competition. I see businesses buying up 100+ keyword rich domains and ranking well for all domains. It tells me 2 things: 1) search engines are not always that clever, 2) I need to be patient, because of 1).
-
Google has not stated anything saying that is is harder for new websites to rank quickly, and I doubt that they would implement something like that into their alogorithm. The reason is it harder for a new website to rank is due to the lack of backlinks and ciatation sources. Without a history, it's harder for Google to see if a website is better or worse than others. This is why they place such a high prioroity on backlinks, as it tells them a broad picture of how trustworthy a site or domain really is. This is one of many factors, but its an important one to consider.
You stated that you have backlinks, have you checked to see if all of them are followed? If the link is not followed, it will only help to direct traffic at your site, not pagerank or weight.
I know a lot of people say this, but focus on laying out your page in a way that will help the user. Moving all your text higher up on the page will not make a magic improvement in your ranking, and I fear that you will spend a lot of time modifying and not get the results you want. Spend time creating really nice listing pages, and having other sites link back to them. Focus on gaining high quality relationships with real estate sites that have authority in the eyes of consumers, and in search engines. Look at large sites that are already successful in search results, and see what you can learn from them. We wrote an article a while back about analyzing your competitors SEO strategy. Might be worth a read for you. Focus on the content of your site, improving the conversion messages, improving the keyword density, and your overall message.
Thats where I would start
-
thanks for the answer. "...placement of the content (above the fold, bellow the fold ..) it's important for ranking - it's not what makes your page rank or don't rank that high" - I am not sure if you are saying it is important or not?
If you look at the URL I sent: http://www.honoluluhi5.com/oahu/honolulu-homes/ - besides the 10 MLS real estate listings on the left side (which all Realtors share), the content lower on the page is all unique - aerial photos, written overview, history of the area and advanced statistical data. My website has only been live for 8 months, has relatively few backlinks (though more than most competitors already, and all natural links - several high quality).
Do we have evidence that Google has tightened the grip and it is tougher for new websites to rank quickly? I am puzzled what may be the reason for the lack of those pages ranking well yet and I think location of the unique content too low on the page may be a main factor. Some insight would be appreciated.
-
Hi,
Although you are right, "real estate" / placement of the content (above the fold, bellow the fold ..) it's important for ranking - it's not what makes your page rank or don't rank that high - for the ones you've sent as examples. The quality of the content (duplicate or unique), competitors, metrics, on-page approach you are taking, keywords targeted, format of serp for some of the keywords there are way more important then placement ....
Thanks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my content being fully read by Google?
Hi mozzers, I wanted to ask you a quick question regarding Google's crawlability of webpages. We just launched a series of content pieces but I believe there's an issue.
Intermediate & Advanced SEO | | TyEl
Based on what I am seeing when I inspect the URL it looks like Google is only able to see a few titles and internal links. For instance, when I inspect one of the URLs on GSC this is the screenshot I am seeing: image.pngWhen I perform the "cache:" I barely see any content**:** image.pngVS one of our blog post image.png Would you agree with me there's a problem here? Is this related to the heavy use of JS? If so somehow I wasn't able to detect this on any of the crawling tools? Thanks!0 -
Need advice on redirects
Hi, I have new web addresses for my subpages. None if them have external links. Should I do redirects to the new pages or just leave the old pages in 404 and let google crawl and rank the new page. I am asking because my current pages don’t have a good ranking and I am thinking starting with a clean url is better. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Importance of Unique Content Location in Source Code
How much does Google value placement of unique content in the source code vs where it is visually displayed? I have a case where my unqiue content visually displays high on page for the user, but in the source code the unique quality content is below duplicate type content that appear across many other domains (think e-commerce category thumbs on left side of screen and 80% right side of screen unique stuff). I have the impression I am at a disadvantage because these pages have the unique / quality content lower in source code. Any thoughts on this?
Intermediate & Advanced SEO | | khi50 -
Is a different location in page title, h1 title, and meta description enough to avoid Duplicate Content concern?
I have a dynamic website which will have location-based internal pages that will have a <title>and <h1> title, and meta description tag that will include the subregion of a city. Each page also will have an 'info' section describing the generic product/service offered which will also include the name of the subregion. The 'specific product/service content will be dynamic but in some cases will be almost identical--ie subregion A may sometimes have the same specific content result as subregion B. Will the difference of just the location put in each of the above tags be enough for me to avoid a Duplicate Content concern?</p></title>
Intermediate & Advanced SEO | | couponguy0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
PDFs and webpages
If a website provides PDF versions of the page as a download option, should the PDF be no-indexed in your opinion? We have to offer PDF versions of the webpage as our customers want them, they are a group who will download/print the pdfs. I thought of leaving the pdfs alone as they site in a subdomain but the more I think about it, I should probably noindex them. My reasons They site in a subdomain, if users have linked to them, my main domain isn't getting the rank juice Duplication issues, they might be affecting the rank of the existing webpages I can't track the PDF as they are in a subdomain, I can see event clicks to them from the main site though On the flipside I could lose out on the traffic the pdfs bring when a user loads it from an organic search and any link existing on the pdf What are your experiences?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Two Sites Similar content?
I just started working at this company last month. We started to add new content to pages like http://www.rockymountainatvmc.com/t/49/-/181/1137/Bridgestone-Motorcycle-Tires. This is their main site. Then i realized it also put the new content on their sister site http://www.jakewilson.com/t/52/-/343/1137/Bridgestone-Motorcycle-Tires. the first site is the main site and I think will get credit for the unique new content. The second one I do not think will get credit and will more than likely be counted as duplicate content. We are changing this so it will no longer be the same. However, I am curious to see ways people think we could fix this issues? Also is it effecting both sits for just the second one?
Intermediate & Advanced SEO | | DoRM0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0