Does Unique Content Need to be Located Higher on my webpages?
-
I have 1 page that ranks well with unique written content located high up on page (http://www.honoluluhi5.com/new-condos-in-honolulu/). I struggle to rank for 200+ other pages where unique content requires scrolling (ex: http://www.honoluluhi5.com/oahu/honolulu-homes/). I am thinking to do as follows:
- Change layout of all my pages to have unique content higher on page
- When users are on my site (not coming from search engines) and use my search filters, then users will land on pages where unique content is lower on page (so keep this layout: http://www.honoluluhi5.com/oahu/honolulu-homes/). I will then add these pages to my robots.txt file so they do not show in Google's index. Reason: unique content lower on page offers best user experience.
With unique content higher on page, I expect bounce rate to increase about 10% (based on the 1 page I have with unique content higher), but I think it is worthwhile, as I am sure search engines will start having my pages rank higher.
-
follow backlinks. site artchitecture and quality of content way above competition. I see businesses buying up 100+ keyword rich domains and ranking well for all domains. It tells me 2 things: 1) search engines are not always that clever, 2) I need to be patient, because of 1).
-
Google has not stated anything saying that is is harder for new websites to rank quickly, and I doubt that they would implement something like that into their alogorithm. The reason is it harder for a new website to rank is due to the lack of backlinks and ciatation sources. Without a history, it's harder for Google to see if a website is better or worse than others. This is why they place such a high prioroity on backlinks, as it tells them a broad picture of how trustworthy a site or domain really is. This is one of many factors, but its an important one to consider.
You stated that you have backlinks, have you checked to see if all of them are followed? If the link is not followed, it will only help to direct traffic at your site, not pagerank or weight.
I know a lot of people say this, but focus on laying out your page in a way that will help the user. Moving all your text higher up on the page will not make a magic improvement in your ranking, and I fear that you will spend a lot of time modifying and not get the results you want. Spend time creating really nice listing pages, and having other sites link back to them. Focus on gaining high quality relationships with real estate sites that have authority in the eyes of consumers, and in search engines. Look at large sites that are already successful in search results, and see what you can learn from them. We wrote an article a while back about analyzing your competitors SEO strategy. Might be worth a read for you. Focus on the content of your site, improving the conversion messages, improving the keyword density, and your overall message.
Thats where I would start
-
thanks for the answer. "...placement of the content (above the fold, bellow the fold ..) it's important for ranking - it's not what makes your page rank or don't rank that high" - I am not sure if you are saying it is important or not?
If you look at the URL I sent: http://www.honoluluhi5.com/oahu/honolulu-homes/ - besides the 10 MLS real estate listings on the left side (which all Realtors share), the content lower on the page is all unique - aerial photos, written overview, history of the area and advanced statistical data. My website has only been live for 8 months, has relatively few backlinks (though more than most competitors already, and all natural links - several high quality).
Do we have evidence that Google has tightened the grip and it is tougher for new websites to rank quickly? I am puzzled what may be the reason for the lack of those pages ranking well yet and I think location of the unique content too low on the page may be a main factor. Some insight would be appreciated.
-
Hi,
Although you are right, "real estate" / placement of the content (above the fold, bellow the fold ..) it's important for ranking - it's not what makes your page rank or don't rank that high - for the ones you've sent as examples. The quality of the content (duplicate or unique), competitors, metrics, on-page approach you are taking, keywords targeted, format of serp for some of the keywords there are way more important then placement ....
Thanks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which URL should I choose when combining content?
I am combining content from two similar articles into one. URL 1 has a featured snippet and better URL structure, but only 5,000 page views in the last 6 month, and has 39 keywords ranking in the top 10. URL 2 has worse structure, but over 100k page views in the last 6 months, and 236 keywords in the top 10. Basically, I'm wondering if I keep the one with the better URL structure or the one with more traffic. The deleted URL will be redirected to whichever I keep.
Intermediate & Advanced SEO | | curtis-yakketyyak0 -
What should I do if same content ranked twice or more on Google?
I have a Bangla SEO related blog where I have written article like "Domain Selection" "SEO Tools" "MOZ" etc. All the article has been written in Bengali language. I have used wp tag for every post. I have submit xml site map generated by Yoast SEO. However I kept "no index" for category. I know well duplicate content is a major problem for SEO. After publishing my content Google ranked them on 1st page. But my fear is that most of the content twice or more. The keywords are ranked by post, wp post tag and Archive. Now I have a fear of penalty. Please check the screenshot and please suggest me what to do. uRCHf yq7m2 rSLKFLG
Intermediate & Advanced SEO | | AccessTechBD0 -
Need to update Google Search Console profile for http to https change. Will a "change of address" option suffice or do we need to create a new GSC profile?
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106. Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Does Google see this as duplicate content?
I'm working on a site that has too many pages in Google's index as shown in a simple count via a site search (example): site:http://www.mozquestionexample.com I ended up getting a full list of these pages and it shows pages that have been supposedly excluded from the index via GWT url parameters and/or canonicalization For instance, the list of indexed pages shows: 1. http://www.mozquestionexample.com/cool-stuff 2. http://www.mozquestionexample.com/cool-stuff?page=2 3. http://www.mozquestionexample.com?page=3 4. http://www.mozquestionexample.com?mq_source=q-and-a 5. http://www.mozquestionexample.com?type=productss&sort=1date Example #1 above is the one true page for search and the one that all the canonicals reference. Examples #2 and #3 shouldn't be in the index because the canonical points to url #1. Example #4 shouldn't be in the index, because it's just a source code that, again doesn't change the page and the canonical points to #1. Example #5 shouldn't be in the index because it's excluded in parameters as not affecting page content and the canonical is in place. Should I worry about these multiple urls for the same page and if so, what should I do about it? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
PDFs and webpages
If a website provides PDF versions of the page as a download option, should the PDF be no-indexed in your opinion? We have to offer PDF versions of the webpage as our customers want them, they are a group who will download/print the pdfs. I thought of leaving the pdfs alone as they site in a subdomain but the more I think about it, I should probably noindex them. My reasons They site in a subdomain, if users have linked to them, my main domain isn't getting the rank juice Duplication issues, they might be affecting the rank of the existing webpages I can't track the PDF as they are in a subdomain, I can see event clicks to them from the main site though On the flipside I could lose out on the traffic the pdfs bring when a user loads it from an organic search and any link existing on the pdf What are your experiences?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Faceted Navigation and Dupe Content
Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B
Intermediate & Advanced SEO | | bjs20100 -
Do we need breadcrumbs?
I found myself in a weird position today having to defend the use of breadcrumbs.... This is what I wrote.... From an SEO point of view it is best practice to have breadcrumbs as they are high up in the code and help the search engines crawling the site. Do you need a breadcrumb for SEO – Yes – as well as from a usability point of you view users can navigate a breadcrumb instead of hitting the back button. What would you have said?
Intermediate & Advanced SEO | | JohnW-UK0