Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Homepage appearing instead of subpage
-
Hi,
I have my homepage which has links saying "bike tours and bike tours in France" because in the past I was only doing bike tours in France. I now do tours all over Europe and I have a page about "bike tours in Franc" only.
The issue I have is that my page about "bike tours in France" never appears in the search ranking it is always my homepage that does for tjhe keyword "bike tours in France". My guess is that it is due to the links that my homepage has ?
How could I make sure my France page appears instead of my homepage ?
Thank you,
-
Thank you that confirms my thinking
-
We run into this a lot at the agency I work for. Many of our clients have outstanding sub-pages for their specific services – such as bike tours – but none of those sub-pages ever rank. Their home page ranks for just about any relevant keyword.
We find that 9/10, this is an authority issue. Any backlinks the site has are to the homepage, and the majority of their sub-pages have zero backlinks, therefore very little authority – if any at all.
Our solution to this is do short-term link building campaigns with a focus on the sub-page. Perhaps writing content specifically on bike tours, with a link from the post to the service page, could be helpful. You can even submit your site to local directories, and when appropriate, link to that specific page, opposed to the home page, to gain some authority.
Best of luck, I hope this helps!
-
I think you should improve the content the France page... It's a pleasure to have been useful
-
I got the structure si all I can do now is remove the links that say bike tours france and that go to the homepage or improve the content of the france page ? Correct ?
Thank you for your detailled reply,
-
The structure of a website or a blog is of great importance for its chances to rank in search engines.
1. A decent structure makes sure Google ‘understands’ your site.
The way your website is structured will give Google important clues about where to nd the most important content. Your site’s structure determines whether a search engine can understand what your site is about, and how easily it will nd and index the content relevant to your site’s purpose and intent.2 A decent structure makes sure you do not compete with your own content.
On your site, you will probably write multiple pages about similar topics.If we would write eight different articles about bike tours, Google would not know which of the articles or page is the most important one. If you do not solve this by creating a clear site structure, you will be competing with our own articles for a high ranking in Google. Solving this problem requires a good internal linking structure and/or taxonomy structure, resulting in higher rankings. This means that you should pick one article (the flagship article/page/product ) for every major keyword you want to be found for and link to that article from all of your other content about the same keyword.How this apply to your website ?
So you need to create a site structure who reflects that hierarchy. You mentioned that your main products/service is "bike tours Europe".Let's take an example1. bike tours Europe
1.1 bike tours France
1.2 bike tours Spain
1.3 bike tours ItalySo probably in your case your case your homepage is optimized for "bike tours in France"
there are many links pointing to your site with that anchor text (Make a backlink analysis)
if Google is putting your homepage in the first place you need figure out why?Once you do that you need to define your structure in order to make it easy for Google to understand your website.
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using a Reverse Proxy and 301 redirect to appear Sub Domain as Sub Directory - what are the SEO Risks?
We’re in process to move WordPress blog URLs from subdomains to sub-directory. We aren’t moving blog physically, but using reverse proxy and 301 redirection to do this. Blog subdomain URL is https://blog.example.com/ and destination sub-directory URL is https://www.example.com/blog/ Our main website is e-commerce marketplace which is YMYL site. This is on Windows server. Due to technical reasons, we can’t physically move our WordPress blog to the main website. Following is our Technical Setup Setup a reverse proxy at https://www.example.com/blog/ pointing to https://blog.example.com/ Use a 301 redirection from https://blog.example.com/ to https://www.example.com/blog/ with an exception if a traffic is coming from main WWW domain then it won’t redirect. Thus, we can eliminate infinite loop. Change all absolute URLs to relative URLs on blog Change the sitemap URL from https://blog.example.com/sitemap.xml to https://www.example.com/blog/sitemap.xml and update all URLs mentioned within the sitemap. SEO Risk Evaluation We have individual GA Tracking ID and individual Google Search Console Properties for main website and blog. We will not merge them. Keep them separate as they are. Keeping this in mind, I am evaluating SEO Risks factors Right now when we receive traffic from main website to blog (or vice versa) then it is considered as referral traffic and new cookies are set for Google Analytics. What’s going to happen when its on the same domain? Which type of settings change should I do in Blog’s Google Search Console? (A). Do I need to request “Change of Address” in the Blog’s search console property? (B). Should I re-submit the sitemap? Do I need to re-submit the blog sitemap from the https://www.example.com/ Google Search Console Property? Main website is e-commerce marketplace which is YMYL website, and blog is all about content. So does that impact SEO? Will this dilute SEO link juice or impact on the main website ranking because following are the key SEO Metrices. (A). Main website’s Avg Session Duration is about 10 minutes and bounce rate is around 30% (B). Blog’s Avg Session Duration is 33 seconds and bounce rate is over 92%
Intermediate & Advanced SEO | | joshibhargav_200 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
No matter what the keyword, only the homepage shows in the SERP
Hi, wondered if someone could help. My clients website shows up well for terms but its always the homepage rather than the targeted landing page. For example, if you search for "teeth whitening anglesey" they appear http://goo.gl/ohJdua however, its the homepage rather than the tooth whitening page http://goo.gl/uVI8gK Thanks Ade
Intermediate & Advanced SEO | | popcreativeltd0 -
Why is this SERP displaying an incorrect URL for my homepage?
The full URL of a particular site's homepage is something like http://www.example.com/directory/.
Intermediate & Advanced SEO | | TheEspresseo
The canonical and og URLs match.
The root domain 301 redirects to it using the absolute path. And yet the SERP (and the cached version of the page) lists it simply as http://www.example.com/. What gives? Could the problem be found at some deeper technical level (.htaccess or DirectoryIndex or something?) We fiddled with things a bit this week, and while our most recent changes appear to have been crawled (and cached), I am wondering whether I should give it some more time before I proceed as if the SERP won't ever reflect the correct URL. If so, how long? [EDIT: From the comments, see here: https://www.youtube.com/watch?v=z8QKIweOzH4#t=2838]0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Block in robots.txt instead of using canonical?
When I use a canonical tag for pages that are variations of the same page, it basically means that I don't want Google to index this page. But at the same time, spiders will go ahead and crawl the page. Isn't this a waste of my crawl budget? Wouldn't it be better to just disallow the page in robots.txt and let Google focus on crawling the pages that I do want indexed? In other words, why should I ever use rel=canonical as opposed to simply disallowing in robots.txt?
Intermediate & Advanced SEO | | YairSpolter0 -
Subpage ranking for homepage keyword
Hi all, May seem like a simple scenario and I might be missing something, but my subpage seems to be ranking for my main homepage keyword. The subpage PR is 28 and my domain authority is 17, how can I get my main home page to rank instead of the sub page (product page)? I want to stay away from exact match anchor text links, any suggestions?
Intermediate & Advanced SEO | | SO_UK0 -
How to get video thumbnails to appear in SERPs
What's the best way to get video thumbs to appear in SERPs?
Intermediate & Advanced SEO | | nicole.healthline1