Why is Google sending traffic to our homepage, not our optimized pages?
-
Hello Forum,
My team and I just completely redid a yoga eCommerce site, including its SEO. The old version of the site didn't feature page-specific optimization and, as a result, Google's search results for our keywords almost always directed visitors to the homepage.
For example, a Google search for the term "yoga bolster" sent users to the homepage, not the product category page for yoga bolsters.
After redoing the site and optimizing specific pages (i.e. the yoga bolster page is now optimized for the keyword "yoga bolster"), the Google search results are still taking users to the homepage, not the optimized page. (i.e. if you search for yoga bolster, find our search result, and click the search result link, you're taken to the homepage, not the bolster page)
It's only been about 36 hours since we've launched the new website and submitted it to Google's webmaster tools.
Does anyone know why Google is still sending people to our homepage and not the keyword-optimized pages we created? Is this a timing issue?
-
Within 36 hours it is definitely a timing issue at this point, but cannot say for sure that in the long term it will sort itself out.
There are many other things to take into consideration other than just META and onpage optimized content.
You would also want to do a high quality informational link building campaign that uses (gracefully) the anchor text "yoga bolster" to this newly optimized page. There is also internal linking you can do that would help re-inforce this directive to Google.
w00t!
-
Hi Pano
Timing is one issue, it will take longer. First the pages need to be indexed which can take much longer than 36 hours for some websites, then the pages need to be ranked for their targeted keywords. You say it's a new website, in which case this would be the case. If the URLs are the same as before, then just a reindexing delay and the ranking factors.
Though the Main Reason will be that your internal product/landing pages most probably do not have any (or any good quality) inbound links to them, chances they are all going to your homepage. I'd suggest looking at this as links are a major ranking factor.
So take a look at the on-page SEO again to ensure that it's as strong as can be, then analyse your inbound link situation and if necessary, get some high quality relevant links from reputable websites to link into your internal pages (though don't pay for the links, free natural links are the way to go).
Also, make sure that visitors and search spiders can easily navigate to your internal pages that you are trying to get ranked, have a check for any crawl errors and make sure that your robots.txt file is not disallowing any such pages from being indexed.
Regards, Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any proof that google can crawl PWA's correctly, yet
At the end of 2018 we rolled out our agency website as a PWA. At the time, Google used Chrome (41) headless to render our website. Although all sources announced at the time that it 'should work', we experienced the opposite. As a solution we implement the option for server side rendering, so that we did not experience any negative effects. We are over a year later. Does anyone have 'evidence' that Google can actually render and correctly interpret client side PWA's?
Web Design | | Erwin000 -
I have a site that has a 302 redirect loop on the home page (www.oncologynurseadvisor.com) i
i am trying to do an audit on it using screaming frog and the 302 stops it. My dev team says it is to discourage Non Human Traffic and that the bots will not see it. Is there any way around this or what can I tell the dev team that shows them it is not working as they state.
Web Design | | HayMktVT0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
What is your opinion in the use of jquery for a continuous scroll type of page layout?
So, I'm in 2 minds about this; let me start with a bit of background info. Context
Web Design | | ChrisAshton
We have a new client who is in the final days of their new site design and were when they first contacted us. Their design essentially uses 5 pages, each with several pages worth of content on each, separated with the use of jquery. What this means is a user can click a menu item from a drop-down in the nav and be taken directly to that section of content like using internal anchor links as if it were a separate page, or they can click the top-level nav item and scroll through each "sub-page" without having to click other links. Vaguely similar to Google's "How Search Works" page if each sector of that page had it's own URL, only without the heavy design elements and slow load time. In this process, scrolling down to each new "sub-page" changes the URL in the address bar and is treated as a new page as far as referencing the page, adding page titles, meta descriptions, backlinks etc. From my research this also means search engines don't see the entire page, they see each sub-page as their own separate item like a normal site. My Reservations I'm worried about this for several reasons, the largest of them being that you're essentially presenting the user with something different to the search engines. The other big one being that I just don't know if search engines really can render this type of formatting correctly or if there's anything I need to look out for here. Since they're so close to launching their new site, I don't have time to set up a test environment and I'm not going to gamble with a new corporate website but they're also going to be very resistant to the advice of "start the design over, it's too dangerous". The Positives
For this client in particular, the design actually works very well. Each of these long pages is essentially about a different service they offer and the continuous scrolling through the "sub-pages" acts as almost a workflow through the process, covering each step in order. It also looks fantastic, loads quickly and has a very simple nav so the overall user experience is great. Since the majority of my focus in SEO is on UX, this is my confusion. Part of me thinks that obscuring the other content on these pages and only showing each individual "sub-page" to search engines is an obvious no-no, the other part of me feels that this kind of user experience and the reasonable prevalence of AJAX/Paralax etc means search engines should be more capable of understanding what's going on here. Can anyone possibly shed some light on this with either some further reading or first-hand experience?0 -
Using a query string for linked, static landing pages - is this good practice?
My company has a page with links for each of our dozen office locations as well as a clickable map. These offices are also linked in the footer of every page along with their phone number. When one of these links is clicked, the visitor is directed to a static page with a picture of the office, contact information, a short description, and some other information. The URL for these pages is displayed as something like http:/example.com/offices.htm?office_id=123456, with seemingly random ID numbers at the end depending on the office that remain static. I know first off that this is probably bad SEO practice, as the URL should be something like htttp://example.com/offices/springfield/ My question is, why is there a question mark in the page URL? I understand that it represents a query string, but I'm not sure why it's there to begin with. A search query should not required if they are just static landing pages, correct?. Is there any reason at all why they would be queries? Is this an issue that needs to be addressed or does it have little to no impact on SEO?
Web Design | | BD690 -
Location specific services pages
Hi guys, I'm working with a client who would has offices in two cities in Australia, and they provide a certain service to both cities. They would like to rank for keyword phrase for the service for both cities. E.g. Window Cleaning Brisbane, Window Cleaning Darwin. I strongly believe in focusing on relevance and visitor experience first, rather than ham-fistedly trying to rank for those phrases. Having said that, I'm thinking of creating two pages for those phrases as a sub-page to the service itself, with the title of those pages containing the geographically specific phrases. E.g.: Window Cleaning -->Window Cleaning Brisbane -->Window Cleaning Darwin The pages would highlight the 'reach' of the service in the two cities, as well as some specific information such as the history of both offices, any distinctions between the services provided, the teams at both locations, and so forth. I feel that although this seems like a valid reason for doing this, I may be overlooking something. What do you guys think?
Web Design | | carlod0 -
Site structure- category pages
Hi, I'm relatively new to SEO but have tried to apply all best practices to my site. However, I've hit a stumbling block when it comes to whether or not to index my category pages. http://istudyenglishonline.com/category/expressions-idioms/ General info: the site has been created with Wordpress and has a directory of English idioms. Each idiom is associated with one or more categories that it falls under (emotions, sports, food etc). Each category has its own page where the list of idioms will be. As each idiom often has more than one associated category, the same idiom will appear in different category pages, thus creating duplicate content. However, I have given each category page its own unique description. The issue is, when there are numerous idioms, the category page will have more than 1 page. I don't have the ability to create a unique description for each subsequent page of the main category. I know that the very model for some vertical search engines (such as indeed.com) is to create such landing pages and that the more "categories" that they have assigned to their job ads, in this case, the more pages created and the more pages indexed in Google. This seems to work very well for them. My question is, am I doing things right? Should I be doing anything to the subsequent category pages to avoid duplicate content? My plan was to have so many idioms associated with so many categories that I have a fair number of landing pages indexed in google, thus attacking the long tail keywords. However, I'm not sure if I am going the right way. Any advice would be much appreciated!
Web Design | | villarroel0 -
Why is site not being indexed by Google, and not showing on a crawl test??
On a site we developed of which .com is forwarded to .net domain, we quit getting crawled by google on about the 20th of Feb. Now when we try to run a crawl test on either url, we get There was an error fetching this page. Error description For some reason the page returned did not describe itself as an html page. It could be possible that the url is serving an image, rss feed, pdf, or xml file of some sort. The crawl tool does not currently report metrics on this type of data. Our other sites are fine and this was up to this date. We took out noodp, noydir today as the only thing we could think of. Site is on WP cms.
Web Design | | RobertFisher0