Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
-
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
-
Don't forget that every keyword is different - how you rank depends on what you're doing compared to other sites targeting that term, not just what you're doing on your own site. So some keywords just take a larger, higher-authority link profile to rank for than others. A good place to start with getting links for that page would be to look at the backlinks that other pages that rank for that term have - you may be able to get some links from the same or similar sites.
-
Thanks again!!!
We have just implemented a site footer with keyword targeting the relevant pages. I thought this alone - along with the top navigation, copy (which is 265 words...) would give google enough guidance. But its not!! External links we have just started to do - but I admit - really just started. I was trying in desperation to get the pages right first
Do you think more link would help? We will try adding more copy...... The weird thing is - other pages on the site rank fine - including the homepage for the right keywords!!!
-
Ah OK, thanks for the clarification!
That problem, to me, sounds like you need some links! In general when Google is ranking your home page for a term, instead of the page that is actually about that term, it's because they recognize that your site has some topical relevance for that term, but the individual page doesn't seem that important based on how many links are pointing to it. Are there ways you can flow some additional internal link juice to that page? Are there sites that are linking to your home page right now that are very closely related to the topic of the page in question, that you could ask to point to that page instead? Are there topically-related sites that don't link to that page right now that you could possibly get a link from? All of these will beef up your page authority, which should help.
In terms of your copy being too far down on the page - if you don't think it will negatively impact your user experience, you could try moving it up, or integrating it into the section with the images, but I don't know how much that will help. You also may need more copy on the page - if your page is 300 lines of code long, and only 5 of those lines are unique copy, it's hard to send a strong enough signal of relevance. Can you expand what you say on the page to make it a better resource on the topic at hand?
-
Thanks for the response! The right pages are not showing for the targeted keywords. The homepage is being ranked instead. Yet the homepage doesn't have the keywords represented on it. Google is taking the Dmoz description of the homepage and is using that to rank (for the targeted keywords)???
The pages do show on Google site: search and they do have images on them. Each page has six images on them. We have ensured the size of the images are ok. The copy on the page is at the bottom after the images - could this be an issue?
I have checked on Screaming Frog - nothing in particular stands out.
i know how we can stop google reading dmoz description which will be implemented....but I don't think this will resolve the issue....?
We have noticed - very strangely - the right page gets ranked - then google changes to the homepage????
Any suggestions... It's baffling me...
-
Is the problem that the page isn't appearing in the index, or that it isn't ranking for its target terms?
If the page has a lot of images but doesn't otherwise have much copy, it may be that Google is determining it to be too similar to other pages on your site and so is not displaying it. If it's not being indexed at all (doesn't show up in a site: search or when you search for a block of copy in quotations), double-check that your robots.txt isn't blocking it and that you don't have a meta robots noindex tag on the page. The suggestion of running Screaming Frog on your site to make sure a crawler can find the page is a good one - Screaming Frog will also tell you if the page is returning a weird HTTP status or is blocked by robots.
-
First think i would do is to check the url with Screaming Frog SEO Spider, anyway a link would be really helpful
-
Are you able to share a link to the page in question?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will google be able to crawl all of the pages given that the pages displayed or the info on a page varies according to the city of a user?
So the website I am working for asks for a location before displaying the product pages. There are two cities with multiple warehouses. Based on the users' location, the product pages available in the warehouse serving only in that area are shown. If the user skips location, default warehouse-related product pages are shown. The APIs are all location-based.
Intermediate & Advanced SEO | | Airlift0 -
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
Does content revealed by a 'show more' button get crawled by Google?
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers
Intermediate & Advanced SEO | | SEOhmygod0 -
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
Help with duplicate pages
Hi there, I have a client who's site I am currently reviewing prior to a SEO campaign. They still work with the development team who built the site (not my company). I have discovered 311 instances of duplicate content within the crawl report. The duplicate content appears to either be 1, 2, or 3 versions of the same pages but with differing URL's. Example: http://www.sitename.com http://sitename.com http://sitename.com/index.php And other pages follow a similar or same pattern. I suppose my question is mainly what could be causing this and how can I fix it? Or, is it something that will have to be fixed by the website developers? Thanks in advance Darren
Intermediate & Advanced SEO | | SEODarren0 -
Dfferent url of some other site is shown by Google in cace copy of our site's page
Hi, When i check cached copy of url of my site http://goo.gl/BZw2Zz , the url in cache copy shown by Google is of some other third party site. Why is Google showing third party url in our site's cached url. Did any of you guys faced any such issue. Regards,
Intermediate & Advanced SEO | | vivekrathore0 -
Help With This Page
This is page - http://www.kempruge.com/location/tampa/tampa-personal-injury-legal-attorneys/ - is the most important one to my business, and I can't seem to get it to rank higher. It has the second highest authority and links, second only to my homepage (though none are all that impressive) but it is just buried in the SERPs. Granted, I know Tampa Personal Injury Attorney is the hardest keyword for us to rank for, but there must be some way to improve this. I know getting high quality links is an appropriate answer, but I'm looking for anything I can do solely on my end to improve it. However, if anyone has some ways to make the page more linkable, I'm all ears! Please, if you have a second to take a look, I'd appreciate any and all feedback. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0