Horrible ranking for hundreds of content pages
-
We have hundreds of pages of restaurant reviews (each is 100% original content) on our site (for example, here is a list of many of the content pages: https://www.rewards21.com/articles). The PA of almost every page is 1 (ouch).
Doing even the most long-tail Google search doesn't bring up one of those pages (even if I include our company name in the search string).
I'm certain we have lots of room to improve SEO, but not even seeing the pages come up for even the most long-tail search has me concerned...
Our site used to be on Wordpress many months ago and many of these pages ranked fine..ever since switching away from Wordpress, we've not made any SEO progress. Am wondering if something major is "broken" with our site SEO...
-
Thanks Rebecca. Unfortunately if you remove the brand name, even a long tail search that includes the restaurant name and city ranks incredibly low which just seems unusual.
Webmaster claims only a tiny number of indexed pages, but when I google "site:" with our domain name, i see thousands of pages in the index...
The query report in GA isn't showing me any data unfortunately. Haven't set up a rank checker yet only because it's so low I figured there was probably some overall major issue going on..
-
The few that I tested including your brand name turned up page 1 results in an incognito search. They're definitely indexed, and performing well enough on branded searches that I saw.
Are you running any kind of rank checker? Looking at query reports in Webmaster Console or GA?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword ranking for different page than the page optimized
I have optimized "equipment trailer for rent" on this page: http://www.bigtrailerrentals.com/flatbed-trailer-rentals/equipment-deckover. I'm wondering if anyone can tell me why Google has chosen to rank the keyword phrase for this page: http://www.bigtrailerrentals.com/flatbed-trailer-rentals/equipment-24 This is just one example. It has happened on several of my pages / keywords.
On-Page Optimization | | BigTrailerRentals0 -
Optimise root level page for secondary keyword or try to make second level rank?
My Main keyword ranks no where in the top 100 for any pages in my site. for / I am ranked at number 10 for a great secondary keyword. Really want to get this secondary keywordranking on /secondary-keyword.html , should I focus my main page only on my primary keyord and remove any references to "Secondary Keyword"? What I feel here is that I would be sacrificing my rank for "Secondary Keyword"
On-Page Optimization | | Adamzoz0 -
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
On Brand Queries Google does not shows my home page first instead of it shows internal pages.
Also on my brand query it doesn't shows sitelinks. What may be the reason?
On-Page Optimization | | vivekrathore0 -
Links to Paywall from Content Pages
Hi, My site is funded by subscriptions. We offer lengthy excerpts, and then direct people to a single paywall page, something like domain.com/subscribe/ This means that most pages on the site links to /subscribe, including all of the high value pages that bring people in from Google. This is a page with an understandably high bounce rate, as most users are not interested in paying for content on the web. My question is are we being penalized in Google for having so many internal links to a page with a very high bounce rate? If anyone has worked with paywall sites before and knows the best practices for this, I'd be really grateful to learn more.
On-Page Optimization | | enotes0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
Why some pages are not indexed?
I have a furniture´s ecommerce. When searching for "site: movstore.com.br" returned 1080 results, but if I search for "site: movstore.com.br / Product" returned 1020 results. I mean, that 1080 indexed pages, 1020 are products pages and the other 60 pages are irrelevant. Where are the category pages? "site: movstore.com.br / Categories" - 0 results
On-Page Optimization | | maisempresas
"site: movstore.com.br / Departments" - 0 results
"site: movstore.com.br / Marks" - 0 results What might be happening?0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5