Wrong Page Indexing in SERPS - Suggestions?
-
Hey Moz'ers!
I have a quick question. Our company (Savvy Panda) is working on ranking for the keyword: "Milwaukee SEO".
On our website, we have a page for "Milwaukee SEO" in our services section that's optimized for the keyword and we've been doing link building to this. However, when you search for "Milwaukee SEO" a different page is being displayed in the SERP's.
The page that's showing up in the SERP's is a category view of our blog of articles with the tag "Milwaukee SEO".
**Is there a way to alert google that the page showing up in the SERP's is not the most relevant and request a new URL to be indexed for that spot? **
I saw a webinar awhile back that showed something like that using google webmaster sitelinks denote tool.
I would hate to denote that URL and then loose any kind of indexing for the keyword.
Ideas, suggestions? -
I'm not sure how many of your /tag/ pages are ranking but if you can figure that part out, you can try doing htaccess 301 redirects for specific URLs, example:
redirect 301 //tag/Milwaukee-SEO.html http://savvypanda.com/services/milwaukee-seo.html
If you need further help with .htaccess and Joomla, I'm pretty well rounded with my skills. We use Joomla for a majority of our clients (followed by Wordpress.)
-
i'm cool with not having them indexed, i'm just worried that if I demote or block the /tag/ from being indexed we'll lose ranking for keywords.
Right now the /tag/ URL is ranking fairly well. ?
-
I personally would not bother indexing the /tag/ pages since all that content exists on their own "permalink" somewhere within your site from what I could tell with a quick look.
-
Hey Dan,
You caught on to the big problem we're correcting now. It's the way our tagging system works in our blog... it's causing all kinds of duplicate content errors. We're changing tagging systems to help this problem.So I plan on doing this first, but do you have any ideas how to correct the /tag/ URL that's being indexed instead of our "MIlwaukee SEO" services page?
-
I see your /tag/ listing is showing up in the SERPs. I also noticed you have duplicate content issues on your website.
S****ee this for an example:
I'd consider fixing the duplicate content issue first, that is definitely a major problem and is probably affecting a lot of other landing pages. Fixing this might also fix your original problem that you posted about.
-
I believe you are referring to googles robot.txt which is designed to have google skip a page while indexing. I dont think you want to do this. However, I checked the backlinks (anchored text to your site) and seems like you have not built any incoming links using your keyword "Milwaukee SEO" . I would recommend just building some good links to using "Milwaukee SEO"
Your code should look like this Milwaukee SEO
Post this on a few Local sites. Since you are web design company as well, you can include that script in some of your local sites footers : ) Goof luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
I still see the old page in index
Hello, I have done a redirect and still see in google index my old page after 3 weeks. My new page is there also Is it normal that the old page isn't dropped for the index yet ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Wrong Pages Ranking
Good Afternoon We had an issue a while ago with the incorrect pages ranking in Google for some of our key terms. For example the page ranking for the term in Hotels in Spain was an individual information page for one particular hotel in Spain rather than the top level page which is optimised for "Hotels in Spain" The individual property page was ranking around 36-40 so we tightened up all the internal linking structure to ensure the term "Hotels in Spain" was pointing to the correct page and de-optimised the individual property page for the term. After a few weeks, everything seemed to be working and we were ranking top of second page for correct page however, ranking report today has reversed our good fortune and the incorrect page is ranking in a low position Any further suggestions or advise would be very much appreciated. Ideally, I don't want to remove the page that is ranking as it's still relevant for a search for that particular hotel
Intermediate & Advanced SEO | | Ham19790 -
Redirected Old Pages Still Indexed
Hello, we migrated a domain onto a new Wordpress site over a year ago. We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp). The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?
Intermediate & Advanced SEO | | phogan0 -
Only 4 of my pages have been indexed out of 64 in total
Hi there, I submitted a sitemap for a new 64 page website 6 weeks ago and only a few pages have been indexed. The website shows in Google search but with a large amount of information on the website it should show higher. I have fetched and rendered 30 plus pages on the 9th September and others on the 16th September, today is the 5th October but in Webmaster tools, Google only acknowledge 1 page as indexed. I have checked the robots txt file which shows it is allowed. There are no messages for problems with crawl and no errors showing. The domain is www.urbaneforms.com . Can you offer a suggestion as to why we are not being indexed?
Intermediate & Advanced SEO | | simplyworld0 -
Howcome Google is indexing one day 2500 pages and the other day only 150 then 2000 again ect?
This is about an big affiliate website of an customer of us, running with datafeeds... Bad things about datafeeds: Duplicate Content (product descriptions) Verrryyyy Much (thin) product pages (sometimes better to noindex, i know, but this customer doesn't want to do that)
Intermediate & Advanced SEO | | Zanox0 -
Thousands of 404 Pages Indexed - Recommendations?
Background: I have a newly acquired client who has had a lot of issues over the past few months. What happened is he had a major issue with broken dynamic URL's where they would start infinite loops due to redirects and relative links. His previous SEO didn't pay attention to the sitemaps created by a backend generator, and it caused hundreds of thousands of pages to be indexed. Useless pages. These useless pages were all bringing up a 404 page that didn't have a 404 server response (it had a 200 response) which created a ton of duplicate content and bad links (relative linking). Now here I am, cleaning up this mess. I've fixed the 404 page so it creates a 404 server response. Google webmaster tools is now returning thousands of "not found" errors, great start. I fixed all site errors that cause infinite redirects. Cleaned up the sitemap and submitted it. When I search site:www.(domainname).com I am still getting an insane amount of pages that no longer exist. My question: How does Google handle all of these 404's? My client wants all the bad pages removed now but I don't have as much control over that. It's a slow process getting Google to remove these pages that are returning a 404. He is continuously dropping in rankings still. Is there a way of speeding up the process? It's not reasonable to enter tens of thousands of pages into the URL Removal Tool. I want to clean house and have Google just index the pages in the sitemap.
Intermediate & Advanced SEO | | BeTheBoss0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0