How to deal with Pages not present anymore in the site
-
Hi,
we need to cut out from the catalog some destinations for our tour operator, so basically we need to deal with destination pages and tour pages not present anymore on the site.
What do you think is the best approach to deal with this pages to not loose ranking?
Do you think is a good approach to redirect with 301's these pages to the home page or to the general catalog page or do you suggest another approach?
tx for your help!
-
Tx Tim for the answer, it make sense.
I explain you in more details my site structure:
site.com/destinations - hub for all the destinations
site.com/destinations/tanzania - single destination page
site.com/tours/tanzania-tour-1 - single tour page
site.com/travel-category/cultural-tours - a second way tour are organized, for travel category.
So lets say i dont want to sell anymore the destination Tanzania and all his related tours. In the case i want to keep the ranking for the destination and tours i would need to 301 redirect the destination Tanzania to the more general page site.com/destinations and the site.com/tours/tanzania-tour-1 page to site.com/travel-category/cultural-tours since this is a cultural tour.
Does this make sense?
-
I wouldn't divert them to the homepage, the content has to be relevant. As Tim says keep them or redirect/create a page that does have relevant content.
Like Advice/comparisons/alternatives -
completely agreed with Tim.
-
Hi there, I think this is a mixed question about meeting the needs of SEO and your customers. You could naturally allow some pages to 404 if you no longer wish to rank for a specific location or as an alternative you could as mentioned above 301 certain pages to a new page of a similar or relevant topic/destination.
Managing a users experience and not having a 404 is probably best, maybe a specialised landing page which keeps the destination is of use... you could use the page to still rank for this destination, but maybe suggest alternatives within the vicinity, this might be useful for hotels on a local level and still lead to conversions. For larger scale alternatives say at a country level this may be more difficult as the user is probably already set to visit a specific destination, as such a 301 to a higher level category maybe more appropriate unless you want to clarify to the user that this location is no longer available.
If you still wish to rank for these old pages/destinations, it is probably best to keep them in place or redirect to a similar page.
Hope that is ok.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
Is my page being indexed?
To put you all in context, here is the situation, I have pages that are only accessible via an intern search tool that shows the best results for the request. Let's say i want to see the result on page 2, the page 2 will have a request in the url like this: ?p=2&s=12&lang=1&seed=3688 The situation is that we've disallowed every URL's that contains a "?" in the robots.txt file which means that Google doesn't crawl the page 2,3,4 and so on. If a page is only accessible via page 2, do you think Google will be able to access it? The url of the page is included in the sitemap. Thank you in advance for the help!
Technical SEO | | alexrbrg0 -
"One Page With Two Links To Same Page; We Counted The First Link" Is this true?
I read this to day http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718 I thought to myself, yep, thats what I been reading in Moz for years ( pitty Matt could not confirm that still the case for 2014) But reading though the comments Michael Martinez of http://www.seo-theory.com/ pointed out that Mat says "...the last time I checked, was 2009, and back then -- uh, we might, for example, only have selected one of the links from a given page."
Technical SEO | | PaddyDisplays
Which would imply that is does not not mean it always the first link. Michael goes on to say "Back in 2008 when Rand WRONGLY claimed that Google was only counting the first link (I shared results of a test where it passed anchor text from TWO links on the same page)" then goes on to say " In practice the search engine sometimes skipped over links and took anchor text from a second or third link down the page." For me this is significant. I know people that have had "SEO experts" recommend that they should have a blog attached to there e-commence site and post blog posts (with no real interest for readers) with anchor text links to you landing pages. I thought that posting blog post just for anchor text link was a waste of time if you are already linking to the landing page with in a main navigation as google would see that link first. But if Michael is correct then these type of blog posts anchor text link blog posts would have value But who is' right Rand or Michael?0 -
Banned Page
I have been using a 3rd party checker on indexed pages in google. It has shown several banned pages. I type the page in and it comes up. But it is nowhere to be found for me to delete it. It is not in the wordpress pages. It also shows up in the duplicate content section in my campaigns in moz.com. I can find the page to delete it. If it is banned then I do not want to redirect it to the correct page. Any ideas on how to fix this?
Technical SEO | | Roots70 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
Pages extensions
Hi guys, We're in the process of moving one of our sites to a newer version of the CMS. The new version doesn't support page extensions (.aspx) but we'll keep them for all existing pages (about 8,000) to avoid redirects. The technical team is wondering about the new pages - does it make any difference if the new pages are without extensions, except for usability? Thanks!
Technical SEO | | lgrozeva0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
Dealing with hundreds of spam pages caused by a hacker
A couple of my sites have recently been hacked with the hacker managing to overwrite lots of my pages with their own spam products and also adding in lots of (hundreds) pages that they have created themselves. I have rectified this in so far as removing folders that the hacker used to over write my pages so my original pages are now back showing the correct content and also removed all the hundres of new pages that they had managed to instantly add. I appreciate that google will find and re-crawl all my genuine pages so the correct content is being displayed and indexed for them but what is the best method for dealing with the hundreds of extra spam ages that google had managed to crawl but have now been deleted so there are loads of 404 page not founds in google?
Technical SEO | | Wardy0