How to deal with Pages not present anymore in the site
-
Hi,
we need to cut out from the catalog some destinations for our tour operator, so basically we need to deal with destination pages and tour pages not present anymore on the site.
What do you think is the best approach to deal with this pages to not loose ranking?
Do you think is a good approach to redirect with 301's these pages to the home page or to the general catalog page or do you suggest another approach?
tx for your help!
-
Tx Tim for the answer, it make sense.
I explain you in more details my site structure:
site.com/destinations - hub for all the destinations
site.com/destinations/tanzania - single destination page
site.com/tours/tanzania-tour-1 - single tour page
site.com/travel-category/cultural-tours - a second way tour are organized, for travel category.
So lets say i dont want to sell anymore the destination Tanzania and all his related tours. In the case i want to keep the ranking for the destination and tours i would need to 301 redirect the destination Tanzania to the more general page site.com/destinations and the site.com/tours/tanzania-tour-1 page to site.com/travel-category/cultural-tours since this is a cultural tour.
Does this make sense?
-
I wouldn't divert them to the homepage, the content has to be relevant. As Tim says keep them or redirect/create a page that does have relevant content.
Like Advice/comparisons/alternatives -
completely agreed with Tim.
-
Hi there, I think this is a mixed question about meeting the needs of SEO and your customers. You could naturally allow some pages to 404 if you no longer wish to rank for a specific location or as an alternative you could as mentioned above 301 certain pages to a new page of a similar or relevant topic/destination.
Managing a users experience and not having a 404 is probably best, maybe a specialised landing page which keeps the destination is of use... you could use the page to still rank for this destination, but maybe suggest alternatives within the vicinity, this might be useful for hotels on a local level and still lead to conversions. For larger scale alternatives say at a country level this may be more difficult as the user is probably already set to visit a specific destination, as such a 301 to a higher level category maybe more appropriate unless you want to clarify to the user that this location is no longer available.
If you still wish to rank for these old pages/destinations, it is probably best to keep them in place or redirect to a similar page.
Hope that is ok.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
Hi There! The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon? Greetings Bob
Technical SEO | | rijwielcashencarry0400 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
One site per location or all under and umbrella site?
I am working on a project where we are re-branding lots (100+) existing local business under one national brand. I am wondering what we should do with their existing websites, they are generally fairly poor and will need re-designing to match the new brand but may have some residual links? 301 redirect the URL to the national site, e.g. nationalsite.com/localbusinessA? If so what should I look out for? Do I need to specifically redirect any pages that have links to them to the same pages on the new site? Or should I give them a new standalone website that they link back to the national brand site? More than likely this will be hosted on the same server and CMS as the main site just the URL will remain Do I need to make sure that any old URL's that had links to them are 301'd to the new pages? Many thanks for you advice.
Technical SEO | | BadgerToo0 -
Duplicates on the page
Hello SEOMOZ, I've one big question about one project. We have a page http://eb5info.com/eb5-attorneys and a lot of other similar pages. And we got a big list of errors, warnings saying that we have duplicate pages. But in real not all of them are same, they have small differences. For example - you select "State" in the left sidebar and you see a list on the right. List on the right panel is changing depending on the what you selecting on the left. But on report pages marked as duplicates. Maybe you can give some advices how to improve quality of the pages and make SEO better? Thanks Igor
Technical SEO | | usadvisors0 -
Is it bad to have your pages as .php pages?
Hello everyone, Is it bad to have your website pages indexed as .php? For example, the contact page is site.com/contact.php and not /contact. Does this affect your SEO rankings in any way? Is it better to have your pages without the extension? Also, if I'm working with a news site and the urls are dynamic for every article (ie site.com/articleid=2323.) Should I change all of those dynamic urls to static? Thank You.
Technical SEO | | BruLee0 -
Mega Menus - Site Links - Bottom of the Page
Here are the questions: If you replace your top menu with a mega menu - like rei.com, target.com etc - that has dramatically more links and lots of non-optimized testimonials and calls for action, and locate the actual code of the mega menu at the bottom of the HTML , How will this affect your sitelinks? Will this now, make your on-page content more visible and indexable? Or does the Google bott dismiss this as just navigation content? In the past, I've have seen this technique work well, but that was before site links were easier to obtain. Looking at sites with virtually no navigation on their home pages and good authority, I've seen site links seemingly gleamed from alt attributes.
Technical SEO | | Runner20090 -
Old proudct pages - eComm Site
Hello, Geeks.com currently has approx. 194k pages in Google index. (approx. 30k suppl.) http://www.google.com/search?q=site%3Ageeks.com+inurl%3Aadditem&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a#sclient=psy&hl=en&client=firefox-a&hs=Ltp&rls=org.mozilla:en-US%3Aofficial&source=hp&q=site:www.geeks.com%2F&aq=f&aqi=&aql=&oq=&pbx=1&fp=876898a2ea0c82c7&biw=1512&bih=641 We have many thousands of old product urls which have gone out of stock, never to "see the light of day" again. 14 years worth! Should we be 301'ing all old products pages that go out of stock, if we know for certain we will never carry that SKU again? If we were to do a "mass" 301 of 30k+ urls how would google or other SE's react to that? Could there be any negative implications to doing so? What is considered best practice for eComm sites, as I imagine we are not alone with this type of situation. Thank you in advance. Michael B.
Technical SEO | | JustinGeeks0 -
Is robots.txt a must-have for 150 page well-structured site?
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them). I have used rel=nofollow for internal links that point to my Login page. Is there any reason to include a generic robots.txt file that contains "user-agent: *"? I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
Technical SEO | | scanlin0