I want to remove some pages from my site with PR, what should I do with traffic?
-
I have a section of a site that I want to remove. It has a main page linked from the nav menu, and a half dozen subpages under that. The pages get some traffic and have ranks up to PR3, which is what my site's home page is. I'm no longer want to do these pages as they require tremendous upkeep and I'm not interested in keeping them going.
So, I know if I just remove these pages and that's all, I'm going to pay for it somewhere with Google. What else should I do? I do't really have similar pages to direct them too.
-
I'd go for #2
-
Good ideas. Thanks.
-
If you don't have similar pages to redirect to, I would do one of the following -
1. Keep the pages up but add some text that says "This is an archive so some of this content may be outdated" and then possibly link to other authoritative sites which do have updated content. This way your pages will likely still rank well in the search engines and the users will find what they are looking for, either in your archive or on the external sites.
2. Simply 301 redirect these pages to the homepage to preserve any link juice/link equity. You will likely see a drop in organic traffic for any terms these pages are currently receiving but it's a much better option than simply removing the pages and serving 404 errors, which is bad for both search engines and users alike.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Embedded Traffic Stats?
Hi all Wondering if someone could give me a pointer here please..... my client is an information resource on internet safety, it is a non profit website just blogging about internet safety and threats. To cut a long story short, the client has relationships with all the police forces and universities in the UK, they regularly republish content on their sites - with approval (and doesn't have relations with many dozens of sites which do not have approval to republish...) Although the goal of the website is information distribution and not to raise money, the site does have a number of KPIs which it needs to meet to justify its sponsorship by the likes of facebook, google, microsoft etc. We are looking to make the content the site publishes embeddable so rather than just republishing 'our' content and it looking like the third party sites' own work, we at least get the credit. The issue we are trying to work out is down to stats. If site B embeds our article and it gets 1,000 views on their site, do these 1,000 people appear in our stats too? I would guess that it does as the content is being loaded from our site each time the 1,000 people visits. In which case, would these 1,000 hits appear as direct traffic or referral traffic in the stats when they read the content on site B? We have run some tests and not seeing the test site appearing as a referrer in the stats so a little puzzled. Many thanks for any advice
Content Development | | daedriccarl0 -
Ecommerce Traffic increase - moving from 500 to 1000 words/category
Hello, We have found traffic to increase upon adding content to category descriptions. Our category pages are stronger than our product pages. Our main categories (the fourty that are in the navigation) have 500 words on them, carefully written helpful content. We are trying to decide whether to increase (for these 40 categories) to 1000 words per category. Do you think this would increase traffic ? Is it a good idea? Thanks!
Content Development | | BobGW1 -
Site Content Review Please!
I m looking for someone who can review my site and let me about quality of content on my site. Can anyone suggest / know who I can talk to about this ? Nick
Content Development | | orion680 -
How does one write different pages of their website that are very similar in nature with using too much duplicate content?
We are a service provider and we have different links on our website to each of our services. The problem is the content that we would have for each is very similar. How can I ensure that it is not deemed duplicate content and ranked poorly because of it. Thanks
Content Development | | JayTurner0 -
How can I resolve a duplicate page issue?
I have (an attached) report that shows duplicate content for a blog page and I'm not sure how to resolve the issue. The blog/website is hosted on wordpress.org, maybe it's something to do with having to add categories or tags - can anyone help please? SDtXT.png
Content Development | | lindsayjhopkins0 -
Duplicate Text on Blog & Internal News Page
I have two places I post news for our company. Our blog - typically more informal posts
Content Development | | seo-hunter
mycompany.wordpress.com & Our news page - typically more newsworthy than the blog
mycompany.com/news My question is, It is okay to just copy the exact text from my wordpress blog and paste to my news area of my site and vice versa? Does this hurt ranking potential for either page?0 -
Mobile Sites / Useragent detection
I've got a question about how search engines declare that they're mobile browsers... Our website is based on wordpress, and uses the caching plugin W3TC to send a different site template to mobile useragents - i believe from the HTTP useragent string; (the same content is served on every page whether it's a desktop or mobile - just different themes). After having this mobile site online for a few months, we're a little confused as to why google still shows the instant preview of the desktop version for mobile users, and it doesn't show the little mobile phone icon in our SERPs for mobile devices (it's as if it doesn't realise the mobile site exists). I was reading today that the "old" method of serving different content based on the browser is to use the HTTP useragent string; and there's a "new" object checking method which is more robust (although I can't find a lot of information about it). Can anyone explain the "new" method? Would this be the reason that google is so far ignorant of our mobile site?
Content Development | | AlecPR0 -
How does google react to duplicate shops on ecommerce sites
Surely shopping cart sites are going to have a lot of duplicate content? Does google recognise this? Is there anything I can do let google know?
Content Development | | borderbound0