301 a page and then remove the 301
-
I have a real estate website that has a city hub page. All the homes for sale within a city are linked to from this hub page.
Certain small cities may have one home on the market for a month and then not have any homes on the market for months or years. I call them "Ghost Cities". This problem happens across many cities at any point in time. The resulting city hub pages are left with little to no content.
We are throwing around the idea of 301 redirecting these "Ghost City" pages to a page higher up in the hierarchy (Think state or county) until we get new homes for sale in the city. At that point we would remove the 301.
Any thoughts on this strategy? Is it bad to turn 301s on and off like that? Thanks!
-
We all know that some of the link juice dies with 301's but I suspect that with a 302 you would keep the link juice.
-
Supposedly we have 6 cities, City A ~ City F
City A ~ C (3 cities) is a ghost city
We can easily make City A, showed additional results for City D and E, while City B shows additional results for City E and F, and of course City C showed results for City F and City D..
In reality, i think one city is surrounded only by 3 ~ 4 cities right? so there will be no city that will have exactly the same neighborhood
If you want you can even kick the additional listing into overdrive by giving some randomization
For example
http://www.stand-out.net/Movitel-Cuffed-Jeans-pr-7703.html (not my site)
You can see the "More From Humor" Section always randomized 3 more items that has Humor Brand (try refreshing the page to see the randomization)... there is no way that you will ever have a duplicate content this way
Again, dont try to built a page for SEO, but built them for your customer first, you will definitely have a very happy customer if you showed them additional result from nearby cities instead of just BLINDLY redirect them to another city
Good luck
-
Playing devil's advocate, should we worry about duplicate content? There would be 100s of pages that all had different titles/h1s, but very similar content (just different nearby cities).
In general, I like this idea.
-
Yea it is definitely a viable option. Just wondering if there would be an added benefit to passing the link juice through a 301, especially if there isn't a penalty for turning it off/on.
-
In my opinion, i think there is no problem ... But if you want to be on a safe side , why not filled it with useful information instead?
For example : I went to City X Hub Page... when it has absolutely no home at all, i think the visitor would be much more happier when they see certain information like "Duh... no one is selling home in this Ghost City, perhaps you would be interested in properties from nearest city around it such as Show some good home on City Y and City Z "
-
Wouldn't a 302 be a better option? Temporary vs permanent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
500 and 508 pages?
Hi we just did a massive deepcrawl (using the tool deepcrawl.co.uk/) on the site: http://tinyurl.com/nu6ww4z http://i.imgur.com/vGmCdHK.jpg Which reported a lot of URLs as either 508 and 500 errors. For the URLs as reported as either 508 or 500 after the deep crawl crawl finished we put them directly into screaming frog and they all came back with status code 200. Could it be because Deep Crawl hammered the site and the server couldn't handle the load or something? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Hreflang and paginated page
Hi, I can not seem to find good documentation about the use of hreflang and paginated page when using rel=next , rel=prev
Intermediate & Advanced SEO | | TjeerdvZ
Does any know where to find decent documentatio?, I could only find documentation about pagination and hreflang when using canonicals on the paginated page. I have doubts on what is the best option: The way tripadvisor does it:
http://www.tripadvisor.nl/Hotels-g187139-oa390-Corsica-Hotels.html
Each paginated page is referring to it's hreflang paginated page, for example: So should the hreflang refer to the pagined specific page or should it refer to the "1st" page? in this case:
http://www.tripadvisor.nl/Hotels-g187139-Corsica-Hotels.html Looking foward to your suggestions.0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
301 Redirect To Another 301 Redirect
Hi, We have a client with an old domain that they want to redirect to their primary domain. They also have a few older domains pointing to the old domain. Do you recommend leaving them as redirects that point to the old domain? This will create a redirect to a redirect situation. Or, is it better to go ahead and redirect those older domains to the primary one's, removing one layer of redirect? Thank you! Jessie
Intermediate & Advanced SEO | | JessieT0 -
301 redirect w/ dynamic pages to static
I am trying to redirect old dynamically created pages to a new static one (single page). However, when I implement the redirects, it still uses part of the old dynamic url. For instance... dynamic.php?var=example1 dynamic.php?var=example2 dynamic.php?var=example3 should all redirect to: static.html. However, they are redirecting to: static.html?var=example1 static.html?var=example2 static.html?var=example3 The page is resolving fine, but I don't want google to misinterpret the new static page as numerous page with dup content. I tried this in PHP on the dynamic.php page as follows, but it the problem above persisted: header('HTTP/1.1 301 Moved Permanently');
Intermediate & Advanced SEO | | TheDude
header('Location: http://www.mysite.com/static.html'); I tried doing it in my .htaccess file as follows, but the problem persisted: redirect 301 /info/tool_stimulus.php?var=example1 http://www.mysite.com/static.html
redirect 301 /dynamic.php?var=example2 http://www.mysite.com/static.html Can anyone solve this in PHP or w/ htaccess? Help!!! 🙂0 -
Restructuring/Removing 301 Redirects Due To Newly Optimized Keywords
Just to be clear, this is for one unique page on a website. Also, please see my diagram attached. Let's say that a page's URL was originally /original. So, you optimize the page for a new keyword (keyword 1), and therefore change the URL to /keyword-1. A 301 redirect would then be placed... /original > /keyword-1 However, let's say 6 months down the road you realize that the keyword you optimized the page for (keyword 1) just isn't working. You research for a new keyword, and come up with (keyword 2). So, you'd like to rename the page's URL to /keyword-2. After placing a redirect from the current page (keyword 1) to the 'now' new page (keyword 2), it would look like this... /original > /keyword-1 > /keyword-2 We know that making a server go through more than one redirect slows the server load time, and even more 'link-juice' is lost in translation. Because of this, would it make sense to remove the original redirect and instead place redirects like this? /original > /keyword-2 /keyword-1 > /keyword-2 To me, this would make the most sense for preserving SEO. However, I've read that removing 301 redirects can cause user issues due to browsers caching the now 'removed' redirect. Even if this is ideal for SEO, could it be more work than it's worth? Does anyone have any experience/input on this? If so, I greatly appreciate your time! oDvLl.jpg
Intermediate & Advanced SEO | | LogicalMediaGroup1 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0 -
Should we deindex duplicate pages?
I work on an education website. We offer programs that are offered up to 6 times per year. At the moment, we have a webpage for each instance of the program, but that's causing duplicate content issues. We're reworking the pages so the majority of the content will be on one page, but we'll still have to keep the application details as separate pages. 90% of the time, application details are going to be nearly identical, so I'm worried that these pages will still be seen as duplicate content. My question is, should we deindex these pages? We don't particularly want people landing on our application page without seeing the other details of the program anyway. But, is there problem with deindexing such a large chunk of your site that I'm not thinking of? Thanks, everyone!
Intermediate & Advanced SEO | | UWPCE0