301 a page and then remove the 301
-
I have a real estate website that has a city hub page. All the homes for sale within a city are linked to from this hub page.
Certain small cities may have one home on the market for a month and then not have any homes on the market for months or years. I call them "Ghost Cities". This problem happens across many cities at any point in time. The resulting city hub pages are left with little to no content.
We are throwing around the idea of 301 redirecting these "Ghost City" pages to a page higher up in the hierarchy (Think state or county) until we get new homes for sale in the city. At that point we would remove the 301.
Any thoughts on this strategy? Is it bad to turn 301s on and off like that? Thanks!
-
We all know that some of the link juice dies with 301's but I suspect that with a 302 you would keep the link juice.
-
Supposedly we have 6 cities, City A ~ City F
City A ~ C (3 cities) is a ghost city
We can easily make City A, showed additional results for City D and E, while City B shows additional results for City E and F, and of course City C showed results for City F and City D..
In reality, i think one city is surrounded only by 3 ~ 4 cities right? so there will be no city that will have exactly the same neighborhood
If you want you can even kick the additional listing into overdrive by giving some randomization
For example
http://www.stand-out.net/Movitel-Cuffed-Jeans-pr-7703.html (not my site)
You can see the "More From Humor" Section always randomized 3 more items that has Humor Brand (try refreshing the page to see the randomization)... there is no way that you will ever have a duplicate content this way
Again, dont try to built a page for SEO, but built them for your customer first, you will definitely have a very happy customer if you showed them additional result from nearby cities instead of just BLINDLY redirect them to another city
Good luck
-
Playing devil's advocate, should we worry about duplicate content? There would be 100s of pages that all had different titles/h1s, but very similar content (just different nearby cities).
In general, I like this idea.
-
Yea it is definitely a viable option. Just wondering if there would be an added benefit to passing the link juice through a 301, especially if there isn't a penalty for turning it off/on.
-
In my opinion, i think there is no problem ... But if you want to be on a safe side , why not filled it with useful information instead?
For example : I went to City X Hub Page... when it has absolutely no home at all, i think the visitor would be much more happier when they see certain information like "Duh... no one is selling home in this Ghost City, perhaps you would be interested in properties from nearest city around it such as Show some good home on City Y and City Z "
-
Wouldn't a 302 be a better option? Temporary vs permanent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi I'm working on an E-Commerce site and the internal Search results page is our 3rd most popular landing page. I've also seen Google has often used this page as a "Google-selected canonical" on Search Console on a few pages, and it has thousands of these Search pages indexed. Hoping you can help with the below: To remove these results, is it as simple as adding "noindex/follow" to Search pages? Should I do it incrementally? There are parameters (brand, colour, size, etc.) in the indexed results and maybe I should block each one of them over time. Will there be an initial negative impact on results I should warn others about? Thanks!
Intermediate & Advanced SEO | | Frankie-BTDublin0 -
Validated pages on GSC displays 5x more pages than when performing site:domain.com?
Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
Intermediate & Advanced SEO | | Ty19860 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Restructuring/Removing 301 Redirects Due To Newly Optimized Keywords
Just to be clear, this is for one unique page on a website. Also, please see my diagram attached. Let's say that a page's URL was originally /original. So, you optimize the page for a new keyword (keyword 1), and therefore change the URL to /keyword-1. A 301 redirect would then be placed... /original > /keyword-1 However, let's say 6 months down the road you realize that the keyword you optimized the page for (keyword 1) just isn't working. You research for a new keyword, and come up with (keyword 2). So, you'd like to rename the page's URL to /keyword-2. After placing a redirect from the current page (keyword 1) to the 'now' new page (keyword 2), it would look like this... /original > /keyword-1 > /keyword-2 We know that making a server go through more than one redirect slows the server load time, and even more 'link-juice' is lost in translation. Because of this, would it make sense to remove the original redirect and instead place redirects like this? /original > /keyword-2 /keyword-1 > /keyword-2 To me, this would make the most sense for preserving SEO. However, I've read that removing 301 redirects can cause user issues due to browsers caching the now 'removed' redirect. Even if this is ideal for SEO, could it be more work than it's worth? Does anyone have any experience/input on this? If so, I greatly appreciate your time! oDvLl.jpg
Intermediate & Advanced SEO | | LogicalMediaGroup1 -
Getting individual website pages to rank for their targeted terms instead of just the home page
Hi Everyone, There is a pattern which I have noticed when trying to get individual pages to rank for the allocated targeted terms when I execute an SEO campaign and would been keen on anyones thoughts on how they have effectively addressed this. Let me try and explain this by going through an example: Let's say I am a business coach and already have a website where it includes several of my different coaching services. Now for this SEO campaign, I'm looking to improve exposure for the clients "business coaching" services. I have a quick look at analytics and rankings and notice that the website already ranks fairly well for that term but from the home page and not the service page. I go through the usual process of optimising the site (on-page - content, meta data, internal linking) as well as a linkbuilding campaign throughout the next couple of month's, however this results in either just the home page improving or the business page does improve, but the homepage's existing ranking has suffered, therefore not benefiting the site overall. My question: If a term already ranks or receives a decent amount of traffic from the home page and not from the page that its supposed to, why do you think its the case and what would you be your approach to try shift the traffic to the individual page, without impacting the site too much?. Note: To add the home page keyword target term would have been updated? Thanks, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0 -
Duplicated Pages and Forums
Does duplicate content hurt that particular duplicated content, or the entire site? There are some parts of my site that I don’t care about getting high rankings on search engines. For example, I have a forum and there are certain links that only logged in people can see. If you aren’t logged in, they will take you to a page where it tells u to log in. google, obviously not logged in, interprets this as lots and lots of the same duplicated page. Should I just leave it alone cause I dont care if those pages makes it to search engines. Will it not hurt the entire site? For example, can my homepage search rankings decrase? That leads to my next question. What is the best way to optimize a forum? Whenever someone posts a new post, it seems another url for the same forum thread is created..... which is obviously duplicated….in other words, if like 20 people post on a thread, i believe my site adds 20 urls for that page...anyone know how to fix this?
Intermediate & Advanced SEO | | waltergah0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Pages On Subfolder Not Ranking
A subdirectory/folder on our website doesn't seem to rank for any keywords where the same type of pages on the same competition level keywords rank perfectly fine. For awhile the pages weren't getting indexed but were crawled regularly. Can't seem to figure the problem out.
Intermediate & Advanced SEO | | bprimeelitellc0