Is it bad practice to create pages that 404?
-
We have member pages on our site that are initially empty, until the member does some activity. Currently, since all of these pages are soft 404s, we return a 404 for all these pages and all internal links to them are js links (not links as far as bots are concerned). As soon as the page has content, we switch it to 200 and make the links into regular hrefs.
After doing some research, I started thinking that this is not the best way to handle this situation. A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be real links.
I'd love to hear input and feedback from fellow Mozzers. What are your thoughts?
-
Yair,
See the infographic on this page regarding rel nofollow tags in links, and when you may want to consider using them. Specifically, see the part about User Generated Content:
http://searchengineland.com/infographic-nofollow-tag-172157
However, Google can decide to crawl whatever they want to crawl, whether it is a nofollowed link, links on a page with a nofollow meta tag, or javascript links. If you really want to keep Google out of those portions of the site you should use the robots.txt disallow statement, as I mentioned in your other thread, or use the X-Robots-Tag as described here.
-
Thanks Everett,
As far as I know, nofollows don't conserve crawl budget. The bots will crawl the link, they just wont transfer any PR.
-
I'm sure Jane meant that it would block indexation of the page.
In my opinion you should probably noindex,follow (robots meta tag) the pages and make the internal links just normal links, possibly with a rel= nofollow link attribute, until the user fills out the profile.
I will go look into your other question as well.
PS: The drawback of this solution is that bots will still be spending crawl budget crawling those URLs if you are linking to them internally. [Edited]
-
Thanks for the clear and concise answer, Jane. You hit the nail right on the head! I appreciate your input.
One question, though. You say that noindex will block bot access to these pages. I'm pretty sure the bots will still crawl the pages (if they find them), just they won't be indexed and presumably they won't be "counted against us" like 404 pages. Is that what you meant?
If you have a minute, maybe you can help me out with this question next: http://moz.com/community/q/internal-nofollows
(Side note: Er_Maqul was referring to the original version of the question (before I edited it) where I had mistakenly written that we nofollow the links.)
-
Hi there,
A 404 certainly isn't the best way to handle a new URL / page before it is populated with content. It is good that Google isn't finding these pages yet (as you state in a later comment), but keep in mind that it could - the pages aren't linked to, but there is never any particular guarantee about what Google will and won't find. It's highly unlikely if you don't link to them, but still - it's not worth taking the risk. As you also say, there's no stopping anyone else from linking to the pages and / or for Google to go on a exploratory mission of its own.
As a note, internal 404s / 410s ("Gone") are perfectly okay if they're appropriate for the situation, i.e. a page has been removed. Not ever removed resource has to be 301ed elsewhere. This isn't the case here though.
To my mind, blocking bot access to these pages while they are empty is a better option, and noindex / follow would achieve this. I believe Er_Maqul has misunderstood what you were saying here - there is no "nofollow" in this situation.
-
Got it, I see.
Well, let's see here. I will state I am no expert in this realm. This is much more of a job for the likes of EGOL or RobertFischer. EGOL in particular with his intimate knowledge of NOINDEX
http://www.mattcutts.com/blog/google-noindex-behavior/
(Scroll down and look at the first post on Matt Cutt's blog there.)
That being said, I still have a few thoughts.
I think you certainly could continue to do what you are doing, I also think that Er_Maquai brought up a point that I touched on as well. Obviously the best scenario is to avoid a 404, and to have original content. Now unfortunately if you don't have have any content to write, or the person doesn't in this case, that set's you up for thin content, and a lot of duplicate content.
It seems to me there is no way to avoid having the extra pages without some sort of script or coding which houses the profiles somewhere other than on a separate page or on a flux capacitor somewhere. So going off that you could create a generic "no profile page" that gets published and use the rel=canonical tag.
I take back my prior statement about "ANY" content. Thin, pointless content, is thin and pointless, and won't benefit you at all. I hope that wasn't interpreted that way.
Again, I think this one is somewhat out of my scope of help, and it might even be worth calling in an SEO professional who specializes in forums for a second opinion. It's like having surgery, gotta go to another Dr to verify your diagnosis.
Sorry I couldn't give a better answer!
-
Wow - thanks for the thorough response, HashtagHustler!
Let me explain a little better...
We get hundreds of signups a day. Each new member has a profile page, which is empty until they do something. Sometimes they never do. So we don't link to the empty pages and they return a 404. As soon as the page has some content, we do link to it and it returns a 200.
Google is not reporting 404s for these pages because they are not linked to. In the pat, when we did link to them, Google reported them as soft 404s.
The current system is working fine.
My question is simply if it makes more sense to allow Google to find these pages (link to them) but noindex them since they do not have content (and are considered soft 404s by Google) or if we should continue doing it the way we are today (which makes me a little uncomfortable since we are creating 1000's of pages - that theoretically may be linked to by other sites - that are 404s)?
-
Yes... all 404 can hurt your SEO campaign, because even if they doesn't count for points (i don't know these), at least the spider ignores a bit more or crawl your page more slowly. Because this, you need to get away all your possible 404 errors.
Think, a empty page can have something. You can use a default text, or use a noindex tag while the page are empty, or simply make a standard profile and link to them all empty pages with a rel=canonical (I think there's the best option). But having more pages, even with low quantity of text, that's better who i doesn't have the pages itself, and, very much better than have 404 errors.
Also, think one more thing. Even more changes have your page, more indexed you're get and low times between indexing. There's another reason to have the pages working, even when they doesn't have any custom info. Because the creation of the page are a change, and customization of them are another change too.
-
Good Morning Yair!
A 404 is a 404 plain and simple. And if Google was able to report it, then they were able to get to it. Which basically means, someone else could. I'm not sure what platform you are using currently using but there are plenty of easy options for a quick fix.
If you wanted a simple option you could just throw up a 302 if you are planning on putting content up SOON. Temporary redirects differ from 301 because they are temporary. Google has to evaluate if they should keep the old page, so in this case, when you are planning on launching the old one (well, technically new one) you might be able to use that to your advantage.
A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be follow links.
I was unsure exactly what you meant by this. The only difference between follow and no follow links is that you are telling Google not to let your linkjuice carry over to the page you are linking to. You are linking to the page without vetting them. My apologies if you already knew this, but I was slightly confused by your sentence. Google will still go to that page to check it out.
Another option: change your schedule and don't put up pages that far in advance that you aren't planning on publishing. I edit everything offline. Google likes new stuff, especially content. Of course they take longevity into account as well, but as far as making a big splash, putting a website up piecemeal is like having people show up at your birthday saying "Suprise!" when you answer the door when you were the one who invited them. The notion is nice, but it just doesn't have the same effect. Not only in the Google realm, but even more so in Social Media.
My favorite option, and my personal recommendation would be to play the cards you were dealt. Get rid of the 404 and the 200 and embrace that you have a new page! Go on and write a profile piece for the member. Write some sort of Biographical data that can act as a placeholder. At this point it doesn't even need to be stuffed with keywords and amazing seo phrases.
On Second Thought: I'm not sure exactly what your forum is for, and if this issue is specific to a few members or if you are referring to bulk membership, but I have a few ideas on how you might be able to extrapolate some of the signup data into even a simple post to avoid getting a 404! Even if you parsed some of the forms from signup and created a simple little one page that displayed, I think that could help.
At the end of the day, Google loves forums that are strong and authoritative. They also understand that every single person on a forum isn't going to post and isn't going to interact. So depending on what kind of forum you have, and what exactly you are doing, some of the forum issues will just have to be accepted. I think it would be more valuable to clean up negative linking and and analyze internal issues and build form recognition etc. than fix soft 404's coming from a handful of users. Again all of this depends on the size of your pool.
Also, you could just track their usage, if someone is logging in all the time and not posting then fine, I would deal with it because who knows what they are doing in the real world. If someone made an account in 1999, and then threw their computer out of the window in Y2K and never bought another one because they still believe that everything crashed, well then maybe its time to fix that 404.
Hope that helps!
Sorry if my train of thought is a little off this morning.... not enough coffee!
-
Thanks for your quick response, Er.
You are correct about the 404s and I realized that what I wrote in the question was a mistake. We don't have any internal links to these pages (not even nofollow). Until there is content on the page, we make all links to the page into js links. I corrected this in the question now.
Concerning what you said about the pages being useful for SEO even without content: I don't think this is correct. Before we started 404ing the empty profile pages, Webmaster was reporting them as soft 404s. Doesn't this mean that they were hurting us (especially since we have many of them)?
-
Any 404 is bad for SEO. Make a little page for the profile without data, and you can gain a lot of SEO even if your pages doesn't have content.
Even if the links have a nofollow, google follows them to see what is in the other side. For this reason, avoid ALWAYS you can to have a link to a 404 page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel canonical tag from shopify page to wordpress site page
We have pages on our shopify site example - https://shop.example.com/collections/cast-aluminum-plaques/products/cast-aluminum-address-plaque That we want to put a rel canonical tag on to direct to our wordpress site page - https://www.example.com/aluminum-plaques/ We have links form the wordpress page to the shop page, and over time ahve found that google has ranked the shop pages over the wp pages, which we do not want. So we want to put rel canonical tags on the shop pages to say the wp page is the authority. I hope that makes sense, and I would appreciate your feeback and best solution. Thanks! Is that possible?
Intermediate & Advanced SEO | | shabbirmoosa0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
SEO: How to change page content + shift its original content to other page at the same time?
Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
Intermediate & Advanced SEO | | daimpa
URL: example.com/formula-1
Content: ContentPageA Desired situation: Page A
URL: example.com/formula-1
Content: NEW CONTENT! Page B
URL: example.com/formula-1-news
Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?0 -
Why does Google rank a product page rather than a category page?
Hi, everybody In the Moz ranking tool for one of our client's (the client sells sport equipment) account, there is a trend where more and more of their landing pages are product pages instead of category pages. The optimal landing page for the term "sleeping bag" is of course the sleeping bag category page, but Google is sending them to a product page for a specific sleeping bag.. What could be the critical factors that makes the product page more relevant than the category page as the landing page?
Intermediate & Advanced SEO | | Inevo0 -
More bad links
Hi, After a recent disastrous dalliance with a rogue SEO company I disavowed quite a few domains (links he had gained) which I was receiving a penalty of about 23 places. I cleaned up the site and added meta descriptions where missing, and deleted duplicate titles and pages. This gained me another 5 places. In the meantime I have been getting a few links from wedding blogs, adobe forums and other relevant sites so was expecting an upward momentum. Since the high point of bottom of page 1 I have slowly slid back down to near the bottom of page two for my main keywords. Just checked my webmaster tools latest links and another 4 domains have appeared (gained by the dodgy SEO) : domain:erwinskee.blog.co.uk domain:grencholerz.blog.co.uk domain:valeriiees.blog.co.uk domain:gb.bizin.eu They all look bad so I am going to disavow. I expect to find an improvement when I disavow these new domains. As I have said, have started using the open site explorer tool to check my competitors backlinks and getting some low level links(I'm a wedding photographer) like forum comments and blog comments and good directories. I know there is much more than this to SEO and plan on raising my game as time progresses. I have also gained more links from the domains I disavowed on the 8th January mostly from www.friendfeed.com. will webmaster tools ignore any new links from previously disavowed domains? Like I have said I know there are better ways to get links, but are these links (forum comments, blog comments and respectable directories) one way of raising my rankings? To be honest that is all my competitors have got other than some of the top boys might have a photograph or two on another site with a link. No-one has a decent article or review anywhere (which is my next stage of getting links). Thanks! David.
Intermediate & Advanced SEO | | WallerD0 -
Noindex search pages?
Is it best to noindex search results pages, exclude them using robots.txt, or both?
Intermediate & Advanced SEO | | YairSpolter0 -
How to properly 404 pages from a subdomain
SO I am working on a site that had a subdomain that attracted a lot of spammy links. I researched the backlinks to this subdomain, and there were no beneficial links at all. I am thinking the best thing is to 404 this subdomain. What is the best way to do this? Should I just edit the DNS settings so that this subdomain does not point to the root domain? Or is there something that should be done in webmaster tools? Thanks in advance!
Intermediate & Advanced SEO | | evan890 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0