Will frequently adding and frequently removing pages from my site hinder any SEO?
-
Hi Guys,
Just looking through our crawl diagnositcs and we have a ton errors, well over 5000 actually, on 404 pages that cannot be accessed.
Our website runs a lot of "Hot Offers" that are time bound, so they expire at the end of each month and we remove the page via our CMS.
It's making the crawl diagnositcs loook bad, but will this be hindering our seo and Google 'stuff' because they are finding thousands of 404 errors?
Any advice would be greatly appreciated!
Website: www.vospers.com
Lee Greenhill
-
then your link to /ford-mondeo-from-6995 would no longer exist
call the page /offers.
one link will offset the advanatge of the keyword in the url
-
Thanks for the advice Alan.
Good idea.
If we had an offer page with a url of say /ford-mondeo-from-6995, i could just rename it to /ford-mondeo-offer so that the new offer overwrites the old page. See what you mean.
Cheers.
-
No not just for having 404’s but you could be using these pages a lot better,
Lets say you have an offer and someone links to tat page, then you delete it, you have wasted a link. You are better off to re-use the page, so if any one links to it helps your next offer rank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
Hi, We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages. As far as I know, the page URL won't change and won't have parameters. Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots. Is it better to have URL parameters for version B and C of the content? For example: /page for the default content /page?id=2 for the B version /page?id=3 for the C version The dynamic content comes from the server side, so not all pages copy variations are in the default HTML. I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
Technical SEO | | Gyorgy.B1 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
Duplicate Page Title for multilingual wordpress site
Hello all, I have received my first crawl reports and I find a lot of errors of duplicate page title. In the wordpress site I use the qtranslate plugin in order to have the site in 2 languages. I also use the Yoast SEO plugin in order to put titles, description and keywords to each web page. By looking deeply in the duplicate page title errors I think I found that the problem is that every web page takes the same SEO Title for each language. But I am not 100% sure. I tried to use some shortcodes of the qtranslate plugin like the following ABOUT [:en]About in order to indicate and give different titles per language for one web page but that doesn't seem to work. Does anybody here has experienced the same problem as me? Do you have any suggestions about how to ressolve the problem of the duplicate page title? I can give you the URL of the website if you need it to have a look. Thank you in advanced for your help. I really appreciate that. Regards, Lenia
Technical SEO | | tevag0 -
Adding parameters in URLs and linking to a page
Hi, Here's a fairly technical question: We would like to implement badge feature where linking websites using a badge would use urls such as: domain.com/page?state=texas&city=houston domain.com/page?state=neveda&city=lasvegas Important note: the parameter will change the information and layout of the page: domain.com/page Would those 2 urls above along with their extra parameters be considered the same page as domain.com/page by google's crawler? We're considering adding the parameter "state" and "city" to Google WMT url parameter tool to tel them who to handle those parameters. Any feedback or comments is appreciated! Thanks in advance. Martin
Technical SEO | | MartinH0 -
SEO problems from moving from several pages to one accordian
Ive read other posts that say using accordion is not detrimental to SEO, and for conversion optimization we want to take several of our existing pages and make them into one accordion. But what will this do to seo and duplicate content as I redirect the old pages to anchors in the accordion? I would think this would be a dup content problem as www.oldinfo1 www.oldinfo2 will now have their content on the same page but I will be redirecting them to www.newpage#oldinfo1 www.newpage#oldinfo2 Is there a way around duplicate content problems?
Technical SEO | | JohnBerger0 -
SEO On-Page Planning - Wire Framing
This is open to discussion. I'm interested in getting help/opinions from others on how they plan their next SEO Project. What methods do you use for laying out On-Page Keyword targeting? Do you use any specific wire framing tools for laying out a large website? With larger websites, this stage is very important so I'd find it very useful to get help in this area. If you know of any useful threads covering this area, share them here.
Technical SEO | | Nick-SEOSpark0 -
International Site, flow of page rank?
OK. I'm working on an international site. The site is setup with folders for UK, US, AU e.g www.site.com/UK/index.aspx The root (non folder based) is the international version of the site e.g www.site.com/index.aspx www.site.com/index.aspx has the lions share of links. Therefore, the pages immediately linked from www.site.com/index.aspx have page rank distributed between them. My UK, US and AU home pages are linked via a country selector from the www.site.com/index.aspx page via an aspx redirect page that 301's to the appropriate country home page. Therefore the home pages of UK, US, AU are recieving some of the 'juice' that is coming in to www.site.com/index.aspx (but only a fraction via the redirect links) Am I right in thinking that pages on the international version of the site will have much more potential to rank (because of their 'juice') than the pages on UK, US and AU versions of the site? If so, am I right in thinking that these will tend to rank over the equivalent UK, US and AU versions of the pages in each country version of Google despite having set directory level Geo-targetting in GWT?
Technical SEO | | QubaSEO1 -
Is robots.txt a must-have for 150 page well-structured site?
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them). I have used rel=nofollow for internal links that point to my Login page. Is there any reason to include a generic robots.txt file that contains "user-agent: *"? I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
Technical SEO | | scanlin0