Site less than 20 pages shows 1,400+ pages when crawled
-
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition.
In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused.
The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!)
Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors.
Not good, right?
So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time.
So, with that, I have two questions
- Does anyone know what the problem is, and how to fix it?
- Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is?
Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!!
-
Thanks, Irving! I am trying turning on/off the plugins - the person who designed the site used a WP Boxer plugin and Multiple Content Blocks plugin, and that is how the homepage is designed (feeding info from pages/posts) so I was wondering if that could be part of it...but when I turn them off/on that doesn't seem to help. So I'm trying the other plugins too (there are just a couple), and if that doesn't work, I'll try a fresh install!
I also tried changing the permalink structure to just /sample-post/ and that didn't seem to work either...but I'm going to keep working on it!
I haven't tried the Twitter approach yet - because I don't actually have a Twitter account (I'm trying to keep social media from taking over my life) - but if that's where the answers are, I guess I need to get on there!
-
Did you install plugins that might have caused the issue? I would deactivate all plugins and see if it has an effect then turn them on one at a time to see if you can isolate the issue.
If the plugins are not the issue, it might make sense to backup the DB and do a fresh install of WP which isn't hard.
-
I don't think the site moved hosts - I'm not the person who created it, but his business is relatively new, so if there was a change it would have been done with very little content on the site.
The permalink structure is custom and looks like this: /blog/%year%/%monthnum%/%day%/%postname%/
Would something else be better? Let me know! Thanks!!
-
Hey K,
If you could post a screen shot of the Settings>Permalink structure screen in the Wordpress Dashboard, or just copy and paste whatever is written in there in a reply, that might help diagnose the issue. Also, do you know if the site has moved hosts recently and was re-installed using the Wordpress export & import feature?
-
Thanks, Alan! I'll try contacting those guys!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating 1 page to https
Hi there, Although we do plan to migrate our entire site to https, we currently have only one page on our site which we want to migrate to https, as it has passwords and wotnot on it. Lets say it is httpS://www.example.com/login/ I am not really sure how to go about migrating just one page though.
Intermediate & Advanced SEO | | unirmk
I am guessing that we need to: make sure that httpS://www.example.com/login/ is the only page that exists on httpS:// replace any link to the httpS://www.example.com/login/ version on the http:// version remove httpS://www.example.com/login/ from the http://www.example.com/sitemap.xml create a httpS://www.example.com/sitemap.xml on httpS:// which only references the one page (httpS://www.example.com/login/) 301 http://www.example.com/login/ to https://www.example.com/login/ submit both sitemaps to Google so they know whats up. fetch http://www.example.com/login/ so that google finds the redirect. Anything else?? :S Not too sure about this one. Many thanks for your help.0 -
Would it work to place H1 (or important page keywords) at the top of your page in HTML and move lower on page with CSS?
I understand that the H1 tag is no longer heavily correlated with stronger ranking signals but it is more important that Keywords or keyphrases are at the top of a page. My question is, if I just put my important keyword (or H1) toward the top of my page in the HTML and move it towards the middle/lower portion with css position elements, will this still be viewed by Googlebot as important keywords toward the top of my page? QCaxMHL
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Can we retrieve all 404 pages of my site?
Hi, Can we retrieve all 404 pages of my site? is there any syntax i can use in Google search to list just pages that give 404? Tool/Site that can scan all pages in Google Index and give me this report. Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
How do I find the links on my site that link to another one of my pages?
I ran IIS Seo toolkit and it found about 40 pages that I have no idea how they exist. What tool can I use to find out what internal link is linking to them so I can fix them or get rid of them?
Intermediate & Advanced SEO | | EcommerceSite0 -
How would you handle 12,000 "tag" pages on Wordpress site?
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
Intermediate & Advanced SEO | | M_D_Golden_Peak1 -
Duplicate content: is it possible to write a page, delete it and use it for a different site?
Hi, I've a simple question. Some time ago I built a site and added pages to it. I have found out that the site was penalized by Google and I have neglected it. The problem is that I had written well-optimized pages on that site, which I would like to use on another website. Thus, my question is: if I delete a page I had written on site 1, can use it on page 2 without being penalized by Google due to duplicate content? Please note: site one would still be online. I will simply delete some pages and use them on site 2. Thank you.
Intermediate & Advanced SEO | | salvyy0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0