Please recommend a tool to list pages on my site.
-
I have taken a major hit from the latest update. Site has been online for 10 years, white hat SEO all the way but I do have some legacy pages were I would duplicate title or the description on a new page.
Things are just unorganized currently and trying to find the best approach to organizing what I already have as well as track new content.
I would like to have a tool that would basically extract a list of my current pages, the title tags and the description in an Excel file. Not sure how the pros organinze the SEO on a site but my biright idea is that I can have a large excel file with the pages listed so I can detect duplicate info.
Site only has about 300 pages. Just regular php pages, no CMS.
Thanks in advance!
-
I had to come back and thank you for the great suggestion on the Screaming Frog software tool, this is exactly what I was looking for plus some additional tools that are invaluable.
-
Based on this post, I just tried Screaming Frog and it is an awesome tool/resource! Thanks Seiomoz crew!
-
I've always gone with Xenu, though I've heard good things about the already-mentioned Screaming Frog.
Dr. Pete did a nice comparison last year of the two:
http://www.seomoz.org/blog/crawler-faceoff-xenu-vs-screaming-frog
-
You could use the SEO Spider Tool at http://www.screamingfrog.co.uk/seo-spider/. The free version has a limit of 500 URI maximum but you should be good with the free version since you said you have 300 pages.
I hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have over 3000 4xx errors on my site for pages that don't exist! Please help!
Hello! I have a new blog that is only 1 month old and I already have over 3000 4xx errors which I've never had on my previous blogs. I ran a crawl on my site and it's showing as my social media links as being indexed as pages. For example, my blog post link is:
Technical SEO | | thebloggersi
https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/
My site is then creating a link like the below:
https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/twitter.com/aliciajthomps0n
But these are not real pages and I have no idea how they got created. I then paid someone to index the links because I was advised by Moz, but it's still not working. All the errors are the same, it's indexing my Twitter account and my Pinterest. Can someone please help, I'm really at a loss with it.
2f86c9fe-95b4-4df5-aeb4-73570881938c-image.png0 -
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
Inner pages of a directory site wont index
I have a business directory site thats been around a long time but has always been split into two parts, a subdomain and the main domain. The subdomain has been used for listings for years but just recently Ive opened up the main domain and started adding listings there. The problem is that none of the listing pages seem to be betting indexed in Google. The main domain is indexed as is the category page and all its pages below that eg /category/travel but the actual business listing pages below that will not index. I can however get them to index if I request Google to crawl them in search console. A few other things: I have nothing blocked in the robots.txt file The site has a DA over 50 and a decent amount of backlinks There is a sitemap setup also any ideas?
Technical SEO | | linklander0 -
Site splitting value of our pages with multiple variations. How can I fix this with the least impact?
Just started at a company recently, and there is a preexisting problem that I could use some help with. Somebody please tell me there is a low impact fix for this: My company's website is structured so all of the main links used on the nav are listed as .asp pages. All the canonical stuff. However, for "SEO Purposes," we have a number of similar (not exact) pages in .html on the same topic on our site. So, for example, let's say we're a bakery. The main URL, as linked in the nav, for our Chocolate Cakes, would be http://www.oursite.com/chocolate-cakes.asp. This differentiates the page from our other cake varieties, such as http://www.oursite.com/pound-cakes.asp and http://www.oursite.com/carrot-cakes.asp. Alas, fully indexed in Google with links existing only in our sitemap, we also have: http://www.oursite.com/chocolate-cakes.html http://www.oursite.com/chocolatecakes.html http://www.oursite.com/cakes-chocolate.html This seems CRAZY to me, because wouldn't this split our search results 4 ways? Am I right in assuming this is destroying the rankings of our canonical pages? I want to change this, but problem is, none of the content is the same on any of the variants, and some of these pages rank really well - albeit mostly for long tail keywords instead of the good, solid keywords we're after. So, what I'm asking you guys is: How do I burn these .html pages to the ground without completely destroying our rankings for the other keywords? I want to 301 those pages to our canonical nav URLs but, because of the wildly different content, I'm afraid that we could see a heavy drop in search traffic. Am I just being overly cautious? Thanks in advance!
Technical SEO | | jdsnyc20 -
Pageing page and seo meta tag questions
Hi if i am using paging in my website there is lots of product in my website now in paging total paging is 1000 pages now what title tag i need to add for every paging page or is there any good way we can tell search engine all page or same ?
Technical SEO | | constructionhelpline0 -
Webmaster Tools/Time spent downloading a page
Hi! Is it preferable for the "time spent downloading a page" in Google webmaster tools to be high or low? I've noticed that this metric rapidly decreased after I moved my site to WP Engine and I'm trying to figure out if it's a good or bad thing. Thanks! Jodi QK8dp QK8dp
Technical SEO | | JodiFTM0 -
Will having a big list of cities for areas a client services help or damage SEO on a page?
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page. They service the entire southeast so the list just looks crazy ridiculous. --------- Example: ---- South Carolina: Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc ------ end example ------ The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
Technical SEO | | Highforge0 -
Our Development team is planning to make our website nearly 100% AJAX and JavaScript. My concern is crawlability or lack thereof. Their contention is that Google can read the pages using the new #! URL string. What do you recommend?
Discussion around AJAX implementations and if anybody has achieved high rankings with a full AJAX website or even a partial AJAX website.
Technical SEO | | DavidChase0