Doing large scale visual link/content analysis
-
Hi i currently have a list of about 5000 URLs i want to visually check quickly, to identify decent content.
I'm currently opening 200 at a time with firefox, more than 200 it gets really choppy and slow as you would expect.
I was wondering if anyone knew any other ways of opening a large amount of web pages. It would be sweet if there was a tool which can scan a list, add the webpages to a pdf/powerpoint and send them back to you for analysis.
Kind Regards,
Chris
-
Looking at a screenshot of a website is a very poor way to determine content quality.
-
It can be solve if you have good configuration system like macbook air , you can open as many pages as you need also the server does matter how sooner your pages are visible .
-
Have you considered Screaming Frog SEO Spider? You can let it crawl your entire site and then start with the content that has a very low word count. That would be a signal that the page is too thin and needs to be adjusted. Depending on the site, that might cut down quite a bit on the manual analysis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content suggestions
Hi, In moz pro you get content suggestions. I was wondering if you can still rank if the topics you cover for a specific keyword on your page are not listed there ? I guess the key is that all the topics covered are related to each other, correct ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
What link would be better?
Hi Guys, Just wondering what would be better in this instance: finding an old post (with good authority) and getting a link from that old article or creating a brand new article and adding the link to that. Finding an old post (with good authority) and getting a link from that old article Creating a brand new article and adding the link to that. Both naturally link out to the page you want a link too. To me, number 1 as the page already has authority but then again number 2 since Google might place some weight to recency. Any thoughts? Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Cleaning up user generated nofollow broken links in content.
We have a question/answer section on our website, so it's user generated content. We've programmed all user generated links to be nofollow. Over time... we now have many broken links and some are even structurally invalid. Ex. 'http:///.'. I'm wanting to go in and clean up the links to improve user experience, but how do I justify it from an SEO standpoint and is it worth it?
Intermediate & Advanced SEO | | mysitesrock0 -
Unnatural Links Removal - are GWMT links enough?
Hi, When working on unnatural links penalty, is removing and disavowing links shown on the GWMT enough or should the list be broaden to include OSE and Majestic etc.? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
On-site links
Hi everybody, There's a lot of information about getting sitewide backlinks, but so few about on-site optimization. Is there a maximum of links to put on a page ? Is there a maximum of link that a page should receive ? etc ... ? So, what is the optimal strategy ? And I'm only concerned about on-page and on-site link, not backlinks commming from other sites. Thanks
Intermediate & Advanced SEO | | DavidPilon0 -
Is there an optimal ratio of external links to a page vs internal links originating at that page ?
I understand that multiple links fro a site dilute link juice. I also understand that external links to a specific page with relevant anchortext helps ranking. I wonder if there is an ideal ratioof tgese two items
Intermediate & Advanced SEO | | Apluswhs0 -
Links from tumblr
I have two links from hosted tumblr blogs which are not on tumblr.com. So, website1 has a tumblr blog: tumblr.website1.com And another site website2.com also uses the a record/custom domains option from tumblr but not on a subdomain, which is decribed below: http://www.tumblr.com/docs/en/custom_domains Does this mean that all links from such sites count as coming from the same IP in google's eyes? Or is there value in getting links from multiple sites because the a-record doesn't affect SEO in a negative way? Many thanks, Mike.
Intermediate & Advanced SEO | | team740 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0