Doing large scale visual link/content analysis
-
Hi i currently have a list of about 5000 URLs i want to visually check quickly, to identify decent content.
I'm currently opening 200 at a time with firefox, more than 200 it gets really choppy and slow as you would expect.
I was wondering if anyone knew any other ways of opening a large amount of web pages. It would be sweet if there was a tool which can scan a list, add the webpages to a pdf/powerpoint and send them back to you for analysis.
Kind Regards,
Chris
-
Looking at a screenshot of a website is a very poor way to determine content quality.
-
It can be solve if you have good configuration system like macbook air , you can open as many pages as you need also the server does matter how sooner your pages are visible .
-
Have you considered Screaming Frog SEO Spider? You can let it crawl your entire site and then start with the content that has a very low word count. That would be a signal that the page is too thin and needs to be adjusted. Depending on the site, that might cut down quite a bit on the manual analysis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links
Hi 64% of our links come from a .com website and only 30% from .co.uk. We only do business in the UK should I continue with the .com links as they are easier to source. Does this hurt my SEO efforts?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Tumblr links
I have several Tumblr blogs. Created when Tumblr links were worth more, and now primarily for my amusement. But, I'd like to get whatever link juice I can out of them. I thought only the footer links were do follow, but when I check Moz it's showing all links as do follow. Any idea which is true?
Intermediate & Advanced SEO | | julie-getonthemap1 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
Duplicated content multi language / regional websites
Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: - Href lang tags - Possible a Local Phone number - Possible a Local translation of the menu - Language meta tag (for Bing) Optional they are willing to take the following steps: - Crosslinking every page though a language flag or similar navigation in the header. - Invest in gaining local .be backlinks - Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Back links Building and article/blog posting
Hi all, I have been researching the best way for back links building, and I would like to ask few questions before I start. Which one of these tools would you recommend for back link building diagnostics. www.linkrisk.com - www.linkdetox.com What would be the best procedure to begin creating healthy back links? Would looking at my competitors back links help me? What would be the recommended amount of back links created per week? Also how many blogs entries should we aim to create per week? The website i'm working on is manvanlondon.co.uk If you guys have any further suggestions please let me know. Many thanks for your time.
Intermediate & Advanced SEO | | monicapopa0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0 -
Best procedure for distributing identical content about your company/site for affiliates to use?
When dealing with affiliate websites, whereby you send them a stock standard bio or info on your company for them to use on their sites, what is best practice? Is is OK to have multiple websites all linking to you with pages that contain the same content? Should I ask them to implement canonical or no-index tags for those particular pages? Should I ask them to rewrite the content (which may be impractical or they're unwilling to do)? Thanks
Intermediate & Advanced SEO | | Martin_S0