Doing large scale visual link/content analysis
-
Hi i currently have a list of about 5000 URLs i want to visually check quickly, to identify decent content.
I'm currently opening 200 at a time with firefox, more than 200 it gets really choppy and slow as you would expect.
I was wondering if anyone knew any other ways of opening a large amount of web pages. It would be sweet if there was a tool which can scan a list, add the webpages to a pdf/powerpoint and send them back to you for analysis.
Kind Regards,
Chris
-
Looking at a screenshot of a website is a very poor way to determine content quality.
-
It can be solve if you have good configuration system like macbook air , you can open as many pages as you need also the server does matter how sooner your pages are visible .
-
Have you considered Screaming Frog SEO Spider? You can let it crawl your entire site and then start with the content that has a very low word count. That would be a signal that the page is too thin and needs to be adjusted. Depending on the site, that might cut down quite a bit on the manual analysis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect closed shop to main shop, or keep the domain and content alive and use it for link building?
Hello, We used to have two shops selling our products, a small shop with a small selection of only our best quality products (domain smallshop.com), and a big shop with everything (bigshop.com). It used to make sense (without going into full detail), but it's not relevant anymore, and so we decided to stop maintaining the small shop, because it was time consuming and not worth it. There is some really good links pointing to smallshop.com, and the content is original (the product descriptions are different between both shops). So far, we just switch the "add to cart" button on the small shop into a link to the same product on the big shop, and did links from the small shop to the big shop also on categories pages. So the question is: in your opinion, is it better to do that, keep the small shop and content alive and build links to our big shop, or do 301 redirections and shut down completely the small shop ? Thanks for your opinion!
Intermediate & Advanced SEO | | Colage0 -
First Link on Page Still Only Link on Page?
Bruce Clay and others did some research and found that the first link on the page is the most important and what is accredited as the link. Any other links on the page mean nothing. Is this still true? And in that case, on an ecommerce site with category links in the top navigation (which is high on the code), is it not useful to link to categories in the content of the page? Because the category is already linked to on that page. Thank you, Tyler
Intermediate & Advanced SEO | | tylerfraser0 -
Spam Links? -115 Domains Sharing the Same IP Address, to Remove or Not Remove Links
Out of 250 domains that link to my site about 115 are from low quality directories that are published by the same company and hosted on the same ip address. Examples of these directories are: -www.keydirectory.net -www.linkwind.com -www.sitepassage.com -www.ubdaily.com -www.linkyard.org A recent site audit from a reputable SEO firm identified 125 toxic links. I assume these are those toxic links. They also identified about another 80 suspicious domains linking to my site. They audit concluded that my site is suffering a partial Penguin penalty due to low quality links. My question is whether it is safe to remove these 125 links from the low quality directories. I am concerned that removing this quantity of links all at once will cause a drop in ranking because the link profile will be thin with only about 125 domains remaining that point to the site. Granted those 125 domains should be of somewhat better quality. I am playing with fire by having these removed. I URGENTLY NEED ADVICE AS THE WEBMASTER HAS INITIATED STEPS TO REMOVE THE 125 LINKS. Thanks everyone!!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
How hard would it be to take a well-linked site, completely change the subject matter & still retain link authority?
So, this would be taking a domain with a domain authority of 50 (200 root domains, 3500 total links) and, for fictitious example, going from a subject matter like "Online Deals" to "The History Of Dentistry"... just totally unrelated new subject for the old/re-purposed domain. The old content goes away entirely. The domain name itself is a super vague .com name and has no exact match to anything either way. I'm wondering, if the DNS changed to different servers, it went from 1000 pages to a blog, ownership/contacts stayed the same, the missing pages were 301'd to the homepage, how would that fare in Google for the new homepage focus and over what time frame? Assume the new terms are a reasonable match to the old domain authority and compete U.S.-wide... not local or international. Bonus points for answers from folks who have actually done this. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Copying my Facebook content to website considered duplicate content?
I write career advice on Facebook on a daily basis. On my homepage users can see the most recent 4-5 feeds (using FB social media plugin). I am thinking to create a page on my website where visitors can see all my previous FB feeds. Would this be considered duplicate content if I copy paste the info, but if I use a Facebook social media plugin then it is not considered duplicate content? I am working on increasing content on my website and feel incorporating FB feeds would make sense. thank you
Intermediate & Advanced SEO | | knielsen0 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
How to get the 'show map of' tag/link in Google search results
I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0