What is the best way to scrape serps for targeted keyword research?
-
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited.
Appreciate any suggestions on doing this at scale,
thanks!
-
Thanks a bunch, Lavellester. Just finished Tom's post - exactly what I needed.
I appreciate your help,
H
-
Hi,
If I remember correctly this post by Tom Critchlow about agile SEO hacks should get you moving in the right direction.
Hope it helps.
L
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Really struggling with serps....
My website is only a week old or so but I have no pages showing on the serps or at least in the first few hundred results! I have many other hobby websites that had pages in the top 100 results instantly and the niche of this new website is tiny and not saturated so it should be up there already! All pages are indexed but non showing in the results. it feels like I have been penalised or something but I don’t see how or why? my website is www.magnet-fishing.co.uk of anyone can see anything obvious that I am missing regards Andy
Intermediate & Advanced SEO | | Onlytopheadsets0 -
2 eCommerce stores that are identical 1 for US 1 for CA, what's the best way to SEO?
Hello everyone! I have an SEO question that I cannot solve given the parameters of the project, and I was wondering if someone could provide me with the next best alternative to my situation. Thank you in advance. The problem: Two eCommerce stores are completely identical (structure, products, descriptions, content) but they are on separate domains for currency and targeting purposes. www.website-can.com is for Canada and www.website-usa.com is for US. Due to exchange rate issues, we are unable to combine the 2 domains into 1 store and optimize. What's been done? I have optimized the Canadian store with unique meta titles and descriptions for every page and every product. However I have left the US store untouched. I would like to gain more visibility for the US Store but it is very difficult to create unique content considering the products are identical. I have evaluated using canonicals but that would ask Google to only look at either the Canadian or US store, , correct me if i'm wrong. I am looking for the next best solution given the challenges and I was wondering if someone could provide me with some ideas.
Intermediate & Advanced SEO | | Snaptech_Marketing0 -
Benefit of Targeting Low/No Volume Keyword Phrases
Hi Folks, I was having a discussion with a friend and colleague of mine yesterday about the pros and cons of targeting keyword phrases that have very little if any search volume. I was of the opinion that if the keyword phrases (whether they were local or not) did not have any search volume as indicated by Google's Keyword Planner tool, then they had little if any value. Would this be a correct assumption? Or is there merit to targeting these phrases in order to begin to build a picture of a sites overall subject matter and to help rank in local search? For example, say there is a phrase like 'second hand clothing slough' (just a random phrase) which has no search volume but 'second hand clothing' has 2400 visits a month, would it be worth targeting the search phrase with no volume to build a better local profile, so that if someone in Slough searches for 'second hand clothing' the site shows up for that keyword? Thanks in advance guys! Gareth
Intermediate & Advanced SEO | | PurpleGriffon0 -
Best way to fix 404 crawl errors caused by Private blog posts in WordPress?
Going over Moz Crawl error report and WMT's Crawl errors for a new client site... I found 44 High Priority Crawl Errors = 404 Not Found I found that those 44 blog pages were set to Private Mode (WordPress theme), causing the 404 issue.
Intermediate & Advanced SEO | | SEOEND
I was reviewing the blog content for those 44 pages to see why those 2010 blog posts, were set to private mode. Well, I noticed that all those 44 blog posts were pretty much copied from other external blog posts. So i'm thinking previous agency placed those pages under private mode, to avoid getting hit for duplicate content issues. All other blog posts posted after 2011 looked like unique content, non scraped. So my question to all is: What is the best way to fix the issue caused by these 44 pages? A. Remove those 44 blog posts that used verbatim scraped content from other external blogs.
B. Update the content on each of those 44 blog posts, then set to Public mode, instead of Private.
C. ? (open to recommendations) I didn't find any external links pointing to any of those 44 blog pages, so I was considering in removing those blog posts. However not sure if that will affect site in anyway. Open to recommendations before making a decision...
Thanks0 -
Keyword Phrases - Can You Break Them Up?
Can you break up a search query across a sentence and have Google still recognize which query you are targeting? Let's say I'm trying to rank a page for the phrase "best haircuts calgary". Is Google's algorithm advanced enough to look at page title "Best Haircuts - Where To Get Them In Calgary" and know it's targeting the query "best haircuts calgary"? If it can't do this right now, I could see it advancing to this at some point in the future, which would then change the game quite a bit in terms of how creative you can get creating pages for queries.
Intermediate & Advanced SEO | | reidsteven750 -
Dropping dramatically in keyword rankings
One of my clients has always ranked well for this keyword (janitorial services, nh). Then within a week, they dropped out of the top fifty in Google and now in other major search engines as well. My first thought of why the drop in rankings is due to duplicate listings within online directories. The previous marketing person on staff was listing the company more than once in these directories, and it wasn't discovered until later in the link building process. Sometimes the company was listed with "janitorial services" as part of the company name, and then listed again with "carpet cleaning" as part of the company name... sometimes with duplicate address, or using the po box instead - as if two companies. The odd thing in all this is that while they dropped in ranking for this keyword, they still come in usually 1st in Google Places for this keyword with 12 excellent reviews. And yet when I check their Google Places account, it says that it needs to be reverified, again, it doesn't meet the terms. (company is a family owned business, for over 30 years, they have a lot of potential). So all this duplication needs to be fixed, but how serious are duplicate listings on places like Manta, YellowPages, SuperPages, also Yahoo Business Local and Bing Business Directory? And now that "forensics" seems to be my task, any suggestions on how to start? Any processes I should go through with Google WebMaster? _Cindy And, too, if I could add, the site ranks very poorly for this keyword and while I have provided recommendations, and they understand the onsite issue, they have yet to go forward with implementation, making this a little more difficult issue.
Intermediate & Advanced SEO | | CeCeBar0 -
What's the best way to phase in a complete site redesign?
Our client is in the planning stages of a site redesign that includes moving platforms. The new site will be rolled out in different phases throughout a period of a year. They are planning to put the new site redesign on a subdomain (i.e. www2.website.com) during the roll out of the different phases while eventually switching the new site back over to the www domain once all the phases are complete. We’re afraid that having the new site on the www2 domain will hurt SEO. For example, if their first phase is rolling out a new system to customize a product design and this new design system is hosted on www2.website.com/customize, when a customer picks a product to customize they’ll be linked to www2.website.com/customize instead of the original www.website.com/customize. The old website will start to get phased out as more and more of the new website is completed and users will be directed to www2. Once the entire redesign is completed, the old platform can be removed and the new website moved back to the www subdomian. Is there a better way of rolling out a website redesign in phases and not have it hosted on a different subdomain?
Intermediate & Advanced SEO | | BlueAcorn0 -
Has anyone found a way to get site links in the SERPs?
I am wanting to get some site links in the serps to increase the size of my "space", has anyone found a way of getting them? I know google says that its automatic and only generated if they feel it would benifit browsers but there must be a rule of thumb to follow. I was thinking down the line of a tight catagorical system that is implimented throughout the site that is clearly related to the content (how it should be I guess)... Any comments, suggestions welcome
Intermediate & Advanced SEO | | CraigAddyman0