User Intent - Office Chairs & Content Writing
-
Hi
I'm trying to look for ideas for content on office chairs. Any ideas on where to start with user intent for this type of search query? I'm using answer the public to gather some ideas.
Some are good ideas, but I can't actually find any search volume for the phrase so then I'm unsure whether to devote time to writing something. Surely most people want to just find a supplier, buy the chair & they don't want a huge informational piece how to buy a chair.
Our competitors are the likes of amazon, and a load of other huge companies with high DA - so I'm looking at types of content we can write, people are interested in reading about chairs, which is less competitive..I'm not sure that exists...
Any help is appreciated
-
Yup, can do the same approach with SF. You can run it in List Mode which will let you upload the list of URLs to crawl, and you can set up an Extraction to separate out the h3s (Configuration > Custom > Extraction)
Paul
-
Not sure how to do it with Frog, but I'm pretty sure you can. You can always start a chat with guys from Netpeak Spider's support, they'll show you how to setup everything, just send them a link to this thread
-
Can you do the same thing with screaming frog? I'm not sure how you do it?
-
Wow thanks, I'll give this a go
-
Here's what I would do if I needed keywords about chairs with commercial intent.
Step 1: Get as many keywords related to this topic as possible.
Step 2: Paste your list of keywords into notepad++
Step 3: Use replace feature (Ctrl+H) to start each line* with "google.com/search?q=" https://i.imgur.com/sJ1g854.png
this will turn your list of keywords into the list of search queries. https://i.imgur.com/sJ1g854.png
This symbol ^ stands for the beginning of the line in Notepad++, so set N++ to replace ^ with that google thing.
Replace all spaces with a + sign for keywords that have more than 1 word to get google.com/search?q=best+chairsStep 4: Use crawler tool (Netpeak Spider has 2 weeks free trial) set it up to crawl your list of "google" URLs from Notepad ++ and fetch
's This is where page titles are nested in SERPs, there will 10 h3's for every keyword you used.
Example: https://i.imgur.com/x5UMGSU.png
Step 5: Paste Page titles with google URLs into spreadsheet and use conditional formatting to highlight titles where words like buy, for sale, price, delivery and etc. are used. Use words that indicate commercial intent.
Step 6: count number of commercial titles per SERP for every keyword to see if SERP is commercial or informational.
Step 7: separate commercial queries from informational
Step 8: analyze informational keywords only and build your content strategy around those keywords, they will be easier to rank for with articles and etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Medical / Health Content Authority - Content Mix Question
Greetings, I have an interesting challenge for you. Well, I suppose "interesting" is an understatement, but here goes. Our company is a women's health site. However, over the years our content mix has grown to nearly 50/50 between unique health / medical content and general lifestyle/DIY/well being content (non-health). Basically, there is a "great divide" between health and non-health content. As you can imagine, this has put a serious damper on gaining ground with our medical / health organic traffic. It's my understanding that Google does not see us as an authority site with regard to medical / health content since we "have two faces" in the eyes of Google. My recommendation is to create a new domain and separate the content entirely so that one domain is focused exclusively on health / medical while the other focuses on general lifestyle/DIY/well being. Because health / medical pages undergo an additional level of scrutiny per Google - YMYL pages - it seems to me the only way to make serious ground in this hyper-competitive vertical is to be laser targeted with our health/medical content. I see no other way. Am I thinking clearly here, or have I totally gone insane? Thanks in advance for any reply. Kind regards, Eric
Intermediate & Advanced SEO | | Eric_Lifescript0 -
Pagination & SEO
Hi In one of my other Q&A's someone mentioned I may need to look at pagination. For instance, are these pages counted as 'new' pages in Google's eyes when clicking on pagination? http://www.key.co.uk/en/key/plastic-storage-boxes http://www.key.co.uk/en/key/plastic-storage-boxes#productBeginIndex:30&orderBy:5&pageView:list& Does anyone have any advice on what I could do? It's not something I have had much experience with. Thank you Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
What are your thoughts on Content Automation?
Hi, I want to ask forum members’ opinion on content automation. And before I raise the eyebrows of many of you with this question, I’d like to state I am creating content and doing SEO for my own website so I’m not looking to cut corners with spammy tactics that could hurt my website from an organic search perspective. The goal is to automate pages in the areas of headings, Meta Titles, Meta Descriptions, and perhaps a paragraph of content. More importantly, I’d like these pages to add value to the users experience so the question is…. How do I go about automating the pages, and more specifically, how is meta title, meta descriptions etc. automated? I’d also like to hear from people that recommend steering clear of any form of content automation. I hope my question isn’t too bit vague and I look forward to hearing from other Mozzers. Regards, Russell in South Africa
Intermediate & Advanced SEO | | Shamima0 -
Duplicate (Basically) H1 & H2
We've about to relaunch one of our ecommerce sites and have a question regarding H1 & H2 tags. We use our primary keyword for each category in that category page's H1. We also include a block of text at the bottom of the page explaining the benefits of the products, the various styles we offer, personalization options, gift packaging, etc. We were planning on having an H2 at the beginning of that text that read 'About [keyword:]', but the question of duplicate H1 & H2 tags has come up. Is penalization possible for having them almost the same? It's not like they're not relevant - the H1 is referring to the category itself and the H2 references our explanation of the category. Just curious what the best way to approach this would be.
Intermediate & Advanced SEO | | Kingof50 -
Is an RSS feed considered duplicate content?
I have a large client with satellite sites. The large site produces many news articles and they want to put an RSS feed on the satellite sites that will display the articles from the large site. My question is, will the rss feeds on the satellite sites be considered duplicate content? If yes, do you have a suggestion to utilize the data from the large site without being penalized? If no, do you have suggestions on what tags should be used on the satellite pages? EX: wrapped in tags? THANKS for the help. Darlene
Intermediate & Advanced SEO | | gXeSEO0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0