What sort of content for 'non-niche' website?
-
Hey guys, had a question with regards to content production. We run an store called Yellow Octopus in Australia and we've literally got thousands of products (4500 skus last count). We've got everything from novelty mugs to kitchen accessories to gag gifts, t-shirts and tech gadgets.
I've read a lot of material on creating awesome content to attract backlinks and we are ready to craft our content strategy. We've got a team in place - graphic designer, illustrator and writers to execute that strategy.
It's just a matter of formulating the strategy! Largely speaking I have an idea of the quality of content required because I look at a lot of it. The real issue is what type of content is right for us? Most of the articles I have read focus on niche industries i.e. SEO, Piano sales or health foods. Right off the bat I can come up with hundreds of content pieces that work around those niches.
However, with such a diverse range of products I'm unsure of what our niche really is, in fact not having a niche is almost our niche. Of course we could do gift guides like '30 Unbelievable Gifts for Foodies' (and we do, do those). However they aren't really the type of posts that are likely to attract back-links.
Is the best strategy to split the content into categories? What sort of content pieces would you suggest for a company such as ours?
Many thanks in advance!
-
From looking at your site, your niche is "quirky gifts", so that is a decent umbrella topic to drive the content you're producing.
I think the "10 Best ______" format is one you should keep doing regardless of the links. Over time they could get links from people referencing gift occasions and things like that.
The next content types I'd be focused on would be topic related. "Star Wars", "Wine Lovers", "Science", etc. These don't have to be posts about gifts for those audiences, just anything related to those topics that fans would want to read. These types of posts will be your broader strategy for building awareness with your target audiences. The better your production value and promotion efforts, the more likely you'll see links from this type of content.
Your overall blog may or may not attract subscribers. You're more likely to get them if you focus on an angle, eg "nerdy stuff" like www.thinkgeek.com. That's probably the best route to go in order to get a following over time. Otherwise you'll always be trying to build traffic without a developed fanbase. And, your blog can speak to 60% of the categories you sell and still be successful. You don't have to publish about all products in the same blog.
-
Hi Matt,
I am also writing for a shop with a large range of products and categories. Topics that work well for me is "How To...." This attracts good quality traffic and there are always people that see the article and than instead of doing ot themselfes decide to buy the product because it looks so good.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Should I redirect a domain we control but which has been labeled 'toxic' or just shut it down?
Hi Mozzers: We recently launched a site for a client which involved bringing in and redirecting content which formerly had been hosted on different domains. One of these domains still existed and we have yet to bring over the content from it. It has also been flagged as a suspicious/toxic backlink source to our new domain. Would I be wise to redirect this old domain or should I just shut it down? None of the pages seem to have particular equity as link sources. Part of me is asking myself 'Why would we redirect a domain deemed toxic, why not just shut it down.' Thanks in advance, dave
Intermediate & Advanced SEO | | Daaveey0 -
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
We're planning a major website redevelopment - SEO Considerations?
We're currently planning a website rehaul, with a new site to be designed and implemented on our existing Drupal 7 platform. I've outlined the following areas to consider: Listing out top content by traffic, conversions, ranking and bounce rates to ensure top content continues to get relevant links throughout site (in particular high internal PA links!). Maintaining a specific KW target for each page Ensuring on-page SEO guidelines remain (i.e. img alt tags, headings and page titles) Having a low page load speed Ensuring architecture of site is built around our keyword methodology What else I need to be aware of? I'm predicting a drop in traffic as this tends to follow redesigns but looking to make this as minimal as possible. Sam
Intermediate & Advanced SEO | | Sam.at.Moz0 -
Is a Rel Canonical Sufficient or Should I 'NoIndex'
Hey everyone, I know there is literature about this, but I'm always frustrated by technical questions and prefer a direct answer or opinion. Right now, we've got recanonicals set up to deal with parameters caused by filters on our ticketing site. An example is that this: http://www.charged.fm/billy-joel-tickets?location=il&time=day relcanonicals to... http://www.charged.fm/billy-joel-tickets My question is if this is good enough to deal with the duplicate content, or if it should be de-indexed. Assuming so, is the best way to do this by using the Robots.txt? Or do you have to individually 'noindex' these pages? This site has 650k indexed pages and I'm thinking that the majority of these are caused by url parameters, and while they're all canonicaled to the proper place, I am thinking that it would be best to have these de-indexed to clean things up a bit. Thanks for any input.
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Merging websites
My company (A) is about the merge with another company (B). The long-term plan is not to keep their brand or website. In terms of the merge process, I have been doing a bit of research and this is how I'm thinking about doing it so far., which is open minded about changing... On the homepage of company B, do a 302 redirect to an information page on the same website which details the merger. This will only be for a year. After a year has passed, do a 301 redirect to the homepage of company A Do 301 redirects from all other pages to similar pages on company A. For pages that don't correspond, either do a 302 to the 'merger detail page', or do a 301 to the homepage of company A. Bring across any content that is effective at driving traffic. Contact all high authority websites that have links to company B and request for them to be updated. Any tips/corrections appreciated. Stu
Intermediate & Advanced SEO | | Stuart260 -
Moving Content To Another Website With No Redirect?
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
Intermediate & Advanced SEO | | sbrault741 -
What would be the ideal method to handling auto-generated product content across network of dealership websites?
We have recently started work with a dealership group that operates ~20 separate dealerships (different locations and brands) and individual websites for each. The group also operates two umbrella websites for the group brand that shows the inventory across All 20 dealerships. All websites are basically using the same template and all product listings are from the same data source (same back-end system). All websites are currently also hosted on the same IP address. Typically we work with clients to rectify duplicate content issues and work towards having just one version of any piece of content. However, this is a unique situation in that each dealership has a legitimate brand and marketing need for having their own website. It also is not realistic to ask the client to create unique content for the same product listing 22x. We understand there are numerous options to consider but I would appreciate hearing any advice/feedback from individuals who have dealt with similar situations. If you know of any good resources on such a scenario, that would also be helpful to verify our thoughts. NOTE: the duplicate content for product inventory is not across all 22 sites but just usually between 3-4 for each product. Often each product listing is shown on 1 or 2 dealerships and the 2 umbrella sites (one is the main group site and the other a product used/clearance site). Currently we can see multiple domains indexed for the same product listings.
Intermediate & Advanced SEO | | BryanSmith0