What is the best way to take advantage of this keyword?
-
Hi SEO's!
I've been checking out webmaster tools (screenshot attached) and noticed that we're getting loads of long tail searches around a search query 'arterial and venous leg ulcers' - on a side note we're a nursing organisation so excuse the content of the search!!!
The trouble is that google is indexing a PDF page which we give out as a freebie:
http://www.nursesfornurses.com.au/admin/uploads/5DifferencesBetweenVenousAndArterialLegUlcers1.pdfThis PDF is a couple of years old and needs updating but its got a few links pointing to it.
Ok so down to the nitty gritty, we've just launched a blog:
http://news.nursesfornurses.com.au/Nursing-news/We have a whole wound care category in which this content belongs, and i'm trying to find the best way to take advantage of the search, so I was thinking:
- Create an article of about 1000 words
- Update the PDF and re-upload it to the main domain (not the sub domain news.nursesfornurses.com.au)
- Attach the PDF to the article on the blog
OR would it be better to host this on the blog, and setup a 301 redirect to this page?
I just need some advice on how best to take advantage of this opportunity, our blog isn't getting much search traffic at the moment (despite having 300+ articles!!) and i'm looking into how we can change that.
I look forward to your response and suggestions.
Thanks!
-
On another side note, the authors often upload PDF's from other sources like government bodies etc. I've told them to set them to nofollow for the PDF's which they're now doing. Is there a quick way to do nofollow on all the PDF's uploaded to the site or will I need to do it manually?
They also sometimes repost (copy and paste) articles from their partners. Is there an easy (and free) way to scan the whole site and see which pages are duplicates from another site need canonical URL's?
What is the best way to fix the canonical domain issue? Were you refering to the blog (news.nur...) or main site?
-
Thank you for your response @ATP its good to know people are in the same boat and had success.
@Micheal / @Andy, I really didn't want to be on a sub domain (theres actually an earlier discussion i've had about ways around this) but the dev we're working with can't host PHP on the server so he's forced us to use a sub domain if we want to use wordpress...
We definitely haven't been ranking that well at all and I know theres a lot that needs to be done to improve the on page SEO. I did just add yoast and updated the /tag / category.
For the current issue with the PDF, I'm thinking to do the following:
- Update the PDF, to make it more current
- Create a unique article.
- Then upload the new PDF to the blog, and do a 301 redirect from the old PDF
Do you think this pass the link juice to the sub domain?
-
And that is fine After having done a lot of testing on this, I can say it does make no difference. Create a site, create a subdomain, and that subdomain will then carry the PR from the primary. I have had subdomains ranking in a matter of days (and still there 12 months later), with not a single link to them.
I have yet to see anything that would suggest otherwise.
Edit-- FYI, I much prefer not to use a subdomain myself, unless there is a reason. I prefer to keep everything under the primary domain.
-Andy
-
I don't totally agree that it truly doesn't matter. I'm strongly siding with Rand on this one - https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday
-
Your first issue is that the blog is a sub-domain which means as far as Google is concerned a new website which will have to be optimised and grow independent of the main site.
This isn't the case now Michael, and hasn't been for some time. Your site sub-domain will carry all benefits as if it were a normal part of the link structure. Google say "Just use whichever one is easier for you"
Matt Cutts talks about them here... https://www.youtube.com/watch?v=_MswMYk05tk
-Andy
-
"I just need some advice on how best to take advantage of this opportunity, our blog isn't getting much search traffic at the moment (despite having 300+ articles!!) and i'm looking into how we can change that. "
Your first issue is that the blog is a sub-domain which means as far as Google is concerned a new website which will have to be optimised and grow independent of the main site.
Within the blog you need to use the Yoast plugin to not index stuff like /tag/, /category/, possibly /author/ - any of these categories that is potentially duplicating your content. You have 1810 URLs indexed.
As a side issue you need to resolve your canonical domain - you have both www and non www.
-
Hi John,
We are in the process of doing a similar thing, we had 15 articles all pdf's with one in-particular attracting 5k viewers a month to the website (65-80% of our website traffic each month). Our problem was that the PDF didn't encourage them to interact with the rest of the site, and was a nightmare to update.
After discussion and seeking advice here on the moz forum we decided the best solution was to launch a blog (in process), update all the old pdf articles and release them all as a blog post, then re-direct the old pdf's to the relevant blog post then finally attach the new updated pdf as a downloadable link via a no-follow link so we don't get duplicate content and our customers can still take away the content. So far we only did it on 1 article as a test whilst we get the blog set-up correctly, and about 1/5 of 5000 viewers each month now view additional pages across our site instead of just the pdf.
If you do it correctly and get the same success with your pdf's that are attracting traffic you may find you can seek the boosts in traffic your blog is after.
SIDE NOTE: However, if your blog has 300+ articles and your not getting traffic, i would suggest moving traffic from other parts of the site to your blog in this manner isnt the answer you seek and instead you need to look at addressing the problem directly by looking at marketing strategies for your content aswel as reviewing the content in these articles to see if there may be a reason its not attracting visitors or ranking to gain traffic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword research
Could some give me an example on how they do keyword research because I have tried many things and it doesn't work. Here is what I tried : Let's take a keyword " Alsace bike tours" I go to the keyword tool, lisgraph, ubersuggest, google keyword tool, and type "Alsace bike tours" Thos tools spit me phrases such as : "bike from colmar to riquewihr, "alsace vineyard cycle route", alsace cycle routes. I write my content and integrate those expressions in it. In my content I add words that relate to alsace such as Strasbourg, Colmar, the wine route etc... I wait and weeks later see no change in ranking.. What I am missing ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Best practice to redirect all 404s?
Hey is it best practice to redirect all 404 pages. For example if the 404 pages had 0 traffic and no links why would you need to redirect that page? Isn't it best practice just to leave as a 404? Cheers.
Intermediate & Advanced SEO | | kayl870 -
Whats the Best way to Protect Wordpress Website from Getting Hacked.
Hi All, I just like to know whats the best way to protect wordpress website for getting hacked. I tried using Wordfence but nothing much happened. I m in shared Host and when ever there is a sign of attack my hosting company takes the site off which affects my site ranking a lot. I m trying to keep all my plugins updated but still it happens . Like to know what other people do . I am open for Paid tool suggestion as well. Thanks
Intermediate & Advanced SEO | | Verve-Innovation0 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
Best practice for site maps?
Is it necessary or good practice to list "static" site routes in the sitemap? I.e. /about, /faq, etc? Some large sites (e.g. Vimeo) only list the 'dynamic' URLs (in their case the actual videos). If there are urls NOT listed in a sitemap, will these continue to be indexed? What is the good practice for a sitemap index? When submitting a sitemap to e.g. Webmaster tools, can you just submit the index file (which links to secondary sitemaps)? Does it matter which order the individual sitemaps are listed in the index?
Intermediate & Advanced SEO | | shawn810 -
Local Keyword Searches With Broad Terms.
I am able to do keyword research for any term that I want,. However, I want to see results for broad keywords in local areas.... For example.. Hair cut Miami may get 100 searches a month. How can I find the number (x) of search volume for "Hair Cut" searched within Miami, FL.? If I add the 100 and the other number (x) it may be worth the while to build.
Intermediate & Advanced SEO | | SEODinosaur0 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0