Came to say just this. Great answer, Anthony.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by dohertyjf
-
RE: If I change Tags and Categories in Wordpress blog post, will it negatively affect SEO and cause 404s?
-
RE: PDF ranking higher than HTML pages, solution?
Hi there! Great question and one that has perplexed SEOs for years.
You are indeed right that one of the best ways to rank the page over the PDF is to put some of the PDF content on your page and do other best SEO practices (eg build links to the page, have a better optimized title/etc than the PDF itself). It is also possible to put the rel-canonical in the header of the PDF - https://moz.com/blog/how-to-advanced-relcanonical-http-headers
Finally, if you don't want your actual PDF showing up in the search results, consider a) blocking it in the robots.txt and then b) requesting the URL be removed from the search index via Search Console.
Good luck!
-
RE: Sitemaps for landing pages
Absolutely no harm at all. Do you have an index sitemap that you list all the sub-sitemaps from? If not you should do that as well just for sanity of sitemap management.
-
RE: Sitemaps for landing pages
Hi there! Good question.
First, each individual XML sitemap should only have a maximum of 50k URLs in it. At the scale of millions of pages I always recommend splitting out your sitemaps by type so that you can monitor indexation by section of the site.
If I were you I'd create a separate sitemap for landing pages and exclude the PPC landing pages unless those are the same pages you've created for SEO.
Cheers!
-
RE: Impact of Medium blog hosted on my subdomain
David -
Thanks for your question, and it's one I see often. I would say this is a much bigger question than "subdomain v subfolder", but really the ability to affect your own SEO.
In direct answer to your questions:
- Since it's on your subdomain, yes. Make sure you have that subdomain verified in Search Console and sitemaps submitted, parameters controlled, etc as well. Also link between your main domain and your subdomain to pass link equity back and forth.
- If they change in the future and no longer point to your subdomain with no way for you to reclaim your content and republish it on a blog you host yourself, then yes. However, I don't really see this happening anytime soon.
Point 2 brings up the bigger question of if you should host your blog on Medium. While it is indeed a beautiful platform and writing on it is a joy (I actually do a lot of blog drafting in their editor), you don't have control over a lot of things such as:
- Internal linking within sidebars/top navs to other important places on your own website
- Full branding. I do recognize that you can add a top banner and branding at the top of blogs hosted on Medium, but it still overall looks like a Medium blog (their typeface, their styles, etc) not like your own brand
If you are concerned about the SEO implications (as you seem to be and should be), I'd definitely recommend investigating a self-hosted blog platform like WordPress instead of Medium.
Good luck!
-
RE: Can't generate a sitemap with all my pages
I definitely agree with Logan. The max for an XML sitemap for Search Console is 50,000 URLs, so you won't be able to fit all of yours into one.
That being the case, divide them into different sitemaps by category or type, then list all of those in one directory sitemap and submit that. Now you can see indexation by page type on your website.
Finally, I have to ask why you are doing this with a third party tool and creating a static sitemap as opposed to creating a dynamic one that can update automatically when you publish new content? If your site is static and you're not creating new pages, then your approach might be ok, but otherwise I'd recommend investigating how you build a dynamic XML sitemap that updates with new content.
Cheers!
-
RE: Duplicate content on recruitment website
Hi Issa -
Great question here. Seems your client is potentially in a tough spot with this!
There is a ton to unpack here and it is hard to know specifics without the site (feel free to private message it to me), but to your specific questions:
- Re: if it is a problem that the jobs have the same title, that is only something you can answer with the analytics data you have access to. It usually is not a problem, but when you have this sort of situation I'd also ask if you have category pages for those terms (eg 20 Growth Hacker jobs in SF a day, but also a "Growth Hacker Jobs in SF" category where all those individual jobs link back up to
- Regarding syndication of content, this can cause an issue if not done correctly. You'd have to see where they lost traffic (you hopefully already know), but if it's the case with syndicated listings losing traffic and non-syndicated not, this is an issue. What I've often done is either get the site we are syndicating to to implement a canonical back to my listing, or get a followed link from their version back to yours. Also, you can be selective about what you syndicate so that it's a small duplication vs complete. Also, make your pages more robust and only syndicate the necessary info if possible.
- Website usability can be bad for Panda, especially if bounce rates are really high. Check those and see if they are high. If they are, you should fix it anyways because you'll get better conversions. I've also heard of cases where they made their site "stickier" and they bounced back from Panda.
I guess it's hard to know if Panda is still rolling out, but from everything I have heard it is. I assume this was not just a one-time drop on one day, but rather a slow leak of traffic? That makes it harder to investigate if the second.
Good luck!
John
-
RE: Should I submit a sitemap for a site with dynamic pages?
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
RE: Should I delete 100s of weak posts from my website?
This is a very valid question, in my opinion, and one that I have thought about a lot. I even did it on a site before on a UGC section where there were about 30k empty questions, many of which were a reputation nightmare for the site. We used the parameters of:
- Over a year old
- Has not received an organic visit in the past year
We 410d all of them as they did not have any inbound links and we just wanted them out of the index. I believe they were later 301d, and that section of the site has now been killed off.
Directly after the pages were removed, we saw a lift of ~20% in organic traffic to that section of the site. That maintained, and over time that section of the site started getting more visits from organic as well.
I saw it as a win and went through with it because:
- They were low quality
- They already didn't receive traffic
- By removing them, we'd get more pages that we wanted crawled, crawled.
I think Gary's answer of "create more high quality content" is too simplistic. Yes, keep moving forward in the direction you are, but if you have the time or can hire someone else to do it, and those pages are not getting traffic, then I'd say remove them. If they are getting traffic, maybe do a test of going back and making them high quality to see if they drive more traffic.
Good luck!
-
RE: Pagination parameters and canonical
Hi Teconsite, this is a great question.
I would not recommend marketing the "p" parameter in Search Console. Instead, I'd leave it as "Let Google Decide" and use your pagination SEO implementation to guide the search engines.
There is still a lot of debate around pagination as it relates to SEO. The way I have always implemented is is:
- Every paginated page canonicals to itself, because you do not want the search engines to start ignoring your paginated pages which are there somewhat for users, but also for SEO.
- Use rel next/prev to help Google understand that they are in pagination, which will also help them rank the beginning of pagination for the terms you are trying to rank for.
- Use noindex/follow on pages 2-N to be sure they stay out of Google's index.
- Use the numbers showing how long pagination is to drive the search engines deep into your pagination to get all of your products/whatever indexed. This is often done through linking to page 1, the last page, and the 3-5 pages on either side of the page you are currently on. So page 7 of 20 would like to page 1, pages 5-9, and page 20.
The reason most people say to canonical pages 2-N to the base page is to preserve any link equity pointing to these pages and help the first page rank. However, I have almost never seen a deep paginated page with links, and if you have architected pagination correctly then the equity going into pages 2-N will also flow to page 1, just like product pages linking to category pages.
Hope this helps!
-
RE: Warnings, Notices, and Errors- don't know how to correct these
Hi Gina -
Great questions here. Some of these you should worry about, others are just notices and not necessarily an issue.
Fix the 4XX errors if those pages have links, or have a 404 page that redirects users. 404s are not always bad, but if the user isn't supposed to end up there (ie your product page is expired), then redirect.
Don't worry about the duplicated meta descriptions on archive pages, but do think about if these pages are needed. Ayima had a good post on pagination recently - http://www.ayima.com/seo-knowledge/conquering-pagination-guide.html
Same as above with the title tags on paginated archives.
Rel-canonicals are fine. Once again, just notices that they are there.
Did you implement those 301s? Moz notifies you of them because they might pass less link equity than straight links, but 301s are not bad.
What do you mean by "These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes." It seems that this may have been implemented manually on your side, though I don't know how All In One SEO Pack handles it (I use Yoast).