Yes, the on-page optimization tool is up to date with Google's recent updates. You'll notice that the tool will call out when you have over-optimized or keyword stuffed content. Because the on-page tool doesn't look at inbound links, however, getting an A+ in the tool won't mean that you are safe from Penguin penalties - make sure to be checking your inbound link profile as well.
![RuthBurrReedy RuthBurrReedy](/community/q/assets/uploads/profile/12263-profileavatar-1663881363655.jpeg)
Best posts made by RuthBurrReedy
-
RE: SEOmoz's On-page Checker upto date?
-
RE: Can a Hosting provider that also hosts adult content sites negatively affect our SEO rankings on a non-adult site hosted on same platform?
Your site should not be affected by other sites hosted by the same provider. Think about huge nationwide hosting sites like GoDaddy - if sharing hosting with websites with questionable content was a problem, a LOT of websites would have a problem!
-
RE: I have a GoDaddy website and have multiple homepages
Jaume is correct that the best solution would be to 1.) change that 302 redirect to a 301 (which would let Google know that the redirect is permanent, rather than temporary) and 2.) Also use a 301 redirect to redirect http://www.ecuadorvisapros.com/home.html to http://www.ecuadorvisapros.com. http://www.edcuadorvisapros.com is the home page that you want. However, this can be difficult to do if you don't have developer experience, and how easy it is to do also depends on the website system you're using. Here's an article on the godaddy.com site about how to do it: https://www.godaddy.com/help/using-301-page-redirects-234
If you don't want to mess with redirects, or can't get the page redirecting properly, you will want to use the canonical tag instead. In the section of http://www.ecuadorvisapros.com/home.html, put the tag . This will let Google know which version of the page is the "official" version. You can learn more about the canonical tag in the Moz learn section, here: https://moz.com/learn/seo/canonicalization.
-
RE: 403s: Are There Instances Where 403's Are Common & Acceptable?
I agree that it's probably not a huge problem, but still something to clean up if you can - it would be best if crawlers weren't trying to access these pages.
-
RE: Should blog tags be location specific?
Tags aren't going to give you a ton of SEO value on specific posts, so I would use them in whatever way makes the most sense for you to organize your content. It shouldn't be necessary to add your location to tags, though, unless you have multiple locations and are posting about the different communities you're in. Even then, I would only add location tags to posts that have some kind of location-related content. If your site overall is doing a good job of saying "San Francisco" (you've got your address prominently featured on your home page and Contact Us pages; you've used schema.org markup to mark it up; you add your city as a suffix to your title tags; you've claimed your local business listings in places like Google My Business, Bing Local, Localeze, etc.) you shouldn't need tags for Google to understand where you are and rank you for location-specific queries accordingly.
-
RE: Really, is there much difference between an unnatural links warning and Penguin?
The main difference between the two is that a reconsideration request is more likely to work with a link warning than with a regular Penguin hit. Penguin is algorithmic, whereas the link warnings were usually triggered by/resulted in manual penalties. Either way, it's a good idea to try to get as many spammy links removed/updated as possible, as well as build some new, non-spam links to increase the percentage of your links that are not spammy.
I wouldn't suggest building more spammy links to drown out the Penguin-targeted links - why not spend that time and effort building some natural links? They will last longer and if you do have to do a reconsideration request you're not running the risk that Google will also see your brand-new spam links.
-
RE: I have a GoDaddy website and have multiple homepages
You are encountering something that happens from time to time with communication about websites - as far as GoDaddy is concerned, there is only one "page" - there is only one document that is your home page - but the page has several different URLs, which is what's causing Google and Moz Tools to recognize it as several different pages. It's more of a terminology thing than anything (but a sarcastic response is never in good taste, not the best customer service there)!
As far as website content management systems go, I always recommend WordPress. It's very customizable, but there are many great out-of-the-box templates that work well for SEO too. There are several great plugins for WordPress that help make SEO changes easier - two good ones are WordPress SEO by Yoast and the All-in-One SEO pack. If you want to change your website's hosting as well, do a bit of research - many hosting companies will help you with step-by-step instructions for moving your site and installing WordPress.
-
RE: Disappearing Links Black Hat ?
The competitor should have no way to disavow links to your client's site. Even if they had somehow managed to disavow the links, the links themselves would still be in place. Have you checked the linking pages? Do they still exist? Can you tell why the links were removed? Do you have a good enough relationship with the owners of any of the linking sites to be able to reach out to them and see if you can get the links put back, or at least see why they were removed?
-
RE: Snippet showing as domain name with apostrophe, instead of page title when searching for the domain name.
I have definitely seen this before - it's been happening more frequently in the last ~3 years. Here's a piece from Search Engine Land a few years back on it: http://searchengineland.com/google-title-wrong-157819.
-
RE: Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
Is the problem that the page isn't appearing in the index, or that it isn't ranking for its target terms?
If the page has a lot of images but doesn't otherwise have much copy, it may be that Google is determining it to be too similar to other pages on your site and so is not displaying it. If it's not being indexed at all (doesn't show up in a site: search or when you search for a block of copy in quotations), double-check that your robots.txt isn't blocking it and that you don't have a meta robots noindex tag on the page. The suggestion of running Screaming Frog on your site to make sure a crawler can find the page is a good one - Screaming Frog will also tell you if the page is returning a weird HTTP status or is blocked by robots.
-
RE: How to check if an individual page is indexed by Google?
This is what I would do, too: search for a large chunk of text from the page and see if the page comes up. Site: is not always 100% accurate.
-
RE: Disappearing Links Black Hat ?
I would recommend taking a look at your historical referral data, to see what sites have sent traffic to this site historically that aren't anymore - that will give you an idea of where the links were that went away. Even if you can't get the old link back, you may be able to form a new relationship with those sites' owners, which could result in future link opportunities.
It sounds like, whatever the cause, the site has had a bunch of inbound links to it disappear. This is frustrating, but without knowing where the specific links were, it's going to be hard to get them back. I would focus on getting some new, high-quality, high-authority links to the site. Since you've got this new aggressive competitor, you could use Open Site Explorer to see where they're getting links from, and see if any of the sites that link to them would be good candidates for link outreach from your team as well.
-
RE: Keyword stuffing in
Here's a video from Google Webmaster Tools confirming that the meta keywords tag is not used for ranking: http://googlewebmastercentral.blogspot.com/2009/09/google-does-not-use-keywords-meta-tag.html
The only thing anyone might use a meta keywords tag for these days is your competitors scraping your tags to see which keywords you're targeting on which page!
-
RE: Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
Ah OK, thanks for the clarification!
That problem, to me, sounds like you need some links! In general when Google is ranking your home page for a term, instead of the page that is actually about that term, it's because they recognize that your site has some topical relevance for that term, but the individual page doesn't seem that important based on how many links are pointing to it. Are there ways you can flow some additional internal link juice to that page? Are there sites that are linking to your home page right now that are very closely related to the topic of the page in question, that you could ask to point to that page instead? Are there topically-related sites that don't link to that page right now that you could possibly get a link from? All of these will beef up your page authority, which should help.
In terms of your copy being too far down on the page - if you don't think it will negatively impact your user experience, you could try moving it up, or integrating it into the section with the images, but I don't know how much that will help. You also may need more copy on the page - if your page is 300 lines of code long, and only 5 of those lines are unique copy, it's hard to send a strong enough signal of relevance. Can you expand what you say on the page to make it a better resource on the topic at hand?
-
RE: Site Migration from One Dev. and Server to Another Dev. and Server
You're right to be against the subdomain idea - that's not a good call for SEO at all.
What I would do is start out with a full database export of all the URLs on the current site, and figure out what URL each of those resources is ultimately going to live at. This can be daunting with a large site, but it goes faster than you think it's going to - once you figure out that all the pages in example.com/old-folder/pagename are now going to live at example.com/new-folder/pagename you can figure out the URL structure for large chunks of the site at a time. Since it sounds like there won't be any changes to the overall design and structure of the site, just some possible URL changes, that will make it easier, too. I did this for the SEOmoz.org -> Moz.com transition and it took about a month to map out 65,000 pages alongside my other SEO duties (but that was with a lot of major changes in site structure, too).
Once everyone (you, the client, both dev teams) have agreed on the new structure, it's simply a matter of:
- moving the pages in each "chunk" from their old URLs to their new URLs
- 301 redirecting the old URLs to the new URLs on a page-to-page level
- doing a database find+replace on the old site and the new one to update internal links to those pages
Be really really careful with managing expectations for this. It's very common to see pages take a temporary hit in rankings and traffic immediately after they move to a new URL; this drop is usually temporary and reversible. But you don't want the client taking that data as proof that the migration isn't going to work or isn't working, and abandoning ship. To help matters along, take a look at what their best inbound links are and the linking sites with whom you have the best relationship, and as those pages move to their new addresses, reach out to the linking sites to try to get those links updated.
Does that answer your question? Happy to discuss further if not.
-
RE: Anchor name URLs & anchor blocks: how Google sees them?
Google is probably showing "Jump To" not because it shows up anywhere on the page, but because they understand that that is the function of the anchor links on-page. So you probably won't be able to change the page to get it to say something besides "Jump to" in the SERP - and that's OK, because your section head "is an umbrella company the only option I have?" is showing up in the snippet, and that is more important.
Here's a piece on Search Engine Land about this phenomenon: http://searchengineland.com/google-jump-to-links-within-search-snippets-26603, and Google Webmaster Central on it: http://googlewebmastercentral.blogspot.com/2009/09/using-named-anchors-to-identify.html. It looks like Google includes "Jump to" in the snippet to let users know that they will be taken to a point in the middle of the page (through the anchor link) instead of to the top of the page.
-
RE: Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
Don't forget that every keyword is different - how you rank depends on what you're doing compared to other sites targeting that term, not just what you're doing on your own site. So some keywords just take a larger, higher-authority link profile to rank for than others. A good place to start with getting links for that page would be to look at the backlinks that other pages that rank for that term have - you may be able to get some links from the same or similar sites.
-
RE: Adding Reviews to JSON Product Schema Markup
AH! OK, gotcha. In that case, Martijn was right - you'll need to add the Review type. Required fields for the Review type are:
- reviewBody (text)
- reviewRating (of type: Rating)
- author (of type: Person or Organization)
So the markup would look something like this:
-
RE: Moz not crawling opencart product pages
Hi Marlon,
There are a lot of different issues that may cause Rogerbot to be unable to crawl your site. For troubleshooting tips, I suggest reading http://moz.com/help/pro/why-can-t-rogerbot-crawl-my-site. Hope that helps!
-
RE: Dealing with Omitted Page
Usually a page is omitted because it's very similar to another page on your site, and Google has determined that the other page is more important. Since the category page is probably higher up in your site architecture, it would make since that Google would rank the other page. Now that you've put the effort into making your landing page more unique, I would give that a chance to rank. One thing you can do is find additional places on your site to point some internal links to your landing page, and try to build some links from other sites to that page as well. That will help you tell Google "hey, this page is important." Good luck!
-
RE: Moving to a new domain name - 301 redirect NOT an option
Using the Change of Address feature in Google Webmaster Tools and having your host point the DNS at the new site will help Google understand that the site has moved. However, you will probably see URLs from the old site outranking URLs from the new site for a while. If you can't create entirely new content for the new site, it would still be worth refreshing and rewriting the content for at least your highest-traffic pages, so they aren't exact duplicates.
Reclaiming as many links as possible by reaching out to get them pointed to the new site, rather than the old site, is definitely worth doing. You should also start thinking strategically about how you're going to market the new site to earn more links links once it launches, to make up for the links you can't get re-pointed to the site.
It will probably take a few months for the new site to start performing well and outranking the old site, but it's definitely possible.
-
RE: Good robots txt for magento
This is fine, as long as you don't want to exclude robots from crawling any part of your site.
-
RE: Effects of a long-term holding page/503 http code whilst site is being rebranded?
Hi Matt,
I think setting up a 503 HTTP code on the holding page and then using 302 redirects to point all pages to that page is a viable option. You could also consider having every page return a 503 error (make sure your robots.txt page does, as that will keep the search engines from continuing to crawl). The pages on the site will most likely fall out of the index while you're returning a 503, but that's OK since there won't be anything for your users to find anyway.
The key here is to add a Retry-After header with the GMT date and time your site will be available. That lets Google know when to come back and that the site isn't actually down/returning a 503 forever. Yoast has a great post on this at http://yoast.com/http-503-site-maintenance-seo/ which I'd recommend checking out.
-
RE: WMT data vs. Analytics
In terms of the correlations between rankings and traffic, one indicator is to look at how individual pages rank and then how much organic traffic each page gets - if it's ranking for some high-traffic terms, you can safely assume that those terms are driving some traffic. As for reporting in general, Bill Sebald just wrote a great post on this for the Moz blog: https://moz.com/blog/are-your-analytics-telling-the-right-story Hope that helps!
-
RE: Delay release of content or fix after release
I would definitely at least clean up the article HTML and structure before launching the pages, since you don't want people who might land on them before they're updated to have a weird experience. As far as optimizing them for SEO, I think you could go ahead and make the pages live and roll out edits as you make them. Prioritizing the pages based on highest-traffic/best-converting first is the way to go. If switching your platform is going to make your site easier to crawl, you definitely want to do that sooner rather than later - plus, having the new pages live will allow them to start accumulating some links even before you make keyword-related changes.
In general with a major change like this I recommend changing as few other things as possible simultaneously. It's OK to make more gradual changes, and it gives Google fewer things to get used to at one time.
-
RE: Some results disappeares after a while
It's fairly common for Google to "test" ranking a brand-new page pretty high for a relevant query, and then after a little while, for that page to settle into where it will be ranking long-term. The good news is that Google is finding your new pages and understanding their targeted queries; the bad news is that they're not sending enough of the signals Google needs for them to rank that well long-term. Some things that will help:
- Work to boost your organic click-through rate through better calls to action in your title and description tags.
- Look at what happens when people click through to the page from organic search, during those first few days when the page ranks well. Are they spending a decent amount of time on the page? Are they moving deeper into the site? Or are they bouncing right away? It's likely that google is using user data such as click-through rate, pogo-sticking and dwell time to determine whether or not the page is a good result for that query.
- Make sure the pages you're creating are robust and provide a complete answer to the query, and, if possible, point users to a next step in their research or decision process.
- Brooks is correct that building up your domain authority over time will also help you rank better, so that should definitely be part of your long-term plans.
-
RE: WMT data vs. Analytics
In my opinion, monitoring every phrase your page ranks for isn't a sustainable goal - there are too many variations and long-tail phrases being searched for. In terms of rank tracking, I tend to focus on the one or two higher-traffic terms I'm really targeting for a page, trusting that that ranking will translate into a halo effect of longer-tail terms and topical authority.
Rather than focusing on finding all of the phrases driving traffic to your page, try focusing on how much organic traffic each landing page is getting, and whether or not that traffic is doing what you want people to do when they reach that page. This will help you discover if the page matches the intent of the terms people are searching for. You should also take a look at the query data in analytics/search console to see if there are topics that your home page is appearing on page 2 or 3 for that you could create a more specific piece of content around - it's a sign that Google already understands your site is related to a topic, and just doesn't have a good page to send people to.
-
RE: Javascript onclick redirects / porn sites...
Hi Marcy,
If the sites are using your brand name and/or other brand terms, and your brand is copyrighted, you may be able to file a Digital Millennium Copyright Act takedown request with Google: https://www.google.com/webmasters/tools/dmca-notice?pli=1. As they note in the description on the tool, be very clear about whether the other site's actions actually constitute a violation of your copyright before filing the request.
I think it's unlikely that these new sites are impacting your site's performance in search - I was a little unclear about the JavaScript redirect, though (I'm at work and don't want to click on the links you posted on my work computer). Is it redirecting from their site to your site, or from their site to another site that is the porn/junk site? If it's the latter, that shouldn't be affecting your site at all. If it's the former, you may want to file disavow requests at the domain level for those sites just in case.
If your drop in rankings was caused by these new sites, I would expect to see a drop in performance across the board, rather than for specific queries, so I recommend that you keep digging on other reasons for the drop. I would take a look at the sites that are ranking now for the terms you've lost rankings for. How are they different from your site? What sites are ranking now that weren't ranking when you were on top? It may be that Google has decided that your site doesn't fulfill the search intent for those keywords, so taking a look at the sites that rank now will give you some insight into the kinds of pages that Google wants to rank for these terms. Since these were highly-converting terms for you, consider investing in PPC ads for these terms while you work to regain your organic presence. Good luck!
-
RE: Schema markup in tag manager for multiple locations not registering in tester tool or search console
Having one Organization tag and then marking up each location with LocalBusiness is how I would do it. It's hard to say what might be going on without knowing a bit more about your implementation. Is all this markup on one page (e.g. an "Our Locations" page), or is each tag firing on a separate location page? You mentioned that some of the markup is showing up in GSC and some isn't - do the entities/markup that are showing up in GSC have anything in common? Are you using static JSON-LD markup in Google Tag Manager, or using variables to generate it dynamically?
-
RE: Branded vs non branded keyword question
Recently Google has been taking a closer look at anchor text, because exact keyword anchor text isn't really how "natural" links (i.e. links you didn't build) look. Ben has a great point that natural link growth usually means anchor text that is either your brand name or your URL. Rob Kerry recently did a Whiteboard Friday on the Penguin update that may shed some more light on your predicament: http://www.seomoz.org/blog/the-penguin-update-whiteboard-friday
A majority of your inbound links should not have exact keyword match anchor text. Instead, focus more on building overall link volume, both for your domain and individual pages. Try to get links from high-authority sites, and create pieces of content that people want to share and link to - these are link building strategies that will be more successful than a hyper-focus on anchor text, whether branded, keyword or "click here."
I think the steps Ben outlines above are a great start.
-
RE: Duplicate Page Titles For Paginated Topics In Blog
Hi John,
As long as you've implemented rel=prev and rel=next markup on your paginated content, having duplicate title tags on these pages shouldn't be a problem at all. If you're really concerned about it, you could update the title tags on subsequent pages to say e.g. "Page 2" at the end, but if you've already marked it up as paginated content it should be fine. Hope that helps!
-
RE: What is Hub Linking
The basic idea behind hub linking is that there are certain websites that are "hubs" for information - that is, they are one place that you can go to get resources on a topic. A lot of times these domains link out a lot, because they want to showcase other websites that have good information resources on the topic. They often do have good page authority, if they are good resources on the topic, because people link back to them.
One thing to look out for when engaging someone to do Hub linking is, you want to make sure the sites you're getting links from are actually relevant resources that real people are using. A lot of "hub" pages are created solely for SEO purposes and tend to have thin content and too many links, which can be a sign to Google that the link is a shady one. So it's important to be judicious in selecting a hub page to target for links, and I would advise against trying to create one yourself unless you have a long-term strategy for how to continue adding content and making it useful.
-
RE: Why Isn't Product Schema Showing Up for my Ecom Site?
Hi! So my first question would be: Are other websites in the same SERP displaying rich product snippets? If not, it may be that Google has decided that other forms of rich content are a better fit for those queries.
One frustrating thing about the structured data testing tool is that it doesn't get updated nearly as often as the algorithm itself does, so occasionally something will validate just fine in the tool but be running up against stricter requirements in the algo itself. The tool was updated pretty recently so I don't know for sure that that's the case but it is a thing that can happen.
If you'd feel comfortable sharing a URL or two (or want to PM them to me), I can take a look.
-
RE: Page links, header links, footer links
If you have more than one link to the same page on a given page, it won't be considered duplicate content necessarily - but it also won't pass any additional page equity.
-
RE: Irrelevant backlinks - will 301 redirect cleanse the relationship?
Cleaning up the backlinks is by far the best option, and regardless of what else you do I recommend setting that in motion - but it sounds like a 100% success rate is pretty unlikely (and it usually is).
If you don't need any pages in the problem directories to be indexed, I'd definitely consider noindexing the /event/ directory and any other directories that are causing problems. You may also want to disavow the old backlinks on a domain level, which will take less time than doing it on a link-by-link basis. If you are going to do this, be warned that it has the potential to hurt your rankings - these links may be causing a penalty risk now, but they may also be passing value to your domain that, once removed, will cause the domain to slip. If you do decide to go that route, I recommend coupling it with a concerted link building effort - have a plan for several months of link-worthy content and a solid promotion plan to get new, more-relevant links to the domain.
-
RE: Secondary related keywords
"Wine tasting" is a very broad topic. Do a Google search for "wine tasting" in an incognito or private window, and you will see that Google serves up local results for places to go wine tasting in your area, meaning that Google understands the term to have local intent - people who search "wine tasting" are looking for places nearby to go wine tasting.
For a term with local intent, you want to make sure that your Local SEO is strong - that your website clearly says your name, address, and phone number; and that your local listings are claimed using a tool like Moz Local. By targeting the term "wine tasting in Bordeaux," you are effectively targeting the term "wine tasting" for the people you care about. You wouldn't want to rank for "wine tasting Chicago" since people from Chicago won't come to you for wine tasting.
For a very broad term, it's always best to narrow it down to a related keyword. You can certainly look at "wine tasting bordeaux" in the keyword tool and see what comes up. The volume for the keywords will be smaller, but it will be much easier to rank for that term and you will know that people who search for it are the people you want to reach.
If you are targeting a more specific term like "wine tasting cellars," you will also be adding the term "wine tasting" to your page a lot - since you can't say "wine tasting cellars" without saying "wine tasting"! Focus less on using the exact keyword phrase every time, and more on using natural variations of the term. This will make your content look more natural to both users and search engines. If you were writing a page targeting "wine tasting in bordeaux" and "wine tasting cellars," and were just writing about the topic without thinking much about keyword use, you would find yourself using variations of the terms, like "bordeaux wine," "wine cellars, "wine tasting," "wine tasting in southern France," etc. That is an OK thing to do, and even a good thing to do! Google expects to see these types of related term on the page. You just want to make sure you're using your target keywords in prominent places on the page such as your title, heading tags, and in the first few paragraphs, and that you're creating great content on the topic you've chosen.
Think about people who are searching for those terms. What kind of information would they want to see? This might be things like: What time is the tasting room open? What sort of wines can they taste there? Is there a tasting fee? Where is the winery located? What is the best way to get there? Including this sort of information provides a complete answer to their query, which is what Google is looking for.
I hope that is helpful!
-
RE: Why Isn't Product Schema Showing Up for my Ecom Site?
The first thing that stands out to me on these pages is that you have all of the other products that the page links to under "Related Items" marked up as well. So there are multiple values for Product and for Offer on the page. The first thing I would test would be removing this markup for the Related items, so that all the Product markup on the page applies only to the product that page is about. I suspect that Google may have gotten stricter about requirements for this, or otherwise changed the way they generate price/review snippets. So that's what I would test first. It can work to have multiple Product schemas on a category page, but on a page that is specifically about one product I would try to have all of the Product markup on that page be about that product.
The other thing you may want to update is your height and width properties. It looks like you're currently using the type "Intangible" for these, when their expected types would be either QuantitativeValue or Distance, so I would update those too.
You may also find this blog post useful: https://moz.com/blog/search-marketers-guide-to-itemref-itemid
Good luck!
-
RE: Domain switch planned - new domain accessible - until the switch: redirect from new to old domain with 307?
Andy's suggestion would work just fine - but you might also consider building some kind of landing page for the new site to preview the change (if it won't wreck your brand launch strategy to do so). When we did the switch from seomoz.org to moz.com, we had a page up teasing the new brand and even collected email addresses for our community to be notified when the new site launched. Not only would this solve your current issue, it would also allow Google to start crawling your new site sooner. You could even do some PR to build some links to the new domain in advance of the switch!
-
RE: Schema for Product Categories
Hi Mike,
You're correct that the Product markup is really intended for individual items, not a category of items. Under "Multiple Entities on the Same Page" here https://developers.google.com/structured-data/policies Google suggests that you mark up each item on the page individually. Other than that, yeah, not much else to mark up. Hope that helps!
-
RE: Search engine blocked by robots-crawl error by moz & GWT
When was your last crawl date in Google Webmaster Tools/Search Console? It may be that your site was crawled with some kind of problem with the robots.txt and hasn't been re-crawled since.
-
RE: Google ignoring Canonical and choosing its own
It's important to remember that Google in general takes canonical tags as more of a suggestion than a rule; they may decide that another page deserves to rank instead. Take a look at the version of the page that ranks: does it have more external or internal links pointing to it? You may be able to build up your canonical page by directing some additional link juice that way.
If it's all the same to you which version ranks, it might be easier to just take the hint and make the ranking page the canonical page; otherwise, it may take some time to build up those off-page signals to get that version to rank.
-
RE: Robots.txt and redirected backlinks
A noindexed page can still accumulate and pass link equity, although results vary on whether or not some of that link juice "evaporates" along the way. I'm inclined to agree with Chris, though, that there's probably no need to noindex a page that redirects to a page that you do want indexed.
-
RE: Aggregate review schema
Hi Nico,
Tim's example should get you what you need to mark up your aggregate ratings correctly. I wanted to take a minute to address your other questions.
Anything between the and tags in an example is something you should customize to your content. So in the example above, <spanproperty="name">Super Book, you would replace "Super Book" with whatever the name of the product being reviewed is. For any example of Schema markup, if the example includes information that isn't on your page, you can just delete those properties.</spanproperty="name">
For Publisher markup, the "publisher" isn't the person who wrote the review, it's the website as a whole (that's you) that is publishing the content.
In terms of whether or not Google will include the ratings snippet since it can't verify whether ratings are real, in my experience they will especially if you have a good volume of reviews.
-
RE: Canonical Query
Google can definitely choose to ignore the canonical tag, especially if they think that the page in question is a better solution to a query. I agree with the other respondents that the best possible solution would be to fix this at a code level, so the duplicate content isn't an issue on your site anymore. In the meantime, some things to try:
- Make sure that your internal hierarchy makes the canonical versions more important than the duplicate versions, i.e. they appear farther up in your site nav and have more internal links pointing to them.
- Try building some external links to those pages as well, where you can.
- Make sure that the pages your canonical tags point to are very similar to the pages the tags are on - if they're too different, Google may decide they both need to be indexed.
Are any of the duplicate pages receiving organic search traffic? If not, it may be that Google has indexed them but understands they're not as important. Again, though, the best possible solution would be to fix this at a code level.
-
RE: Hammered by Spam links
Hi Matt,
To clarify - these are internal links on your own site that have had the anchors go all weird? Or are they external links?
-
RE: 301 vs 410 for subdirectory that was moved to a new domain, 2-years later
Google is adding and removing URLs from its index fairly slowly right now, and it's not uncommon for changes to take several weeks to filter up into the index, especially for site: searches. This is very annoying (even more so for people who are trying to launch brand-new sites), but not a huge deal since, to Laura's point, these URLs are most likely not showing up for any searches, they just haven't filtered out of the index. I would give it another week or two and see what happens. You may also want to do a Fetch+Submit in Search Console for a few of the subdirectory URLs, to make sure that Google revisits them and registers that they are 410s now - if they've been redirecting for 2 years, Google may just not be crawling them that frequently.
-
RE: Legacy domains
Hi Dan,
It's important to remember that Domain Authority is a Moz-specific metric, measuring the power that inbound links to a domain are passing to it. It's based on a lot of research and information about how Google measures links, but isn't necessarily a reflection of how Google is actually perceiving and rating the inbound links to your site (since no outside party knows for sure exactly how Google is measuring link signals at any given time). Since those links still point to the legacy domains, they are still coming up in the Moz tools as having high DA. It's probable that Google is interpreting those signals passing through your redirects correctly and passing that link value on to your main domain, but if those redirects go away, that may change.
URLs will often remain in the index when they have links pointing to them, because Google is indexing the presence of that link - but a URL showing up in a site: search doesn't necessarily mean that it's ever showing up for any query. Is your main domain still getting a lot/much/any traffic from your legacy domains through those redirects? How much traffic is still getting to those legacy domains (whether via organic search traffic to still-indexed pages or, more likely, from people clicking on inbound links to the old pages) will dictate whether or not you need to keep the redirects live.
One thing I would recommend doing, that I always recommend when a domain is moved, is reaching out to the sites that link to your old domain and seeing if you can get them to update the links to the new domain. You won't have anywhere close to a 100% success rate with this, but it can decrease the number of links that are passing value through redirects and increase the number that pass value directly.
If you do decide to decommission the server and not re-host the redirects elsewhere, I would recommend planning that move in conjunction with a link building and promotion campaign for the new site, to attract new links to make up for any link juice that is lost from the old domains' redirects. I hope that helps!