You will also have to get those URLs out of the index once you fix the rel next/prev issue. In order to do that effectively, they should return a 404 or 410 status code in the HTTP header so Google knows that they no longer exist (even though they never really did in the first place). Otherwise, it's what is known as a "soft 404" in which the page doesn't really exist, but returns a 200 (OK) status code, which is confusing to Google if you don't want them indexed.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Posts made by Everett
-
RE: Paginated Pages Which Shouldnt' Exist..
-
RE: Moving pages to new domain
Hello James,
I would advise you to make the new pages as close to the old ones as you can, meaning that you would use the same content and try to keep a similar file structure, including product names. Then make sure every one of your old product URLs has a 301 redirect to the new one on the new site. Also redirect the home page, category pages, and other indexed pages of the old site to their corresponding pages on the new site. Follow that up with some sort of catch-all redirect that ensures every other possible URL someone might end up on from the old site goes to the new site.
How to do a successful domain migration is beyond the scope of this Q&A format. You should read the following, which will give you a better understanding of the process and things to look out for. And yes, despite what Kevin said, you can move rankings from one domain to another. I've done it many times. There will be a lag of a few weeks to a month, even if you do it right, but eventually Google figures out what has happened.
https://moz.com/blog/website-migration-guide
https://searchengineland.com/site-migration-seo-checklist-dont-lose-traffic-286880
http://www.seerinteractive.com/blog/website-migration-seo-checklist/
-
RE: Former tenant Google Map listing still displays
Hello Kevin,
I assume you've already tried the "suggest an edit" feature on the maps?
If not, that would be a good place to start.Do you have a Google MyBusiness listing?
If not, you should set one up. This may help.Do you have your address clearly marked on your website?
The more you associate your business with that exact address, as typed, the more you help Google figure this mess out.Do you have your address marked-up using Schema or JSON-LD on your website?
See above.Are you the custom signs and displays company or another company?
-
RE: 302 > 302 > 301 Redirect Chain Issue & Advice
Thank you for sharing the GoDaddy response with the Moz community Andrew.
How many (and which) pages/links is this affecting? Once I know that I should be able to help a little more with prioritization. If this is the way your navigation menu works, for example, then it's a 10. If it's just happening on one page that doesn't have a lot of external backlinks it's a 1.
Google says they follow redirects at least five levels deep and that they treat 302s and 301s the same. In my humble opinion after seeing numerous examples otherwise, this is B.S. It can depend on the response times, how the redirects are implemented, how much trust Google has in your site, and many other things. Long story short, fix it if you can, but I doubt it's going to require switching hosts.
-
RE: I want to load my ecommerce site xml via CDN
Hello Micey123,
That sounds good except you should put the sitemap reference for xyz.abcd.com within that subdomain's robots.txt file as well: xyz.abcd.com/robots.txt, as each subdomain should have its own robots.txt file.
-
RE: How to track google auto search suggestion click?
Hello Micey123,
Unfortunately, as I mentioned, there is no easy or cheap way to do this. Even with log files, most of the time the keyword data is not going to be available in the referral string.
Jumpshot would be your best bet, but it's not going to be cheap.
-
RE: How to track google auto search suggestion click?
Hello MIcey123,
Yes that is what I was talking about above by using the parameters in the URL to pull out the queries. However, that post was written in 2013 and Google is not passing that information through the referral header anymore so I would be surprised if you get anything. Most of it will be (not provided).
-
RE: How to track google auto search suggestion click?
Well that's where it might get difficult. I forgot for a moment we lived in a not provided world where queries don't pass through referrers.
Being on a secure site would help, and you might try looking into jumpshot.com for click-stream data, though you're probably talking tens-of-thousands of dollars.
I think the auto-suggestions themselves are a good indication of what people are trying to search for in relation to the other searches. People Also Ask boxes are another one.
Tools like SEMrush, Moz and Stat Search Analytics can help you see the types of results you're showing up for, and they track features like "people also ask" suggestions.
I know none of this easily solves your problem or answers your question. I'll let you know if I find out something that does. In the meantime the question is still open for others to answer.
-
RE: How to track google auto search suggestion click?
Hello Micey123,
I think the link provided went to a phenomenal post, but there may have been a misunderstanding about what the post was instructing. From what I could tell, it was about tracking your own internal site predictive search, and not Google's.
Assuming you can get the full referrer path, including query string, in GA or via log files, I think one way to approach it would be to separate the queries from last to first, and you'll see the last is probably the original query that was "assisted" (or "interrupted", depending on how you look at it) and the first one in the URL was the auto-complete suggestion that was chosen. Here are a few examples.
This is the URL from my address bar while searching on Google for “I’m searching for something" without quotes, and selecting the suggestion for "I'm searching for something".
First query in the URL string (I'm searching for something):
q=i%27m+searching+for+something(q=i%27m = I’m)
Second query in the URL string (I'm searching for):
q=i%27m+searching+forThis is the URL from my address bar while searching for “baby pandas are” without quotes, and selecting the autocomplete suggestion for “baby pandas are ugly”. I agree Google. Hideous creatures.
First query in the URL string:
q=baby+pandas+are+uglySecond query in the URL string:
q=baby+pandas+areURL from address bar while searching for “typing in full” and selecting the autocomplete suggestion for “typing in full sentences”.
Same pattern.
This is the first…
day of my life lyrics (after “day of my life”).
First query:
this is the first day of my life lyricsSecond query:
this is the first day of my lifeI hope this helps. But there may be an easier way. I'll will ask around for you if you'd like, but I want to make sure I understand your needs first. Do I?
-
RE: Reviews on Product Page or Separated
I like Logan's answer.
You may, however, consider allowing a single "endlessly scrolling" Reviews page for each product, which:
1. Would NOT have the first X reviews shown on the Product page (sorted in whatever way works best for you).
2. Would perform the role of Review Pagination, though without taking anyone off the product page unless they REALLY wanted to see more reviews by clicking the (see all reviews) link. I think this is pretty similar to how Amazon does it, and they know what they're doing when it comes to maximizing conversions.
3. Would be indexable and optimized for "Product Name Reviews".
4. Would never be "built" unless there are at least X-reviews for the product, necessitating pagination.
-
RE: How to exclude URL filter searches in robots.txt
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
RE: Is it possible to have organization markup schema for sub domain ? and how should it look like ?
An organization can have multiple brands and websites. If the subdomain is home to a different organization than the root or www. domain you should use the subdomain URL and org name. If it's the same organization, but a different brand, you would use the top level domain and org name. Make sense?
-
RE: Should I add no-follow tags to my widget links?
Hello Pascal,
Like Hillary Clinton, I have a public opinion and a private opinion when it comes to stuff like this. The public opinion is also Google's, and that is to use a rel="nofollow" attribute on widget links. They are considered links that webmasters use to manipulate the search engine rankings. Yes, that video is old, but the rule still stands as far as Google is concerned.
My private opinion is that widgets are a form of branding, and it is not a webmaster's responsibility to do anything other than get their brand discovered far and wide. You created a widget that, if people are using it, probably provides some value to them. Why should you get any less credit for this than you would get from someone linking to the widget on your site?
If you are going to keep the links followable, my advice is to keep the anchor text branded and the href pointing to your home page. This is the least likely to seem like link-graph manipulation. Avoid deep-links, unless they go to the widget download page, and avoid optimized anchor text. Use "YourDomain.com" or "Your Brand" instead.
I'll leave the question open for more input since this isn't a question that necessarily has a single "right" or "wrong" answer.
-
RE: Using Google to find a discontinued product.
Yes, that would make for a better internet indeed. A lot of these occur because the merchants keep the discontinued product URL in their feed and/or the page returns a 200 status code. Technically this should be considered a "soft 404" since what the user was looking for isn't there.
-
RE: Rankings rise after improving internal linking - then drop again
"My theory is that the uplift generated by the internal linking is subsequently mitigated by other algorithmic factors relating to content quality or site performance or..."
I think your initial analysis of the situation is right. Look to improve user-experience, conversion rates and interactions on those pages and try your experiment again.
I don't like using bounce rate as a metric for this for several reasons, but if you use Time On Site, Pages Per Visit, or track interactions, such as when they scroll past 50% of the page or click a button... There are plenty of ways to gauge whether your changes are providing a better experience for visitors from search results, which in turn should be roughly the same thing that pleases the algorithm.