According to John Mueller, the answer is no (at least in the long term)
https://www.seroundtable.com/google-long-term-noindex-follow-24990.html
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Product Manager
Company: uncommon goods
Favorite Thing about SEO
Problem Solving, Dynamic Environment, Knowing secrets of the web
According to John Mueller, the answer is no (at least in the long term)
https://www.seroundtable.com/google-long-term-noindex-follow-24990.html
Hi Moz Community,
Are Bing/Yahoo crawlers different from Google’s crawler in terms of how they process client side JavaScript and especially content/data loaded by client side JavaScript?
Thanks,
Hi Moz Community,
I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction?
Thanks,
Hi Moz Community,
Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices?
In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well?
Thanks!
Hi Moz community,
Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works.
I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed.
https://sitebulb.com/resources/guides/javascript-seo-resources/
However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index.
Any thoughts on this, is this concern valid?
Thanks!
Thanks for your help on this Nigel
Hey Nigel,
These parameters are already in my search console but Moz is still picking them up as duplicates.
Hi Logan,
I've seen your responses on several threads now on pagination and they are spot on so I wanted to ask you my question. We're an eCommerce site and we're using the rel=next and rel=prev tags to avoid duplicate content issues. We've gotten rid of a lot of duplicate issues in the past this way but we recently changed our site. We now have the option to view 60 or 180 items at a time on a landing page which is causing more duplicate content issues.
For example, when page 2 of the 180 item view is similar to page 4 of the 60 item view. (URL examples below) Each view version has their own rel=next and prev tags. Wondering what we can do to get rid of this issue besides just getting rid of the 180 and 60 item view option.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks!
Hi Nigel,
Thanks for the response and the post, I've actually read the article before and used the rel=next and rel=prev to fix some duplicate content issues because of pagination in the past.
Right now, the rel=next and rel=prev is not solving my duplication problems because pagination isn't the issue so the speak. The duplication is occurring because i have two page types (one at view 60 items and one at view 180 items - kind of like a filter) Each view (60 & 180) has their own set of pagination rules but it looks like page 4 of the 60 view is a duplicate of page 2 of the 180 view if that makes sense.
It becomes really tricky here to try and find a solution.
Hi Moz Community,
We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue.
https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2
https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4
Thoughts, ideas or suggestions are welcome. Thanks
If you're a tourist business and the tourists know it as 144 I would stick with that. However, you may be able to get further insight off adwords keyword tool or traffic estimator. You can try some searches and see what the traffic looks like to try to gauge which search pulls more traffic.
try deleting the subpage content. It might be too keyword stuffed and u might have got penalized. I'd get in touch w Google via Webmaster tools
Thanks for being so thorough with your answer!
I'll have our tech team take a look at the markup. We used to have our review content embedded as part of the html code and I've heard that increases crawl frequency and was also easy for search engines to understand. Now I think we might be using AJAX which apparently causes confusion for crawl bots.
I am starting to miss the old OSE. I've found that for a lot of the pages on our site, the new OSE is showing WAY more links and most of them are garbage nonsense links from China, Russia, and the rest of the internet Wild West.
For instance, in the old OSE, this page used to show 9 linking domains:
http://www.uncommongoods.com/gifts/by-recipient/gifts-for-him
It now shows 454 links. Some of the new links (about 5 of them) are legitimate. The other 400+ are garbage. Some are porn sites, most of them don't even open a web page, they just initiate some shady download. I've seen this for other sites as well (like Urban Outfitters) This is making it much harder for me to do backlink analysis on bc I have no clue how many "Normal" links they have. Is anyone else having this problem ? Any way to filter all this crap out ? See attached screenshot of the list of links I'm getting from OSE.
Hi Moz Community,
I wanted to know how reliable the average position data is for queries in Google Analytics search console report. I know this report is fairly new this year and the numbers are calculated a bit differently than they were in the old search engine optimization report.
I want to know what the biggest differences are between this search console report vs. the old SEO report in GA. I'm also pretty confused about how GA reports on the average position. Obviously it's an average position of whatever date range your choose. But for instance, if your site shows multiples landing pages for one search query will it roll that into the average or just take the landing page that ranks higher? Does the position average take into account video or photo serp results and is this the average across mobile, desktop and tablet?
This number has always been a guess since it's sampled data but I want to know how accurate it is. I read this article in 2014 (linked below) but I'm not sure if it all still applies now that that data might be presented differently.
Any answers or discussions would be great here.
Thanks
Would you rather have someone +1 your homepage or add you to their Circles? Which one will help you out more in the SERPs ?
Hey Ben thanks for your feedback.
We did change the extensions somewhat recently..We didn’t use .html, but we did use .jsp before a site redesign in October. So for instance, our item page URLs used to be:
www.uncommongoods.com/item/item.jsp?itemId=13271
and are now:
www.uncommongoods.com/product/pee-pee-teepees
We never had an extension on the homepage; for category pages (non-product pages), we made the switch around June 2010, and for the product pages and all other pages we made the switch in October 2010.
What would you suggest we do as a solution to get our page rank back?
Hi all,
I've been looking around at some of our competitors websites and I've been noticing huge amounts of keyword stuffing throughout the pages and also grouped within the bottom of the page. From what I've been taught it's not a good thing to do and you can be penalized for it. What's anyone else's take on keyword stuffing and how it's looked upon in 2017? Is there a max amount of keywords you should have on your page?
Here are a few URL's to the websites I'm talking about and their webpage.
https://www.walmart.com/cp/personalized-gifts/133224 - Keyword stuffing in the bottom group text for the word "personalized"
http://www.personalcreations.com/unique-groomsmen-gifts-pgrmsmn - Keyword stuffing in bottom group text for "groomsmen"
http://www.groovygroomsmengifts.com/ - keyword stuffing throughout page for "groomsmen"
When we post links to product pages onto our Google Plus account, the image never shows up in the post. Only the name of the product and our domain name appears. Has anyone experienced this problem or have ideas on how to fix?
2/23/2012 A while back, I went to a Distilled meetup here in NYC. SEER Interactive's Mark Lavoritano did some cool slides on the seasonality of keywords. Basically, his presentation made the point that you should not only think about which keywords you want to rank for but also WHEN they are most valuable.
I began studying SEO for my own website, Yodelscope and am now working in-house at http://www.uncommongoods.com (online retail) I love the challenge of SEO and always up for sharing thoughts and ideas with others.
Looks like your connection to Moz was lost, please wait while we try to reconnect.