Thanks Jim, When I said drop that was page rank as I have been tracking through Google Search Console search analytics. We have an adaptive site that detects the users equipment. I URLs are the same in both environments. When I look at desktop the average page rank dropped 4 positions taking us off the front page while the average position on mobile dropped a half a position. Does that help? Thanks
Posts made by merch_zzounds
-
RE: Google Search Analytics desktop site to losing page position compared to the mobile version of the site
-
Google Search Analytics desktop site to losing page position compared to the mobile version of the site
Looking at Google Search Analytics page position by device. The desktop version has seen a dramatic drop in the last 60 days compared to the mobile site. Could this be caused by mobile first indexing? Has Google had any releases that might have caused this?
-
RE: Does Google add parameters to the URL parameters in webmaster tools/
Thanks again Chris,
If these items are not parameters, should I do something to them? Or just leave them alone.
-
RE: Does Google add parameters to the URL parameters in webmaster tools/
Thanks for the quick response Chris,
I understand what the parameters are for,
My issue is that I am seeing new parameters in the list that I did not enter. Will Google insert what it thinks are new parameters it finds?
Or does this have to be caused by someone keying these who has access to the Webmaster account?
Jim
-
Does Google add parameters to the URL parameters in webmaster tools/
I am seeing new parameters added (and sometimes removed) from the URL Parameter tool. Is there anything that would add parameters to the tool? Or does it have to be someone internally?
FYI - They always have no date in the configured column, no effect set, and crawl is set to Let Google decide.
-
Is putting a manufacturer's product manual on my site in PDF duplicate content
I add the product manuals to our product pages to provide additional product information to our customers. Is this considered duplicate content? Is there a best way to do this so that I can offer the information to my customers without getting penalized for it? Should they be indexable? If not how do I control?
-
RE: GWT Soft 404 count is climbing. Important to fix?
Thank you for your responses!
-
GWT Soft 404 count is climbing. Important to fix?
In GWT I am seeing my mobile site's soft 404 count slowly rise from 5 two weeks ago to over 100 as of today. If I do nothing I expect it will continue to rise into the thousands. This is due to there being followed links on external sites to thousands of discontinued products we used to offer. The landing page for these links simply says the product is no longer available and gives links to related areas of our site.
I know I can address this by returning a 404 for these pages, but doing so will cause these pages to be de-indexed. Since these pages still have utility in redirecting people to related, available products, I want these pages to stay in the index and so I don't want to return a 404.
Another way of addressing this is to add more useful content to these pages so that Google no longer classifies them as soft 404. I have images and written content for these pages that I'm not showing right now, but I could show if necessary.
But before investing any time in addressing these soft 404s, does anyone know the real consequences of not addressing them? Right now I'm getting 275k pages indexed and historically crawl budget has not been an issue on my site, nor have I seen any anomalous crawl activity since the climb in soft 404s began. Unchecked, the soft 404s could climb to 20,000ish. I'm wondering if I should start expecting effects on the crawl, and also if domain authority takes a hit when there are that many soft 404s being reported.
Any information is appreciated.
-
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked.
I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page?
We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
-
RE: Why is this store getting hurt in SERPs when they removed duplicate content?
Hi Ryan,
Thank you for your thoughtful reply.
We didn't make any changes to any of the titles or any overall site changes. We measured the changes using internal tracking tools to get the daily traffic by search engine to each product detail page. We also used Google Webmaster Tools to estimate SERP positional data.
The updated pages do have about half as many total words as the control, so that could definitely explain it.
For followup testing, I'm thinking of trying:
- test versus control, same word count, test is 100% original, control is 0% original
- test versus control, same word count, test is 50% original, control is 0% original
- test versus control, same word count, test is 25% original, control is 0% original
-
Why is this store getting hurt in SERPs when they removed duplicate content?
I work with an e-commerce client who got hit hard by Panda. They are very cautious, and want small-scale tests to prove each hypothesis before committing to larger changes.
Recently, we reworked content on 30 product detail pages. Before, these product pages featured some original content mixed with some manufacturer content. The change we made was to remove the manufacturer content completely from the product page, leaving about 300 words of high-quality, original content--all of which was written by subject matter experts.
I assumed that Google viewed this manufacturer text as duplicate content. However, when these 30 modified pages were compared to the control, they performed significantly worse.
Question 1: Does any have any idea why these pages would perform worse than the control?
Question 2: Do you have any tips for convincing this client to try another test or get the buy-in to make the larger changes that--in theory--need to happen?FWIW, this client has about 10,000 product detail pages--the vast majority of which contain just manufacturer content.
I appreciate your thoughts.