Thank you for the reminder! I believe these have all been switched over, but I'll give it another look to be sure!
Kaylie
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Thank you for the reminder! I believe these have all been switched over, but I'll give it another look to be sure!
Kaylie
Hmmm, I don't appear to have that icon. Perhaps because sampling is not occurring on my report?
I actually felt that the transition went really well, which is why I was surprised by the data. I have a feeling, however, that I indeed just need to give it a few more days and keep checking the traffic/search information.
I'll keep you posted on how things pan out over the next week or so!
Thanks for the response!
Sorry about the timeout error. Not sure what happened there either. I have been unable to replicate it.
You were partially right about the dates. So now I am looking at the same time table for Analytics and GWT (21-27) and I see a huge "drop" on the 27th in GWT. However, if I extend GA through today, I see a minor drop on Friday the 26th and then it bumps back up to normal levels. Is there a chance that maybe GWT is reporting search/clicks from part way through the 27th rather than the full day of data, creating a false sense of alarm?
I wanted to see a larger set of data that might tell a more complete story. So, looking in GA, I only see a change of 5% in traffic when comparing the 23rd-29th with the 16th-22nd. That is a 4% decrease in Google organic, but I don't feel like these numbers are the cause for alarm that GWT graphs initially indicate. Thoughts?
Thanks for your input!
Kaylie
My URL is: https://www.seattlecoffeegear.comWe implemented https across the site on Friday. Saturday and Sunday search traffic was normal/slightly higher than normal (in analytics) and slightly down in GWT. Today, it has dropped significantly in both, to about half of normal search traffic. From everything we can see, we implemented this correctly.
We also use a CDN (though I don't think that impacts anything) and have had no customer issues with accessing or using the website since the transition.Is there anything else I might be missing that could correlate to a drop in search impressions or is this just a waiting game of a few days to let Google sort through the change we've made and reindex everything (it dropped to 0 indexed for a day and is now up to 1744 of our 2180 pages indexed)?Thank you so much for any input!Kaylie
EGOL,
Thank you for your response.
I would first like to clarify that I don't think I deserve to have a higher ranking. We have worked for years on our content and building out unique and creative information (a couple thousand pages of it). As a result, over the last year, our site has been on the first page for most of our key terms. The reason I posted was just to get some feedback on why the rankings changed so drastically (Google update?) and how to recover.
As a retail site, it really isn't feasible for us to build out our category and home pages like bestespressomachine.org has. I do, however, see what you are saying about the amount of information their site (the ranking page, in particular) contains and how optimized it is for those terms.
To address the duplicate content, please see the reply I had posted above:
"We don't use stock information from manufacturers and actually put a lot of time and effort into creating informational, clever, and unique descriptions. Unfortunately, a lot of people copy our product descriptions.
Do you have any advice on how to deal with others copying our content? I'd love an automated way of receiving a report on duplicate text (as we have thousands of pages on our site). I could then contact those sites and at least ask them to remove the content. Also, is there any way of notifying Google of sites copying our content?"
I would appreciate any feedback you could provide in that regard.
Best,
Kaylie
Also of note is the sites that have taken the first page/top of second page results for the "espresso machine" term.
The first page has 2 relevant sites, 1 site that's not very built out, and the rest are major retailers (Amazon, Williams Sonoma, Macys, Best Buy.). Second page has eBay in the top spot.
Sure, the retail sites sell espresso machines. But some of them only sell a few espresso machines and, on all of them, there isn't a lot of content to support these products. It seems to me that they are ranking on this term because they are large sites in general, which means searchers are not being served up the most relevant results. Instead, they are being served mass retail sites.
sigh
Keri,
Thanks for clearing that up. Have you heard of others experiencing a drop in rankings recently or any insight into recent Google updates?
Thanks for your response, Jeff!
I have searched around a bit in the Q&A but haven't found anyone else experiencing this particular issue.
Thanks for your response!
We started guest blogging as a way of building out actual links and recovering from the spammy link building a previous company had done on our behalf. We only guest blog on sites that relate to our products (coffee sites, essentially). We write unique content for each blog article and only include a few, very relevant links in each one.
I don't know the exact date but I believe they started dropping in mid-January-ish.
We don't use stock information from manufacturers and actually put a lot of time and effort into creating informational, clever, and unique descriptions. Unfortunately, a lot of people copy our product descriptions.
Do you have any advice on how to deal with others copying our content? I'd love an automated way of receiving a report on duplicate text (as we have thousands of pages on our site). I could then contact those sites and at least ask them to remove the content. Also, is there any way of notifying Google of sites copying our content?
My company has experienced a significant drop in rankings in the last month or so. How significant? Across our top 11 keywords, we've dropped an average of 6 positions. Some have dropped less (1 or 2) and some have dropped way more (38).
We work really hard to provide great content on our site and have been building out our link profile with guest blogging on relevant sites (dailyshotofcoffee.com, for example). No major changes have been made to the URL structure or content on the pages that are ranking for our key terms, so I am not sure where the drop is coming from.
For example: One of our key terms is "espresso machine" and we went from #9 to #22 in the last few weeks. We have not made any changes to the main content on the page that is ranking since September. Our on-site page report for this page has us nailing all of the critical factors, most of the high importance factors (not exact keyword in page titles - we use "espresso machines" as that is one of our other terms - or "avoid keyword stuffing" - unfortunately, our products are also named with "espresso machine" and i can't very well not follow those links or remove the term from their names), all of the moderate importance factors, and most of the low importance factors. It has been reporting this way since September.
Most of our key terms are this way (haven't had content changed in the last several weeks to months and have great on-page grades). We don't engage in spammy link building (though I did some work this fall on cleaning up bad links a previous SEO company had built out). I'm just really taken aback by the sudden drop in rankings across the board.
Any insight or advice anyone can give me would be greatly appreciated!
Last June, we transitioned our site to the Magento platform. When we did so, we naturally got an increase in 404 errors for URLs that were not redirected (for a variety of reasons: we hadn't carried the product for years, Google no longer got the same string when it did a "search" on the site, etc.). We knew these would be there and were completely fine with them.
We also got many 404s due to the way Magento had implemented their site map (putting in products that were not visible to customers, including all the different file paths to get to a product even though we use a flat structure, etc.). These were frustrating but we did custom work on the site map and let Google resolve those many, many 440s on its own.
Sure enough, a few months went by and GWT started to clear out the 404s. All the poor, nonexistent links from the site map and missing links from the old site - they started disappearing from the crawl notices and we slowly went from some 20k 404s to 4k 404s. Still a lot, but we were getting there.
Then, in the last 2 weeks, all of those links started showing up again in GWT and reporting as 404s. Now we have 38k 404s (way more than ever reported). I confirmed that these bad links are not showing up in our site map or anything and I'm really not sure how Google found these again.
I know, in general, these 404s don't hurt our site. But it just seems so odd. Is there any chance Google bots just randomly crawled a big ol' list of outdated links it hadn't tried for awhile? And does anyone have any advice for clearing them out?
Thanks!
I have decided to do this and am turning off a few parameters at a time to ensure it doesn't cause any crawling or indexing issues.
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed.
Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is.
For narrowing, it gives the following options:
Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Yes, the page is modified based on selected options and it is done via Javascript.
So those individual pages never actually appear on the site. They are just created to allow Magento to pull inventory on those items from that configurable product, which is why I'm not sure how Google is finding them.
For example, if I go to mysite.com/white-t-shirt, I would get a 404 (and if I searched for it, nothing would come up) because as far as the world outside of Magento admin is concerned, that URL doesn't exist.
We recently moved our website over to the Magento eCommerce platform. Magento has functionality to make certain items not visible individually so you can, for example, take 6 products and turn it into 1 product where a customer can choose their options. You then hide all the individual products, leaving only that one product visible on the site and reducing duplicate content issues.
We did this. It works great and the individual products don't show up in our site map, which is what we'd like. However, Google Webmaster Tools has all of these individual product URLs in its Not Found Crawl Errors.
! For example:
White t-shirt URL: /white-t-shirt
Red t-shirt URL: /red-t-shirt
Blue t-shirt URL: /blue-t-shirt
All of those are not visible on the site and the URLs do not appear in our site map. But they are all showing up in Google Webmaster Tools.
Configurable t-shirt URL: /t-shirt
This product is the only one visible on the site, does appear on the site map, and shows up in Google Webmaster Tools as a valid URL. !
Do you know how it found the individual products if it isn't in the site map and they aren't visible on the website? And how important do you think it is that we fix all of these hundreds of Not Found errors to point to the single visible product on the site? I would think it is fairly important, but don't want to spend a week of man power on it if the returns would be minimal.
Thanks so much for any input!
I am working in Magento to build out a large e-commerce site with several thousand products. It's a great platform, but I have run into the issue of what it does to URLs when you put a product into multiple categories.
Basically, "a book" in two categories would make two URLs for one product: 1) /books/a-book 2) author-name/a-book
So, I need to come up with a solution for this. It seems I have two options:
This would solve the issues and be a quick fix, but I think it's a double edged sword, because then we lose the SEO value of our well named categories being in the URL.
To be fair, I'm not even sure this is possible. Even though it is creating different URLs and, thus, poses a risk of "duplicate content" being crawled, there really is only one page on the admin side. So, I can't go to all of the "duplicate" pages and put a canonical tag, because those duplicate pages don't really exist on the back-end. Does that make sense?
After typing this out, it seems like the best thing to do probably will be to just turn off categories in the URL from the admin side. However, I'd still love any input from the community on this.
Thanks!
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc.
It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages.
I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions").
What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products?
Any help is greatly appreciated!
The content definitely serves the reader. We strive to provide unbiased, in-depth reviews on all of our products. So, for example, each product page has: a basic description, a video review, an overview of how the product works, an overview of its functionality, pros/ons, etc etc.
That's why I was hoping to add the bulk of the keywords to headings, to break up the vast amounts of content and give the pages some SEO value without altering the writing style that our customers enjoy and get so much use out of.
I'll go through and write up a page as it would appear on the site with all the tabs and then do some analysis to look at keyword density.
Also, I won't be the one writing most of the content (hence the guides). Do you have any advice on how to convey best practices with keyword density without getting overly technical?
Thanks so much for your help!
I am in the process of drawing up content templates to guide my company's marketing team in creating SEO optimized content as we move over our retail website to a new platform. On each product page, we will have multiple tabs that are crawl-able, each one containing different chunks of information on the products.
Within each tab, I was thinking of breaking up the content and adding SEO value by using headers (h2 or h3) that have a keyword included. So, for example: "How The PRODUCT NAME Works" and "User Manuals for your PRODUCT NAME."
Between the multiple tabs, in headers alone, the main keyword for the product (which will usually be the product name) will be on the page 7 times. Between this and the keywords that are part of the actual content (ex: product description), is this too many keyword instances?
I know headers are often skimmed or skipped when used to simply break up the content, so I don't think they will impact user experience too much. However, I would love some feedback on if you agree with that and if you think I should cut down on the number of keywords or if I am headed in the right direction.
Thanks!