What's going on with my organic traffic from Google?
-
I am working on eCommerce website Vista Stores. My website's traffic is going down due to certain reason. I have done R & D and have assumption with auto generated content which I have added on few product pages.
You can find out attachment to know more about current situation of traffic.
-
Are you talking about internal link structure or external links? I have done lot of work to define proper categorization and internal link structure.
-
All in all your store looks the part. Your link profile is healthy. Site speed looks good. (75/100)
I'd tackle your content issues and see what happens from there.
-
I am going to follow similar steps for my website. Duplicate content is one issue for my website and will resolve very soon. Can you evaluate any additional reason behind drop of ranking?
-
Remove auto generated and duplicated content and create unique, relevant and interesting content (free of grammatical/spelling errors). Perhaps consider hiring a writer to re-write your product descriptions.
-
Honestly, I did not get accurate one by your reply. But, I can understand that grammar mistake is one factor. But, can you give me any alternative suggestion which may help me to sustain or recover my organic traffic.
I am going to remove all auto generated content from website.
-
I'm guessing that Google is taking a harsh stand on the Grammar used in the product description. The description doesn't read well, there are many gramatical mistakes.
In addition to that the only other chunk of content on the page appears to be duplicate content. See the Google search below on a piece of the California Umbrellas text.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
ECommerce Replatforming URL's
We are in the process of re-platforming our eCommerce site to Magento 2. For the most part, the majority of site content will remain the same. Unfortunately on our current platform, we have been inconsistent with the use of .html as a URL suffix. As a result, our category and product pages are half and half - /stainless-steel-hardware.html
Intermediate & Advanced SEO | | BoatOutfitters
&
/stainless-steel-hardware We are considering taking the opportunity to clean up and standardize our URLs. (Drop the .html from all URLs on the new site and 301 redirect these to the same URL without the .html) Our concern is that many of the .html pages are good categories with strong page rank and I've read many articles about page rank loss from 301 redirects. We are debating internally if it really makes sense to take an SEO hit for something is seemingly small as dropping the .html from the URL. It would be a no-brainer if we were taking the opportunity to change to more SEO friendly natural language URLs. However currently our URL's appear acceptable with the exception of the inconsistent suffix. Thanks in advance for any insight on how you would approach this!2 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
My landing page changed in google's serp. I used to have a product page now I have a pdf?
I have been optimizing this page for a few weeks now and and have seen our page for up from 23rd to 11th on the serp's. I come to work today and not only have I dropped to 15 but I've also had my relevant product page replaced by this page . Not to mention the second page is a pdf! I am not sure what happened here but any advice on how I could fix this would be great. My site is www.mynaturalmarket.com and the keyword I'm working on is Zyflamend.
Intermediate & Advanced SEO | | KenyonManu3-SEOSEM0 -
.com Outranking my ccTLD's and cannot figure out why.
So I have a client that has a number of sites for a number of different countries with their specific ccTLD. They also have a .com in the US. The problem is that the UK site hardly ranks for anything while the .com ranks for a ton in the UK. I have setup GWT for the UK and the .com to be specific to their geographic locations. So I have the ccTLD and I have GWT showing where I want these sites to rank. Problem is it apparently is not working....Any clues as to what else I could do?
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
What's the best theme for seo if you are going to use yoast anyway
I am going to edit my theme myself so I don't need something like thesis for that. But people say that the thesis framework is amazing for seo, and it's hard to edit it manually. Does using the thesis theme do anything for you if you are going to use yoast anyway? Thanks William
Intermediate & Advanced SEO | | willie790