Remove remove
-
remoe remove remove
-
Hello Nadir,
We generally don't remove Q&A questions unless they are spam or not TAGFEE. Other community members have taken their time to answer the question, and deleting the question removes their MozPoints, and also doesn't give people a reason to answer questions if they keep having their answers removed.
-
I think it would be in your best interest to only have one shop instead of two with the exact same products. Also, you should provide a unique description and title for each product and page.
There is no guarantee that you will experience great results by just deleting the 3k item shop, but if the content and products are the exact same, they are most likely competing with each other.
-
Hi there
I can't guarantee a huge improvement, but I think you're right - this is probably having a negative affect on the site.
You want your titles and descriptions as unique as possible. I'd also change the theme to make it look less suspicious if possible. Images can actually be kept the same - it's not uncommon for thousands shops to use stock product images and Google recognises this.
Should you change all the written content to be unique, it's a safe assertion that your performance will probably improve - or at the very least it will enable you to improve much faster than before now that it is unique.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove spammy backlink from my webpage?
Hi Professionals, Someone create spammy backlinks in my website. How to remove spammy backlinks from my community "Sewways" company website? Please guide me to solve my this problem, because my website is D-Rank according to that backlinks. Thanks!
Technical SEO | | Smartlanjabdul0 -
Removing site subdomains from Google search
Hi everyone, I hope you are having a good week? My website has several subdomains that I had shut down some time back and pages on these subdomains are still appearing in the Google search result pages. I want all the URLs from these subdomains to stop appearing in the Google search result pages and I was hoping to see if anyone can help me with this. The subdomains are no longer under my control as I don't have web hosting for these sites (so these subdomain sites just show a default hosting server page). Because of this, I cannot verify these in search console and submit a url/site removal request to Google. In total, there are about 70 pages from these subdomains showing up in Google at the moment and I'm concerned in case these pages have any negative impacts on my SEO. Thanks for taking the time to read my post.
Technical SEO | | QuantumWeb620 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Is it good practice to update your disavow file after a penalty is removed.
I was wondering if you could use the disavow file by adding to it - even after your site has recovered from a partial site penalty. As a recurring SEO procedure, we are always looking at links pointing to our Website. We then ascertain those links that are clearly of no value. In order to clean these up, would it be good practice to update your disavow file with more of theses domains. Is the disavow file just used for penalty issues to alert google of the work you have done? (we have had penalty in the past but fine now) Would this method help in keeping high quality links to the fore and therefore removing low quality links from Googles eyes? I would welcome your comments.
Technical SEO | | podweb0 -
Removing a URL from Search Results
I recently renamed a small photography company, and so I transferred the content to the new website, put a 301-redirect on the old website URL, and turned off hosting for that website. But when I search for certain terms that the old URL used to rank highly for (branded terms) the old URL still shows up. The old URL is "www.willmarlowphotography.com" and when you type in "Will Marlow" it often appears in 8th and 9th place on a SERP. So, I have two questions: First, since the URL no longer has a hosting account associated with it, shouldn't it just disappear from SERPs? Second, is there anything else I should have done to make the transition smoother to the new URL? Thanks for any insights you can share.
Technical SEO | | williammarlow0 -
Remove Bad Links Or Build New
Hello, After deeply assessing our back links we have come to the conclusion that we have too many links that have been devalued and also some spammy looking links.... Our next question is do we remove these bad links and start a fresh or do we just build new white hat links?? Thanks, Scott
Technical SEO | | ScottBaxterWW0 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0 -
Removing inbound Spam Links
Hello, Last February one of my clients websites was delisted. It turns out that some time ago that had attempted to launch a social network along time lines of ning. The project had fallen apart of the was still up. At some point spammers found it and started using it as part of a link farm. Once it was discovered, the subdomain it was posted on was removed and the website returned to search within 2 weeks. Last week, the website disappeared again OSE shows that in the last 2 months the website has got 2000 (There are about 16,000 total spam links) additional spam links now pointing and the root domain. On top of that, Google Webmaster Tools is reporting about 15,000 404 errors. I have blocked Google from crawling the path where the path were the spam pages used to be. If there a way to block the 1000s of inbound spam links?
Technical SEO | | Simple_Machines0