Should I Remove My Articles From Article Directories?
-
I have been submitting articles to directories for about 3 years. With the Panda update, it seems that these directories are now obsolete. So, if there is no link value from these articles: 1) should I remove these articles (at east the better ones) and place them on my site/blog? 2) If not, would there be any benefit at pointing some bookmarks at these old links to maybe get some juice out of them?
-
You are correct they have little value,
It would be a good idea to place them on your own site instead, only that they are probably copied onto many other sites by now by screen scapers and you would not get credit for being the original.
You can try copy scape to see if they have been copied http://copyscape.com/
no dont point links at them, point the links at your own site instead.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old Sub domain removal and deletion of content
There are two questions here. I have waited for over 2-3 weeks now and they are still not resolved till now. An old sub-domain is still indexed on Google (blog.nirogam.com) of which all pages have been redirected or 404'd to main domain. There is no webmasters, no authority of this old sub-domain. Hosting of the same might be there. (this has been deleted and does not exist - we own main domain only) How do I de-index and remove them for good? _(Around ~1,000 pages)_I am trying this public tool - any better approaches?Even after removing pages and submission on the tool, 600 pages are still indexed after 2-3 weeks! We deleted a lot of thin content/duplicate pages from the domain (nirogam.com) in Wordpress - All these pages are still in Google's index. They are in Trash folder now. This is causing an increase in 404s in the webmasters etcI have served a 410 header (using wordpress plugin) on all these pages as these should not be directed to anything. However, Google does not always fully understand 410 properly and it still shows up in webmasters as read in this detailed post.All these pages are still indexed.How do I de-index these pages? Any other approach to stop the 404s and remove these pages for good?Any feedback/approach will be highly appreciated.
Intermediate & Advanced SEO | | pks3330 -
How to remove seemingly untouchable link spam
Hey Mozzers, I have been struggling with this issue, and I am hoping someone can help. I have a number of bad/spammy links to my site. We have never engaged in "bad SEO", but an old subdomain received a number of spammy blog comments, and everything seemed to escalate from there. We have removed a subdomain that received all of the bad links from our DNS settings (about a year ago), but these links are still there when using Ahrefs or MajesticSEO. I don't think we have been penalized for these links, but I would just like to clean them up because, well, it's the right thing to do. How does one do this when these sites seem so untouchable. Either they are from China, Russia, Denmark, abandoned in 2009, etc. If I look for someone to contact, I can't seem to find anyone to even email. Suggestions?
Intermediate & Advanced SEO | | evan890 -
What happens if one remove the disavow file from a non penalised site
What happens if one remove the disavow file from a site that has not received a manual penalty from Google. Although the site did suffer from a drop in traffic and rankings.
Intermediate & Advanced SEO | | Taiger0 -
Are articles still benificial and how best to promote them?
Hello, I'm trying to promote a new site doing things differently moving forward if needed in order to prevent getting google slapped while being as efficient as possible.. We have a main site which manufacturers materials. we also have a blog on blogger.com every week someone in our office writes an article about something related to our area of work and within the article has a varied keyword or two embedded within the article they are writing... My questions are as follows: -1- should be change our blog site address from oursite.blogger.com to blog.oursite.com?
Intermediate & Advanced SEO | | Robdob2013
-2- would it be beneficial to have a link from our main site to the oursite.blogger.com
-3- We also have a ezine account, would it be beneficial to also post this same article perhaps with some minor changes to our ezine account so that it would start to get more visibility from other sites or is this now possibly a no no?
-4- should we be now usin nofollow links in our articles? if we do use nofollow links aren't we losing the benefit? Any suggestions would be greatly appreciated0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Free Directories - Yes or No?
Clearly, we see that high authority directories like Dmoz.Org are effective, even if this big monster is practically dead because of how unresponsive it is. What about other free directories? Is it worth obtaining as many listings as possible in the free directories? what about the paid ones? Is this still good SEO strategy if the directories have at least a PR3-4 or many cases higher? I'm asking this for an established site, so I understand that it won't help for deep-linking and anchor text, but will it help anyway to get links from these? If you like this post, help me out by giving me a Big Thumbs Up!
Intermediate & Advanced SEO | | applesofgold0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0 -
Removing Duplicate Content Issues in an Ecommerce Store
Hi All OK i have an ecommerce store and there is a load of duplicate content which is pretty much the norm with ecommerce store setups e.g. this is my problem http://www.mystoreexample.com/product1.html
Intermediate & Advanced SEO | | ChriSEOcouk
http://www.mystoreexample.com/brandname/product1.html
http://www.mystoreexample.com/appliancetype/product1.html
http://www.mystoreexample.com/brandname/appliancetype/product1.html
http://www.mystoreexample.com/appliancetype/brandname/product1.html so all the above lead to the same product
I also want to keep the breadcrumb path to the product Here's my plan Add a canonical URL to the product page
e.g. http://www.mystoreexample.com/product1.html
This way i have a short product URL Noindex all duplicate pages but do follow the internal links so the pages are spidered What are the other options available and recommended? Does that make sense?
Is this what most people are doing to remove duplicate content pages? thanks 🙂0