Industry News Page Best Practices
-
Hi,
We have created an industry news page which automatically curates articles from specific news sources within our sector.
Currently, I have the news index page set to be indexed and followed by robots. I have the article pages noindex, nofollow, since these are not original content.
Is this the best practice or do you recommend another configuration?
Thanks!
-
As far as your original question, your nofolllow and noindex settings are fine.
I guess I don't understand the point of building out the pages with content that isn't unique. I come from the camp that starting it out right is always the best practice. People follow you because they want your point of view. I don't think that takes much time. On most CMS platforms you can schedule your posts and you can spend one day on post and spread them out over the week. I would see if someone in your company likes to write and see if they want to champion it.
But, like I said before. If you want to go the route you are currently taking then your nofollow noindex of those pages is fine.
-
Thanks Darin.
That's definitely a good strategy, and hopefully we'll do that at some point. But it also takes a lot more time and effort.
You have to start somewhere, and that's where we are.
What would you recommend for this stage of the campaign?
-
I use a slightly different approach. I like to write an article about the topic of the news that you have an provide a link about it. For example, if you are a dentist and a new cleaning practice came out, I would write an article referencing the article that came out. I would provide quality content about how the change will effect the industry and and provide a link to that post. That way you can use the rel=author element and possibly have those pages show up in the SERPs. I'm not a real big fan of putting other peoples content on my site unless they have created it for me or I have something to add to it.
Darin.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do with all these 404 pages?
I have a website that Im currently working on that has been fairly dormant for a while and has just been given a face lift and brought back to life. I have some questions below about dealing with 404 pages. In Google WMT/search console there are reports of thousands of 404 pages going back some years. It says there are over 5k in total but I am only able to download 1k or so from WMT it seems. I ran a crawl test with Moz and the report it sent back only had a few hundred 404s in, why is that? Im not sure what to do with all the 404 pages also, I know that both Google and Moz recommend a mixture of leaving some as 404s and redirect others and Id like to know what the community here suggests. The 404s are a mix of the following: Blog posts and articles that have disappeared (some of these have good back-links too) Urls that look like they used to belong to users (the site used to have a forum) which where deleted when the forum was removed, some of them look like they were removed for spam reasons too eg /user/buy-cheap-meds-online and others like that Other urls like this /node/4455 (or some other random number) Im thinking I should permanently redirect the blog posts to the homepage or the blog but Im not sure what to do about all the others? Surely having so many 404s like this is hurting my crawl rate?
Technical SEO | | linklander0 -
How can i make Google to consider my News pages
How can i make Google to consider my News pages as News and place them in the Google News section? Is there some syntax i need to mention in all my news pages?
Technical SEO | | AlexisWithers0 -
Best on-line tool for checking indexed pages (or just for a Mac)
Hey guys, I'm on a Mac and that's why I can't use the usual PC software for checking if my links have been indexed. Here's the deal. I ordered some guest posts. The guest poster did it for me and put my back links. Now, I want to quickly check which pages (with my backlinks) have been indexed. I have a lot of guest posts. So, I need something that can check if those pages have been indexed by Google. I need an online tool or something that will work for my Mac. Help. 🙂
Technical SEO | | VinceWicks0 -
Index page
To the SEO experts, this may well seem a silly question, so I apologies in advance as I try not to ask questions that I probably know the answer for already, but clarity is my goal I have numerous sites ,as standard practice, through the .htaccess I will always set up non www to www, and redirect the index page to www.mysite.com. All straight forward, have never questioned this practice, always been advised its the ebst practice to avoid duplicate content. Now, today, I was looking at a CMS service for a customer for their website, the website is already built and its a static website, so the CMS integration was going to mean a full rewrite of the website. Speaking to a friend on another forum, he told me about a service called simple CMS, had a look, looks perfect for the customer ... Went to set it up on the clients site and here is the problem. For the CMS software to work, it MUST access the index page, because my index page is redirected to www.mysite.com , it wont work as it cant find the index page (obviously) I questioned this with the software company, they inform me that it must access the index page, I have explained that it wont be able to and why (cause I have my index page redirected to avoid duplicate content) To my astonishment, the person there told me that duplicate content is a huge no no with Google (that's not the astonishing part) but its not relevant to the index and non index page of a website. This goes against everything I thought I knew ... The person also reassured me that they have worked within the SEO area for 10 years. As I am a subscriber to SEO MOZ and no one here has anything to gain but offering advice, is this true ? Will it not be an issue for duplicate content to show both a index page and non index page ?, will search engines not view this as duplicate content ? Or is this SEO expert talking bull, which I suspect, but cannot be sure. Any advice would be greatly appreciated, it would make my life a lot easier for the customer to use this CMS software, but I would do it at the risk of tarnishing the work they and I have done on their ranking status Many thanks in advance John
Technical SEO | | Johnny4B0 -
BEST Wordpress Robots.txt Sitemap Practice??
Alright, my question comes directly from this article by SEOmoz http://www.seomoz.org/learn-seo/robotstxt Yes, I have submitted the sitemap to google, bing's webmaster tools and and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with? <code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code> <code>???</code> because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read other questions. but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt http://www.seomoz.org/q/robots-txt-question-2 http://www.seomoz.org/q/quick-robots-txt-check. http://www.seomoz.org/q/xml-sitemap-instruction-in-robots-txt-worth-doing I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right. User-agent: *
Technical SEO | | joony2008
Disallow:
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-login.php
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /comments **ERASE EVERYTHING??? and changed it to** <code> <code>
<code>User-agent: *
Disallow: </code> Sitemap: http://www.example.com/sitemap_index.xml</code> <code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code> <code>?????????</code> ```</code>0 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
Local SEO best practices for multiple locations
When dealing with local search for a business with multiple locations, I've always created an individual page for each location. Aside from the address and business name being in there, I also like to make sure the title tag and other important markup features the state/city/suburb, or, in the case of hyper-local, hyper-competitive markets, information more specific than that. It's worked very well so far. But, the one thing you can always count on with Local is that the game keeps changing. So I'd like to hear what you think... How do you deal with multiple locations these days? Has Google (and others, of course) advanced far enough to not mess things up if you put multiple locations on the same page? (Do I hear snickers? Be nice now) How does Schema.org fit in to your tactics in this area, if at all? Cheers (Edit: dear SEOmoz, stop eating my line breaks)
Technical SEO | | BedeFahey0 -
Page not being indexed
Hi all, On our site we have a lot of bookmaker reviews, and we are ranking pretty good for most bookmaker names as keywords, however a single bookmaker seems to have been shunned by Google. For a search "betsafe" in Denmark, this page does not appear among the top 50: http://www.betxpert.com/bookmakere/betsafe All of our other review pages rank in top 10-20 for the bookmaker name as keyword. What to do if Google has "banned" a page? Best regards, Rasmus
Technical SEO | | rasmusbang0