Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
-
We just lost over 20% traffic after google algo update at June 26.
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update.The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web.
I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well.
Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?
-
Everett, thanks so much. Also the link for the quality rater guidelines was very interesting and useful.
-
iCourse,
It used to be that Google told their Quality Raters to look for "Supplementary Content". This has recently been removed from their Handbook for Quality Raters, and you can learn more about it here: http://www.thesempost.com/updated-google-quality-rater-guidelines-eat/ .
That said, they probably removed it because people were showing unrelated supplementary content, or because QRs were marking pages with lots of supplementary content and very little unique body content as "High Quality", which they are not.
In your case, all of the ideas you presented sounded like useful added information for someone on a local vacation or real estate page.
-
Hi Patrick, thanks these are very useful links for an audit. Also the Barracuda tool is great.
In our case we are already quite confident that our focus should be adding more content to our about 1000 city category pages.
My core doubt right now is really: Shall I as a quick first step add now to the city pages the mentioned data from external sources or may it rather hurt in the eyes of google. For visitors it would be useful. -
Hi there
What I would take a look at the algorithm updates and line up your analytics with the dates. Barracuda actually has a great tool to make this easy on you. Note what pages dropped the most. From there, I would look the following resources:
- How To Do a Content Audit (Moz)
- Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)
I am not so much worried about tools and plugins (as long as they are credible and you're not abusing them) as much as I am that usually travel sites that have to cover a lot of cities using the same content simply switching city names out. I would review duplicate content best practices and make sure you're not inadvertently abusing this tactic.
Let me know if this helps, happy to help where I can! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
Blocking Certain Site Parameters from Google's Index - Please Help
Hello, So we recently used Google Webmaster Tools in an attempt to block certain parameters on our site from showing up in Google's index. One of our site parameters is essentially for user location and accounts for over 500,000 URLs. This parameter does not change page content in any way, and there is no need for Google to index it. We edited the parameter in GWT to tell Google that it does not change site content and to not index it. However, after two weeks, all of these URLs are still definitely getting indexed. Why? Maybe there's something we're missing here. Perhaps there is another way to do this more effectively. Has anyone else ran into this problem? The path we used to implement this action:
Intermediate & Advanced SEO | | Jbake
Google Webmaster Tools > Crawl > URL Parameters Thank you in advance for your help!0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Are Dated News Considered Low Quality Content?
I've got lots of news posts on my blog. There is nothing wrong with the news posts themselves but older posts do get lower CTR and higher bounce rates. I was considering moving the older news to a subdomain (ie.: archive.mywebsite.com) and do a 302 redirect for each post. What do you think?
Intermediate & Advanced SEO | | sbrault740 -
Will changing Google Places address hurt rankings?
I have a client transferring ownership of their service business (photo booth rental). The current listed address will change, so my main concern is preserving the rankings during the transition. Should I change the Google Local listing to a new physical address, or change it to "serve a surrounding area"? It seems best to set as "serving a surrounding area", but I know Google is really weird about making local listing changes. I've seen and heard about countless listings falling completely off the map after being updated. Any advice appreciated.
Intermediate & Advanced SEO | | Joes_Ideas0 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Google.ca vs Google.com Ranking
I have a site I would like to rank high for particular keywords in the Google.ca searches and don't particularly care about the Google.com searches (it's a Canadian service). I have logged into Google Webmaster Tools and targeted Canada. Currently my site is ranking on the third page for my desired keywords on Google.com, but is on the 20th page for Google.ca. Previously this change happened quite quickly -- within 4 weeks -- but it doesn't seem to be taking here (12 weeks out and counting). My optimization seems to be fine since I'm ranking well on Google.com: not sure why it's not translating to Google.ca. Any help or thoughts would be appreciated.
Intermediate & Advanced SEO | | seorm0