Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
-
We just lost over 20% traffic after google algo update at June 26.
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update.The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web.
I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well.
Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?
-
Everett, thanks so much. Also the link for the quality rater guidelines was very interesting and useful.
-
iCourse,
It used to be that Google told their Quality Raters to look for "Supplementary Content". This has recently been removed from their Handbook for Quality Raters, and you can learn more about it here: http://www.thesempost.com/updated-google-quality-rater-guidelines-eat/ .
That said, they probably removed it because people were showing unrelated supplementary content, or because QRs were marking pages with lots of supplementary content and very little unique body content as "High Quality", which they are not.
In your case, all of the ideas you presented sounded like useful added information for someone on a local vacation or real estate page.
-
Hi Patrick, thanks these are very useful links for an audit. Also the Barracuda tool is great.
In our case we are already quite confident that our focus should be adding more content to our about 1000 city category pages.
My core doubt right now is really: Shall I as a quick first step add now to the city pages the mentioned data from external sources or may it rather hurt in the eyes of google. For visitors it would be useful. -
Hi there
What I would take a look at the algorithm updates and line up your analytics with the dates. Barracuda actually has a great tool to make this easy on you. Note what pages dropped the most. From there, I would look the following resources:
- How To Do a Content Audit (Moz)
- Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)
I am not so much worried about tools and plugins (as long as they are credible and you're not abusing them) as much as I am that usually travel sites that have to cover a lot of cities using the same content simply switching city names out. I would review duplicate content best practices and make sure you're not inadvertently abusing this tactic.
Let me know if this helps, happy to help where I can! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does content revealed by a 'show more' button get crawled by Google?
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers
Intermediate & Advanced SEO | | SEOhmygod0 -
Could this be seen as duplicate content in Google's eyes?
Hi I'm an in-house SEO and we've recently seen Panda related traffic loss along with some of our main keywords slipping down the SERPs. Looking for possible Panda related issues I was wondering if the following could be seen as duplicate content. We've got some very similar holidays (travel company) on our website. While they are different I'm concerned it may be seen as creating content that is too similar: http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/the-wildlife-and-beaches-of-kenya.aspx http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/ultimate-kenya-wildlife-and-beaches.aspx http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays/wildlife-and-beach-family-safari.aspx They do all have unique text but as you can see from the titles, they are very similar (note from an SEO point of view the tabbed content is all within the same page at source level). At the top level of the holiday pages we have a filtered search:
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/kenya/suggested-holidays.aspx These pages have a unique introduction but the content snippets being pulled into the boxes is drawn from each of the individual holiday pages. I'm just concerned that these could be introducing some duplicating issues. Any thoughts?0 -
Question regarding geo-targeting in Google Webmaster Tools.
I understand that it's possible to target both domains/subdomains and subfolders to different geographical regions in GWT. However, I was wondering about the effect of targeting the domain to a single country, say the UK. Then targeting subfolders to other regions (say the US and France). e.g. www.domain.com -> UK
Intermediate & Advanced SEO | | TranslateMediaLtd
www.domain.com/us -> US
www.domain.com/fr -> France etc Would it be better to leave the main domain without a geographical target but set geo-targeting for the subfolders? Or would it be best to set geo-targeting for both the domain and subfolders.0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Automated Quality Content Acceptable Even Though Looks Similar Across Pages
I have some advanced statistics modules implemented on my website, which is very high level added value for users. However, wording is similar across 1000+ pages, with difference being the statistical findings.
Intermediate & Advanced SEO | | khi5
Page Ex 1: http://www.honoluluhi5.com/oahu/honolulu-condos/
Page Ex: 2: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ As you can see same wording is used "Median Sales Price per Year", "$ Volume of Active Listings" etc etc....difference being the findings / results are obviously different. Questions: are search engines smart enough to realize the quality in this or do they see similar wording across 1000+ pages and p-otentially consider the pages low-quality content, because search engines are unable to identify the high level added value and complexity in pulling such quality data? If that may be the case, does that mean I ought to make the pages more "unique" by including a little piece of writing about each page to make them look more unique, even though it is not of value to users?0 -
Help!!! Am I being Attacked???
Hello, I do not believe so much in spammy links attacks and I definitely do not believe my site is worth attacking. However, I'm seeing new links pointing to my site that I have no idea where they come from. I just spotted three articles on a poor crappy article site with exact match keywords point to me. The articles are completely unique (copyscaped them) and they were posted according to the site time stamp during Oct and Nov 2012. (And they Appear in the WMT recently discovered links from more or less the same time). What to do (besides for disavowing this domain)? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Google Places
If you rank on google places, I have noticed that you do not rank on the front page as well. I have a site that ranks on front page for it's keywords; however, because they are (1) on google places, they don't show up when someone is localized to that area. They show up on google places but not on front page. If you turn of localization, they are first in serps. How can I get around this? Two separate sites? One for Google+ (Places) and one for SERPS?
Intermediate & Advanced SEO | | JML11790 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0