Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
-
We just lost over 20% traffic after google algo update at June 26.
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update.The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web.
I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well.
Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?
-
Everett, thanks so much. Also the link for the quality rater guidelines was very interesting and useful.
-
iCourse,
It used to be that Google told their Quality Raters to look for "Supplementary Content". This has recently been removed from their Handbook for Quality Raters, and you can learn more about it here: http://www.thesempost.com/updated-google-quality-rater-guidelines-eat/ .
That said, they probably removed it because people were showing unrelated supplementary content, or because QRs were marking pages with lots of supplementary content and very little unique body content as "High Quality", which they are not.
In your case, all of the ideas you presented sounded like useful added information for someone on a local vacation or real estate page.
-
Hi Patrick, thanks these are very useful links for an audit. Also the Barracuda tool is great.
In our case we are already quite confident that our focus should be adding more content to our about 1000 city category pages.
My core doubt right now is really: Shall I as a quick first step add now to the city pages the mentioned data from external sources or may it rather hurt in the eyes of google. For visitors it would be useful. -
Hi there
What I would take a look at the algorithm updates and line up your analytics with the dates. Barracuda actually has a great tool to make this easy on you. Note what pages dropped the most. From there, I would look the following resources:
- How To Do a Content Audit (Moz)
- Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)
I am not so much worried about tools and plugins (as long as they are credible and you're not abusing them) as much as I am that usually travel sites that have to cover a lot of cities using the same content simply switching city names out. I would review duplicate content best practices and make sure you're not inadvertently abusing this tactic.
Let me know if this helps, happy to help where I can! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content From API - Remove or to Redirect ?
Hi Guys,
Intermediate & Advanced SEO | | PaddyM556
I am working on a site at the moment,
Previous developer used a API to pull in HealthCare content (HSE) .
So the API basically generates landing pages within the site, and generates the content.
To date it has over 2k in pages being generated.
Some actually rank organically and some don't. New site being launch: So a new site is being launched & the "health advice" where this content used to live be not included in the new site. So this content will not have a place to be displayed. My Query: Would you allow the old content die off in the migration process & just become 404's
Or
Would you 301 redirect the all or only ranking pages to the homepage ? Other considerations, site will be moved to https:// so site will be submitted to search console & re-indexed by Google. Would love to hear if anyone had similar situation or suggestions.
Best Regards
Pat0 -
Does Google ignore content styled with 'display:none'?
Do you know if an H1 within a div that has a 'display: none' style applied will still be crawled and evaluated by Google? We have that situation on this page on line 136: view-source:https://www.junk-king.com/services/items-we-take/foreclosure-cleanouts Of course we also have an H1 up at the top of the page and are concerned that the second one will cause interference with our SEO efforts. I've seen conflicting and inconclusive information on line - not sure. Thanks for any help.
Intermediate & Advanced SEO | | rastellop0 -
Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
Hey guys. Wondering if someone can help diagnose a problem for me. Here's our site: https://www.flagandbanner.com/ We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place! Here's the robots.txt file: User-agent: *
Intermediate & Advanced SEO | | webrocket
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/ Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/* Sitemap: https://www.flagandbanner.com/images/sitemap.xml Anyone have any thoughts as to what our problems are?? Mike0 -
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
Does content revealed by a 'show more' button get crawled by Google?
I have a div on my website with around 500 words of unique content in, automatically when the page is first visited the div has a fixed height of 100px, showing a couple of hundred words and fading out to white, with a show more button, which when clicked, increases the height to show the full content. My question is, does Google crawl the content in that div when it renders the page? Or disregard it? Its all in the source code. Or worse, do they consider this cloaking or hidden content? It is only there to make the site more useable for customers, so i don't want to get penalised for it. Cheers
Intermediate & Advanced SEO | | SEOhmygod0 -
Blog content and panda?
If we want to create a blog to keep in front of our customers (via email and posting on social) and the posts will be around 300 - 1000 words like this site http://www.solopress.com/blog/ are we going to be asking for a panda slap as the issue would be the very little shares and traction after the first day or two. Also would panda only affect the blogs that are crap if we mix in a couple of really good posts or would it affect theses as well and possibly even the site? Any help would be appreciated.
Intermediate & Advanced SEO | | BobAnderson0 -
April Google Update?
Since April 16 (when Jews ate Matzah) Google hurt one of our clients badly. They are well-known and beloved brand with hundreds of employees and locations across USA.
Intermediate & Advanced SEO | | Elchanan
I can’t see any signal of organic update, or penalty (neither Google Places). No message on GWT Nothing has been changed on and off site. All keywords' ranking are looking like this All tools showing good analysis: MOZ, Barracuda, MajesticSeo Content is good and not duplicated, etc. Do one of you is aware of significant Google update?
What do you think/suggest?0 -
Removing content from Google's Indexes
Hello Mozers My client asked a very good question today. I didn't know the answer, hence this question. When you submit a 'Removing content for legal reasons report': https://support.google.com/legal/contact/lr_legalother?product=websearch will the person(s) owning the website containing this inflammatory content recieve any communication from Google? My clients have already had the offending URL removed by a court order which was sent to the offending company. However now the site has been relocated and the same content is glaring out at them (and their potential clients) with the title "Solicitors from Hell + Brand name" immediately under their SERPs entry. **I'm going to follow the advice of the forum and try to get the url removed via Googles report system as well as the reargard action of increasing my clients SERPs entries via Social + Content. ** However, I need to be able to firmly tell my clients the implications of submitting a report. They are worried that if they rock the boat this URL (with open access for reporting of complaints) will simply get more inflammatory)! By rocking the boat, I mean, Google informing the owners of this "Solicitors from Hell" site that they have been reported for "hosting defamatory" content. I'm hoping that Google wouldn't inform such a site, and that the only indicator would be an absence of visits. Is this the case or am I being too optimistic?
Intermediate & Advanced SEO | | catherine-2793880