Yoast platinum, news plug-in & meta properties
-
I have followed some of the discussions about Yoast and the news plug-in, but have not found specific information about the use of meta properties. One of our competitors is successfully using about 15 meta properties to gain news ranking. They list the publisher as Facebook. Is this coding part of the Yoast package or hard coding? As an example:
-
That part is probably configured by the Yoast plugin as it looks pretty much the same as the set-up that we have.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool to identify if meta description are showing?
Hi we have a Ecommerce client with 1000s of meta descriptions, we have noticed that some meta descriptions are not showing properly, we want to pull and see which ones are showing on Google SERP results. You can use tools like screaming frog to pull meta description from page, but we want to see if it's showing for certain keywords. Any ideas on how to automate this? Cheers.
Intermediate & Advanced SEO | | brianna00 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Meta tags - are they case sensitive?
I just ran the wordtracker tool and noticed something interesting. The tool didn't pick up our meta description. It's strange as our meta descriptions appear in organic search results and Moz never reported missing meta descriptions.After reviewing other pages, I noticed our meta description tag is written as the following: name="Description" content=" I never thought about this, but are meta tags case sensitive? Should it be written as: name="description" content=" Thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Organic Rankings for the US & Australia
I have a site that is ranking well for competitive keywords in the US, but would like to have it rank in Australia as well. Although there's no direct correlation, I'm running large Adwords campaigns in both countries. I've read to write localized content for each region, but not sure if this is effective as it used to be. I've also read to use location markup and microformats. Any feedback would be greatly appreciated. Thank you in advance
Intermediate & Advanced SEO | | NickMacario0 -
Www. & Non-www versions problem
I currently have both WWW and non-WWW versions of my site live on the web, and I want to be able to have all non-WWW to point to WWW.http://nile-cruises-4u.co.ukhttp://www.nile-cruises-4u.co.uk
Intermediate & Advanced SEO | | NileCruises
I want just the WWW version of the site live.I'm wondering the best way to resolve this?Is it normally dealt with in the .htaccess file and I think, looks like this but I'm not sure: <code>RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ [http://www.](http://www.)%{HTTP_HOST}/$1 [R=301,L]</code>0 -
How do I reduce internal links & cannibalisation from primiary navigation?
SEOmoz tools is reporting each page on our site containing in excess of 200 internal links mostly from our primary navigation menu which it says is too many. This also causes cannibalization on the word towels which i would like to avoid if possible. Is there a way to reduce the number of internal links whilst maintaining a good structure to allow link juice to filter through the site and also reduce cannibalization?
Intermediate & Advanced SEO | | Towelsrus0 -
Magento Hidden Products & Google Not Found Errors
We recently moved our website over to the Magento eCommerce platform. Magento has functionality to make certain items not visible individually so you can, for example, take 6 products and turn it into 1 product where a customer can choose their options. You then hide all the individual products, leaving only that one product visible on the site and reducing duplicate content issues. We did this. It works great and the individual products don't show up in our site map, which is what we'd like. However, Google Webmaster Tools has all of these individual product URLs in its Not Found Crawl Errors. ! For example: White t-shirt URL: /white-t-shirt Red t-shirt URL: /red-t-shirt Blue t-shirt URL: /blue-t-shirt All of those are not visible on the site and the URLs do not appear in our site map. But they are all showing up in Google Webmaster Tools. Configurable t-shirt URL: /t-shirt This product is the only one visible on the site, does appear on the site map, and shows up in Google Webmaster Tools as a valid URL. ! Do you know how it found the individual products if it isn't in the site map and they aren't visible on the website? And how important do you think it is that we fix all of these hundreds of Not Found errors to point to the single visible product on the site? I would think it is fairly important, but don't want to spend a week of man power on it if the returns would be minimal. Thanks so much for any input!
Intermediate & Advanced SEO | | Marketing.SCG0