Does anyone know how to fix this structured data error on search console? Invalid value in field "itemtype"
-
I'm getting the same structured data error on search console form most of my websites, Invalid value in field "itemtype"
I take off all the structured data but still having this problem, according to Search console is a syntax problem but I can't find what is causing this.
Any guess, suggestion or solution for this?
-
Hi,
Thanks for your answer.
As I say before I'm having this problem even when I take off all the structure data from tag manager. Not sure what is generating that schema. Any ideas?
Regards,
-
Hi,
Thanks for your answer, but I'm having the problem on most of my websites, even before I put some JSON schema with tag manager. I tried to disallowing all the schema from the tag manager and still happening the same. All these sites are on WordPress I'm starting to think tath there is a problem on the code or something.
You can have a look on the page, https://alexanders.co.nz/ please let me know if you have any suggestions.
I run a test on https://search.google.com/structured-data/testing-tool and is highlighting this problem, but as I say before even when I took out all the schema from tag manager so I'm not sure what is generating that code or schema.
-
Hi,
Have you incorporated structured data into your website using html code (json, schema, etc)?
Maybe there is some data not completed.
A greeting
-
Hi,
Are you able to provide a link to the page?
From experience it's placing the wrong schema together which causes it to not validate. Search console is usually 3 days behind in terms of collecting data so to sort it out immediately test the page through here => https://search.google.com/structured-data/testing-tool
If after all this and your'e still having problems, go check out a website that serves up similar content and check how they've marked up their page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console indexes website for www but images for non www.
On the google search console, the website data is all showing for the www.promierproducts.com. The images however are indexed on the non www version. I'm not sure why.
Intermediate & Advanced SEO | | MikeSab1 -
How much does "Sud-domain SEO optimisation" improves website ranking?
Let's say there is a website(domain) and couple of sub-domains (around 6). If we optimise all sub-domains with "keyword" we want our website to rank for.....like giving "keyword" across all page titles of sub-domains and possible places which looks natural as brand mentions. Will this scenario helps website to rank better for same "keyword"? How can these sub-domains do really influence website in rankings? Like if the sub-domains have broken links, will this affect website SEO efforts?
Intermediate & Advanced SEO | | vtmoz0 -
Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
We just lost over 20% traffic after google algo update at June 26.
Intermediate & Advanced SEO | | lcourse
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update. The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web. I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well. Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
Google webmaster Smartphone errors fix
I have certain URL's that I have fixed before in google webmaster. With smartphone addition. It start appearing again. How can I fix the Google webmaster errrors for smartphones?
Intermediate & Advanced SEO | | csfarnsworth0 -
Add or not add "nofollow" to duplicate internal links?
Hello everyone. I have searched on these forums for an answer to my concerns, and despite I found many discussions and questions about applying or not applying "nofollow" to internal links, I couldn't find an answer specific to my particular scenarios. Here is my first scenario: I have an e-commerce site selling digital sheet music, and on my category pages our products are shown typically with the following format: PRODUCT TITLE link that takes to product page Short description text "more info" link that takes to the same product page again As you may notice, the "more info" link takes at the very same page of the PRODUCT TITLE link. So, my question is: is there any benefit to "nofollow" the "more info" link to tell SEs to "ignore" that link? Or should I leave the way it is and let the SE figure it out? My biggest concern by leaving the "nofollow" out is that the "more info" generic and repetitive anchor text could dilute or "compete" with the keyword content of the PRODUCT TITLE anchor text.... but maybe that doesn't really matter! Here a typical category page from my site; http://www.virtualsheetmusic.com/downloads/Indici/Guitar.html My second scenario: on our product pages, we have several different links that take to the very same "preview page" of the product we sell. Each link has a different anchor text, and some other links are just images, all taking to the same page. Here are the anchor texts or ALT text of such same links: "Download Free Sample" (text link) "Cover of the [product title]" (ALT image text) "Look inside this title" (ALT image text) "[product title] PDF file" (ALT image text) "This item contains one high quality PDF sheet music file ready to download and print." (ALT image text) "PDF" (text link) "[product title] PDF file" (ALT image text) So, I have 7 links on the same product page taking the user to the same "product preview page" which is, by the way, canonicalized to the "main" product page we are talking about. Here is an example of product page on my site: http://www.virtualsheetmusic.com/score/Moonlight.html My instinct is to tell SEs to take into account just the links with the "[product title] PDF file" anchor text, and then add a "nofollow" to the other links... but may that hurting in some way? Is that irrelevant? Doesn't matter? How should I move? Just ignore this issue and let the SEs figure it out? Any thoughts are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau0 -
Avoiding 301 on purpose; Landing homepage linking to another domain with "Click here to go" and 5 sec meta refresh
Hello, Some users when they search for our site by using "ourbrand" keyword that ignore the first result (we will call it here ourbrand.de -not real name-) and they look for ourbrand.com . Even though we have that domain name also registered (indeed it also has a high ranking power) we are doing a 301 from the dot com to the dot.de . What we want to do is to index the homepage of the dot com, that is http://www.ourband.com as a secondary result while doing a 301 to any other internal URL of the dot com to the dot .de. Yes, we will loose link juice for the main domain but at least we will not loose visits from the brand traffic (which is our main traffic). So the question is, would Google index ourbrand.com if we show just a landing page that just show our logo, a "Click here to go to ourbrand.de" with a link to http://www.ourbrand.de and a meta refresh of 6 seconds to that URL? Additionally a cookie would be sent to the first time visitors, so in the next time they would be automatically redirected. PS: The 6 seconds is to avoid search engine consider it a "301" like it do with short meta refresh (not sure what time is the minimum to avoid be considered a 301). Any other suggestions on how to deal with this problem are welcomed
Intermediate & Advanced SEO | | Zillo0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0