Spammy Structured Data Markup Removal
-
Hi There,
I'm in a weird situation and I am wondering if you can help me.
Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues.
We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened:
a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site.
b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore.
We need a solution for getting the markup out of the SERP.
We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages.
Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me?
Thanks so much
-
Thanks so much
I'll try that right away
-
yes just create one you can call 301-sitemap.xml and submit it to google webmaster tools. This is a separate one from your full sitemap as when you ll get those pages removed from google seeps you can just delete it without affecting your normal sitemap.
-
thanks for your answer,
Should I create a sitemap with only dead pages? and then have two sitemaps?
let me know, please.
-
Hi Yosepgr,
one thing I would like to clarify IMO is that dev needs SEO guidance on how to implement schema. Sometimes people just request schema implementation and then wait for dev to do it. I'm not saying is your case but we, as SEO, should provide technical guidance on how to correctly implement that.
That being said I had a similar case in the past and what I did was creating a sitemap including just the dead URLs. I this way I was forcing google to crawl them and see that they now redirect to the new version.
After doing so, ensure that your redirect is actually a permanent redirect (301). You can check that easily with screaming frog by crawling those URLs in list mode or get the ayima plugin for chrome and visit the URL so you can see what the header response look like. Ensure that the redirect is 301 and with just 1 step (if possible).
It may take a while for google to digest the but you shouldn't be worried about schema as if google is penalizing your site for spammy markup, it will penalize only pages containing that markup which are now dead and removed from the site.
I hope this helps!
e
-
Hey there,
It's definitely not that good of an idea to re-do the old url's. Have you submitted the site to be reindexed? Make sure you update your sitemap if needed (and/or robots) and reupload these to google. Then wait. Any additional changes might confuse G even more. Make sure to 301 the old pages to the new ones.
If you still need help with the schema code drop me a PM.
Have a great day
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website url structure after redesign and 301 redirect chains - Looking for advice
OK, been trying to piece together what is best practice for someone I'm working with, so here goes; Website was redesigned, changed urls from url a to url b. 301's put in place. However, the new url structure is not optimal. It's an e-commerce store, and all products are put in the root folder now: www.website.com/product-name A better, more organized url structure would be: www.website.com/category/product-name I think we can all agree on that. However, I'm torn on whether it's worth changing everything again, and how to handle things in terms of redirects. The way I see things, it would result in a redirect chain, which is not great and would reduce link equity. Keeping the products in the root moving forward with a poor structure doesn't feel great either. What to do? Any thoughts on this would be much appreciated!
Technical SEO | | Tomasvdw0 -
Social Profile & Logo Markup: Where to add it?
We're looking to implement structured data for our social profiles and logo, as referenced here: https://developers.google.com/search/docs/data-types/social-profile https://developers.google.com/search/docs/data-types/logo Should we add the markup for these structured data types to multiple pages, the homepage only, or all indexable pages? TIA
Technical SEO | | Allie_Williams0 -
No structured sitemap
Hello We face this problem that a lot of sitemaps are structurally not good. In this case we used the WP sitemap plugin to generate the website sitemap and Google XML sitemaps to generate the sitemap for Google. We also bought the Yoast premium plugin, but we can read in the backend that the plugin XML sitemaps may cause problems in combination with Yoast. Normally the Google XML sitemap generator improves SEO using sitemaps for the best indexation by search engines, but the structure is not as we want it. Will Yoast be a better solution to generate structured sitemaps? This is a section from the current sitemap of www.rovana.be. Products Reepgordijn Plissé - Dupli gordijn Duo rolgordijn Paneelgordijn Jaloezie - Vlinderjaloezie Poorten Muggenramen Velux accessoires Rolgordijn Vouwgordijn Buitenjaloezie Voorzetrolluik Glasdak Glaswand Vouwdak Pergola Verlichting - Verwarming Automatisering Lamellendak Verandazonwering Screens Koepel zonwering This is how we think the sitemap should look like. We would like more structure in the different product categories. Producten Zonwering Zonnescherm
Technical SEO | | conversal
Screens
Verandazonwering
Koepel zonwering
Automatisering
Verwarming – verlichting Terrasoverkapping Lamellendak
Pergola
VouwdaK
Glasdak
Glaswand Raamdecoratie Rolgorijn
Paneelgordijn
Duo rolgordijn
Vouwgordijn
Plissé – dupli gordijn
Jaloezie – vlinderjaloezie
Reepgordijn
Velux accessoires Rolluiken Voorzetrolluiken
Buitenjaloezie
Velux accessoires Muggenramen Muggenraam
Velux accessoires Poorten Sectionaal poort Is this technically possible to create similar sitemaps in WordPress and how exactly do we proceed here? What is the impact of these changes on SEO? How can we make this work? Thanks!0 -
Why Doesn't All Structured Data Show in Google Webmaster?
We have more than 80k products, each of them with data-vocabulary.org markup on them, but only 17k are being reported as having the markup in Google Webmaster (GW). If I run a page that GW isn't showing as having the structure data in the structured data testing tool (http://www.google.com/webmasters/tools/richsnippets), it passes. Any thoughts on why this would be happening? Is it because we should switch from data-vocabulary.org to schema.org? Example of page that GW is reporting that has structured data: https://www.etundra.com/restaurant-equipment/refrigeration/display-cases/coutnertop/vollrath-40862-36-inch-cubed-glass-refrigerated-display-cabinet/ Example of page that isn't showing in GW as having structured data: https://www.etundra.com/kitchen-supplies/cutlery/sandwich-spreaders/mundial-w5688-4-and-half-4-and-half-sandwich-spreader/
Technical SEO | | eTundra0 -
Toxic Link Removal
Greetings Moz Community: Recently I received an site audit from a MOZ certified SEO firm. The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy. After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed. Only after the removal process is complete do they think it would be appropriate to start building new links. Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)? For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources. While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic. I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012. Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up. Any thoughts??? THANKS,
Technical SEO | | Kingalan1
Alan0 -
Removing pages from website
Hello all, I am fairly new to the SEOmoz community. But i am working for a company which organizes exhibitons, events and training in Holland. A lot of these events are only given ones ore twice and then we do not organise them any more because they are no longer relevant. Every event has its own few webpages which provide information about the event and are being indexed by Google. In the past we did not remove any of these events. I was looking in the CMS and saw a lot of events of 2008 and older which are being indexed. To clean the website and the CMS i am thinking of removing these pages of old events. The risk is that these pages have some links to them and are getting some traffic, so if i remove them there is a risk of losing traffic and rankings. What would be the wise thing to do? Make a folder with archive or something? Regards, Ruud
Technical SEO | | RuudHeijnen0 -
Suggested url structure for hierarchical data
For an existing web site we are developing a local info web site section where each area would get a review and information about local bars and restaurants. The site manages areas in the following hierarchy: Country > Broader region > Perfecture > Municipality > Neighborhood e.g. Italy > Northern Italy > Lombardia > Milano > Center Local Info pages would exist for all the above levels so you could have a page for Italy as a whole, a page for Lombardia, and a separate page for the Center of Milano. On certain countries there are many synonyms especially in the Neighborhood level but also a few in the Municipality level. We would like to build a generic SEF url structure/pattern that would be able to represent the above and be as short as possible for the purpose of SEO. 1. the obvious solution would be to incorporate the unique identifier of e.g. www.example.com/local-info/Italy-10
Technical SEO | | seo-cat
www.example.com/local-info/Milano-12363
www.example.com/local-info/Center-789172 but this does not represent the hierarchy and does not include the interesting keyword of e.g. Milano when looking at the neighborhood level 2. Another option would be to include all levels e.g. www.example.com/local-info/Italy/Northern-Italy/Lombardia
www.example.com/local-info/Italy/Northern-Italy/Lombardia/Milano
www.example.com/local-info/Italy/Northern-Italy/Lombardia/Milano/Center But this would end up with large URLs 3. I am thinking of another solution which would include the current level and its parent at any page. Not capturing the hierarchy very well but at least it includes the parent name for richer keywords in the url itself. www.example.com/local-info/Northern-Italy/Lombardia
www.example.com/local-info/Lombardia/Milano
www.example.com/local-info/Milano/Center 4. Or a hybrid where the first levels are always there and the rest are concatenated on a single segment www.example.com/local-info/Italy/Northern-Italy/Lombardia
www.example.com/local-info/Italy/Northern-Italy/Lombardia-Milano
www.example.com/local-info/Italy/Northern-Italy/Lombardia-Milano-Center any thoughts? thanks in advance0 -
Is there any issue with using the same structured data property multiple times on the same page?
Im working on implementing structured data properties into my product detail pages. (http://schema.org/Book) My site sells books and many books have both a 13 digit ISBN # and a 10 Digit ISBN. Should I apply the itemprop "isbn" to both of them or just the one with higher search volume? Some books also have multiple authors, how should I handle that?
Technical SEO | | myork07240