Spammy Structured Data Markup Removal
-
Hi There,
I'm in a weird situation and I am wondering if you can help me.
Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues.
We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened:
a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site.
b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore.
We need a solution for getting the markup out of the SERP.
We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages.
Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me?
Thanks so much
-
Thanks so much
I'll try that right away
-
yes just create one you can call 301-sitemap.xml and submit it to google webmaster tools. This is a separate one from your full sitemap as when you ll get those pages removed from google seeps you can just delete it without affecting your normal sitemap.
-
thanks for your answer,
Should I create a sitemap with only dead pages? and then have two sitemaps?
let me know, please.
-
Hi Yosepgr,
one thing I would like to clarify IMO is that dev needs SEO guidance on how to implement schema. Sometimes people just request schema implementation and then wait for dev to do it. I'm not saying is your case but we, as SEO, should provide technical guidance on how to correctly implement that.
That being said I had a similar case in the past and what I did was creating a sitemap including just the dead URLs. I this way I was forcing google to crawl them and see that they now redirect to the new version.
After doing so, ensure that your redirect is actually a permanent redirect (301). You can check that easily with screaming frog by crawling those URLs in list mode or get the ayima plugin for chrome and visit the URL so you can see what the header response look like. Ensure that the redirect is 301 and with just 1 step (if possible).
It may take a while for google to digest the but you shouldn't be worried about schema as if google is penalizing your site for spammy markup, it will penalize only pages containing that markup which are now dead and removed from the site.
I hope this helps!
e
-
Hey there,
It's definitely not that good of an idea to re-do the old url's. Have you submitted the site to be reindexed? Make sure you update your sitemap if needed (and/or robots) and reupload these to google. Then wait. Any additional changes might confuse G even more. Make sure to 301 the old pages to the new ones.
If you still need help with the schema code drop me a PM.
Have a great day
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Warning "Missing field "url" (optional)"
Hello Moz Team, I hope everyone is doing well & good, I need bit help regarding Schema Markup, I am facing issue in my schema markup specifically with my blog posts, In my majority of the posts I find error "Missing field "url" (optional)"
Technical SEO | | JoeySolicitor
As this schema is generated by Yoast plugin, I haven't applied any custom steps. Recently I published a post https://dailycontributors.com/kisscartoon-alternatives-and-complete-review/ and I tested it at two platforms of schema test 1, Validator.Schema.org
2. Search.google.com/test/rich-results So the validator generate results as follows and shows no error
Schema without error.PNG It shows no error But where as Schema with error.PNG in search central results it gives me a warning "Missing field "url" (optional)". So is this really be going to issue for my ranking ? Please help thanks!6 -
Website url structure after redesign and 301 redirect chains - Looking for advice
OK, been trying to piece together what is best practice for someone I'm working with, so here goes; Website was redesigned, changed urls from url a to url b. 301's put in place. However, the new url structure is not optimal. It's an e-commerce store, and all products are put in the root folder now: www.website.com/product-name A better, more organized url structure would be: www.website.com/category/product-name I think we can all agree on that. However, I'm torn on whether it's worth changing everything again, and how to handle things in terms of redirects. The way I see things, it would result in a redirect chain, which is not great and would reduce link equity. Keeping the products in the root moving forward with a poor structure doesn't feel great either. What to do? Any thoughts on this would be much appreciated!
Technical SEO | | Tomasvdw0 -
Several Items in the Organization schema structured file
Hi MOZ community! Could you please help me with the issue? I have implemented Organization Schema on the website. But according to the structure I can not markup data once. So now I have 2 Items for a Organization schema on a page. The questions are: 1. Does Google consider both of them? 2. Is that OK to have a few Items for one type of schema on the page? Thank you
Technical SEO | | juicefromtheraw10 -
Is Schema markup inappropriate for ?
Is Schema(.org) markup meant specifically to be used on text? Or can you use it in a similar way that you can use Open Graph Protocol? For example, for awhile I've been using something like this on my site: Because it's in the head section, it appears on every page. In review, this seems to be an incorrect use? Should I only be using Schema to mark specific text? If not, what are the consequences of using Schema like this?
Technical SEO | | eglove0 -
How to remove the duplicate page title
Hi everyone, I saw many posts related to this query.But i couldnt find a solution for my error.. Here is my question I got 575 Duplicate page title & 600 duplicate page content errors. My site is related to realestate. I created a page title like same sentence differs with locality name Eg: Land for sale - kandy property Land for sale - Galle property Likewise Locality name only differs..I have created meta title & Content like this. Can anyone let me know how to solve this error ASAP ?
Technical SEO | | Rajesh.Chandran0 -
We have duplicate page titles on the footer menu section of our site. Is this considered spammy?
When our new site was in development stages our digital agency convinced me that we should have duplicate menu links in the footer section of the site. The general justification being that the menu links are key word relevant. I have received opposing opinion from SEO advisers indicating that these duplicate menu links could be considered 'spammy'. I would appreciate some views on this please
Technical SEO | | saints0 -
Search optimal Tab structure?
Good day, We are in the process of starting a website redesign/development. We will likely be employing a tabbing structure on our home page and would like to be able to capitalize on the keyword content found across the various tabs. The tab structure will be similar to how this site achieves tabs: http://ugmo.com/ I've uploaded a screen grab of this page as the Googlebot user agent. The text "Soil Intelligence for professional Turf Managers" clicks through to this page: http://ugmo.com/?quicktabs_1=1#quicktabs-1 So I'm thinking there could be some keyword dilution there. That said Google is very much aware of the text on the quicktabs-1 page being related to the home page content: http://www.google.com/search?q=Up+your+game+with+precise+soil+moisture%2C+salinity+and+temperature+measurements.+And+in+the+process%2C+save+water%2C+resources%2C+money.+inurl%3Augmo.com&sourceid=ie7&rls=com.microsoft:en-us:IE-SearchBox&ie=&oe= Is this the best search optimal way to add keyword density on a home page with a tab structure? Or is there a better means of achieving this? {61bfcca1-5f32-435e-a311-7ef4f9b592dd}_tabs_as_Googlebot.png
Technical SEO | | Hershel.Miller0 -
Are (ultra) flat site structures better for SEO?
Noticed that a high-profile site uses a very flat structure for there content. It essentially places most landing pages right under the root domain folder. So a more conventional site might use this structure: www.widgets.com/landing-page-1/ www.widgets.com/landing-page-1/landing-page-2/ www.widgets.com/landing-page-1/landing-page-2/landing-page-3/ This site in question - a successful one - would deploy the same content like this: www.widgets.com/landing-page-1/ www.widgets.com/landing-page-2/ www.widgets.com/landing-page-3/ So when you're clicking deeper into the nav. options the clicks always roll up to the "top level." Top level pages are given more weight by SEs but conventional directory structures are also beneficial seen as ideal. Why would a site take the plunge and organize content in this way? What was the clincher?
Technical SEO | | DisneyFamily1