Spammy Structured Data Markup Removal
-
Hi There,
I'm in a weird situation and I am wondering if you can help me.
Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues.
We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened:
a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site.
b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore.
We need a solution for getting the markup out of the SERP.
We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages.
Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me?
Thanks so much
-
Thanks so much
I'll try that right away
-
yes just create one you can call 301-sitemap.xml and submit it to google webmaster tools. This is a separate one from your full sitemap as when you ll get those pages removed from google seeps you can just delete it without affecting your normal sitemap.
-
thanks for your answer,
Should I create a sitemap with only dead pages? and then have two sitemaps?
let me know, please.
-
Hi Yosepgr,
one thing I would like to clarify IMO is that dev needs SEO guidance on how to implement schema. Sometimes people just request schema implementation and then wait for dev to do it. I'm not saying is your case but we, as SEO, should provide technical guidance on how to correctly implement that.
That being said I had a similar case in the past and what I did was creating a sitemap including just the dead URLs. I this way I was forcing google to crawl them and see that they now redirect to the new version.
After doing so, ensure that your redirect is actually a permanent redirect (301). You can check that easily with screaming frog by crawling those URLs in list mode or get the ayima plugin for chrome and visit the URL so you can see what the header response look like. Ensure that the redirect is 301 and with just 1 step (if possible).
It may take a while for google to digest the but you shouldn't be worried about schema as if google is penalizing your site for spammy markup, it will penalize only pages containing that markup which are now dead and removed from the site.
I hope this helps!
e
-
Hey there,
It's definitely not that good of an idea to re-do the old url's. Have you submitted the site to be reindexed? Make sure you update your sitemap if needed (and/or robots) and reupload these to google. Then wait. Any additional changes might confuse G even more. Make sure to 301 the old pages to the new ones.
If you still need help with the schema code drop me a PM.
Have a great day
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Surge in spammy links
Hi, Our website www.foodjet.com has recently seen a huge amount of spammy incoming links to non-exisiting URLS: They all target pages that lead to a 404 and which clearly do not exist on our website. Since they have started to appear our DA has plummeted. I have already disavowed some domains, but more re-appear just as fast. I have also checked if our site has been hacked, which does not seem to be the case. What am I missing? And/or what can I do?
Technical SEO | | FoodJEt0 -
New URL Structure
Hi Guy's, For our webshop we're considering a new URL structure because longtail keywords to rank so well. Now we have /category (main focus keywords)
Technical SEO | | Happy-SEO
/product/the-product345897345123/ (nice to rank on, not that much volume) We have over 500 categories and every one of them is placed after our domain. Because i think it's better to work with a good structure and managed a way to make categories and sub-categories. The 500 categories may be the case why not every one of them is ranking so well, so that was also the choice of thinking about a new structure. So the new URL structure will be: /category (main focus keywords)
/category/subcat/ (also main focus keywords) Everything will be redirect (301, good way), so i think there won't be to much problems. I'm thinking about what to do with the /product/ URL. Because now it will be on the same level as the subcategories, and i'm affraid that when it's on that level, Google will give the same value to both of them. My options that i'm considering are: **Old way **
/product/the-product-345897345123/ .html (seen this on big webshops)
/product/the-product-345897345123.html/ Level deeper SKU /product/the-product/345897345123/ What would you suggest? The new structure would be 20 categories 500+ sub's devided under main categories 5000+ products Thanks!0 -
Remove a page after redirection
Hi, I had page eg. www.example.com/page1 and I redirect 302 it to > www.example.com/page2 After that I fatch this page (page2) with GSC and this page was index in serp. Can I remove this old redirect page > www.example.com/page1 now? Will this remove harm my page?
Technical SEO | | Tormar0 -
The W3C Markup Validation Service - Good, Bad or Impartial?
Hi guys, it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task. My questions to you fellow SEO'rs out there are 2: 1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with. 2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation. *As a note i will say that i mostly refer to Wordpress driven sites. would love ot hear your take. Daniel.
Technical SEO | | artdivision0 -
Improving SEO Structure of a Page
Our site is an online Marketplace for Services. Naturally, we have a lot of unique content in the form of :
Technical SEO | | Ideas2life
a) Job Posts
b) Profiles of Service Providers We also have 2 very important pages:
a) The Job Listing Page
b) The Service Provider Page The Listing pages have very valuable H1 Titles, but everything else is duplicate content. To capture those keywords currently in H1, we have created a different landing page for each category page, and we`ll optimize around that, so these H1s are not that big of a deal any more. These landing pages are the key to our SEO strategy and we are building new content every day to help them rank I want to make the Listing Pages No Index Follow. This way they pass Juice to Jobs, and Profiles which have unique contents, but are not indexed themselves. Is this a bad idea? I have been thinking about doing this for over a year but it never felt important enough to be worth the risk of accidentally screwing up We `ll soon do a new on page flow optimization and that's why I am considering this again. Thank you so much in advance Argyris0 -
Duplicate content warning for a hierarchy structure?
I have a series of pages on my website organized in a hierarchy, let's simplify it to say parent pages and child pages. Each of the child pages has product listings, and an introduction at the top (along with an image) explaining their importance, why they're grouped together, providing related information, etc.
Technical SEO | | westsaddle
The parent page has a list of all of its child pages and a copy of their introductions next to the child page's title and image thumbnail. Moz is throwing up duplicate content warnings for all of these pages. Is this an actual SEO issue, or is the warning being overzealous?
Each child page has tons of its own content, and each parent page has the introductions from a bunch of child pages, so any single introduction is never the only content on the page. Thanks in advance!0 -
Having trouble removing homepage from google
For various reasons my client wants their homepage removed from google, no just the content of the page off but the page not to be indexed (yep strange request but we are mere service providers) today I requested in webmaster tool that default.asp was removed. Wht says done but the sites homepage is still listed. The page also has a no index tag on but 24 hours and 18k Google bot hits later it still remains. Anyone got any other suggestions to deindex just the homepage asap please
Technical SEO | | Grumpy_Carl0 -
Best URL Structure for Product Pages?
I am happy with my URLs and my ecommerce site ranks well over all, but I have a question about product URL's. Specifically when the products have multiple attributes such as "color". I use a header URL in order to present the 'style' of products, www.americanmusical.com/Item--i-GIB-LPCCT-LIST and I allow each 'color' to have it's own URL so people can send or bookmark a specific item. www.americanmusical.com/Item--i-GIB-LPCCT-ANCH1 www.americanmusical.com/Item--i-GIB-LPCCT-WRCH1 I use a rel canonical to show that the header URL is the URL search engines should be indexing and to avoid duplicate content issues from having the exact same info, MP3's, PDF's, Video's accessories, etc on each specific item URL. I also have a 'noindex no follow' on the specific item URL. These header URLs rank well, but when using tools like SEOMoz, which I love, my header pages fail for using rel canonical and 'noindex no follow' I've considered only having the header URL, but I like the idea of shoppers being able to get to the specific product URL. Do I need the no index no follow? Do I even need the rel canonical? Any suggestions?
Technical SEO | | dianeb1520