Spammy Structured Data Markup Removal
-
Hi There,
I'm in a weird situation and I am wondering if you can help me.
Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues.
We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened:
a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site.
b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore.
We need a solution for getting the markup out of the SERP.
We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages.
Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me?
Thanks so much
-
Thanks so much
I'll try that right away
-
yes just create one you can call 301-sitemap.xml and submit it to google webmaster tools. This is a separate one from your full sitemap as when you ll get those pages removed from google seeps you can just delete it without affecting your normal sitemap.
-
thanks for your answer,
Should I create a sitemap with only dead pages? and then have two sitemaps?
let me know, please.
-
Hi Yosepgr,
one thing I would like to clarify IMO is that dev needs SEO guidance on how to implement schema. Sometimes people just request schema implementation and then wait for dev to do it. I'm not saying is your case but we, as SEO, should provide technical guidance on how to correctly implement that.
That being said I had a similar case in the past and what I did was creating a sitemap including just the dead URLs. I this way I was forcing google to crawl them and see that they now redirect to the new version.
After doing so, ensure that your redirect is actually a permanent redirect (301). You can check that easily with screaming frog by crawling those URLs in list mode or get the ayima plugin for chrome and visit the URL so you can see what the header response look like. Ensure that the redirect is 301 and with just 1 step (if possible).
It may take a while for google to digest the but you shouldn't be worried about schema as if google is penalizing your site for spammy markup, it will penalize only pages containing that markup which are now dead and removed from the site.
I hope this helps!
e
-
Hey there,
It's definitely not that good of an idea to re-do the old url's. Have you submitted the site to be reindexed? Make sure you update your sitemap if needed (and/or robots) and reupload these to google. Then wait. Any additional changes might confuse G even more. Make sure to 301 the old pages to the new ones.
If you still need help with the schema code drop me a PM.
Have a great day
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing site URL structure
Hey everybody, I'm looking for a bit of advice. A few weeks ago Google sent me an email saying all pages with any text input on them need to switch to https for those pages. This is no problem, I was slowly switching the site to https anyway using a 301 redirect. However, my site also has a language subfolder in the url, mysite.com/en/ mysite.com/ru/ etc. Due to poor work on my part the translations of the site haven't been updated in a long time and lots of the pages are in english even on the russian version etc. So I'm thinking of just removing this url structure and just having mysite.com My plan is to 301 all requests to https and remove the language subfolder in the url at the same time. So far the https switching hasn't changed my rankings. Am I more at risk of losing my rankings by doing this? Thanks!
Technical SEO | | Ruhol0 -
Schema markup for products is missing "price": Is this bad?
Hey guys, So a current client of mine has an e-commerce shop with a few hundred products. They purposely choose to keep the prices off of their website, which is causing errors in Google Webmaster Tools. Basically the error shows: Error: Structured Data > Product (markup: schema.org) Error type: missing price 208 items with error Is this a huge deal? Or are we allowed to have non-numerical prices for schema ie. "call for quote"
Technical SEO | | tbinga1 -
Moved a site and changed URL structures: Looking for help with pay
Hi Gents and Ladies Before I get started, here is the website in question. www.moldinspectiontesting.ca. I apologize in advance if I miss any important or necessary details. This might actually seem like several disjointed thoughts. It is very late where I am and I am a very exhausted. No on to this monster of a post. **The background story: ** My programmer and I recently moved the website from a standalone CMS to Wordpress. The owners of the site/company were having major issues with their old SEO/designer at the time. They felt very abused and taken by this person (which I agree they were - financially, emotionally and more). They wanted to wash their hands of the old SEO/designer completely. They sought someone out to do a minor redesign (the old site did look very dated) and transfer all of their copy as affordably as possible. We took the job on. I have my own strengths with SEO but on this one I am a little out of my element. Read on to find out what that is. **Here are some of the issues, what we did and a little more history: ** The old site had a terribly unclean URL structure as most of it was machine written. The owners would make changes to one central location/page and the old CMS would then generate hundreds of service area pages that used long, parameter heavy url's (along with duplicate content). We could not duplicate this URL structure during the transfer and went with a simple, clean structure. Here is an example of how we modified the url's... Old: http://www.moldinspectiontesting.ca/service_area/index.cfm?for=Greater Toronto Area New: http://www.moldinspectiontesting.ca/toronto My programmer took to writing 301 redirects and URL rewrites (.htaccess) for all their service area pages (which tally in the hundreds). As I hinted to above, the site also suffers from a overwhelming amount of duplicate copy which we are very slowly modifying so that it becomes unique. It's also currently suffering from a tremendous amount of keyword cannibalization. This is also a result of the old SEO's work which we had to transfer without fixing first (hosting renewal deadline with the old SEO/designer forced us to get the site up and running in a very very short window). We are currently working on both of these issues now. SERPs have been swinging violently since the transfer and understandably so. Changes have cause and effect. I am bit perplexed though. Pages are indexed one day and ranking very well locally and then apparently de-indexed the next. It might be worth noting that they had some de-index problems in the months prior to meeting us. I suspect this was in large part to the duplicate copy. The ranking pages (on a url basis) are also changing up. We will see a clean url rank and then drop one week and then an unclean version rank and drop off the next (for the same city, same web search). Sometimes they rank along side each other. The terms they want to rank for are very easy to rank on because they are so geographically targeted. The competition is slim in many cases. This time last year, they were having one of the best years in the company's 20+ year history (prior to being de-indexed). **On to the questions: ** **What should we do to reduce the loss in these ranked pages? With the actions we took, can I expect the old unclean url's to drop off over time and the clean url's to pick up the ranks? Where would you start in helping this site? Is there anything obvious we have missed? I planned on starting with new keyword research to diversify what they rank on and then following that up with fresh copy across the board. ** If you are well versed with this type of problem/situation (url changes, index/de-index status, analyzing these things etc), I would love to pick your brain or even bring you on board to work with us (paid).
Technical SEO | | mattylac0 -
Term for how content or data is structured
There is a term for how data or content is structured and for the life of me I can't figure it out. The following is the best I know of how to explain it: magnolia is of Seattle. Seattle is of Washington. Washington is of the US. US is of North America. North America is of Earth. etc etc etc etc. Any help is much appreciated. I'm trying to use the term to communicate It's application to SEO in that Google analyze how information is structured to understand the breadth and depth of your sites content...
Technical SEO | | BonsaiMediaGroup0 -
URL structure
Hi, I am in the process of having a site created which will focus on the Xbox 360, PS3, Wii and PS3 Vita. I would appreciate some advice when it comes to the URL structure. Each category mentioned above will have the following subsections News
Technical SEO | | WalesDragon
Reviews
Screenshots
Trailers Would the best url structure be? www.domain.com/xbox-360/news/news-story-headline
www.domain.com/ps3/reviews/ps3-game-name Thanks in advance for your help and suggestions.0 -
How to create a tree-like structure map of a website?
Hi all, The online marketing manager requested to make a tree-like map of the website. He means that he would like to see a graphical representation of the website and his contents. This way we will be able to see if there are internal link issues. The problem is that there are thousands of pages and many subdomains, manual labour would make this a very tedious task. If you would get this question, how would you try to solve this? Any software recommendation?
Technical SEO | | djingel10 -
Suggested url structure for hierarchical data
For an existing web site we are developing a local info web site section where each area would get a review and information about local bars and restaurants. The site manages areas in the following hierarchy: Country > Broader region > Perfecture > Municipality > Neighborhood e.g. Italy > Northern Italy > Lombardia > Milano > Center Local Info pages would exist for all the above levels so you could have a page for Italy as a whole, a page for Lombardia, and a separate page for the Center of Milano. On certain countries there are many synonyms especially in the Neighborhood level but also a few in the Municipality level. We would like to build a generic SEF url structure/pattern that would be able to represent the above and be as short as possible for the purpose of SEO. 1. the obvious solution would be to incorporate the unique identifier of e.g. www.example.com/local-info/Italy-10
Technical SEO | | seo-cat
www.example.com/local-info/Milano-12363
www.example.com/local-info/Center-789172 but this does not represent the hierarchy and does not include the interesting keyword of e.g. Milano when looking at the neighborhood level 2. Another option would be to include all levels e.g. www.example.com/local-info/Italy/Northern-Italy/Lombardia
www.example.com/local-info/Italy/Northern-Italy/Lombardia/Milano
www.example.com/local-info/Italy/Northern-Italy/Lombardia/Milano/Center But this would end up with large URLs 3. I am thinking of another solution which would include the current level and its parent at any page. Not capturing the hierarchy very well but at least it includes the parent name for richer keywords in the url itself. www.example.com/local-info/Northern-Italy/Lombardia
www.example.com/local-info/Lombardia/Milano
www.example.com/local-info/Milano/Center 4. Or a hybrid where the first levels are always there and the rest are concatenated on a single segment www.example.com/local-info/Italy/Northern-Italy/Lombardia
www.example.com/local-info/Italy/Northern-Italy/Lombardia-Milano
www.example.com/local-info/Italy/Northern-Italy/Lombardia-Milano-Center any thoughts? thanks in advance0 -
Is there any issue with using the same structured data property multiple times on the same page?
Im working on implementing structured data properties into my product detail pages. (http://schema.org/Book) My site sells books and many books have both a 13 digit ISBN # and a 10 Digit ISBN. Should I apply the itemprop "isbn" to both of them or just the one with higher search volume? Some books also have multiple authors, how should I handle that?
Technical SEO | | myork07240