Search Console Incorrectly Identifies WordPress Version and Recommends Update
-
Howdy, Moz fans,
Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible."
This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3.
What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email.
Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly.
Thanks in advance!
-
I saw this for a client as well, who I know for sure isn't running WordPress at all. Personally, I think it's a Google mistake.
-
Thanks for that info, but I actually don't see a trace of 3.3.1 anywhere in my source code, so I'm still confused as to how it came up with that info. I do have a meta generator tag but it just contains a credit to Visual Composer.
The site is http://foam-roller.com.
-
Thanks for the response. It's interesting to me that Google doesn't penalize for vulnerabilities - you'd think it'd have some effect since it'd be in Google's best interest not to serve potentially insecure/malicious websites, just as SSL has a positive effect on rankings.
-
Peter is right, what I also wouldn't worry about is that you might get a penalty because of this. Google is very concerned about the security issues that Web sites might have and that's why they're alerting webmasters through Search Console that this is the case.
-
I also get notifications.
On first site in wp-content/uploads there was HTML file with this in header:
so checking works almost perfect. Just file was downloaded somewhere from other authors.
On second site Jooma was identified as 1.5 or less:
and this is correct. But wasn't hacked yet from creation like 5-6 years ago.
I think that this is part of their notifications about updates and pushing internet CMSes to latest versions. This isn't their first nor be last mail. Do you remember wp-timthumb notification? Do you remember Fancybox notification? Do you remember Revolution slider notification? What's equal in all cases? I know - one vulnerability and over 100k sites are at risk. And bad guys knows this and uses such vulnerability for black hat seo.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have very good backlinks but not showing in search console?
hi, i have some editorial links from some sites, they are appearing in google search results even when i select verbatim, long story short-when will they show up in search console. are they indexed by google? if they are showing in search results , does that mean they will also show up in search console? i am confused.
White Hat / Black Hat SEO | | Sam09schulz0 -
Pointless Wordpress Tagging: Keep or unindex?
Simple as that. Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page. If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once). I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
White Hat / Black Hat SEO | | HashtagHustler0 -
Will blank category pages automatically get updated
Hello, We've got old category pages that are blank like domain/shoes.html (blank white page not in menu anymore) domain/newshoesurl.html (working URL with link in menu) Will the blank pages be automatically deindexed and updated by Google?
White Hat / Black Hat SEO | | BobGW0 -
Moz recommends submitting to directories?
In the Moz Beginners Checklist for Small Business SEO, it's recommended to "Add your client's website to industry specific directories" That's not recommended anymore, right?
White Hat / Black Hat SEO | | CFSSEO0 -
Can I get updated opinions on PR Web?
I saw Moz has discussed PR web in earlier posts, but they are mostly months to years old. I'm wondering if PR Web is a good service? A lot of my competitors use it, but it seems just like a paid link to me. If for whatever reason, PR Web is an approved loophole, does anyone have any suggestions on which plan to purchase? Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Search query for SEO Brisbane
Would love to get some opinions on the latest Penguin 2.0 update and how on earth the #1 rank is #1 ranked, very, very peculiar... http://www.google.com/search?gs_rn=14&gs_ri=psy-ab&pq=sila&cp=8&gs_id=10&xhr=t&q=seo+brisbane&pf=p&client=safari&rls=en&sclient=psy-ab&oq=seo+bris&gs_l=&pbx=1&bav=on.2,or.r_qf.&bvm=bv.47008514,d.aGc&biw=1300&bih=569 Any and all theories welcomed and appreciated. Thanks, Mike
White Hat / Black Hat SEO | | MichaelYork0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
Penguin Update Seems To Benefit Wikipedia Etc
I was updating product info on my site which was apparently hammered by Penguin. As I was updating I was "Googling" the products. I noticed that every single product I carry, Wikipedia held the #1 position in search results. Anyone else noticing this? I previously held the number 1 position on 2 of my products but I was knocked down to 60+...
White Hat / Black Hat SEO | | chronicle0