GWT Message - CMS Update Available
-
Howdy Moz,
Just received a message in Google Webmaster Tools about a CMS update:
"Joomla Update Available
As of the last crawl of your website, you appear to be running Joomla 1.5. One or more of the URLs found were:
http://www.website/custom-url/article5034
Google recommends that you update to the latest release. Older or unpatched software may be vulnerable to hacking or malware that can hurt your users. To download the latest release, visit the Joomla download page. If you have already updated to the latest version of Joomla, please disregard this message.
If you have any additional questions about why you are receiving this message, Google has provided more background information in a blog post about this subject."
Read through the associated blog post. According to the post a generator meta tag is created in Joomla that notes the CMS version. Here's the oddity:
The site was on Joomla 1.5 over 2 years ago. 1 Year ago it was updated to Joomla 2.5. About a week ago it was converted completely to Wordpress. According to GWT the last date the Google bot accessed the site was the day before (5/1/14) the email.
I went through the code, css/html, and the database and found no reference of Joomla 1.5.
Has anyone seen this message? If so, how did you rectify it? Were there any adverse effects on rankings?
-
Just wanted to add, no I don't think there would be any adverse effects on ranking, unless the site was compromised somehow. Since you are on a completely different system, you should be fine with a resubmission.
On a side note, since you are on WordPress now, make sure you have the right file permissions, and shell access turned off. Hackers love a WordPress site that is unprepared.
Best of luck with the new site!
-
Sounds like they have a super old page cached in their system. Do you have caching turned off in the new WordPress site? Most people turn it on, but I have found if you have gzip compression and css compiling turned on, it's not needed. To me, caching can create more headaches then solutions. (personal opinion)
You could also go into your webmaster account, and do a fresh "fetch as google" for the root domain, and all linked pages. This way they will have the latest version of your site in their database. Google downloads your site so they can reference the content quickly for the search queries entered. Could be they are still on an old download. Have you ever resubmitted the site since you rebuilt it?
-
You can check the cached version of that page to see what they have in the cache - is it a recent version of the page or something pretty old, and they haven't reindexed that page in a while.
Either way, I think these notices are more recommendations and helpful tidbits they think will assist webmasters than crucial information that will influence rankings. So if the message isn't relevant anymore, I would ignore it and move on building a great website for your visitors!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
During major update rankings update seem to be on pause ?
Hello, I have read in the past that during a major update google puts all his ressources in the update and it seems that they don't update search results anymore. Has someone noticed that too ? How long does it take for an update to be rolled out fully and have everything get back to normal ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
HTTPS Update - 1 Category Dropped Out of Google
Hi We updated to HTTPs last week, we haven't had any major issues and most categories on the site are OK, apart from one. We have completely dropped out of ranking in Google at all for our Dollies section: https://www.key.co.uk/en/key/dollies-load-movers-door-skates We've always ranked well on the first page for a number of keywords, now we're out of the top 100 - I am trying to hunt for an issue but I can't seem to find one. Can anyone advise? Thanks 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
To recover from Penguin update, shall i remove the links or disavow links?
Hi, One of our websites hit by Penguin update and I now know where the links are coming from. I have chance to remove the links from those incoming links but I am a little confused whether i should just remove the links from incoming links or disavow the links? Thanks
Intermediate & Advanced SEO | | Rubix0 -
Using WP All Import csv import plugin for wordpress to daily update products on large ecommerce site. Category naming and other issues.
We have just got an automated solution working to upload about 4000 products daily to our site. We get a CSV file from the wholesalers server each day and the way they have named products and categories is not ideal. Although most of the products remain the same (don't need to be over written) Some will go out of stock or prices may change etc. Problem is we have no control over the csv file so we need to keep the catagories they have given us. Might be able to create new catgories and have products listed under multiple categories? If anyone has used wp all import or has knoledge in this area please let me know. I have plenty more questions but this should start the ball rolling! Thanks in advance mozzers
Intermediate & Advanced SEO | | weebro0 -
A First For Me: Client Wants New Website & Completely Updated Content
A client that I worked with on another project has approached me asking if I can handle the transition from his old website to his new website. He already had the new website designed and the URL structure is completely different than the old one. Normally this would not be a problem, just 301 redirect each page to the new page. However, this client has COMPLETELY redone his website from the ground up including navigation, pages and page content. The old website has been around for 8 years and is ranking for some good keywords. The reason he decided to build the new site is because the URL of his old domain is very long and for whatever reason he didn't like it (I'm assuming this would be misspellings of people trying to get back to his website and long email addresses, but he didn't clarify). I have never dealt with such a drastic change before and wanted the SEOMoz community input on the best way to pass authority/link juice from the old domain to the new one. Thanks in advance for your help.
Intermediate & Advanced SEO | | Bo-Jangles0 -
Will our PA be retained after URL updates?
Our web hosting company recently applied a seo update to our site to deal with canonicalization issues and also rewrote all urls to lower case. As a result our PA is now 1 on all pages its effected. I took this up with them and they had this to say. "I must confess I’m still a bit lost however can assure you our consolidation tech uses a 301 permanent redirect for transfers. This should ensure any back link equity isn’t lost. For instance this address: http://www.towelsrus.co.uk/towels-bath-sheets/aztex/egyptian-cotton-Bath-sheet_ct474bd182pd2731.htm Redirects to this page: http://www.towelsrus.co.uk/towels-bath-sheets/aztex/egyptian-cotton-bath-sheet_ct474bd182pd2731.htm And the redirect returns 301 header response – as discussed in your attached forum thread extract" Firstly, is canonicalization working as the number of duplicate pages shot up last week and also will we get our PA back? Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0