GWT Message - CMS Update Available
-
Howdy Moz,
Just received a message in Google Webmaster Tools about a CMS update:
"Joomla Update Available
As of the last crawl of your website, you appear to be running Joomla 1.5. One or more of the URLs found were:
http://www.website/custom-url/article5034
Google recommends that you update to the latest release. Older or unpatched software may be vulnerable to hacking or malware that can hurt your users. To download the latest release, visit the Joomla download page. If you have already updated to the latest version of Joomla, please disregard this message.
If you have any additional questions about why you are receiving this message, Google has provided more background information in a blog post about this subject."
Read through the associated blog post. According to the post a generator meta tag is created in Joomla that notes the CMS version. Here's the oddity:
The site was on Joomla 1.5 over 2 years ago. 1 Year ago it was updated to Joomla 2.5. About a week ago it was converted completely to Wordpress. According to GWT the last date the Google bot accessed the site was the day before (5/1/14) the email.
I went through the code, css/html, and the database and found no reference of Joomla 1.5.
Has anyone seen this message? If so, how did you rectify it? Were there any adverse effects on rankings?
-
Just wanted to add, no I don't think there would be any adverse effects on ranking, unless the site was compromised somehow. Since you are on a completely different system, you should be fine with a resubmission.
On a side note, since you are on WordPress now, make sure you have the right file permissions, and shell access turned off. Hackers love a WordPress site that is unprepared.
Best of luck with the new site!
-
Sounds like they have a super old page cached in their system. Do you have caching turned off in the new WordPress site? Most people turn it on, but I have found if you have gzip compression and css compiling turned on, it's not needed. To me, caching can create more headaches then solutions. (personal opinion)
You could also go into your webmaster account, and do a fresh "fetch as google" for the root domain, and all linked pages. This way they will have the latest version of your site in their database. Google downloads your site so they can reference the content quickly for the search queries entered. Could be they are still on an old download. Have you ever resubmitted the site since you rebuilt it?
-
You can check the cached version of that page to see what they have in the cache - is it a recent version of the page or something pretty old, and they haven't reindexed that page in a while.
Either way, I think these notices are more recommendations and helpful tidbits they think will assist webmasters than crucial information that will influence rankings. So if the message isn't relevant anymore, I would ignore it and move on building a great website for your visitors!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
Intermediate & Advanced SEO | | Jade1 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
Possible problem with new site (GWT no queries/very low index vs. submitted)
Hi everyone, I recently launched a new website for a small business loan company in the Dallas area. The site has been live for roughly a month and a half. I submitted everything to GWT as usual, including my sitemap. I am not sure what's going on with the site, as there is no activity from GWT in the impressions or queries. The submit vs. index is 24/3 (and hasn't moved). Also the queries graph on the overview stops at 3/18/2015... On another note, when I go to Crawl > Sitemaps, it shows that there were pages indexed during the month of march and then on April 3 it drops from 17 to 2 and never increases. Google says there are no errors or issues found, but I feel like there's something wrong. When I do site:, my URLs do pop up which makes me believe there's just a problem with my GWT. With that being said, I'm not happy THINKING there's something wrong. I need to actually know what the problem is. The only thing I can think of that I have done is purchase SSL for the site, but when I search what pages are indexed using www. it shows all the HTTPS URLS, so that would tell me that the site is getting indexed without a problem? Does anyone have a clue as to what might be happening? I will attach some screen shots so that you can get a better idea... KQ2366i D5xBNZf mF7kkgW
Intermediate & Advanced SEO | | jameswesleyhunt0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Duplicate content URLs from bespoke ecommerce CMS - what's the best solution here?
Hi Mozzers Just noticed this pattern on a retail website... This URL product.php?cat=5 is also churning out products.php?cat=5&sub_cat= (same content as product.php?cat=5 but from this different URL - this is a blank subcat - there are also unique subcat pages with unique content - but this one is blank) How should I deal with that? and then I'm seeing: product-detail.php?a_id=NT001RKS0000000 and product-detail.php?a_id=NT001RKS0000000&cont_ref=giftselector (same content as product-detail.php?a_id=NT001RKS0000000 but from this different URL) How should I deal with that? This is a bespoke ecommerce CMS (unfortunately). Any pointers would be great 🙂 Best wishes, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Update content or create a new page for a year related blog post?
I have a page called 'video statistics 2013' which ranks really well for video stat searches and drives in a lot of traffic to the site. Am I best to just change the title etc to 2014 and update the content, or create a totally new page? The page has 2013 in the URL as well which may be a problem for just updating?
Intermediate & Advanced SEO | | JonWhiting0 -
301s Creating Soft 404s in GWT
Hi, We re-did a section of a site and got rid of hundreds of pages of no longer relevant content. We 301'd the urls to the category homepage. Now, GWT calls these soft 404s. a) Should we have done something differently instead of 301ing? b) Are these hundreds of soft 404 errors a big problem or threat to how Google sees us for SEO? c) Should we correct this in some way? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Best method to update navigation structure
Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
Intermediate & Advanced SEO | | CodyWheeler0