Have I Set my joomla site map correctly as google not picking up articles
-
Hi here is my site map http://www.in2town.co.uk/sitemap-xml?sitemap=1
But i am concerned that i have not set my joomla site map correctly. The reason why i think this is, because i have a lot of articles on the site and google are not picking them up in my google webmaster tools.
On my old site google was quick to pick up all the articles.
Can anyone offer me any help on this and let me know if my site map should be set to show all of my articles
-
Seems to me you might be spending some time doing this:
I think what you want to do is:
- Discover all the old urls that were bringing traffic
- 301 redirect them to the relevant new urls on the updated site using htaccess (assuming you're on apache)
- [optional: submit a sitemap to google with the old urls to encourage the crawler to see that they're now redirected]
- [if not doing step 3, might be worth doing this as the first step] create a sitemap for the new site (I have trouble with xmap so I don't use it, would love to though since it automates things - I know many people like it) try manually creating one with say xenu or integrity
- submit the new sitemap to bing and google
- sit back fingers crossed
- keep tracking 404 errors in gwt to identify urls you've missed that can be added to the 301 list
there are posts on seomoz and elsewhere that cover how to do 301 redirects in htaccess, how to use xenu etc.
good luck!
--
EDIT: One other thing to remember, (particularly if you've suffered data loss in the past) is backup - search for akeeba backup for joomla, it's free and works brilliantly.
-
hi, how would i add the old urls to the site. there are lots of urls from the old site that i have had to change.
What i mean by this is, when we changed the site, we asked the hosting company to move all the data over, they made a huge error and because of this we lost thousands of pages of content and thousands of pages of links
If you could explain about adding the old urls and if this is possible.
Buit what i had on the old xmap was, it was showing all the content that i had, which made the sitemap look very long, but by doing this google was adding it and it was showing in my google webmaster tools, but now it is only showing that 13 pages have been listed by google instead of all the content pages that we have.
On the old site by doing this, it showed thousands of content pages listed in google in webmaster tools, but at the moment it is only showing our main sections
With ref to the coronation street, it is because we are adding this over the weekend
any help would be great. It has been a nightmare building the new site after the hosting company lost all of our data
-
Have you changed the urls from the old site to the new one? If so, I'd be inclined to create a sitemap with everything and submit it to encourage the pages to be crawled.
Some of the categories in the sitemap don't have any content - that looks a bit odd - (e.g. coronation street)
Also, if the URL structure is all new, have you 301'd the old pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Google Site Search
I'm considering to implement google site search bar into my site.
Technical SEO | | JonsonSwartz
I think I probably choose the version without the ads (I'll pay for it). does anyone use Google Site Search and can tell if it's a good thing? does it affects in any way on seo? thank you0 -
Why did I drop ranking after setting up perm redirect, sitemap, and Google places??
I have a site that was ranking in the top two for my search terms. We had a funky url (it contained hyphens) and was advised to change it for SEO, so I setup a perm redirect through my web host (before it was a temporary one I think) At the same time I installed a sitemap plugin for Wordpress and also registered for a Google Places account. I can't remember the exact order I did this -- does it matter? Anyway, within a couple days of doing the above, my ranking dropped to the bottom of the second page. I would like to fix this, but I'm not sure. I need help please!
Technical SEO | | fsvatousek0 -
Does Site Structure Affect Google
Hi - I'm pretty new at this. We’re running an e-commerce affiliate site at http://www.mydomain.com. So we don’t take payments but customer gets passed through to third party sites when they select to buy a product. We have a blog at http://www.mydomain.com/news. I think Google is treating these 2 sites as as separate sites for PR. For this reason we're thinking about moving this to http://news.mydomain.com. Anyone have any experience in this?
Technical SEO | | richardjoseph0 -
How to correct a google canonical issue?
So when I initially launched my website I had an issue where I didn't properly set my canonical tags and all my pages got crawled. Now in looking at the search engine results I see a number of the pages that were meant to be canonical tagged to the correct page showing up in the results. What is the best way to correct this issue with google? Also I noticed that while initially I was ranking well for the main pages, now those results have disappeared entirely and deeper in the rankings I am finding the pages that were meant to be canonical tagged. Please Help.
Technical SEO | | jackaveli0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
Why do I see dramatic differences in impressions between Google Webmaster Tools and Google Insights for Search?
Has anyone else noticed discrepancies between these tools? Take keyword A and keyword B. I've literally seen situations where A has 3 or 4 times the traffic as B in Google Webmaster Tools, but half the traffic of B in Google Insights for Search. What might be the reason for this discrepancy?
Technical SEO | | ir-seo-account0 -
Site just will not be reincluded in Google's Index
I asked a question about this site (www.cookinggames.com.au) some time ago http://www.seomoz.org/qa/view/38488/site-indexing-google-doesnt-like-it and had some very helpful answers which were great. However I'm still no further ahead. I have added some more content, submitted a new XML sitemap, removed the 'lorem ipsum...' Now it seems that even Bing have ditched the site too. The number 1 result in Australia for the search term 'cooking games' is now this one - http://www.cookinggames.net.au/ which surely is not so much better to deserve a #1 spot whilst my site is deindexed? I have just had another reconsideration request 'denied' and am absolutely out of ideas/. If anyone can help suggest what I need to do... or even suggest how I can get feedback from the search engines what's wring that would be fantastic. Thank you David
Technical SEO | | OzDave0