Sitemap.xml Question
-
I am pretty new to SEO and I have been creating new pages for our website for niche terms. Should I include ALL pages on our website in the sitemap.xml or should I only have our "main" pages listed on the sitemap.xml file?
Thanks
-
All pages. This might help http://www.xml-sitemaps.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap generator only crawling 20% of my site
Hi guys, I am trying to submit the most recent XML sitemap but the sitemap generator tools are only crawling about 20% of my site. The site carries around 150 pages and only 37 show up on tools like xml-sitemaps.com. My goal is to get all the important URLs we care about into the XML sitemap. How should I go about this? Thanks
Intermediate & Advanced SEO | | TyEl0 -
XML Sitemap Questions For Big Site
Hey Guys, I have a few question about XML Sitemaps. For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml). If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)? If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this? Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it? Thank you!
Intermediate & Advanced SEO | | keywordwizzard0 -
I have a question regarding parking good value domain.
I have a question regarding parking good value domain. A client has a great website 'A' with page rank of 5 and a lot of traffic. They want to change the URL and redesign the site. So they have parked the domain 'A' and will later redirect it to the new domain, this will be in a month time. My questions is, by parking the old domain 'A' would they have lost its SEO value or will it be given to the new URL once they place a 301 redirect on it. Also, would it not have been better not to park domain 'A', keep it live and just redirect it once new domain goes live, notifying Google in Webmaster tools?
Intermediate & Advanced SEO | | OrangeGuys0 -
Followup question to rand(om) question: Would two different versions (mobile/desktop) on the same URL work well from an SEO perspective and provide a better overall end-user experience?
We read today's rand(om) question on responsive design. This is a topic we have been thinking about and ultimately landing on a different solution. Our opinion is the best user experience is two version (desktop and mobile) that live on one URL. For example, a non-mobile visitor that visits http://www.tripadvisor.com/ will see the desktop (non-responsive) version. However, if a mobile visitor (i.e. iOS) visits the same URL they will see a mobile version of the site, but it is still on the same URL There is not a separate subdomain or URL - instead the page dynamically changes based on the end user's user agent. It looks like they are accomplishing this by using javascript to change the physical layout of the page to match the user's device. This is what we are considering doing for our site. It seems this would simultaneously solve the problems mentioned in the rand(om) question and provide an even better user experience. By using this method, we can create a truly mobile version of the website that is similar to an app. Unfortunately, mobile versions and desktop users have very different expectations and behaviors while interacting with a webpage. I'm interested to hear the negative side of developing two versions of the site and using javascript to serve the "right" version on the same URL. Thanks for your time!
Intermediate & Advanced SEO | | davidangotti0 -
Google Places Question: Two Businesses, Same Address
I am working with a client who runs a personal training business. He shares a fitness studio with another personal trainer to minimise costs. My issue is that the other guy has 'taken' the Google Places listing address as his business, rather than my client's. The gym itself is not a business, it is simply a shared workspace by two personal trainers - in the same way as a shared office space might be the address of several businesses. This presents a bit of a problem with Google Places verification. Is it best to: 'Alter' the address slightly so it appears to be a separate premises (e.g. 51 Something Street --> 51A Something Street) then use that address in all my citations Leave the address itself the same, but rely on the fact that there are separate domains, phone numbers and business names Any thoughts on this?
Intermediate & Advanced SEO | | Pretty-Klicks1 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Google Sitemap only indexing 50% Is that a problem?
We have about 18,000 pages submitted on our Google Sitemap and only about 9000 of them are indexed. Is this a problem? We have a script that creates a sitemap on a daily basis and it is submitted on a daily basis. Am I better off only doing it once a week? Is this why I never get to the full 18,000 indexed?
Intermediate & Advanced SEO | | EcommerceSite0 -
Redirect on exact match domain to Brand domain question :)
Hi, If I have a website with the domain crazysocks.co.uk and a title tag 'black socks' would I see any benefit redirecting blacksocks.co.uk to crazysocks.co.uk, to give my keyword 'black socks' a boost in the SE's from the EMD. I see it loads where an EMD is indexed for its term but when you click the result it redirects to a branded domain. I personally cant see this being true but wanted to double check.
Intermediate & Advanced SEO | | activitysuper0