XML Sitemap Questions For Big Site
-
Hey Guys,
I have a few question about XML Sitemaps.
-
For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml).
-
If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)?
-
If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this?
-
Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it?
Thank you!
-
-
Thanks for the input guys!
I believe Twitter and Facebook don't run sitemaps for their profiles, what they have is a directory for all their profiles (twitter: https://twitter.com/i/directory/profiles Facebook: https://www.facebook.com/find-friends?ref=pf) and use that to get their profiles crawled, however I feel the best approach is through xml sitemaps and Google plus actually does this with their profiles (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml) and quite frankly I would rather follow Google then FB or Twitter... I'm just now wondering how the hell they upkeep that monster! Does it create a new sitemap everything one hits 50k? When do they update their sitemap? daily, weekly, or monthly and how?
One other question I have is if their is any penalties to getting a lot of pages crawled at once? Meaning one day we have 10 pages and the next we have 10,000 pages or 50,000 pages...
Thanks again guys!
-
I guess the way I was explaining it was for scalabilty on a large site. You have to think a site like fb or twitter with hundreds of millions of users still has the limitation of only having 50k records in a site map. So if they are running site maps, they have hundreds.
-
I'm not a web developer, so this might may be wrong, but I feel like it might be easier to just add every user to the xml sitemap and then add a noindex robots meta tag ons users pages that don't want to their profiles to be indexed.
-
If it were me and someone were asking me to design a system like that, I would design it in a few parts.
First I would create an application that handled the sitemap minus profiles, just for your tos, sign up pages, terms, and what ever pages like that.
Then I would design a system that handled the actual profiles. It would be pretty complex and resource intensive as the site grew. But the main idea flows like this
Start generation, grab the user record with id 1 in the database, check to see if indexable (move to next if not), see what pages are connected, write to xml file, loop back and start with record #2.
There are a few concessions you have to make, you need to keep up with the number of records in a file before you start another file. You can only have 50k records in one file.
The way I would handle the process in total for a large site would be this, sync the required tables via a weekly or daily cron to another instance (server). Call the php script (because that is what I use) that creates the first sitemap for the normal site wide pages. At the end of that site map, put a location for the user profile sitemap, then at the end of the scrip, execute the user profile site map generating script. At the end of each site map, put the location of the next site map file, because as you grow it might take 2-10000 site map files.
One thing that I would ensure to do is get a list of crawler ip addresses and in your .htaccess have an allow / deny rule. That way you can make the site maps only visible to the search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding Site and URL structure + Faceted Navigation (Endeca)
We are currently implementing the SEO module for Endeca faceted navigation. Our development team has proposed URLs to be structured in this way: Main category example: https://www.pens.com/c/pens-and-writing/ As soon as a facet is selected, for example "blue ink" - The URL path would change to https://www.pens.com/m/pens-and-writing/blue-ink/_/Nvalue (the "N" value is a unique identifier generated by Endeca that determines what products from the catalog are served as a match for the selected facet and is the same every time that facet is selected, it is not unique per user). My gut instinct says that this change from "/c/" to "/m/" might be very problematic in terms of search engines understanding that /m/pens-and-writing/blue-ink/ as part of the /c/pens-and-writing/ category. Wouldn't this also potentially pose a problem for the flow of internal link equity? Has anyone ever seen a successful implementation using this methodology?
Intermediate & Advanced SEO | | danatanseo0 -
Rel=canonical Question
Alright, so let's say we've got an event coming up. The URL is website.com/event. On that page, you can access very small pages with small amounts of information, like website.com/event/register, website.com/event/hotel-info, and website.com/event/schedule. These originally came up as having missing meta descriptions, and I was thinking a rel=canonical might be the best approach, but I'm not sure. What do you think? Is there a better approach? Should I have just added a meta description and moved on?
Intermediate & Advanced SEO | | MWillner0 -
Sitemap indexing
Hi everyone, Here's a duplicate content challenge I'm facing: Let's assume that we sell brown, blue, white and black 'Nike Shoes model 2017'. Because of technical reasons, we really need four urls to properly show these variations on our website. We find substantial search volume on 'Nike Shoes model 2017', but none on any of the color variants. Would it be theoretically possible to show page A, B, C and D on the website and: Give each page a canonical to page X, which is the 'default' page that we want to rank in Google (a product page that has a color selector) but is not directly linked from the site Mention page X in the sitemap.xml. (And not A, B, C or D). So the 'clean' urls get indexed and the color variations do not? In other words: Is it possible to rank a page that is only discovered via sitemap and canonicals?
Intermediate & Advanced SEO | | Adriaan.Multiply1 -
Ajax tabs on site
Hello, On a webpage I have multiple tabs, each with their own specific content. Now these AJAX/JS tabs, if Google only finds the first tab when the page loads the content would be too thin. What do you suggest as an implementation? With Google being able to crawl and render more JS nowadays, but they deprecated AJAX crawling a while back. I was maybe thinking of doing a following implementation where when JS is disabled, the tabs collapse under each other with the content showing. With JS enabled then they render as tabs. This is usually quite a common implementation for tabbed content plugins on Wordpress as well. Also, Google had commented about that hidden/expandable content would count much less, even with the above JS fix. Look forward to your thoughts on this. Thanks, Conrad
Intermediate & Advanced SEO | | conalt1 -
SEO question
Hi there! I'm the SEO manager for 5 Star Loans. I have 2 city pages running. We are running our business in 2 locations: Berkeley, CA & San Jose, CA. For those offices we've created 2 google listings with separate gmail accounts. Berkeley (http://5starloans.com/berkeley/) ranks well in Berkeley in Gmaps and it shows on first page in organic results. However the second city page San Jose (http://5starloans.com/san-jose/) doesn't show in the Gmaps local pack results and also doesn't rank well in organic results. Both of them have authentic backlinks and reviews. It has been a year already and it's high time we knew the problem 🙂 any comment would be helpful. thanks a lot
Intermediate & Advanced SEO | | moonalev0 -
Submitting XML Sitemap for large website: how big?
Hi there, I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages. Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted. Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index. Finally, my questions are: How big should a sitemap be? What proportion of the URLs of a website should it cover? What are the best tools for creating the sitemaps of large websites? How often should a sitemap be updated? Thanks 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Quick htaccess question
Hi! I'm trying to do a 301 from www.stevesims.com/index.htm to www.stevesims.com. I know I need to use the request command to avoid an infinite loop, but I can't quite figure out the correct code. Here's the first part of the htaccess file. RewriteEngine On RewriteCond %{HTTP_HOST} ^stevesims.com
Intermediate & Advanced SEO | | Blink-SEO
RewriteRule (.*) http://www.stevesims.com/$1 [R=301,L] RewriteCond %{HTTP_REFERER} !^http://stevesims.com/.$ [NC]
RewriteCond %{HTTP_REFERER} !^http://stevesims.com$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.stevesims.com/.$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.stevesims.com$ [NC]
RewriteRule .*.(jpg|jpeg|gif|png|bmp)$ - [F,NC] Any suggestions would be much appreciated.0 -
Site Navigation
Hi Mozzers, I am an SEO at uncommongoods.com and looking for your opinion on our site nav. Currently our nav & URLs are structured in 3 levels. From the top level down, they are: 1. Category ex: http://www.uncommongoods.com/home-garden 2. Subcat ex: http://www.uncommongoods.com/home-garden/bed-bath 3. Family ex:http://www.uncommongoods.com/home-garden/bed-bath/bath-accessories Right now, all levels are accessible from our top nav but we are considering removing the family pages. If we did that, Google could still find & crawl links to the family pages, but they would have to drill down to the subcat pages to find them. Do you guys think this would help or hurt our SEO efforts? Thanks! -Zack
Intermediate & Advanced SEO | | znotes0