How does Tripadviser ensure all their user reviews get crawled?
-
Tripadvisor has a LOT of user generated content. Searching for a random hotel always seems to return a paginated list of 90+ pages. However once the first page is clicked and "#REVIEWS" is appended to the URL, the URL never changes with any subsequent clicks of the paginated links.
How do they ensure that all this review content gets crawled?
Thanks,
linklater
-
Domain authority is the key here, now only their links get crawled but they all get ranked, usually in the top 10.
-
The review pages actually do change. These are 2 URLs for 2 reviews on the same property:
http://www.tripadvisor.com.au/ShowUserReviews-g255060-d255650-r265622033-Hilton_Sydney-Sydney_New_South_Wales.html#CHECK_RATES_CONT
http://www.tripadvisor.com.au/ShowUserReviews-g255060-d255650-r265582073-Hilton_Sydney-Sydney_New_South_Wales.html#CHECK_RATES_CONT
They don't change after the hash - they change before it. And the ones you're talking about:
http://www.tripadvisor.com.au/Hotel_Review-g255060-d255650-Reviews-or10-Hilton_Sydney-Sydney_New_South_Wales.html#REVIEWS
http://www.tripadvisor.com.au/Hotel_Review-g255060-d255650-Reviews-or20-Hilton_Sydney-Sydney_New_South_Wales.html#REVIEWS
-
Hallo Linklater,
The short answer, internal site structure + domain autority.
For the long answer, could you provide me with an URL example? I tried to duplicate your setting but didn't find the pages you describe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate without user-selected canonical excluded
We have pdf files uploaded in the media of wordpress and used in our website. As these pdfs are duplicate content of the original publishers, we have marked links to these pdf urls as nofollow. These pages are also disallowed in robots.txt Now, Google Search Console has shown these pages Excluded as "Duplicate without user-selected canonical" As it comes out we cannot use canonical tag with pdf pages so as to point to the original pdf source If we embed a pdf viewer in our website and fetch the pdfs by passing the urls of the original publisher, would the pdfs be still read as text by google and again create duplicate content issue? Another thing, when the pdf expires and is removed, it would lead to 404 error. If we direct our users to the third party website, then it would add up to our bounce rate. What should be the appropriate way to handle duplicate pdfs? Thanks
Intermediate & Advanced SEO | | dailynaukri1 -
Start a new site to get out of Google penalties?
Hey Moz, I have several questions in regards to whether I should a start a new second site to save my online presence after a series of Google penalties. The main questions being: Is this the best way to spend my time/resources? If I’m forced to jump my company over to the new site can Google see that and transfer the penalty? I plan on all new content (no link redirect, no dup content) so do I need to kill the original site? Are there any Pro’s/cons I am missing? Summary of my situation: Looking at analytics it appears I was hit with both Penguin 2.0 and 2.1, each cutting my traffic in half, despite a link remediation campaign in the summer of 2013. There was a manual penalty also imposed on the site in the fall of 2013, which was released in early 2014. With Penguin 3.0’s release at the end of 2014, the site saw a slight uptick in organic traffic, improving from essentially nothing to next to nothing. Most of the site’s issues revolved around cheap $5 links from India in the 2006-09 time frame. This link building was abandoned, and replaced with nothing but “letting them happen naturally” from 2010 through the 2013 penalties. Since 2013 we have done a small amount of quality articles on a monthly basis to promote the site, social media, and continuous link remediation. In addition the whole site has been redesigned, optimized for speed/mobile, secured, and completely rewritten. Given all of this, the site has really only recovered to page 2 and 3 of the SERPs for our key words. Even after a highly circulated piece appeared on an Authority site (97 DA) a few months ago there was zero movement. It appears we have an anvil tied around our leg until Penguin 4.0. With all of the above, and no sign of when the next penguin will be released, I ask, is it time to start investing in a new site? With no movement in 2.5 years, it’s impossible to know where my current site stands, so I don’t know what else I can do to improve it. I am considering slowly building a new site that is a high quality informational site. My thought process is it will take a year for a new site to gain any traction with Google. If by that time my main site has not recovered, I can jump to that new site, add a commercial component, and use it as a life boat for my company. If I have recovered, then I have a future asset. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
How to Get Google to Recognize Your Pages Are Gone
Here's a quick background of the site and issue. A site lost half of its traffic over 18 months ago and its believed to be a Panda penalty. Many, many items were already taken care of and crossed off the list, but here's something that was recently brought up. There are 30,000 pages indexed in Google,but there are about 12,000 active products. Many of these pages in their index are out of stock items. A site visitor cannot find them by browsing the site unless he/she had bookmarked and item before, was given the link by a friend, read about it, etc. If they get to an old product because they had a link to it, they will see an out of stock graphic and not allow to make the purchase. So, efforts have been made about 1 month ago to 301 old products to something similar, if possible, or 410 them. Google has not been removing them from the index. My question is how to make sure Google sees that these pages are no longer there and remove from the index? Some of the items have links to them and this will help Google see them, but what about the items which have 0 external / internal links? Thanks in advance for your assistance. In working on a site which has about 10,000 items available for sale. Looking in G
Intermediate & Advanced SEO | | ABK7170 -
SEO site Review
Does anyone have suggestions on places that provide in depth site / analytics reviews for SEO?
Intermediate & Advanced SEO | | Gordian0 -
User profile page optimisation - tips required
Hello, we have developed a network of medical professionals and our main goal on SEO is to rank on user names. I would like to use a profile [h**p://goo.gl/bUwFWW] i build in corporation with my client as sample and request any tips to increase ranking position of users profile page while searching for his name. Right now we list on 2nd page of google page. I would to know any specific tips / advices i miss out on page optimisation. Thanks in advance, C
Intermediate & Advanced SEO | | HaCos0 -
Getting Your Website Listed
Do you have any suggestiongs? I do not know local websites where I can get some easy backlinks. I guess a record in Google Places.would be great as well. Any sound suggestion will be appreciated. Thanks!
Intermediate & Advanced SEO | | stradiji0 -
How do we get individual products to rank ?
Hi, We have a site that sells music and we have been researching SEO and things we can do to help SERPs. We have started on link building and have added links to the footer of our page We have friendly urls, meta tag description added to all products. My question is, Yes we can work on getting keywords to rank better in google, one of ours being buy cds. But when it comes to individual products these keywords and results are useless if people are searching for a CD by artist or title which most do as they know what they are looking for. How do i get better results for all these unique products ? One or more of our competitors constantly show up in first few results for nearly any CD search by artist or title, yet we cant seem to get anywhere near this type of result ? Thanks Chris
Intermediate & Advanced SEO | | PressPlayMusic0 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0