Content change within the same URL/Page (UX vs SEO)
-
Context:
I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page.
Question:
The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page.
My concern:
1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page.
2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
-
Google can read dynamic content, so I would not do this. In terms of UX, why don't you come up with something really cool while the users are waiting? How many seconds do you need for the change? As a rule, I would keep the sections with their own content because Google can read dynamic content.
-
So by example, if the user is on the "New York" page that he clicks on the "Boston" location withing the app, is there any problem changing the content inside the New York page and propose all the boston location without changing the page to the Boston url but just by changing the content inside the New York page
-
Hi Cristian,
I totally agree with what you're telling me. My plan is to keep the separate pages for every sector so I have my pages that rank for every sector. My question was more oriented on when the user is on-site for example on a specific location, is there any problem if the content change to a new location withing the web app without changing static pages.
-
Hello. You really need to have separate pages if you want to rank with all of them. Basically, think of the title for example. How do you want to index a specific region if you have only one page? How should googlebot understand that you have multiple content and which content/section to show if a person does a specific query? Escaped fragments could have been used in the past but it was not a great solution and it was discontinued (https://webmasters.googleblog.com/2015/10/deprecating-our-ajax-crawling-scheme.html). As such, I would try to provide separate pages with as much qualitative content as possible and with strong internal linking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use existing page with bad URL or brand new URL?
Hello, We will be updating an existing page with more helpful information with the goal of reaching more potential customers through SEO and also attaching a SEM campaign to the specific landing page. The current URL of the page scores 25 on Page Authority, and has 2 links to it from blog articles (PA 35, 31). The current content needs to be rewritten to be more helpful and also needs some additional information. The downsides are that it has an "bad" URL- no target keyword and uses underscores. Which of the following choices would you make? 1. Update this old "bad" URL with new content. Benefit from the existing PA. -or- 2. Start with a new optimized URL, reusing some of the old content and utilizing a 301 redirect from the previous page? Thank you!
Technical SEO | | XLMarketing0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
Page Content
Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. Unfortunately we have understood by now that many customers have entered exactly same title for their listings which has caused us having hundreds of similar page title. We have corrected all the pages which had similar meta tag and duplicate page title tags. We have also inserted controls to our software to prevent generating duplicate page title tags or meta tags. But also the page content quality not very good because page content added by customer.(example: http://www.enakliyat.com.tr/detaylar/evden-eve--6001) What should I do. Please help me.
Technical SEO | | iskq0 -
Are you allowed to point different urls to same page
hi, i have some urls that i am either going to put on hold or thinking about pointing to one of my sites. what it is, i am looking at re-designing the pages but not until next year, so i thought i would point some of the urls to a site that i am happy with to different pages, but not sure if i am allowed this or not so for example, if i have a site on cars, and one of the url is www.rovercars.co.uk i was thinking about pointing it to the page that is about rover cars. can anyone let me know if this is allowed or not please
Technical SEO | | ClaireH-1848860 -
Changing a blog url from subdomain to subfolder
I am abou to change my company blog from a subdomain (blog.mydomain.com) to a subfolder (mydomain.com/blog), from suggestions from this awesome community! Not only that though, because the current blog is on another server than the main site I have to move my blog between servers as well. This will be a big hassle for me, and means a big risk for errors as I don't have a clue what I am doing on the development part. Hint: I'm no developer. My blog is fairly new, having posted 18 blog posts so far. There is no major linking to or from the blog as it has been basically no activity on the blog. It has been fairly good optimized for SEO, with custom plugin settings for Wordpress SEO plugin and similar. Also followed advice from Rand regarding wordpress SEO. So I guess my question is: Would it be a big loss for me to just start over with a new blog on the subfolder domain? And move content over from the old blog manually (and then deleting the old one). Or would It be plain stupid taking that route? Thankfull for all help I can get!
Technical SEO | | danielpett0 -
Post vs page in Wordpress?
Hello there, I have a Wordpress site and would like to know if it is better to have 600 posts or 600 pages in terms of efficiency in the site. I would like to publish the content as pages, as I can have subapges,etc... and keep the path: www.website.com/page/subpage1... in terms of good SEO. This structure of using pages rahter than posts allow me to keep the path as stated above (with a category/post path I could not manage in this sense as a pile of articles is displayed although the path category/post in terms of SEO I understand would be good too). Thank you very much for your thoughts here as I would go for a page structure. Antonio
Technical SEO | | aalcocer20030 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0