Advanced Question on Synonym Variation Pages!
-
Hi,
This is quite an advanced question, so I'll go through in detail - please bare with me!
I launched the new version of our website exactly a week ago - and all the key metrics are in the right direction: Pages / Visit +5% , Time on Site +25%, Bounce rate down 1 %.
I work in an industry were our primary keyword has 4 synonyms and our long tail keywords are location related.
So as an example I have primary synonyms like: Holiday, Vacation, Break, Trip (Not actually these but they are good enough as an example). Pluralised versions and you have 8 in total.
So my longtail keywords are like:
Las Vegas Vacation / Las Vegas Vacations
Las Vegas Holiday / Las Vegas Holidays
Las Vegas Trip / Las Vegas Trips
Las Vegas Breaks / Las vegas BreaksAll these synonyms effectively mean the same thing, so my thinking on my new website was to specifically target each of these synonyms with their own unique page and optimise the meta and page titles, to those exact words.
To make these pages truely unique, I therefore got a bunch of copywriters to write about 600 words unique for every long tail synonym (well over 750,000 words in total!).
So now at this point I have my page "Las Vegas Holidays" with 600 unique words of content, and "Las Vegas Vactions" with 600 words of unique content etc etc etc.
The problem is, when the user is searching for these words, there primary goal is not to read 600 words of content on "Las Vegas Holidays" - their primary goal is to get a list of last vegas holidays that they can search, view purchase (they may want to read 600 words of content, but is not their primary goal).
So this puts me in a dilema - I need to display the nuts and bolt (IE the actual holidays in las vegas) to the customer on any page they land on off my synonyms as the primary content. But to make sure these pages are unique I need to also have this unique content on that page.
So here's what I did:
- On every synonym version of the page I display the exact same information. However, on each page I have a "Information" link. and on click this pop's up a layer which contains my unique content for that page. To further optimise using perfect anchors in this content pop-up, I have cross linked the synonym pages (totally naturally) - IE on my "Las Vegas Holidays" page, in the content I may have the words "Las Vegas Breaks" - this would be linked the the "Las Vegas Breaks" synonym page.
In theory I don't think there is anything wrong with what I am doing in the eyes of the customer - but I have a big concern that this may well look "fishy" to SE's. IE the pages are almost identical to the user except for this information pop-up layer of unique content, titles and meta. We know that Google at least can get can tell exactly what the user see's when they land on that page ( from their "Preview") and can distinguise between user visible and hidden text. Therefore, even though from a user experience, I think we are making a page that is perfect for them (they get the list of vactions etc as the primary content, and can read infomation if they want by clicking a button), I am concerned that SE's are going to say - hold on a minute there are load of pages here that are identical except for a chuck of text that is not visible to the user (Even though this is visible to the user if they click the "Information" button), and this content cross links to a load of almost identical pages with the same thing.
Today I checked our rankings, and we have taken a fair whack from google - I'm not overly concerned at the moment as I expected big fluctuations from ranking for the first few weeks - but I'd be a lot more confident if they were fluctuating in the right direction!!
So what do I do?
As far as I can see my options break down as follows:Content Display:
1/. Keep it as it is, and hope the SE's don't see it as spammy. Even though I think what we are doing is the best for customer experience, I'm concerned SE's won't.2/. On every synonym page, below all the list of products, packages etc that the customer wants to see, display the unique content as a block of subtext text which is visble by default. This however could make the page a bit ugly.
3/. Display a visible snippet of the unique content, below all the packages, and have a more button which expands the rest of the content - IE have a part visible layer. This is slightly better for display, but again I'm only displaying a portion of visible content and the rest will still be flagged as "hidden" by default to the SE's.
Cross Linking within the content:
1/. Keep it as it is where synonym keywords link to the synonym version of the page.2/. Alter it so that every sysnonym keyword links to the "primary" synonym version of the page - EG if I now "Las Vegas Holidays" is my main keyword, then "Las Vegas Vactions" keyword, would not link to my "Las Vegas Vactions" page as current, but would link to my "Las Vegas Holidays" page.
I apologise for the indepth questions, but it requires a lot of explanation to get it across clearly.
I would be grateful on any of your thoughts.
Many thanks in advance.
-
The new synonym pages are not ranking anywhere significant for their exact synonym - it is the primary synonym page ranking for them.
To me this would be proof that they are seeing the synonym pages as duplicate so you may be better focusing on one page per keyword. Although if these pages are only a week old they could take more time to get picked up. As Ryan mentioned, changes need time to settle in properly, you won't see final results in a week.
As you mention in point 3 you are definitely going to decrease the strength of each location if it has to share link juice between 8 similar pages, so you are making it harder for them to rank well.
-
Google is not liking the fact that I have numerous synonym pages, where the only difference is Title and Meta variantions, along with a chunk of content in a layer not visible on page load to the user.
I would highlight this issue as a major concern. We know for sure Google does not like duplicate pages, nor hidden content so these are going to be your issues.
What Google wants to see is each page offers unique content, and the content is prominently displayed to users up front. Burying the unique content in the footer, hiding it with java script and any other form of shell game will leave you quite unhappy because sooner or later you will wind up on this Q&A asking what happened to your rankings.
As for the rest, try to keep your linking natural and don't force it.
As you mentioned, it takes time to see any adjustments truly settle into the rankings. Some differences might be obvious the next day. Others may take weeks to appear. Google samples your site, and the web, in pieces. Site owners often make improvements to their site, see something they don't like in SERPs, panic, then make other changes as a result.. Don't panic.
You need the ability to understand SEO at a level where you have confidence in the changes you are making. Make the proper adjustments then just watch. Don't touch your site for 30 days and then check the results. If they are not to your liking, look for other adjustments. The other alternative is to hire a SEO who can do the work for you. If you do hire a SEO, understand we don't have magic wands or buddies at Google. The same rules apply to us as well.
-
Ok I've done some more testing and here are some other points:
From answers above:
1/. Each synonym set of content is totally unique - I used different copywriters for every different version to make sure there was no duplication. It's also very useful content for anyone looking for "more indepth info".
2/. Each synonym set of content was only scattered with a few keywords about 1-2% for total synonym words.
3/. The content is searchable by SE's (its loaded in a layer hidden by default), opened on click of an info button. I've also checked searching on specific unique strings from the content - Google is indexing the content and page containing the text.
4/. Our "primary" Synonym content pages are still ranking decently, but have just dropped - IE I'm talking drops from position 1-3, to 6-7. The new synonym pages are not ranking anywhere significant for their exact synonym - it is the primary synonym page ranking for them.Possibilties:
1/. Google is not liking the fact that I have numerous synonym pages, where the only difference is Title and Meta variantions, along with a chunk of content in a layer not visible on page load to the user.
2/. Google is not liking the fact that this content contains synonym keywords that link to other synoynm pages doing the same thing.
3/. Google is re-distributing link juice which was before going to the primary synonym page amongst the 8 synonym pages (as now all synonym pages have links that randomly point to other synonym pages), thus the link juice to the primary synonym page is diluted, resulting in a drop in ranking.
4/. Google is just adjusting our ranking factors in their alogrithm and as with any new site, you would expect some big fluctuation in the first couple of weeks, and there is nothing to worry about??On point 4 is there any actual study done on what happens to a total redesigned site with regard to rankings and timelines?
Any further thoughts appreciated.
For now I am working on changing:
1/. Content is displayed visible by default
2/. All links within any synonym content point to the "primary" synonym content page.All advise much appreciated.
Thanks -
James,
It sounds like you suspect your content may be a bit spammy. If that is the case, it probably is and wont be seen well by SEs.
I would suggest ensuring your content on each page is truly unique and helpful to users. Here is how I might approach your challenge with respect to the main content:
Phrase "Las Vegas Vacation": Write an article titled "Las Vegas Vacations for your family". The article can talk about the experiences from a family perspective. Shows, shopping, etc.
Phrase "Las Vegas Trip". Write an article titled "My Las Vegas Trip" and focus on gambling. Blackjack, poker, etc.
Phrase "Las Vegas Holidays". Write an article titled "Las Vegas Holidays for couples". Write the article from a romantic couple point of view. Cover shows, spas, etc.
I realize you already have content but I want to ensure it is truly unique and not the same words re-arranged. The content should also be helpful and something people want to link to, +1, tweet about, etc. This content should be at the top of your page. It isn't something you would bury in the bottom for Google nor hide in any manner.
The other concern your expressed was "hiding" content and how to deal with your duplicate content. One thought is right-click on your current page and choose "view page source". Do a simple search for the text you don't wish seen by SEs. If you can see the text in the html code, so can search engines.
Another thought would be to place a Call To Action button at the top of the back "Book a Las Vegas Vacation" that would take users to a page with the content so you don't have to duplicate it throughout your site. This could be a win-win benefiting your site and users.
If you don't like any of the above options and are determined to keep the current approach you can display the duplicated content in an overly as a noindex, followed page. Another alternative would be to display it in an iframe which is not crawled by SEs. These last options would work fine but I would recommend using the other approaches first.
-
Hi James,
Firstly it may be worth checking if Google is actually ranking every version of a page, they will exclude some of the variations if they see it as duplicate. Try doing a site search in Google e.g.
- Try site:domain.com Las Vegas Vacation - See if the vacations page is listed
- then try site:domain.com Las Vegas Holidays - See if the holidays page is listed etc.etc.
- If you find that you only have one page listed for Las Vegas then you could have a duplicate content issue.
In terms of the hidden text I don't think the search engines are quite as skeptical of hidden content anymore because most sites these days are heavily reliant on hiding and showing sections through JavaScript. it would be a problem if the hidden text was unrelated to the rest of the page content or was 'keyword stuffed' but an extra information section should not be a problem. The important question is how is the information loaded? Since search engines do not use JavaScript you need to make sure that the info paragraphs are loaded first and then hidden so they are part of the HTML. If your info section is empty and then the text is loaded into the container when you click a button then they will not see this text.
There will be many different opinions but if it was me I would only have one page for each location, try and use all the synonyms in that page and then vary the anchor text when creating backlinks to the page. Google is clever enough to understand synonyms so you don't really need to have different pages for them however, I'm sure that there are many people who have had success with this strategy.
They always say that a key guideline is to build your website for users, not search engines. So is it useful to the user to have many versions of each page? Probably not, and it might confuse them. It may also confuse the search engines as they will not be sure which version is the important one and could end up showing only one in their rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Pillar pages and blog pages
Hello, I was watching this video about pillar pages https://www.youtube.com/watch?v=Db3TpDZf_to and tried to apply it to my self but find it impossible to do (but maybe I am looking at it the wrong way). Let's say I want to rank on "Normandy bike tou"r. I created a pillar page about "Normandy bike tour" what would be the topics of the subpages boosting that pillar page. I know that it should be questions people have but in the tourism industry they don't have any, they just want us to make them dream !! I though about doing more general blog pages about things such as : Places to rent a bike in Normandy or in XYZ city ? ( related to biking) Or the landing sites in Normandy ? (not related to biking) Is it the way to do it, what do you recommend ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Help with duplicate pages
Hi there, I have a client who's site I am currently reviewing prior to a SEO campaign. They still work with the development team who built the site (not my company). I have discovered 311 instances of duplicate content within the crawl report. The duplicate content appears to either be 1, 2, or 3 versions of the same pages but with differing URL's. Example: http://www.sitename.com http://sitename.com http://sitename.com/index.php And other pages follow a similar or same pattern. I suppose my question is mainly what could be causing this and how can I fix it? Or, is it something that will have to be fixed by the website developers? Thanks in advance Darren
Intermediate & Advanced SEO | | SEODarren0 -
HELP! How do I get Google to value one page over another (older) page that is ranking?
So I have a tactical question and I need mozzers. I'll use widgets as an example: 1- My company used to sell widgets exclusively and we built thousands of useful, branded unique pages that sell widgets. We have thousands of pages that are ranking for widgets.com/brand-widgets-for-sale. (These pages have been live for almost 2 years) 2- We've shifted our focus to now renting widgets. We have about 100 pages focused on renting the same branded widgets. These pages have unique content and photos and can be found at widgets.com/brand-widgets-for-rent. (These pages have been live for about 2-3 months) The problem is that when someone searches just for the brand name, the "for sale" pages dramatically outrank the "for rent" pages. Instead, I want them to find the "for rent" page. I don't want to redirect traffic from the "for sale" pages because someone might still be interested in buying (although as a company, we are super focused on renting). Solutions? "nofollow" the "for sale" pages with the idea that Google will stop indexing "for sale" and start valuing "for rent" over it? Remove "for sale" from sitemap. Help!!
Intermediate & Advanced SEO | | Vacatia_SEO0 -
SEO and Internal Pages
Howdy Moz Fans (quoting Rand), I have a weird issue. I have a site dedicated to criminal defense. When you Google some crimes, the homepage comes up INSTEAD of the internal page directly related to that type of crime. However, on other crimes, the more relevant internal page appears. Obviously, I want the internal page to appear when a particular crime is Googled and NOT the homepage. Does anyone have an explanation why this happens? FYI: I recently moved to WP and used a site map plugin that values the internal pages at 60% (instead of Weebly, which has an auto site map that didn't do that). Could that be it? I have repeatedly submitted the internal pages via GWT, but nothing happens. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Consistent Ranking Jumps Page 1 to Page 5 for months - help needed
Hi guys and gals, I have a really tricky client who I just can't seem to gain consistency with in their SERP results. The keywords are competitive but what the main issue I have is the big page jumps that happen pretty much on a weekly basis. We go up and down 40 positions and this behaviour has been going on for nearly 6 months.
Intermediate & Advanced SEO | | Jon_bangonline
I felt it would resolve itself in time but it has not. The website is a large ecommerce website. Their link profile is OK in regards to several high quality newspaper publication links, majority brand related anchor texts and the link building we have engaged in has all been very good i.e. content relevant / high quality places. See below for some potential causes I think could be the reason: The on page SEO is good however the way their ecommerce website is setup they have formed a substantial amount of duplicate title tags. So in my opinion this is a potential cause. The previous web developer set-up 301 redirects all to their home page for any 404 errors. I know best practice is to go to the most relevant pages, however could this be a potential issue? We had some server connectivity issues show up in webmasters tools but that was for 1 day about 4 months ago. Since then no issues. they have quite a few 'blocked URLs' in their robots.txt file, e.g. Disallow: /login, Disallow: /checkout/ but to me these seem normal and not a big issue. We have seen a decrease over the last 12 months in Webmasters Tools of 'total indexed web pages' from 5000 to 2000 which is quite an odd statistic. Summary So all in all I am a tad stumped. We have some duplicate content issues in title tags, perhaps not following best practice in the 301 redirects but other than that I dont see any major on page issues, unless I am missing something in the seriousness of what I have listed.
Finally we have also do a bit of a cull of poor quality links, requesting links to be removed and also submitting a 'disavow' of some really bad links. We do not have a manual penalty though. Thoughts, feedback or comments VERY welcome.0 -
Question about "launching to G" a new site with 500000 pages
Hey experts, how you doing? Hope everything is ok! I'm about to launch a new website, the code is almost done. Totally fresh new domain. The site will have like 500000 pages, fully internal optimized of course. I got my taticts to make G "travel" over my site to get things indexed. The problem is: to release it in "giant mode" or release it "thin" and increase the pages over the time? What do you recomend? Release the big G at once and let them find the 500k pages (do they think this can be a SPAM or something like that)? Or release like 1k/2k per day? Anybody know any good aproach to improve my chances of success here? Any word will be apreciated. Thanks!
Intermediate & Advanced SEO | | azaiats20