I show different versions of the same page to the crawlers and users, but do not want to do anymore
-
Hello,
While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
-
Hi there
Ideally, create one page that serves both search engines and users, because you want users to find your page via search engines and you want search engines to be able to crawl your content. It's thought that Google is getting better at crawling Javascript, but you need to make sure that you text or content is readable in a text-based browser or is visible to Google with Javascript off. Here's a resource for you.
That being said, focus on having one page for the content you're trying to create, so you can put more SEO efforts into building the equity in that page. You can also build other pages around variations of that topic that link back to that page, and link to these new pages from the one main topic page as well. This will help build your site from both a topic standpoint and passing linking equity throughout your site.
Let me know if this makes sense or helps. Best of luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best seo benefit location ( main page text or h1 , h2)?
i have learned that h1 has more value than h2 and h2 has more than h3, but lets say if i want to place my keywords in there. should i include them in the main body or should take advantage of header tags?
White Hat / Black Hat SEO | | Sam09schulz0 -
Back links to pages on our site that don't exist on forums we haven't used with irrelevant product anchor text
Hi, I have a recurring issue that I can't find a reason for. I have a website that has over 7k backlinks that I monitor quite closely. Each month there are additional links on third party forums that have no relevance to the site or subject matter that are as a result toxic. Our clients site is a training site yet these links are appearing on third party sites like http://das-forum-der-musik.de/mineforum/ and have anchor text with "UGG boots for sale" to pages on our url listed as /mensuggboots.html that obviously don't exist. Each month, I try to contact the site owners and then I add them to Google using the disavow tool. Two months later they are gone and then are replaced with new backlinks on a number of different forum websites. Quite random but always relating to UGG boots. There are at least 100 extra links each month. Can anyone suggest why this is happening? Has anyone seen this kind of activity before? Is it possibly black hat SEO being performed by a competitor? I just don't understand why our URL is listed. To be fair, there are other websites linked to using the same terms that aren't ours and are also of a different theme so I don't understand what the "spammer" is trying to achieve. Any help would be appreciated.
White Hat / Black Hat SEO | | rufo
KInd Regards
Steve0 -
Google Analytics shows wrong position
1 For a particular targeted keyword Google Analytic shows Avg position as 20 for 10 Impresions . On the other hand other Tools like Rank Tracker/Authority Labs shows no Ranking at all . When I Manually check Google results, that particular page is not listed at all in Google Search for that particular keyword in Top 400 search Results. Its almost 30-40 days back done Optimization for that keyword, no Keyword stuffing (2%) or nothing blackhat. Keyword has just 300+ search per month . MOZ and other Tools like MyWebsite Auditor shows no major issues with On Page SEO , overall good score for onpage SEO . Any one has any ideas as why this happens or happened to someone before . Thanks
White Hat / Black Hat SEO | | Aus0070 -
Noindexed Pages with External Links Pointing to it: Does the juice still pass through?
I have a site with many many pages that have very thin content, yet they are useful for users/visitors. Those pages also have many external links pointing to them from reputable and authoritative websites. If i were to noindex/follow these pages, will the juice/value from the external links still pass through just as if the page didn't have the noindex tag? Please let me know!
White Hat / Black Hat SEO | | juicyresults0 -
IT want to do a name server redirect
Hi, I am in a little bit of a pickle, and hope that you clever people can help me... A little background: In April this year we relaunched one of our brands as a standalone business. I set up page to page 301 redirects from the old website to the new branded domain. From an SEO perspective this relaunch went amazingly smoothly - we only lost around 10% of traffic and that was just for a couple of months. We now get more traffic than ever before. Basically it's all going swimmingly. I noticed yesterday that the SSL certificate on the old domain has expired, so I asked IT to repurchase one for us to maintain the 301 redirects. IT are saying that they would prefer to do a name server redirect instead, which would remove all the page to page 301s. They are saying that this would maintain the SEO. As far as I am aware this wouldn't. Please can someone help me put together a polite but firm response to basically say no? Thanks, I really welcome and appreciate your help on this! Amelia
White Hat / Black Hat SEO | | CommT0 -
How/why is this page allowed to get away with this?
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs. I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years. My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
White Hat / Black Hat SEO | | Millermore0 -
Help with E-Commerce Product Pages
Hi, I need to find the best way to put my products on our e-commerce website. I have researched and researched but I thought I'd gather a range of ideas in here. Basically I have the following fields: Product Title
White Hat / Black Hat SEO | | YNWA
Product Description
Product Short Description SEO Title
Focus Keyword(s) (this is a feature of our CMS)
Meta Description The problem we have is we have a lot of duplicate content eg. 10 Armani Polos but then each one will be a different colour (but the model number is the same). I don't want to miss out on rankings because of this. What would you say is the best way to do this? My idea is this: Product Title: Armani Jeans Polo Shirt Blue
Product Description: Armani Jeans Polo Shirt in Blue Made from 100% cotton Armani Jeans Polo with Short Sleeves, Pique Collar and Button Up Collar. Designer Boutique Menswear are official stockists of Armani Jeans Polos.
Short Description: Blue Armani Jeans Polo SEO Title: Armani Jeans Polo Shirt Blue MA001 | Designer Boutique Menswear
Focus Keywords: Armani Jeans Polo Shirt
Meta Description: Blue Armani Jeans Polo Shirt. Made from 100% cotton. Designer Boutique Menswear are official stockists of Armani Polos. What are peoples thoughts on this? I would then run the same format across each of the different colours. Another question is on the product title and seo title, should these be exactly the same? And does it matter if I put the colour at the beginning or end of the title? Any help would be great.0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0