I show different versions of the same page to the crawlers and users, but do not want to do anymore
-
Hello,
While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
-
Hi there
Ideally, create one page that serves both search engines and users, because you want users to find your page via search engines and you want search engines to be able to crawl your content. It's thought that Google is getting better at crawling Javascript, but you need to make sure that you text or content is readable in a text-based browser or is visible to Google with Javascript off. Here's a resource for you.
That being said, focus on having one page for the content you're trying to create, so you can put more SEO efforts into building the equity in that page. You can also build other pages around variations of that topic that link back to that page, and link to these new pages from the one main topic page as well. This will help build your site from both a topic standpoint and passing linking equity throughout your site.
Let me know if this makes sense or helps. Best of luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site in 2 page
my site in 2 page how can i rank with this keywords in dubai legal translation in Dubai
White Hat / Black Hat SEO | | saharali150 -
IT want to do a name server redirect
Hi, I am in a little bit of a pickle, and hope that you clever people can help me... A little background: In April this year we relaunched one of our brands as a standalone business. I set up page to page 301 redirects from the old website to the new branded domain. From an SEO perspective this relaunch went amazingly smoothly - we only lost around 10% of traffic and that was just for a couple of months. We now get more traffic than ever before. Basically it's all going swimmingly. I noticed yesterday that the SSL certificate on the old domain has expired, so I asked IT to repurchase one for us to maintain the 301 redirects. IT are saying that they would prefer to do a name server redirect instead, which would remove all the page to page 301s. They are saying that this would maintain the SEO. As far as I am aware this wouldn't. Please can someone help me put together a polite but firm response to basically say no? Thanks, I really welcome and appreciate your help on this! Amelia
White Hat / Black Hat SEO | | CommT0 -
Is it wrong to have the same page represented twice in the Nav?
Hi Mozzers, I have a client that have 3 pages represented twice in the Nav. There are not duplicates since they land with the same URL. It seems odd to have this situation but I guess it make sense for my client to have those represented twice since these pages could fall into multiple categories? Is it a bad practice for SEO or is it a waste to have those in the NAV? Should I require to eliminate the extras? Thanks!
White Hat / Black Hat SEO | | Ideas-Money-Art0 -
Authorship Photo Not showing. Done all checks still photo not coming
Can someone suggest - authorship photo not showing - have asked this earlier too but did not get much response on it Site URL http://www.mycarhelpline.com/index.php?option=com_easyblog&view=entry&id=93&Itemid=91 http://www.mycarhelpline.com/index.php?option=com_latestnews&view=detail&n_id=479&Itemid=10 Google + https://plus.google.com/109551624336693902828/posts Have done checks :- ?rel=author at end of profile url on site - Yes Profile discovery option on in Google+ - yes Contributor link in Google+ - yes Email validation done - yes Photo fitted in size - yes Rich snippet showing authorship established with photo - yes still the photo not coming in for last 6 months now. Any suggestion pls Even on searching name 'Gagan Modi' - the photo do show in Search result of google plus profile. But rich snippet as author photo do not show in for the site.
White Hat / Black Hat SEO | | Modi0 -
Website Vulnerability Leading to Doorway Page Spam. Need Help.
Keywords he is ranking for , houston dwi lawyer, houston dwi attorney and etc.. Client was acquired in June and since then we have done nothing but build high quality links to the website. None of our clients were dropped/dinged or impacted by the panda/penguin updates in 2012 or updates previously published via Google. Which proves we do quality SEO work. We went ahead and started duplicating links which worked for other legal clients and 5 months later this client is either dropping or staying in local maps results and we are performing very badly in organic results. Some more history..... When he first engaged our company we switched his website from a CMS called plone to word press. During our move I ran some searches to figure out which pages we needed to 301 and we came across many profile pages or member pages created on the clients CMS (PLONE). These pages were very spammy and linked to other plone sites using car model,make,year type keywords (ex:jeep cherokee dealerships). I went through these sites to see if they were linking back and could not find any back links to my clients website. Obviously nobody authorized these pages, they all looked very hackish and it seemed as though there was a vulnerability on his plone CMS installation which nobody caught. Fast forward 5 months and the newest OSE update is showing me a good 50+ back links with unrelated anchor text back links. These anchor text links are the same color as the background and can only be found if you hover your mouse over certain areas of the site. All of these sites are built on Plone and allot of them are linked to other businesses or community websites. These websites obviously have no clue they have been hacked or are being used for black hat purposes. There are dozens of unrelated anchor text links being used on external websites which are pointing back to our clients website. Examples: <a class="clickable title link-pivot" title="See top linking pages that use this anchor text">autex Isuzu, </a><a class="clickable title link-pivot" title="See top linking pages that use this anchor text">Toyota service department ratings, </a><a class="clickable title link-pivot" style="color: #5e5e5e; font-family: Helvetica, Arial, sans-serif; font-size: 12px; font-style: normal; font-variant: normal; font-weight: normal; line-height: normal;" title="See top linking pages that use this anchor text">die cast BMW and etc..</a> Obviously the first step is to use the disavow link tool, which will be completed this week. The second step is to take some feedback from the SEO community. It seems like these pages are automatically created using some type of bot. It will be very tedious if we have to continually remove these links. I hope there is a way to notify Google that these websites are all plone and have a vulnerability, which black hats are using to harm the innocent... If i cannot get Google to handle this, then the only other option is to start fresh with a new domain name. What would you do in this situation. Your help is greatly appreciated. Thank you
White Hat / Black Hat SEO | | waqid0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0 -
Multiple links to different pages from same page
Hey, I have an opportunity to get listed in a themed directory page, that has a high mozRank of 4+ and a high mozTrust of 5+. Would it be better to just have one link from that page going to one of my internal product category pages, or take advantage of the 'sitelinks' they offer, that allows me to have an additional 5 anchor text links to 5 other pages? I've attached an example. sitelinks.jpg
White Hat / Black Hat SEO | | JerDoggMckoy0