Can a hidden menu damage a website page?
-
Website (A) - has a landing page offering courses
Website (B) - ( A different organisation) has a link to Website A. The goal landing page when you click on he link takes you to Website A's Courses page which is already a popular page with visitors who search for or come directly into Website A.
Owners of Website A want to ADD an Extra Menu Item to the MENU BAR on their Courses page to offer some specific courses to visitors who come from Website (B) to Website (A) - BUT the additional MENU ITEM is ONLY TO BE DISPLAYED if you come from having clicked on the link at Website (B).
This link both parties are intending to track
However, if you come to the Courses landing page on Website (A) directly from a search engine or directly typing in the URL address of the landing page - you will not see this EXTRA Menu Item with its link to courses, it only appears should you visit Website (A) having come from Website (B).
The above approach is making me twitch as to what the programmer wants to do as to me this looks like a form of 'cloaking'. What I am not understanding that Website (A) URL ADDRESS landing page is demonstrating outwardly to Google a Menu Bar that appears normal, but I come to the same URL ADDRESS from Website (B) and I end up seeing an ADDITIONAL MENU ITEM
How will Google look at this LANDING PAGE? Surely it must see the CODING INSTRUCTIONS sitting there behind this page to assist it in serving up in effect TWO VERSIONS of the page when actually the URL itself does not change.
What should I advise the developer as I don't want the landing page of Website (A) which is doing fine right now, end up with some sort of penalty from the search engines through this exercise.
Many thanks in advance of answers from the community.
-
Great book.
-
Kurt thanks for your further contribution to this question.
I refer back to a book that I once read by Steve Krugg - Don't Make Me Think
And I am very focused on parachuting the visitor into the right page with the right information that is targeted towards that end user you want to then 'convert' - and as you say there is no confusion who the page is for. And this way it can be better measurable in analytics.
-
I'd say having a unique landing page just for that specific segment is a very good idea for the user experience. Even though I don't think you'd have an SEO issue with their original idea, this would certainly remove all doubt.
-
Thanks to both William and Kurt for your taking the time to respond to my question. I agree, this situation is unusual, the web developer I am working with is not a marketer and very much a programmer and his skill set is normally centred around bringing together end to end ecommerce solutions, but leaves the marketing to me and my team.
What we are dealing with here are actually two academic websites, with academics who are not marketers at the centre of requirements as to what 'they' want. So my developer partner is having to work on what the client wants and the client is required to satisfy an external other 3rd party website which, when you read my question they are referred to as Website (B).
My personal thought was why not just create a specific landing page that is very much targeted for this audience coming from Website (B) and have a deal tailored for them on that page. The call to action could have behind it something very specific (a voucher code or something) unique to that audience being able to take up the offer and so not interfere with my very public facing page that is already a popular landing page that I really don't want to have interfered with.
If you guys or anybody else has any further thought on this I very much appreciate it.
-
Hi Brian,
I haven't come across anyone doing this exact situation before, but I don't think it's anything to be concerned with. If you are just giving a single extra menu item to a navigation menu, I don't think it's enough to raise any flags.
I disagree with William, though, about adding the noindex and nofollow. It sounds like this is not a temporary test and you are getting traffic to the page from the search engines. So, I wouldn't sacrifice that traffic for extra caution.
I think you'll be fine adding the menu item.
Kurt Steinbrueck
OurChurch.Com -
Its not cloaking.
However, I'd suggest adding a noindex, nofollow to that landing page, so there is no confusion between which is the original. The company I currently work for, we noindex, nofollow all of our testing pages that have different linking structure, and works well for us. Our test pages are activate under certain criterias like new user, 2nd time user etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website penalized never again be the same
In February 2015 I received google email that had been penalized Superficial content with little or no added value. I resolved the situation with Google and the site was reconsidered two months later. The problem happens that since I had to drop the site never again be the same, since the site has been penalized never again be the same, now owned only 10% of visits and since then has not shown more growth. I'm deciding to leave the site for no more hopes and all who have had the same problem told me to forget about and working with new. What do you think? Give up the site and get a new one? In addition, during this period I rephrased the entire site, let responsive, mobile and improved as a whole in the general context and migrated to wordpress. www.acervoamador.com.br (Warning: adult content) I thank you for your attention and have a nice day.
White Hat / Black Hat SEO | | stroke0 -
SEO for Career sites and sup-pages
For main job categories: We manage several career pages for several clients but the competition for the main keywords (even several long tail) is from big names like Indeed and similar job boards?
White Hat / Black Hat SEO | | rflores
What would you recommend? For job posts: Since the job posts that our clients post are short lived (80% live less than a month) would it still be incorrect to purchase backlinks? or is it always a big no Thanks for your help. And if a similar question has been asked I would appreciate if you could point me to it. I could not find one.0 -
I'm Getting Attacked, What Can I Do?
I recently noticed a jump in my Crawl Errors in Google Webmaster Tools. Upon further investigation I found hundreds of the most spammy web pages I've ever seen pointing to my domain (although all going to 404 errors): http://blurchelsanog1980.blog.com/ http://lenitsky.wordpress.com/ These are all created within the last week. A. What the hell is going on? B. Should I be very concerned? (because they are 404 errors) C. What should my next steps be? Any help would be greatly appreciated.
White Hat / Black Hat SEO | | CleanEdisonInc0 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
Do shady backlinks actually damage ranking?
That is, it looks like a whole bunch of sites got smacked around the penguin/panda updates, but is this by virtue of actually being hurt by google's algorithms, or by virtue of simply not being helped "as much"? That is, was it a matter of the sites just not having any 'quality' backlinks, having relied on things google no longer liked, which would result in not having as much to push them to the top? That is, they would have been in the same position had they not had those shoddy practices? Or was google actively punishing those sites? That is, are they worse off for having those shoddy practices? I guess the reason I ask is I'm somewhat terrified of going "out there" to get backlinks -- worst case scenario: would it just not do much to help, or would it actually hurt? Thanks!
White Hat / Black Hat SEO | | yoni450 -
Does Google Penalize for Managing multiple Google Places from the same IP Address? Can you manage from same google account or separate? Or does it matter since it's created from the same IP?
I manage a number of client's Google Places from the same IP and heard this is not a good thing. Are there Do's and Don'ts when managing multiple Google Places? Create separate google accounts for each or can you use the same account?
White Hat / Black Hat SEO | | Souk0