Can a hidden menu damage a website page?
-
Website (A) - has a landing page offering courses
Website (B) - ( A different organisation) has a link to Website A. The goal landing page when you click on he link takes you to Website A's Courses page which is already a popular page with visitors who search for or come directly into Website A.
Owners of Website A want to ADD an Extra Menu Item to the MENU BAR on their Courses page to offer some specific courses to visitors who come from Website (B) to Website (A) - BUT the additional MENU ITEM is ONLY TO BE DISPLAYED if you come from having clicked on the link at Website (B).
This link both parties are intending to track
However, if you come to the Courses landing page on Website (A) directly from a search engine or directly typing in the URL address of the landing page - you will not see this EXTRA Menu Item with its link to courses, it only appears should you visit Website (A) having come from Website (B).
The above approach is making me twitch as to what the programmer wants to do as to me this looks like a form of 'cloaking'. What I am not understanding that Website (A) URL ADDRESS landing page is demonstrating outwardly to Google a Menu Bar that appears normal, but I come to the same URL ADDRESS from Website (B) and I end up seeing an ADDITIONAL MENU ITEM
How will Google look at this LANDING PAGE? Surely it must see the CODING INSTRUCTIONS sitting there behind this page to assist it in serving up in effect TWO VERSIONS of the page when actually the URL itself does not change.
What should I advise the developer as I don't want the landing page of Website (A) which is doing fine right now, end up with some sort of penalty from the search engines through this exercise.
Many thanks in advance of answers from the community.
-
Great book.
-
Kurt thanks for your further contribution to this question.
I refer back to a book that I once read by Steve Krugg - Don't Make Me Think
And I am very focused on parachuting the visitor into the right page with the right information that is targeted towards that end user you want to then 'convert' - and as you say there is no confusion who the page is for. And this way it can be better measurable in analytics.
-
I'd say having a unique landing page just for that specific segment is a very good idea for the user experience. Even though I don't think you'd have an SEO issue with their original idea, this would certainly remove all doubt.
-
Thanks to both William and Kurt for your taking the time to respond to my question. I agree, this situation is unusual, the web developer I am working with is not a marketer and very much a programmer and his skill set is normally centred around bringing together end to end ecommerce solutions, but leaves the marketing to me and my team.
What we are dealing with here are actually two academic websites, with academics who are not marketers at the centre of requirements as to what 'they' want. So my developer partner is having to work on what the client wants and the client is required to satisfy an external other 3rd party website which, when you read my question they are referred to as Website (B).
My personal thought was why not just create a specific landing page that is very much targeted for this audience coming from Website (B) and have a deal tailored for them on that page. The call to action could have behind it something very specific (a voucher code or something) unique to that audience being able to take up the offer and so not interfere with my very public facing page that is already a popular landing page that I really don't want to have interfered with.
If you guys or anybody else has any further thought on this I very much appreciate it.
-
Hi Brian,
I haven't come across anyone doing this exact situation before, but I don't think it's anything to be concerned with. If you are just giving a single extra menu item to a navigation menu, I don't think it's enough to raise any flags.
I disagree with William, though, about adding the noindex and nofollow. It sounds like this is not a temporary test and you are getting traffic to the page from the search engines. So, I wouldn't sacrifice that traffic for extra caution.
I think you'll be fine adding the menu item.
Kurt Steinbrueck
OurChurch.Com -
Its not cloaking.
However, I'd suggest adding a noindex, nofollow to that landing page, so there is no confusion between which is the original. The company I currently work for, we noindex, nofollow all of our testing pages that have different linking structure, and works well for us. Our test pages are activate under certain criterias like new user, 2nd time user etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Website not moving?
We run a printing website www.fastprint.co.uk and have built a few decent tools such as http://www.fastprint.co.uk/adobe-shortcut-mapper/ and decent infographics such as http://www.fastprint.co.uk/blog/the-art-of-mixing-typefaces.html and had a fair few decent links from website over the course of the last 1 1/2 but we do not seem to be moving very far? If you take our site on sem rush (a decent percentage of our site traffic is through the above tools or decent blog posts so the number would be lower for E-commerce) http://www.semrush.com/uk/info/fastprint.co.uk+(by+organic)?sort=volume_desc in comparison to a few others http://www.semrush.com/uk/info/banana-print.co.uk+(by+organic)] http://www.semrush.com/uk/info/brunelone.com+(by+organic) Especially this site http://www.semrush.com/uk/info/instantprint.co.uk+(by+organic) I just don't get what we are doing wrong?
White Hat / Black Hat SEO | | BobAnderson0 -
Can image links help improve my backlinking profile?
I recently spent some time looking at the backlink profile of a leading UK food & clothing retailer and noticed that a high number of their backlinks for very competitive search phrase's consisted entirely of image backlinks. 50% of the links contained no alt text and other 50% contained a mix of just the targeted keyword or a phase containig one mention of the targeted keyword. Has anyone had any experiance of this type of marketing producing any positive effect on SEO or search engine rankings?
White Hat / Black Hat SEO | | BigJonOne0 -
Page 1 Ranking - Disappeared!
Hi All We launched our client's website http://rollerbannerscheap.co.uk in January this year. We have been building links making sure we are not over optmising anchor text and only following ethical SEO tactics. Our client's site eventually hit page 1 for it's main key word 'Roller Banner' 1 week ago, the site received impressions/clicks from the SERPS and has started to gain traffic from that particular keyword. I have checked today, and I cannot our client's website URL within the first 10 pages of Google, nevermind on page 1. Our client is currently undercutting competitiors on price, which we stated (the price) in the meta tag. Is it possible other SEOs could de-rank our website? If not, what would be a likely explaination for this occurance? Would just like to add, I recently build a link with anchor text 'Roller Banner Website, but one of my older links uses anchor text 'Roller Banners Cheap Website' - They're not exact match, but could this affect our ranking? Awaiting help Lewis
White Hat / Black Hat SEO | | SO_UK0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0