Any known impact for using a JS onload event
-
Has anyone experienced any negative experiences using a JS onload event to show a particular tab opened to organic visitors vs front door traffic?
Any other thoughts on the subject are indeed appreciated.
-
What is the purpose of this? Do you have all your content on 'one page' but the 'tabs' just say what is the current 'page'?
My feeling is that if your home page is
and you're showing organic search visitors different content than those who put in www.example.com then you are misleading search engines and visitors and you need to not do that.
You will be better setting up separate pages with each well targeted to different keywords/terms instead. Using JS to present different content is not a good idea.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you bother with an "impact links" manual action
I have a couple sites that have these, and I have done a lot of work to get them removed, but there seems to be very little if any benefit from doing this. In fact, sites were we have done nothing after these penalties seem to be doing better than ones where we have done link removal and the reconsideration request. Google says "I_f you don’t control the links pointing to your site, no action is required on your part. From Google’s perspective, the links already won’t count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you’re able to get the artificial links removed, submit a reconsideration request__. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action._" I would guess a lot of people with this penalty don't even know they have it, and it sounds like leaving it alone really doesn't hurt your site. If seems to me that just simply ignoring this and building better links and higher quality content should help improve your site rankings vs. worrying about trying to get all these links removed/disavowed. What are your thoughts? Is it worth trying to get this manual action removed?
Intermediate & Advanced SEO | | netviper0 -
MedicalEntity Schemas: Examples of Sites Using It?
Anyone know of any medical or health-related sites that have widely implemented medical schema types? For example: MedicalCode, MedicalTest, MedicalSignorSymptom, etc. and others listed here: http://schema.org/MedicalEntity I've reviewed the examples on schema.org, but it would he helpful to see some live examples in the wild. Thanks!
Intermediate & Advanced SEO | | Allie_Williams1 -
Using Webmaster Tools to Redirect Domain to Specific Page on Another Domain
Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Bad use of the Rel="canonical" tag
Google is currently ranking my category page instead of our homepage for our key term and we would rather have our homepage rank for the term. Would it be a bad idea to rel="canonical" our category page to our homepage? Our homepage is optimized to rank for the keyword and has more PR than our category page. However, I don't really know if this will have negative repercussions. Thanks, Jason
Intermediate & Advanced SEO | | Jason_3420 -
Use of subdomains, subdirectories or both?
Hello, i would like your advice on a dilemma i am facing. I am working a new project that is going to release soon, thats a network of users with personal profiles seperated in categories for example lets say the categories are colors. So let say i am a member and i belong in red color categorie and i got a page where i update my personal information/cv/resume as well as a personal blog thats on that page. So the main site is giving the option to user to search for members by the criteria of color. My first idea is that all users should own a subdomain (and this is how its developed so far) thats easy to use and since the domain name is really small (just 3 letters) i believe subdomain worth since personal site will be easy to remember. My dilemma is should all users own a subdomain, a subdirectory or both and if both witch one should be the canonical? Since it said that search engines treat subdomains as different stand-alone sites, whats best for the main site? to show multiple search results with profiles in subdomains or subdirectories? What if i use both? meaning in search results i use search directory url for each profile while same time each profile owns a subdomains as well? and if so which one should be the canonical? Thanks in advance, C
Intermediate & Advanced SEO | | HaCos0 -
Should 301 Redirects be used only in cross domains or also internally?
In the following video with Cutts: http://youtu.be/r1lVPrYoBkA he explains a bit more about 301 redirects but he only talks about cross sites. What about redirecting internally from a non-existing product in a store to a new similar existing product?
Intermediate & Advanced SEO | | BeytzNet0 -
Does google can read the content of one Iframe and use it for the pagerank?
Beginners doubt: When one website has its content inside Iframe's, google will read it and consider for the pagerank?
Intermediate & Advanced SEO | | Naghirniac0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0