Can JavaScrip affect Google's index/ranking?
-
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop?
I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... "
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website.
Any advice would be much appreciated, thank you!
-
Thank for looking into this. The new site has been uploaded almost a month ago. Since then, Google crawled our website at least 4 times; nevertheless, the image snippet is still from the old template. Although, as you pointed out, there some minor issues, they are certainly not the reason why we dropped so bad.
As Rand pointed out to me about 2 weeks ago, sometimes Google works in mysterious ways that even the best in SEO cannot understand.
I'll wait for Google to show the snippet image of my Homepage, just to make sure he completely crawled it.Thanks again for your help!
-
Thank you for sharing. The primary concern I have is Google's cache of your home page is from Dec 3rd and looks different then the current site. "This is Google's cache of http://www.echolimousine.com/. It is a snapshot of the page as it appeared on Dec 3, 2011 10:23:14 GMT."
The next step is to wait for Google to recrawl your home page. Your home page has a PA of 47 so it should be updated very soon. Normally your home page is the most crawled page of your site. Your other site changes need to be updated as well to fully capture any changes.
A few other minor items I noticed while looking at the page:
-
As you shared, the page does not validate. I would recommend correcting the errors.
-
You are presently using tables in HTML. That type of coding stopped being used years ago. I would recommend using table-less code.
-
Your meta keywords tag is very long and offers no value to search engines. Unless you use it for another purpose you should remove it.
-
There is no value in adding "index, follow" meta tag since that is the default behavior. It just adds unnecessary code to the page. I recommend removing it.
-
-
_(echo limousine dot com) _
_keywords - Chicago limo service, Chicago limousine service, _Chicago limo | Chicago car service
As we redesigned the website, we decided to focus more on the first 3 keywords to better achieve the positions desired. We decided to remove "car service" from the Title to better target the other ones. I was not sure if by keeping it there it will dilute the first part so I decided to remove it; we ranked on first page for this keyword and on page two at the very top for the first two; the third one is very competitive and there are a couple of very old website on the first page which have a partial match domain which seems to be working in their favor still and it will take longer to beat them based on those facts.
-
All of your responses are fine except one:
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
I disagree with your assessment any missing pages are not important. If you have pages on your site which are missing then there is PR flowing to those pages via your internal links but zero PR flowing back to your site.
Based on your responses, you have taken the right actions and investigated the right areas. In order to offer further assistance I would need the URL involved along with a couple examples of keywords which you used to rank for and have dropped. I would also need to know the URL of the pages which you expect to rank for the given keyword.
-
Thanks for your quick reply Ryan
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
"Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed" - it is indexed 100%
_“_I know you shared the 301 redirects are properly set, but you may wish to check them again” - all ok
_“_robots.txt is blocking the page – all ok _“the page is tagged with noindex, all ok
the page is not properly linked to internally or externally -_ internal links are all in place as well the external ones the page does not have unique content – the content on the homepage has been written about a week after the website has been uploaded and it is 100% unique.
“Check your new site's log for 404 errors where links are broken” – there are 20 broken links (according to Xenu) which are coming from template’s CSS but those are related to some images which have not been published (some those images were part of some old presets from the original template). The user cannot access those broken links from inside our website and they have not been indexed anyway.
“Crawl the new site checking for issues” – in Seomoz campaign there are only 2 warnings and they are related to the Title element being too short for Contact page and Sitemap page.
“Perform a site: search for content that is in JS to see if it is indexed” – we did, every single page has been indexed and that shows in GWT as well.
Other issues:
- Homepage content in 90% new, all keywords have been kept in place and optimized.
- Title has been changed once
- all other webpages have kept their metrics
- we have dropped for every single keyword that we were targeting on our Homepage (echo limousine dot com) but all metrics are the same (PA, DA, Mr, mT...)
In my opinion, it is an issue with our homepage. I’m not sure what to check anymore; the JavaScript slideshow on the Homepage shows a lot of code. We used a third party company to upload the new template and I am not sure if there is an issue with that; upon CCS validation there were 3 errors and 354 warnings and W3C returned 31 errors and 2 warnings. Now, they told us that these errors are somewhat normal because they are not pure html and they will never be validated. They might be right but at this moment I take everything into account. I verified everything I knew and I cannot find an answer.
-
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
When you migrate from an old site to a new site, your rankings wont be solid until the entire new site is crawled. Most sites use the same navigation and footers for the entire site. The links on each and every page are important as they control the flow or a large amount of page rank. If you remove a group of pages then the PR flow of your site is greatly disrupted.
Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed. If 30 days have passed and that has not happened, there is a problem. I know you shared the 301 redirects are properly set, but you may wish to check them again. 30 days after a site migration you should search Google's index for every page. Any pages not indexed should be listed, and that list should be reviewed. Determine the root cause of the problem. Some possibilities: the old page is not properly 301'd, robots.txt is blocking the page, the page is tagged with noindex, the page is not properly linked to internally or externally, the page does not have unique content, etc.
Once your site is fully indexed, the next step is to work to improve rankings. Adjust any backlinks which you control so they link directly to the new site without the need for a 301 redirect. Check your new site's log for 404 errors where links are broken. Crawl the new site checking for issues. Continue the normal process of earning links.
With respect to JavaScript, Google has demonstrated the ability to read some JS. It is safest to present all your content in HTML and use JS to perform functions such as control which content is displayed and when. If you decide to use JS to display content, I would always recommend checking Google for the content in JS. Perform a site: search for content that is in JS to see if it is indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo ip filtering / Subdomain can't be crawled
My client has "load balancing" site traffic in the following way: domain: www.example.com traffic from US IP redirected to usa.example.com traffic from non-US IP redirected to www2.example.com The reason for doing this is that site contents on the www2 contains herbal medicine info banned by FDA."usa.example.com" is a "cleaned" site. Using HK IP, when I google an Eng keyword, I can see that www.example.com is indexed. When googling a Chi keyword, nothing is indexed - neither the domain or www2 subdomain. From Google Search Console, it shows a Dell Sonicwall geo ip filtering alert for www2 (Connection initiated from country: United States). GSC data also confirms that www2 has never been indexed by Google. Questions: Is geo ip filtering the very reason why www2 isn't indexed? What should I do in order to get www2 to be indexed? Thanks guys!
Technical SEO | | irene7890 -
From: http://www. to https://
Hi all, I am changing my hosting for legal and SEO reasons from http://www to https:// . Now I hear different stories on the redirects: 1: should i try and change my backlinks? 2: internally all links will be 301 redirected at first. Than I want to (manually) change them. It;s within Wordpress so there should be a plugin for this. Tips? 3: Will it affect my rankings and for what period? What I now know that at first it will drop little but eventually you will rank higher than before. Thanks so much in advance! Tymen
Technical SEO | | Tymen1 -
Can Google Crawl This Page?
I'm going to have to post the page in question which i'd rather not do but I have permission from the client to do so. Question: A recruitment client of mine had their website build on a proprietary platform by a so-called recruitment specialist agency. Unfortunately the site is not performing well in the organic listings. I believe the culprit is this page and others like it: http://www.prospect-health.com/Jobs/?st=0&o3=973&s=1&o4=1215&sortdir=desc&displayinstance=Advanced Search_Site1&pagesize=50000&page=1&o1=255&sortby=CreationDate&o2=260&ij=0 Basically as soon as you deviate from the top level pages you land on pages that have database-query URLs like this one. My take on it is that Google cannot crawl these pages and is therefore having trouble picking up all of the job listings. I have taken some measures to combat this and obviously we have an xml sitemap in place but it seems the pages that Google finds via the XML feed are not performing because there is no obvious flow of 'link juice' to them. There are a number of latest jobs listed on top level pages like this one: http://www.prospect-health.com/optometry-jobs and when they are picked up they perform Ok in the SERPs, which is the biggest clue to the problem outlined above. The agency in question have an SEO department who dispute the problem and their proposed solution is to create more content and build more links (genius!). Just looking for some clarification from you guys if you don't mind?
Technical SEO | | shr1090 -
Will you get more 'google juice' if your social links are in your websites header, rather than its footer?
Hi team, I'm in the process of making some aesthetic changes to my website. Its getting quite cluttered so the main purpose is to clean up its look. I currently have 3x social links in the header, right at the top, and i would really like to move these to the footer to remove some clutter in the header. My concern is that moving them may have an impact on the domains ranking in google. Website: www.mountainjade.co.nz We've made some huge gains against our competitors over the past 6 months and I don't want to jeopardise that. Any help would be much appreciated as i'm self taught in SEO and have learnt through making mistakes. This time however, with Moz, i'd rather get some advice before I make any decisions! Thanks is advance, Jake S
Technical SEO | | Jacobsheehan0 -
How to rank in Google Places
Normally, I don't have a problem with local SEO (more of a multi-channel sort of online marketing guy) but this one has got me scratching my head. Look at https://www.google.co.uk/search?q=wedding+venues+in+essex Theres two websites there (fennes and quendon park) that both have a much more powerful DA but don't appear in the Google Places (Google + Business or whatever it's labeled as). Why are websites such as Boreham house ranking top in the map listings? Quendon Park has a Google places listing, it's full of content, the NAP all matches up. Its a stronger website. Boreham House isn't any closer to the centroid than Quendon Park Just got me struggling this one
Technical SEO | | jasonwdexter0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Has Google stopped rendering author snippets on SERP pages if the author's G+ page is not actively updated?
Working with a site that has multiple authors and author microformat enabled. The image is rendering for some authors on SERP page and not for others. Difference seems to be having an updated G+ page and not having a constantly updating G+ page. any thoughts?
Technical SEO | | irvingw0 -
What is Google's Penguin effect on SEO?
I want to know about Google's Penguin. Specially, how it works to protect spam links <seo>or other jobs. </seo> How I can protect this problem. Kind Regards John
Technical SEO | | JohnDooley0