Can JavaScrip affect Google's index/ranking?
-
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop?
I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... "
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website.
Any advice would be much appreciated, thank you!
-
Thank for looking into this. The new site has been uploaded almost a month ago. Since then, Google crawled our website at least 4 times; nevertheless, the image snippet is still from the old template. Although, as you pointed out, there some minor issues, they are certainly not the reason why we dropped so bad.
As Rand pointed out to me about 2 weeks ago, sometimes Google works in mysterious ways that even the best in SEO cannot understand.
I'll wait for Google to show the snippet image of my Homepage, just to make sure he completely crawled it.Thanks again for your help!
-
Thank you for sharing. The primary concern I have is Google's cache of your home page is from Dec 3rd and looks different then the current site. "This is Google's cache of http://www.echolimousine.com/. It is a snapshot of the page as it appeared on Dec 3, 2011 10:23:14 GMT."
The next step is to wait for Google to recrawl your home page. Your home page has a PA of 47 so it should be updated very soon. Normally your home page is the most crawled page of your site. Your other site changes need to be updated as well to fully capture any changes.
A few other minor items I noticed while looking at the page:
-
As you shared, the page does not validate. I would recommend correcting the errors.
-
You are presently using tables in HTML. That type of coding stopped being used years ago. I would recommend using table-less code.
-
Your meta keywords tag is very long and offers no value to search engines. Unless you use it for another purpose you should remove it.
-
There is no value in adding "index, follow" meta tag since that is the default behavior. It just adds unnecessary code to the page. I recommend removing it.
-
-
_(echo limousine dot com) _
_keywords - Chicago limo service, Chicago limousine service, _Chicago limo | Chicago car service
As we redesigned the website, we decided to focus more on the first 3 keywords to better achieve the positions desired. We decided to remove "car service" from the Title to better target the other ones. I was not sure if by keeping it there it will dilute the first part so I decided to remove it; we ranked on first page for this keyword and on page two at the very top for the first two; the third one is very competitive and there are a couple of very old website on the first page which have a partial match domain which seems to be working in their favor still and it will take longer to beat them based on those facts.
-
All of your responses are fine except one:
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
I disagree with your assessment any missing pages are not important. If you have pages on your site which are missing then there is PR flowing to those pages via your internal links but zero PR flowing back to your site.
Based on your responses, you have taken the right actions and investigated the right areas. In order to offer further assistance I would need the URL involved along with a couple examples of keywords which you used to rank for and have dropped. I would also need to know the URL of the pages which you expect to rank for the given keyword.
-
Thanks for your quick reply Ryan
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
"Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed" - it is indexed 100%
_“_I know you shared the 301 redirects are properly set, but you may wish to check them again” - all ok
_“_robots.txt is blocking the page – all ok _“the page is tagged with noindex, all ok
the page is not properly linked to internally or externally -_ internal links are all in place as well the external ones the page does not have unique content – the content on the homepage has been written about a week after the website has been uploaded and it is 100% unique.
“Check your new site's log for 404 errors where links are broken” – there are 20 broken links (according to Xenu) which are coming from template’s CSS but those are related to some images which have not been published (some those images were part of some old presets from the original template). The user cannot access those broken links from inside our website and they have not been indexed anyway.
“Crawl the new site checking for issues” – in Seomoz campaign there are only 2 warnings and they are related to the Title element being too short for Contact page and Sitemap page.
“Perform a site: search for content that is in JS to see if it is indexed” – we did, every single page has been indexed and that shows in GWT as well.
Other issues:
- Homepage content in 90% new, all keywords have been kept in place and optimized.
- Title has been changed once
- all other webpages have kept their metrics
- we have dropped for every single keyword that we were targeting on our Homepage (echo limousine dot com) but all metrics are the same (PA, DA, Mr, mT...)
In my opinion, it is an issue with our homepage. I’m not sure what to check anymore; the JavaScript slideshow on the Homepage shows a lot of code. We used a third party company to upload the new template and I am not sure if there is an issue with that; upon CCS validation there were 3 errors and 354 warnings and W3C returned 31 errors and 2 warnings. Now, they told us that these errors are somewhat normal because they are not pure html and they will never be validated. They might be right but at this moment I take everything into account. I verified everything I knew and I cannot find an answer.
-
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
When you migrate from an old site to a new site, your rankings wont be solid until the entire new site is crawled. Most sites use the same navigation and footers for the entire site. The links on each and every page are important as they control the flow or a large amount of page rank. If you remove a group of pages then the PR flow of your site is greatly disrupted.
Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed. If 30 days have passed and that has not happened, there is a problem. I know you shared the 301 redirects are properly set, but you may wish to check them again. 30 days after a site migration you should search Google's index for every page. Any pages not indexed should be listed, and that list should be reviewed. Determine the root cause of the problem. Some possibilities: the old page is not properly 301'd, robots.txt is blocking the page, the page is tagged with noindex, the page is not properly linked to internally or externally, the page does not have unique content, etc.
Once your site is fully indexed, the next step is to work to improve rankings. Adjust any backlinks which you control so they link directly to the new site without the need for a 301 redirect. Check your new site's log for 404 errors where links are broken. Crawl the new site checking for issues. Continue the normal process of earning links.
With respect to JavaScript, Google has demonstrated the ability to read some JS. It is safest to present all your content in HTML and use JS to perform functions such as control which content is displayed and when. If you decide to use JS to display content, I would always recommend checking Google for the content in JS. Perform a site: search for content that is in JS to see if it is indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console 'Change of Address' Just 301s on source domain?
Hi all. New here, so please be gentle. 🙂 I've developed a new site, where my client also wanted to rebrand from .co.nz to .nz On the source (co.nz) domain, I've setup a load of 301 redirects to the relevant new page on the new domain (the URL structure is changing as well).
Technical SEO | | WebGuyNZ
E.G. On the old domain: https://www.mysite.co.nz/myonlinestore/t-shirt.html
In the HTACCESS on the old/source domain, I've setup 301's (using RewriteRule).
So that when **https://www.mysite.co.nz/**myonlinestore/t-shirt.html is accessed, it does a 301 to;
https://mysite.nz/shop/clothes/t-shirt All these 301's are working fine. I've checked in dev tools and a 301 is being returned. My question is, is having the 301's just on the source domain only enough, in regards to starting a 'Change of Address' in Google's Search Console? Their wording indicates it's enough but I'm concerned, maybe I also need redirects on the target domain as well? I.E. Does the Search Console Change of Address process work this way?
It looks at the source domain URL (that's already in Google's index), sees the 301 then updates the index (and hopefully pass the link juice) to the new URL. Also, I've setup both source and target Search Console properties as Domain Properties. Does that mean I no longer need to specify that the source and target properties are HTTP or HTTPS? I couldn't see that option when I created the properties. Thanks!0 -
Wordpress versus html and google ranking
My current SEO has always recommended that I take my site to wordpress. I really don't want to move to wordpress. I don't like it... I just like writing code in raw html, css, and script. I feel like I have more control that way. Wordpress just seems like a platform for blogs (I have my blog in wordpress). My question is, do wordpress websites typically rank better? Is there benefit to moving to it?
Technical SEO | | CalicoKitty20000 -
New website's ranking dropped
Hi, Im working on brand new website i didn't even start my link building yet, just added to local directories i slowly started getting my ranking on 3rd page of Google then few weeks ago my ranking fell for all the keywords so now the website doesn't even rank on 10th page. Its been like this for a few weeks now. Here's the website Screenshot http://screencast.com/t/wDWk8sxLw Thanks for your help
Technical SEO | | mezozcorp0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Sitemap coming up in Google's index?
I apologize if this question's answer is glaringly obvious, but I was using Google to view all the pages it has indexed of our site--by searching for our company and then clicking the link that says to display more results for the site. On page three, it has the sitemap indexed as if it wee just another page of our site. <cite>www.stadriemblems.com/sitemap.xml</cite> Is this supposed to happen?
Technical SEO | | UnderRugSwept0 -
Http:// vs http://www.
Why is it that when I run an "On Page Optimization Keyword Report" for my website I get a different score when using http://www.tandmkitchens.com vs http://tandmkitchens.com. My keyword is "Kitchen Remodeling" http://www.tandmkitchens.com scores an A http://tandmkitchens.com scores a B It's the same page yet one url scores higher than the other. Any help! Thanks
Technical SEO | | fun52dig
Gary0 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0 -
Access To Client's Google Webmaster Tools
Hi, What's the best/easiest way for a client to grant access to his Google Webmaster Tools to me? Thanks! Best...Michael
Technical SEO | | 945010