Can JavaScrip affect Google's index/ranking?
-
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop?
I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... "
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website.
Any advice would be much appreciated, thank you!
-
Thank for looking into this. The new site has been uploaded almost a month ago. Since then, Google crawled our website at least 4 times; nevertheless, the image snippet is still from the old template. Although, as you pointed out, there some minor issues, they are certainly not the reason why we dropped so bad.
As Rand pointed out to me about 2 weeks ago, sometimes Google works in mysterious ways that even the best in SEO cannot understand.
I'll wait for Google to show the snippet image of my Homepage, just to make sure he completely crawled it.Thanks again for your help!
-
Thank you for sharing. The primary concern I have is Google's cache of your home page is from Dec 3rd and looks different then the current site. "This is Google's cache of http://www.echolimousine.com/. It is a snapshot of the page as it appeared on Dec 3, 2011 10:23:14 GMT."
The next step is to wait for Google to recrawl your home page. Your home page has a PA of 47 so it should be updated very soon. Normally your home page is the most crawled page of your site. Your other site changes need to be updated as well to fully capture any changes.
A few other minor items I noticed while looking at the page:
-
As you shared, the page does not validate. I would recommend correcting the errors.
-
You are presently using tables in HTML. That type of coding stopped being used years ago. I would recommend using table-less code.
-
Your meta keywords tag is very long and offers no value to search engines. Unless you use it for another purpose you should remove it.
-
There is no value in adding "index, follow" meta tag since that is the default behavior. It just adds unnecessary code to the page. I recommend removing it.
-
-
_(echo limousine dot com) _
_keywords - Chicago limo service, Chicago limousine service, _Chicago limo | Chicago car service
As we redesigned the website, we decided to focus more on the first 3 keywords to better achieve the positions desired. We decided to remove "car service" from the Title to better target the other ones. I was not sure if by keeping it there it will dilute the first part so I decided to remove it; we ranked on first page for this keyword and on page two at the very top for the first two; the third one is very competitive and there are a couple of very old website on the first page which have a partial match domain which seems to be working in their favor still and it will take longer to beat them based on those facts.
-
All of your responses are fine except one:
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
I disagree with your assessment any missing pages are not important. If you have pages on your site which are missing then there is PR flowing to those pages via your internal links but zero PR flowing back to your site.
Based on your responses, you have taken the right actions and investigated the right areas. In order to offer further assistance I would need the URL involved along with a couple examples of keywords which you used to rank for and have dropped. I would also need to know the URL of the pages which you expect to rank for the given keyword.
-
Thanks for your quick reply Ryan
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
"Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed" - it is indexed 100%
_“_I know you shared the 301 redirects are properly set, but you may wish to check them again” - all ok
_“_robots.txt is blocking the page – all ok _“the page is tagged with noindex, all ok
the page is not properly linked to internally or externally -_ internal links are all in place as well the external ones the page does not have unique content – the content on the homepage has been written about a week after the website has been uploaded and it is 100% unique.
“Check your new site's log for 404 errors where links are broken” – there are 20 broken links (according to Xenu) which are coming from template’s CSS but those are related to some images which have not been published (some those images were part of some old presets from the original template). The user cannot access those broken links from inside our website and they have not been indexed anyway.
“Crawl the new site checking for issues” – in Seomoz campaign there are only 2 warnings and they are related to the Title element being too short for Contact page and Sitemap page.
“Perform a site: search for content that is in JS to see if it is indexed” – we did, every single page has been indexed and that shows in GWT as well.
Other issues:
- Homepage content in 90% new, all keywords have been kept in place and optimized.
- Title has been changed once
- all other webpages have kept their metrics
- we have dropped for every single keyword that we were targeting on our Homepage (echo limousine dot com) but all metrics are the same (PA, DA, Mr, mT...)
In my opinion, it is an issue with our homepage. I’m not sure what to check anymore; the JavaScript slideshow on the Homepage shows a lot of code. We used a third party company to upload the new template and I am not sure if there is an issue with that; upon CCS validation there were 3 errors and 354 warnings and W3C returned 31 errors and 2 warnings. Now, they told us that these errors are somewhat normal because they are not pure html and they will never be validated. They might be right but at this moment I take everything into account. I verified everything I knew and I cannot find an answer.
-
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
When you migrate from an old site to a new site, your rankings wont be solid until the entire new site is crawled. Most sites use the same navigation and footers for the entire site. The links on each and every page are important as they control the flow or a large amount of page rank. If you remove a group of pages then the PR flow of your site is greatly disrupted.
Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed. If 30 days have passed and that has not happened, there is a problem. I know you shared the 301 redirects are properly set, but you may wish to check them again. 30 days after a site migration you should search Google's index for every page. Any pages not indexed should be listed, and that list should be reviewed. Determine the root cause of the problem. Some possibilities: the old page is not properly 301'd, robots.txt is blocking the page, the page is tagged with noindex, the page is not properly linked to internally or externally, the page does not have unique content, etc.
Once your site is fully indexed, the next step is to work to improve rankings. Adjust any backlinks which you control so they link directly to the new site without the need for a 301 redirect. Check your new site's log for 404 errors where links are broken. Crawl the new site checking for issues. Continue the normal process of earning links.
With respect to JavaScript, Google has demonstrated the ability to read some JS. It is safest to present all your content in HTML and use JS to perform functions such as control which content is displayed and when. If you decide to use JS to display content, I would always recommend checking Google for the content in JS. Perform a site: search for content that is in JS to see if it is indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why can't google mobile friendly test access my website?
getting the following error when trying to use google mobile friendly tool: "page cannot be reached. This could be because the page is unavailable or blocked by robots.txt" I don't have anything blocked by robots.txt or robots tag. i also manage to render my pages on google search console's fetch and render....so what can be the reason that the tool can't access my website? Also...the mobile usability report on the search console works but reports very little, and the google speed test also doesnt work... Any ideas to what is the reason and how to fix this? LEARN MOREDetailsUser agentGooglebot smartphone
Technical SEO | | Nadav_W0 -
Good organic ranking/Poor local ranking
Hi, I am a local retailer with a physical store in a major US City. My website is very well ranked on major keywords related to my business from an organic result perspective (between 1st and 3rd spot). However, in terms of local results (google place), my website isn't even ranked (except for one specific long tail keyword). Anybody know why? Thank you so much for your help. This is driving me crazy:-)
Technical SEO | | larose370 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
Does Google index has expiration?
Hi, I have this in mind and I think you can help me. Suppose that I have a pagin something like this: www.mysite.com/politics where I have a list of the current month news. Great, everytime the bot check this url, index the links that are there. What happens next month, all that link are not visible anymore by the user unless he search in a search box or google. Does google keep those links? The current month google check that those links are there, but next month are not, but they are alive. So, my question is, Does google keep this links for ever if they are alive but nowhere in the site (the bot not find them anymore but they work)? Thanks
Technical SEO | | informatica8100 -
Google indexing less url's then containded in my sitemap.xml
My sitemap.xml contains 3821 urls but Google (webmaster tools) indexes only 1544 urls. What may be the cause? There is no technical problem. Why does Google index less URLs then contained in my sitemap.xml?
Technical SEO | | Juist0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
Google Shopping Australia/Google Merchant Centre
So Google Shopping has finally landed in Australia so we've got some work todo hooking it up to our client ecom sites. Right now we have a handful of clients who are setup, the feed is getting in their ok but all products are sitting in "disapproved" status in the dashboard and clicking into each individual product the status says awaiting review. I logged a support ticket with Google to get some more info on this as it doesn't look right to me (ie the disapproved status in dashboard) and got a useless templated answer. Seems that if I switch the country destination to US the products are approved and live in google.com shopping search within the hour. Switch back to Australia and they go back to disapproved status. Anyone having the same issue/seen this before? I simply don't trust Google support and wondering if there's other factors at play here.
Technical SEO | | Brendo0 -
Google crawl index issue with our website...
Hey there. We've run into a mystifying issue with Google's crawl index of one of our sites. When we do a "site:www.burlingtonmortgage.biz" search in Google, we're seeing lots of 404 Errors on pages that don't exist on our site or seemingly on the remote server. In the search results, Google is showing nonsensical folders off the root domain and then the actual page is within that non-existent folder. An example: Google shows this in its index of the site (as a 404 Error page): www.burlingtonmortgage.biz/MQnjO/idaho-mortgage-rates.asp The actual page on the site is: www.burlingtonmortgage.biz/idaho-mortgage-rates.asp Google is showing the folder MQnjO that doesn't exist anywhere on the remote. Other pages they are showing have different folder names that are just as wacky. We called our hosting company who said the problem isn't coming from them... Has anyone had something like this happen to them? Thanks so much for your insight!
Technical SEO | | ILM_Marketing
Megan0