Can JavaScrip affect Google's index/ranking?
-
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop?
I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... "
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website.
Any advice would be much appreciated, thank you!
-
Thank for looking into this. The new site has been uploaded almost a month ago. Since then, Google crawled our website at least 4 times; nevertheless, the image snippet is still from the old template. Although, as you pointed out, there some minor issues, they are certainly not the reason why we dropped so bad.
As Rand pointed out to me about 2 weeks ago, sometimes Google works in mysterious ways that even the best in SEO cannot understand.
I'll wait for Google to show the snippet image of my Homepage, just to make sure he completely crawled it.Thanks again for your help!
-
Thank you for sharing. The primary concern I have is Google's cache of your home page is from Dec 3rd and looks different then the current site. "This is Google's cache of http://www.echolimousine.com/. It is a snapshot of the page as it appeared on Dec 3, 2011 10:23:14 GMT."
The next step is to wait for Google to recrawl your home page. Your home page has a PA of 47 so it should be updated very soon. Normally your home page is the most crawled page of your site. Your other site changes need to be updated as well to fully capture any changes.
A few other minor items I noticed while looking at the page:
-
As you shared, the page does not validate. I would recommend correcting the errors.
-
You are presently using tables in HTML. That type of coding stopped being used years ago. I would recommend using table-less code.
-
Your meta keywords tag is very long and offers no value to search engines. Unless you use it for another purpose you should remove it.
-
There is no value in adding "index, follow" meta tag since that is the default behavior. It just adds unnecessary code to the page. I recommend removing it.
-
-
_(echo limousine dot com) _
_keywords - Chicago limo service, Chicago limousine service, _Chicago limo | Chicago car service
As we redesigned the website, we decided to focus more on the first 3 keywords to better achieve the positions desired. We decided to remove "car service" from the Title to better target the other ones. I was not sure if by keeping it there it will dilute the first part so I decided to remove it; we ranked on first page for this keyword and on page two at the very top for the first two; the third one is very competitive and there are a couple of very old website on the first page which have a partial match domain which seems to be working in their favor still and it will take longer to beat them based on those facts.
-
All of your responses are fine except one:
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
I disagree with your assessment any missing pages are not important. If you have pages on your site which are missing then there is PR flowing to those pages via your internal links but zero PR flowing back to your site.
Based on your responses, you have taken the right actions and investigated the right areas. In order to offer further assistance I would need the URL involved along with a couple examples of keywords which you used to rank for and have dropped. I would also need to know the URL of the pages which you expect to rank for the given keyword.
-
Thanks for your quick reply Ryan
“If you remove a group of pages then the PR flow of your site is greatly disrupted” – No important pages have been removed, just some low end pages that did not gain any external links and also did not play any role in the internal linking process.
"Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed" - it is indexed 100%
_“_I know you shared the 301 redirects are properly set, but you may wish to check them again” - all ok
_“_robots.txt is blocking the page – all ok _“the page is tagged with noindex, all ok
the page is not properly linked to internally or externally -_ internal links are all in place as well the external ones the page does not have unique content – the content on the homepage has been written about a week after the website has been uploaded and it is 100% unique.
“Check your new site's log for 404 errors where links are broken” – there are 20 broken links (according to Xenu) which are coming from template’s CSS but those are related to some images which have not been published (some those images were part of some old presets from the original template). The user cannot access those broken links from inside our website and they have not been indexed anyway.
“Crawl the new site checking for issues” – in Seomoz campaign there are only 2 warnings and they are related to the Title element being too short for Contact page and Sitemap page.
“Perform a site: search for content that is in JS to see if it is indexed” – we did, every single page has been indexed and that shows in GWT as well.
Other issues:
- Homepage content in 90% new, all keywords have been kept in place and optimized.
- Title has been changed once
- all other webpages have kept their metrics
- we have dropped for every single keyword that we were targeting on our Homepage (echo limousine dot com) but all metrics are the same (PA, DA, Mr, mT...)
In my opinion, it is an issue with our homepage. I’m not sure what to check anymore; the JavaScript slideshow on the Homepage shows a lot of code. We used a third party company to upload the new template and I am not sure if there is an issue with that; upon CCS validation there were 3 errors and 354 warnings and W3C returned 31 errors and 2 warnings. Now, they told us that these errors are somewhat normal because they are not pure html and they will never be validated. They might be right but at this moment I take everything into account. I verified everything I knew and I cannot find an answer.
-
One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website.
When you migrate from an old site to a new site, your rankings wont be solid until the entire new site is crawled. Most sites use the same navigation and footers for the entire site. The links on each and every page are important as they control the flow or a large amount of page rank. If you remove a group of pages then the PR flow of your site is greatly disrupted.
Before you can begin to troubleshoot any ranking issues your new site should be 100% indexed. If 30 days have passed and that has not happened, there is a problem. I know you shared the 301 redirects are properly set, but you may wish to check them again. 30 days after a site migration you should search Google's index for every page. Any pages not indexed should be listed, and that list should be reviewed. Determine the root cause of the problem. Some possibilities: the old page is not properly 301'd, robots.txt is blocking the page, the page is tagged with noindex, the page is not properly linked to internally or externally, the page does not have unique content, etc.
Once your site is fully indexed, the next step is to work to improve rankings. Adjust any backlinks which you control so they link directly to the new site without the need for a 301 redirect. Check your new site's log for 404 errors where links are broken. Crawl the new site checking for issues. Continue the normal process of earning links.
With respect to JavaScript, Google has demonstrated the ability to read some JS. It is safest to present all your content in HTML and use JS to perform functions such as control which content is displayed and when. If you decide to use JS to display content, I would always recommend checking Google for the content in JS. Perform a site: search for content that is in JS to see if it is indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'domain:example.com/' is this line with a '/' at the end of the domain valid in a disavow report file ?
Hi everyone Just out of curiosity, what would happen if in my disavow report I have this line : domain:example.com**/** instead of domain:example.com as recommended by google. I was just wondering if adding a / at the end of a domain would automatically render the line invalid and ignored by Google's disavow backlinks tool. Many thanks for your thoughts
Technical SEO | | LabeliumUSA0 -
Does Google Parse The Anchor Text while Indexing
Hey moz fanz, I'm here to ask a bit technical and open-minding question.
Technical SEO | | atakala
In the Google's paper http://infolab.stanford.edu/~backrub/google.html
They say they parse the page into hits which is basically word occurences.
But I want to know that they also do the same thing while keeping the anchor text database.
I mean do they parse the anchor text or keep it as it is .
For example, let's say my anchor text is "real car games".
When they indexing my link with anchor text, do they parse my anchor text as hits like
"real" distinct hits
"car" distinct hits
"games" distinct hits.
OR do they just use it as it is. As "real car games"0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
How to stop my webmail pages not to be indexed on Google ??
when i did a search in google for Site:mywebsite.com , for a list of pages indexed. Surprisingly the following come up " Webmail - Login " Although this is associated with the domain , this is a completely different server , this the rackspace email server browser interface I am sure that there is nothing on the website that links or points to this.
Technical SEO | | UIPL
So why is Google indexing it ? & how do I get it out of there. I tried in webmaster tool but I could not , as it seems like a sub-domain. Any ideas ? Thanks Naresh Sadasivan0 -
De-indexed from Google
Hi Search Experts! We are just launching a new site for a client with a completely new URL. The client can not provide any access details for their existing site. Any ideas how can we get the existing site de-indexed from Google? Thanks guys!
Technical SEO | | rikmon0 -
Site 'filtered' by Google in early July.... and still filtered!
Hi, Our site got demoted by Google all of a sudden back in early July. You can view the site here: http://alturl.com/4pfrj and you may read the discussions I posted in Google's forums here: http://www.google.com/support/forum/p/Webmasters/thread?tid=6e8f9aab7e384d88&hl=en http://www.google.com/support/forum/p/Webmasters/thread?tid=276dc6687317641b&hl=en Those discussions chronicle what happened, and what we've done since. I don't want to make this a long post by retyping it all here, hence the links. However, we've made various changes (as detailed), such as getting rid of duplicate content (use of noindex on various pages etc), and ensuring there is no hidden text (we made an unintentional blunder there through use of a 3rd party control which used CSS hidden text to store certain data). We have also filed reconsideration requests with Google and been told that no manual penalty has been applied. So the problem is down to algorithmic filters which are being applied. So... my reason for posting here is simply to see if anyone here can help us discover if there is anything we have missed? I'd hope that we've addressed the main issues and that eventually our Google ranking will recover (ie. filter removed.... it isn't that we 'rank' poorly, but that a filter is bumping us down, to, for example, page 50).... but after three months it sure is taking a while! It appears that a 30 day penalty was originally applied, as our ranking recovered in early August. But a few days later it dived down again (so presumably Google analysed the site again, found a problem and applied another penalty/filter). I'd hope that might have been 30 or 60 days, but 60 days have now passed.... so perhaps we have a 90 day penalty now. OR.... perhaps there is no time frame this time, simply the need to 'fix' whatever is constantly triggering the filter (that said, I 'feel' like a time frame is there, especially given what happened after 30 days). Of course the other aspect that can always be worked on (and oft-mentioned) is the need for more and more original content. However, we've done a lot to increase this and think our Guide pages are pretty useful now. I've looked at many competitive sites which list in Google and they really don't offer anything more than we do..... so if that is the issue it sure is puzzling if we're filtered and they aren't. Anyway, I'm getting wordy now, so I'll pause. I'm just asking if anyone would like to have a quick look at the site and see what they can deduce? We have of course run it through SEOMoz's tools and made use of the suggestions. Our target pages generally rate as an A for SEO in the reports. Thanks!
Technical SEO | | Go2Holidays0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0