Javascript(0) extension causing an excess of 404's
-
For some reason I am getting a duplicate version of my urls with /javascript(0) at the end. These are creating an abundance of 404 errors. I know I am not supposed to block JS files so what is the best way to block these?
Ex:
http://www.jasonfox.me/infographics/page/8/javascript(0) is a 404
http://www.jasonfox.me/infographics/page/8/ is not
Thank you.
-
That's correct.
In your HTML you have code as:
-
[_this is the search icon and as it's created it fool some bots that there is file javascript(0) in current folder. So if i'm here:
http://www.jasonfox.me/infographics/page/9/
then bot add file this as relative and full path became:
http://www.jasonfox.me/infographics/page/9/javascript(0)
and this is how 404 is make.Correct way is to replace "javascript(0);" with "javascript:void(0)" or with "#". Only this patch in WordPress theme (look around header.php) will stop 404s._](javascript(0);)
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best SEO tactics when you have a dedicated web address pointing to a page on a different site?
Hope someone can help with a question I've got about sorting out some duplicate content issues. To simplify the question, imagine there is a website a.com which has a page a.com/newslettersignup. In addition to the a.com domain, there is also a different web address, ashortcut.com, which points to a.com/newslettersignup. ashortcut.com is the web address that is advertised in marketing material etc. So what is the best way then to tell Google etc. that ashortcut.com is the preferred URL for the page which sits at a.com/newslettersignup? The advice I've read about the canonical tag, for example, doesn't cover this exact scenario so although it can support cross-domain information, I'm not sure if that's the best route to follow. Thanks!
On-Page Optimization | | Nobody15755058948220 -
Does name of town in title tag help if queries don't include the town name?
Hi. Wanted to know if targeting local traffic online and the search volume of KWs in the area do not include the local names (according to KW planner) does it still help to keep the town names in the title tag? does google deliver local results based on location names in title tag if query didn't mention it?
On-Page Optimization | | Morris770 -
ECommerce & Reviews when generated by 3rd Party uses Javascript
Hi all, I am trying to optimize our product pages and I know one of the important factors is showing customer reviews. While we have plenty of reviews to show they are collected by a third party (Shopper Approved) and the way we have been told to display them on our pages is via a Javascript. My question is, is this sufficient for search engines to be able to crawl and interpret the Javascript or are we missing out on user generated content since it is displayed via Javascript. If so are there best practices or recommendations to help us? Thank you!
On-Page Optimization | | MyFairyTaleBooks
Dinesh
http://www.MyFairyTaleBooks.com <- this is the site in question if it helps.2 -
Using example.info when example.com is a link farm. Ok? Bad? Doesn't matter?
Second question of the day- I'm helping a friend with his law firm site. He is using example.info because example.com is being used by a link farm. Is this hurting his search efforts? Thanks
On-Page Optimization | | ahossom0 -
Does 'XXX' in Domain get filtered by Google
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for. Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered. here is the url: xxxchurch.com Thanks, Brian
On-Page Optimization | | Add3.com0 -
Internal Linking Question(s)
Is it unwise to link internally to a page more than once on the homepage. I am reading that it is considered spammy. I am also reading that it passes PR twice to the internal page instead of just once... Which is it? Is there a way to stop passing PR to the "contact us" page. I watched an older video that Matt Cutts suggested a nofollow. Now I read that this strategy is a no no? Which is it? Thanks! 🙂
On-Page Optimization | | JML11790 -
Would adding a line break tag into the product name affect SEO ranking and Google's ability to read the entire title?
Our client would like to include a link break so that part of the product name always showed up on a second line. Would this affect how Google bots crawl the product name? Would it also affect how Google would show the product name in a search result page? Thanks!
On-Page Optimization | | BrandLabs0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0