CSS and Javascipt files - website redesign project
-
UPDATED: We ran a crawl of the old website and have a list of css and javascript links as are part of the old website content. As the website is redesigned from scratch, I don't think these old css and javascipt files are being used for anything on the new site. I've read elsewhere online that you redirect "all" content files if launching/migrating to a new site. We are debating if this is needed for css and javascript files. Examples
-
Hey there,
One thing is for sure - js and css files have no value to your site's SEO. If your site is looking good and working well under the new design there's no point in worrying about this.
Good luck!!
-
Hi Andy!
Thanks for reading. We ran a crawl of the old website and the said links are part of the old website content. As the website is redesigned from scratch, I don't think these old css and javascipt files are being used for anything on the new site. I've read elsewhere online that you redirect "all" content files if launching/migrating to a new site. We are debating if this is needed for css and javascript files. Thanks!
-
Hi there,
Sorry but I'm not sure what you mean by that Why would you redirect them? What exactly are you looking to acieve?
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang and canonical for multi-language website
Hi all, We're about to have a new website in different languages and locations, which will replace the existing one. Lets say the domain name is example.com. the US version will be example.com/en-us/ and the UK version will be example.com/en-uk/. Some of the pages on both version share the same content. So in order to solve it, we're about to use hreflang on each page + a canonical tag which will always use the US address as canonical address. My question is - since we are using canonical tag along with hreflang, is there a possibility that a user who is searching with Google.co.uk will get the canonical US address instead of the UK address? Or maybe the search engine will know to display the right localized address since (UK) i've been using hreflang? It is really important for me to know, because i'm afraid we will lose the high rankings that we have right now on google.co.uk. Thanks in Advance
Technical SEO | | Operad-LTD0 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Is there any value in having a blank robots.txt file?
I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place. Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?
Technical SEO | | NicDale0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Page authority old and new website
Dear all, I tried to find this question using the search option but cannot find the exact same problem. This is the thing: I launched a new website in January, replacing our old website that did pretty good in the SERPs. The old website is still running on a subdomain old.website.com and the new website is on www.website.com (www.denhollandsche.nl) Both sites are indexed by google right now, but I'm not sure if that's a good thing. For our main keyword, the page on the new website has an authority of "23" and the exact same page (some minor differences) on the old website still has an authority of "30". Both currently are on the second page of google while some time ago, they where still on position 2/3/4. My question is: if I would take down the old website and make a 301 redirect for the old page with P/A 30, to point to the new page with a P/A 23, will the p/a of this new page take over the P/A of the old page? What effects can I expect? The reason the old website is still running is that google images still shows images from old.domain.com in stead of images from the new website... Thanks for your help guys!
Technical SEO | | stepsstones0 -
Are multiple sites needed to rank one website?
My SEO guy for a mortgage website says that we should have 30 websites, with about 250 pages each on each site plus 50 blogs in order to even think of ranking for mortgage keywords. Is that correct?
Technical SEO | | simermeet0 -
How best to optimise a website for more than one location?
I have a client who is a acupuncturist and operates clinics both in Chester and Knutsford in Cheshire the site performs well for Chester based terms such as "Chester acupuncture" this is the primary location the client wishes to focus efforts on but would also like to improve rankings for the Knutsford clinic and area. I have setup local places pages for each clinic and registered each on different local directories. Both clinic addresses are placed on each page of the website and have a map to each on the contact page. Most of the on-page SEO elements such as page titles, descriptions and on-page keywords mainly focus on the term "Chester" over "Knutsford" is it advisable to target both locations in these page elements or will local search have an effect on this and will reduce/ dilute overall rankings for Chester clinic? I haven't setup and separate page for each clinic location as this might help in terms of SEO for improving ranking for both locations but from a user point of view it would just duplicate the same content but for a different location and also would create duplicate content issues. Any advice/ experience on this matter would be greatly appreciated.
Technical SEO | | Bristolweb0 -
Sitemap for dynamic website with over 10,000 pages
If I have a website with thousands of products, is it a good idea to create a sitemap for this website for the search engines where you show maybe 250 products on a page so it makes it easy for the search engine to find the part and also puts that part closer to the home page? Seems like google likes pages that are the closest to the home page (less clicks the better)
Technical SEO | | roundbrix0