Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Script tags and seo
-
Hi,
I have a page on my site with a google map embed, and a path drawn on the map. The path is made from a long string of coordinates. For ease I have the co-ordinates placed in a script tag at the foot of the page, amongst my javascript
My question is, will this script tag hurt the seo for the page? I've read that inline js and 'data islands' can be bad, so I've been careful to keep it out of the main body of the page. Thanks, any help appreciated!
-
Inline scripts aren't bad per se, search engines just can't always understand them. Worst case scenario: you have extra code that Google has to crawl but doesn't understand, which takes up bandwidth and doesn't add value. But, it won't lower your rankings.
So, do whatever you need to do to deliver the best user experience you can on your site with this map and related route, and figure that Google will ignore it (Google is trying to understand it, though, so it may be helpful in the long run). Then, for search engines, include some text content describing the map and the route so that search engines can send the right searchers to your page.
Good luck!
Kristina
-
Okay great, that's very helpful.
What if I wanted to have multiple scripts, say, for points of interest along the route, and had multiple (20+) tags at the bottom of the page? Would this be an ugly way of doing it, or considered totally okay in the eyes of google?
-
Yes, that's an inline script (putting it at the top or bottom will still be inline), but as I said, if only one page is using that script, you are good to go. There's nothing bad in using inline scripts if they aren't going to be used on other pages as well.
-
Thanks Federico.
As my script is being called at the bottom of the page, I would assume it doesn't count as 'inline'?
Yes the scripts are only being used once on specific pages.
-
Inline scripts are bad if you are bringing them on every page, if that's the case, just use scrip embedding so users don't need to download the scripts EVERY time they see a page.
But, if the inline script is used only on a specific page and not reused, then there's no reason to load it as an external file. In my opinion, that will even need an extra server call to bring a code that only works on that page.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have Your Thoughts Changed Regarding Canonical Tag Best Practice for Pagination? - Google Ignoring rel= Next/Prev Tagging
Hi there, We have a good-sized eCommerce client that is gearing up for a relaunch. At this point, the staging site follows the previous best practice for pagination (self-referencing canonical tags on each page; rel=next & prev tags referencing the last and next page within the category). Knowing that Google does not support rel=next/prev tags, does that change your thoughts for how to set up canonical tags within a paginated product category? We have some categories that have 500-600 products so creating and canonicalizing to a 'view all' page is not ideal for us. That leaves us with the following options (feel it is worth noting that we are leaving rel=next / prev tags in place): Leave canonical tags as-is, page 2 of the product category will have a canonical tag referencing ?page=2 URL Reference Page 1 of product category on all pages within the category series, page 2 of product category would have canonical tag referencing page 1 (/category/) - this is admittedly what I am leaning toward. Any and all thoughts are appreciated! If this were in relation to an existing website that is not experiencing indexing issues, I wouldn't worry about these. Given we are launching a new site, now is the time to make such a change. Thank you! Joe
Web Design | | Joe_Stoffel1 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Too Many Outbound Links on the Home Page - Bad for SEO?
Hello Again Moz community, This is my last Q of the day: I have a LOT of outbound links on the home page of www.web3.ca Some are to clients projects, most are to other pages on the website. Can reducing this to the core pages have a positive impact on SEO? Thanks, Anton
Web Design | | Web3Marketing870 -
Does having a Blog link in the top level navigation provide any better SEO value, or would having it in a footer or top navigation work just as good?
Trying to decide on whether placing a link to the blog in our top level navigation would have a better SEO value than just placing it in top or footer navigation. I have an ecommerce site.
Web Design | | RPD0 -
Other tags inside an H1 tag
So I have a situation with the website I'm currently redesigning where the H1 titles are supposed to mix colors per the current brand strategy. The branding crew is adamant that this has to be done so there is no use in saying "just don't do it". To accomplish this I'm wrapping the words that need to be the other color in a . Additionally, some pages have a "sub text" as part of the title, floated to the right and in a smaller font but with the same multi color treatment. I'm wondering if the sub text should be in an H2 and positioned to the right or if it would be beneficial to have the text in the H1 as well. An example of what I'm talking about would be something like this: "Big Shoes for Big Guys - Nike Shoes" In that, the "Big Shoes" and "Nike" would be one color and the "for Big Guys" and "Shoes" would be another. I can imagine having the "Nike Shoes" as part of the H1 would be a good idea in some respect but I'm not certain of that. In order to make that happen I can only think of one way to do it: -H1-
Web Design | | EscaladeSports
Big Shoes
-span- for Big Guys -/span-
-div- Nike
-span- Shoes -/span-
-/div-
-/H1- So that brings me back to the original concern, do search engines care about tags inside the H1? The only other way to accomplish the color changes that I can think of would be to have a fairly large chunk of javascript setup to go through H1's to colorize them using the span tags. That is unless GoogleBot has started to execute javascript while crawling the sites now...1 -
SEO Issues From Image Hotlinking?
I have a client who is hotlinking their images from one of their domains. I'm assuming the images were originally stored on the first domain (let's call it SiteA.com) and when they were putting together SiteB.com, they decided to just link to the images directly on SiteA.com instead of moving the images to Site B. Essentially hotlinking. Site A is not using the images in any way and in essence is just a gateway for their other sites and in this case a storage for their images. It doesn't use those images at all, so it really doesn't get any benefits of the images being referenced since I read that Google sometimes counts that hotlinking as a "vote" for the original image. But again, since ite A doesn't use the images that are being hotlinked at all, there's no benefit for Site A. My concern is that it's affecting their SEO for Site B because it makes it look like Site B is simply scraping data by hotlinking those images from Site A. Their programmer suggested creating a virtual directory so that it "looked" like it was coming from Site B. My guess is that Google can see this, so then not only will it look like Site B is scaping/hotlinking images, but also trying to hide it which may send up red flags to Google. My suggesstion to them was to just upload the images correctly into their own images directory on Site B. They own the images, so there's not any copyright issue, but that if they want proper SEO credit for that content, it all needs to be housed on the correct server and not hotlinked. Am I correct in this or will the virtual directory serve just as well?
Web Design | | GeorgiaSEOServices1 -
Seo and CSS media queries
Hello to all participants! I'm starting on responsive design with css media queries and I was wondering if hidding content can, in this case, can also be bad for seo? I know that hidding content is bad (eg. display: none;), but is it also like that with responsive design or does Google see it other way? If I have a news column with title, image and text for 1024px and hide the text and image leaving just the title for 768px, or smaller, will Google consider this black hat and will it be bad for seo? are there any articles I can read about this subject, and other similar subjects? sorry for my english 🙂 thanks
Web Design | | Lusodados1 -
Does Google follow links inside a <noscript>tag?</noscript>
I'm looking at making an embedable calculator and asking users to embed it to their website. I had the idea of using javascript to include the calculator which would also conatain a text link back to my site in order to gain some back links. If it's possible Google won't see the link (as they may not execute the javascript), is it safe to place the link in the <noscript>tag? If so, Will it be indexed and will Page Rank be passed?</span></p> <p>Thanks in advance for your answers. </p> <p>Anthony</p> <p><span style="color: #5e5e5e;"><br /></span></p></noscript>
Web Design | | BallyhooLtd0