Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will having image lightbox with content on a web page SEO friendly?
-
This website is done in CMS. Will having lightbox pop up with content be SEO friendly?
If you go to the web page and click on the images at the bottom of the page. There are lightbox that will display information. Will these lightbox content information be crawl by Google? Will it be consider as content for the url http://jennlee.com/portfolio/bran..
Thanks,
John
-
Hi Dale,
Really stupid question, how do I look at the CSS to identify that? I've viewed source but cant see that information anywhere on the page.
If you wouldn't mind, could you point me in the right direction of some information about this issue, I would be interested in understanding it better, but until you brought it to my attention, I had no idea even to look for it
J
-
Ryan and James,
Take a closer look at the div class of the lightbox (class="contact"). In the CSS for the page in question we find the following:
div.contact {
display: none;
visibility: hidden;}
In my opinion, you're asking the wrong question. This isn't about lightboxes or DA at all; it's about the display:none; and visibility: hidden; elements.
There is no shortage of information about that here on SEOmoz or in the Google Webmaster Forums.
-
Interesting supposition. i've got absolutely no idea if a stronger page changes the specific parts of a page are parsed.
Shouldn't be too difficult to work out though:
If we work on the logic that an exact match search result indicates that the text is being read and used by google, you can then compare javascript parsing across strong and weak pages.
Another way would be to look at the cached text only version across pages and see if there is any difference, although I think I prefer the first suggestion
Seems simple, although it probably isn't
j
-
I agree with your assessment James.
Before I accept this information I would like to ask if you are aware of any other similar examples of lightbox use on a page with better stats? The DA of this page is only 31, and PA is 1. I would like to rule out the idea Google may crawl deeper if the page was deemed more important.
-
James is correct. Your lightbox content is not visible to a Google Bot.
You can see from an exact match search of some text from the page that Google has indexed the visible text: http://bit.ly/nDQLlM
The only place that the exact text from the lightbox appears in the Google index is on this thread: http://bit.ly/mRQICc
-
Sorry for butting in on an old(ish) post, but I have a different opinion on this...
Correct the text used in the example does show up in the source code as HTML, but I dont think that indicates that google is reading that text.
For me there are two ways to check to see if Google is reading text:
1. Do an exact match (quotation marked) search in google.
2. Look at the cached version of the page in google in text only version.
From that information, the lightbox data is not showing up and for me that would indicate that the text is not being read.
Also, an interesting point to note is that 'Fetch as Googlebot' should not be used as a method of identifying what text is being parsed according to searchengineland http://searchengineland.com/see-what-googlebot-sees-on-your-site-27623
Feel free to prove me wrong!
thanks
james
-
I have read that article before. Keep in mind it is from 2008. Technology and Google have advanced substantially in the past 3 years.
100% of the text in all your lightbox is fully viewable by Google presently. William and I both looked and we see the text in your html source code. That means Google can see it as well.
-
Those are not issues on your site.
Your light box images are fully crawlable. Google sees all of the images and the text descriptions. You definitely want to add an ALT description. Otherwise you are in great shape.
-
thanks for all the responses guys.
my thoughts were most of the time it depends upon the script because some script hide data from the viewers while it shows the same data to Search Engine which turns out Clocking issue on website.. this could be proved very dangerous for the website.
Also seems like google does not crawl the images as often than normal web page.. because it hide the contents and creates unauthenticated website.
-
Sure thing brother!
-
Thank you William. Somehow I missed it during my review of the source code.
-
Hi Ryan,
Yes, I just did a search for the text I found in the Lightbox description for the Coco & Max logo. Right there. I've attached a couple images to show what I found.
Is this underneath a Javascript? I'd be interested to learn about the differences between different scripts as I see myself building sites that I would like to use the most SEO beneficial one.
-
Hi William.
Thanks for the feedback. I did look at the HTML and the real text is NOT visible. I am pretty sure that Google can read it even in the javascript, but I am not certain so I did not wish to offer that conclusively. If I knew which version was in use, such as Highslide, I could check and offer a confirmation.
The first image shared is the Coco and Max logo. If you click on that image the Lightbox will appear with a description that says "The Jenn Lee Group developed photography, business cards, expo-banner plus an ecommerce website for Coco and Max using a logo they had already developed. The Jenn Lee Group can pick up the ball at whatever stage you are currently in towards your marketing and advertising initiatives. Call us today! 401-885-3200"
I do not see that text snippet anywhere in the page's source code. Also, there are a total of 7 pictures offered in a group with that first image, each which their own text.
If you have any additional information, I would love to learn as well.
-
Lightbox should have zero negative impact in regards to SEO, providing you have effectively labeled your photos. I love the look of it, and although has a similar effect to flash, they have nothing to do with eachother in regards to negative SEO.
-
Hey Ryan,
The Original Poster is actually talking about the text descriptions of each logo that is listed.
The easy way to figure this out is to look in the HTML. If it's real text, then Google can crawl it. In your case it is.
So the content you have will be indexed.And you can do as Ryan suggested and add Alt Attribute to each image. It will help as well.
-
The biggest gap I see on your site is your images are all missing ALT tags. Search engines don't see images the way people do. By providing an alt tag, you can offer a description of each image. For example your first image alt tag might be "logo Coco & Max Doggie Distinctions".
There are many packages of javascript code which use Lightbox so if you want a more definite answer you would need to take a look at your specific package. Highslide and Suckerfish are two examples of Lightbox javascript coding packages. For additional research you can check out this article.
Another note. I would recommend changing your Meta description to readable text, not a list of key words. Your meta description is what people will see as your listing in search engines. It will not affect your search result ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
We have followers, following, friends, etc pages for each user who creates account on our website. so when new user sign up, he may have 0 followers, 0 following and 0 friends, but over period of time he can get those lists go up. we have different pages for followers, following and friends which are allowed for google to index. When user don't have any followers/following/friends, those pages looks empty and we get issue of duplicate content and description too short. so is it better that we add noindex for those pages temporarily and remove noindex tag when there are at least 2 or more people on those pages. What are side effects of adding noindex when there is no data on those page or benefits of it?
Intermediate & Advanced SEO | | swapnil120 -
Is it bad for SEO to have a page that is not linked to anywhere on your site?
Hi, We had a content manager request to delete a page from our site. Looking at the traffic to the page, I noticed there were a lot of inbound links from credible sites. Rather than deleting the page, we simply removed it from the navigation, so that a user could still access the page by clicking on a link to it from an external site. Questions: Is it bad for SEO to have a page that is not directly accessible from your site? If no: do we keep this page in our Sitemap, or remove it? If yes: what is a better strategy to ensure the inbound links aren't considered "broken links" and also to minimize any negative impact to our SEO? Should we delete the page and 301 redirect users to the parent page for the page we had previously hidden?
Intermediate & Advanced SEO | | jnew9290 -
Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content
Hello, Can anyone help me find a solution to Fixing and Creating Magento CMS pages to only use one URL and not two URLS? www.domain.com/testpage www.domain.com/testpage/ I found a previous article that applies to my issue, which is using htaccess to redirect request for pages in magento 301 redirect to slash URL from the non-slash URL. I dont understand the syntax fully in htaccess , but I used this code below. This code below fixed the CMS page redirection but caused issues on other pages, like all my categories and products with this error: "This webpage has a redirect loop ERR_TOO_MANY_REDIRECTS" Assuming you're running at domain root. Change to working directory if needed. RewriteBase / # www check If you're running in a subdirectory, then you'll need to add that in to the redirected url (http://www.mydomain.com/subdirectory/$1 RewriteCond %{HTTP_HOST} !^www. [NC]
Intermediate & Advanced SEO | | iamgreenminded
RewriteRule ^(.*)$ http://www.mydomain.com/$1 [R=301,L] Trailing slash check Don't fix direct file links RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ $1/ [L,R=301] Finally, forward everything to your front-controller (index.php) RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule .* index.php [QSA,L]0 -
How does having multiple pages on similar topics affect SEO?
Hey everyone, On our site we have multiple pages that have similar content. As an example, we have a section on Cars (in general) and then specific pages for Used Cars, European Cars, Remodeled Cars etc. Much of the content is similar on these page and the only difference is some content and the additional term in the URL (for example car.com/remodeled-cars and /european-cars). In the past few months, we've noticed a dip in our organic ranking and started doing research. Also, we noticed that Google, in SERPs, shows the general page (cars.com/cars) and not the specific page (/european-cars), even if the specific page has more content. Can having multiple pages with similar content hurt SEO? If so, what is the best way to remedy this? We can consolidate some of the pages and make the difference between them a little clearer, but does it make that much of a difference for rankings? Thanks in advance!
Intermediate & Advanced SEO | | JonathonOhayon0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0