Flickr Gallery Effect on Page Ranking
-
Hello there
We are working on a redesign for our site, and our business is very image intensive (sign company)
On a typical product page, we have 5 images we are placing directly in the site optimized to try to rank the images in image search
We also have about 30-50 sets of images, with 3-5 images each - hosted on flickr, that we are displaying as galleries on the page (user clicks, opens a light box to view the set, etc)
Here is the page - http://impactsigns.ugmade.com/sample-page/
If you look at the page code, you will see that the flickr gallery (additional examples) section - adds ALOT of code to the page (lines 498 to 837)
My question is :
Does adding that flick gallery block negatively impact the page SEO, all else being equal?
It seems like a lot of lines of code. And dont want it to seem spammy to the search engines.
Thanks for your help and advice
-
Oh, it's year of 2015 and you still use "timthumb.php"?
https://blog.sucuri.net/2014/06/timthumb-webshot-code-execution-exploit-0-day.html
https://www.binarymoon.co.uk/2014/07/dont-use-timthumb-instead/
https://www.binarymoon.co.uk/2014/09/timthumb-end-life/Also if i think that lack of alt text for images can be terrible for your page. If you're still looking for modern gallery you can use PhotoSwipe script and he is SEO and semantic friendly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Heavy rank drop post migration
Our website has been migrated from Joomla to Wordpress at the end of 2015 and we have tasted the loss of 20% of the traffic. After an year at the end of 2016, we have relaunched the website in same word press with new theme. Again we lost n rankings and traffic. I would say ranking. Because mostly people land on our website by searching for our brand. Now we almost went invisible for "keywords" we been targeting. We have checked all the possibilities like duplicate content, redirection, alt tags, speed, canonicals, backlinks, etc..and couldn't find what is hitting us. What could be such strong factor hitting us ?
Web Design | | vtmoz0 -
URL Structure's Effect on SEO
Hello all, I have a client who currently has a very poor URL structure. As it stands, their URLs are formatted in the following manner: http://www.domain.com/category/subcategory/page In all my years of SEO, however, I have always tried to implement the following format: http://www.domain.com/category/page The web designer for this particular project has been very reluctant to change the structure for obvious reasons, but I'm convinced that by modifying the URL structure, SEO will improve. I am correct in thinking this? Likewise, if I am able to get the URL structure changed, what do I need to look out for to make sure we don't lose any traction for our keyword terms? Any and all insight/suggestions is greatly appreciated. Thanks for reading!
Web Design | | maxcarnage0 -
Joomla Core Pages Delisted
Hey Everyone, This query may be more for a Joomla developer or someone that has had a similar issue. I'm not really looking for answers like, "check Google Search Console" or anything like that. We have a client who recently had all of their core pages delisted in Google but the blog is still being displayed in search results. For example, if you search "company name" they have a blog post that ranks #13 or so organically. I tested Google Search Console and Google is saying that the site is temporarily unavailable. We haven't made any changes or updates to Joomla's core structure so I'm unsure as to where this change is coming from. Here are some items we've checked: 1. Site searches within Google, resulted in seeing core pages are not indexed but blog pages are 2. Google Search Console - looked for manual actions (none found), looked at sitemap errors (nothing mentioned), looked at robots.txt (no issues here), attempted to fetch the site as Google (temporarily unavailable). 3. Called the hosting company (Rackspace) to discuss potential issues. They were extremely helpful but we were unable to find anything. The blog is actually a Module that was added so I'm thinking something has changed to block Google bots from the core Joomla structure but it hasn't blocked them from the blog structure. Without putting the company name or url on blast, has anyone heard of or experienced anything like this? Any help or insights would be much appreciated!
Web Design | | Leadhub0 -
Facebook is now only allowing owners of FB pages (not admins) to create keys for a WP blog post syndication. Is there a way around this?
I hired a contractor to configure a WP plugin to syndicate FB, G+, Twitter and standard WP posts. He is using NextScripts: Social Networks Auto-Poster. He came back to me saying that FB is now only allowing direct owners (not admins) of FB pages to create keys. This means I have to give my client's personal FB access to a third party contractor. I'm not comfortable asking my client to do this. Does anybody know of a way around this? Is there a way to create a FB key with just admin access? Thanks
Web Design | | RosemaryB0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
Advice on migrating from .com to .co.uk without dropping in rank?
I have a retail business in the UK whose website has *.com address and it has taken 3 years to reach a page rank of 3. We are building an updated site which will have a completely new url structure and optimized for SEO. We are considering launching the new site at a *.co.uk as we understand this will have advantages in local search and ranking as we are primarily targeting UK traffic. Does anyone have comments on **.com vs .co.uk and/or have any advice on how to handle the migration while minimizing any drop in traffic and ranking?
Web Design | | brian.james0