Alt tag best practices for a mutli gallery site with hundreds of images
-
I have neglected to add alt tags to one of my sites, and am ready to tackle the project. I want to make sure I do not do something that will have a negative impact on rankings....and I have not been able to find info that fits my situation.
The pics are all about a product I make and sell through the site. I have a free gallery section that has about 10 galleries with about 20 pics each. Each gallery page has a different model and/or context of how the product. might be used. These are not sales pages directly just thumbnail galleries linked to larger images for the viewers enjoyment.
I have 10 or so keyword phrases that would be good to use, with the intent to start getting listed in google images and other rank enhancements.
Can I choose one keyword phrase as my alt tag choice for a whole gallery and give each individual large pic in the gallery that same alt tag, And use a different phrase for the next gallery's pics etc.?
Or is that thought of as stuffing, and I would have to come up with a different keyword phrase for each pic?
I hope that makes sense.
Thanks
Galen
-
Hi Thanks for the info.
Is it correct then to assume, acknowledging all the cautions given, that it would be most advantageous to just target one pic per gallery to optimize, instead of trying to optimize each pic in each gallery, and spread my different key word phrase descriptions among one picture per each gallery?
THanks
Galen
-
My suggestion would be to go with the basic idea of what Image Alt tags are and go with it and you will find the right path yourself. In my personal opinion Alt tag images are use to describe and hint search engine about what the image is all about so if your keyword justify an image go with it but if not then try to avoid that... using similar image alt tag on all images will take you to the red zone of Google and they might consider you a spammer for adding keyword stuffing within the content.
Try not to get in to the red area of Google and play safe!
Hope this helps!
-
Alt tags are intended for the purpose of describing the image, while you can include keywords in them, if you use all the same keywords, while not actually describing the images, it will be seen as stuffing, I strongly suggest not to do that. Instead focus on writing a real image description and try to put the keyword in them, not forcing the keywords neither. If the picture can include a keyword in the alt tag great, but if not, just don't.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Leveraging A Second Site
Hi, A client of mine has an opportunity to buy/control another site in the same niche. The client's site is the top-ranked site for the niche. The second site is also often top half of page one. The second site has a 15 year old design that is a really bad, almost non-functional, user experience and thin content. The client's site (site 1) has the best link profile and dominates organic search, but the second site's link profile is as good as our nearest competitor's link profile. Both sites have been around forever. Both sites operate in the affiliate marketing space. The client's site is a multi million dollar enterprise. If the object were to wring the most ROI out of the second site, would you: A) Make the second site not much more than a link slave to the first, going through the trouble to keep everything separate, including owner, hosting, G/A, log-on IPs, so as not to devalue the links to 1st site, etc? Or... B) Develop the second site and not worry about hiding that both are the same owner. Or... C) Develop the second site and still worry about it keeping it all hidden from Google. Or... D) Buy the second site and forward the whole thing to site 1. I know the white hat answer is "B," but would like to hear considerations for these options and any others. Thanks! P.S., My pet peeve is folks who slam a fast/insufficient answer into an unanswered question, just to be the first. So, please don't.
White Hat / Black Hat SEO | | 945010 -
Title and Meta Description Best Practice
Hi guys, I need some creative input on this. I'm working on a Hyundai dealership's website, and I want it to rank well for "used cars" in its local market. I need it to rank well for four cities for "Hyundai Dealer" also. Can you pick apart a dummy meta title and description I put together? In the example "Metropolis" will be the home city Title, "Hyundai Dealer serving Metropolis, Gotham City, Star City & Red City, NY | Used Cars Red City, NY" Description, "Visit Bob's Hyundai in Metropolis, NY. We're a new & used car dealer near Gotham City, Star City & Red City, NY." Be brutally honest and let me know what I can do to achieve this objective beyond this too if you can. I want to know how I can achieve this objective. Thanks a bunch!
White Hat / Black Hat SEO | | oomdomarketing0 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
What do you say in your emails to horrible sites to remove your links?
Morning guys, I've the unenviable task of having to rectify poor link building (a previous company's work, not mine) which inevitably means emailing tons and tons of horrible directories with links to the client from as far back as 5/6 years ago. I'm sure many of you are in the same boat so it begs the question: What have you said to these types of sites that is effective in getting them to remove the links? This could even be a two/three-parter: If you've had little joy in requesting removals, have you dis-avowed the links, and what (if any) effect did it have? Thanks, M.
White Hat / Black Hat SEO | | Martin_S0 -
301, 404 or 410? what is the best practice
Hi I'm currently working on a project to correct some really bad practices from years of different SEO's. Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page. Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates. I've pulled the pages, but i'm in several frames of mind on how to best fix this. The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them. Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
White Hat / Black Hat SEO | | eminent1 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Getting sites unbanned in bulk
Prior to the Panda update we had 1 main site, and 300 or so satellite sites. The satellite sites all had an identical template with identical content. The satellite sites all got flagged, and the main site persevered. We'd like to TRY to get all of these sites unbanned in bulk. My question is...how 'DIFFERENT' should these sites be? I know that a real google employee will be looking. All of these sites will be in the same industry...so how 'different' can the content really be? I am going to try to do this in sets of 10 and purchase a different template for EACH city/satellite site, as well as having varying categories, but realistically how doomed/successful do you think this endeavor will be? Any advice? realistic timeline?
White Hat / Black Hat SEO | | ilyaelbert0