Image URLs changed 3 times after using a CDN - How to Handle for SEO?
-
Hi Mozzers,
Hoping for your advice on how to handle the SEO effects an image URL change, that changed 3 times, during the course of setting up a CDN over a month period, as follows:- (URL 1) - Original image URL before CDN:www.mydomain.com/images/abc.jpg
- (URL 2) - First CDN URL (without CNAME alias - using WPEngine & their own CDN):
username.net-dns.com/images/abc.jpg - (URL 3) - Second CDN URL (with CNAME alias - applied 3 weeks later):
cdn.mydomain.com/images/abc.jpg
When we changed to URL 2, our image rankings in the Moz Tool Pro Rankings dropped from 80% to 5% (the one with the little photo icons).
So my questions for recovery are:
- Do I need to add a 301 redirect/Canonical tag from the old image URL 1 & 2 to URL 3 or something else?
- Do I need to change my image sitemap to use cdn.mydomain.com/images/abc.jpg instead of www.?
Thanks in advance for your advice.
-
Sorry I missed this follow-up earlier. Within the site map you'll want to change the http://WWW to http://CDN for these image files. The www version of your site, and the cdn server are on two different IPs / server. You want images to be serving from the CDN one.
For 2, if you do use 301 redirection I'd recommend scripting it so that the script inspects whether or not it's an image file and then applies the cdn change. A pro in your area that works with REgex and htaccess will be able to guide you through that.
The username.net-dns.com thing... That's not your server is it? You can't apply redirects on servers outside of your control. Cheers!
-
Hi Ryan,
Thanks for your answer - Sorry I didn't mean about the URL for the location of the sitemap - I think my question wasn't clear - may I rephrase it:(1) Inside my image sitemap, the urls serve off the www. subdomain as bolded in the example below (not .cdn). I'm assuming this setup is correct as this was auto-generated by an Image Sitemap Generator - does the below image:loc look correct to you?
<url><loc>http://www.bosphorusyacht.com/yachts/</loc>
image:imageimage:lochttp://**WWW.**bosphorusyacht.com/wp-content/uploads/2010/09/istanbul-boat-rental.jpg</image:loc></image:image></url>(2) For a 301 image redirect would I set it up like this:
Redirect 301 /wp-content/uploads/2010/09/istanbul-boat-rental.jpg
http://**WWW.**bosphorusyacht.com/wp-content/uploads/2010/09/istanbul-boat-rental.jpgOR
Redirect 301 /wp-content/uploads/2010/09/istanbul-boat-rental.jpg
http://**CDN.**bosphorusyacht.com/wp-content/uploads/2010/09/istanbul-boat-rental.jpgOR
How would I 301 this one?: username.net-dns.com/images/abc.jpg
Hope you can advise one last time - thank you!
-
Right. Not everything is going to be served from cdn. It's most likely setup for your images so your sitemap will still reside on www. Make sure to point to the front end files though as those are the publicly accessible ones.
-
Hi Ryan,
Thanks for your reply and advice. I've read the guidelines and will follow those. But I wonder if you can clarify an issue on implementing them that is not answered there:On my site the images in 'Backend' (edit/admin/code view) start with WWW.mydomain... and in 'Frontend' (actual published view in browser) they start with CDN.mydomain...
So my question is, do I use the Backend or Frontend (www. or cdn.) for the URL in both image sitemaps and in 301 redirect final destination?
My current sitemap for example seems to be using www rather than cdn. : http://www.bosphorusyacht.com/sitemap-image.xml
Thanks for your help!
-
You're on it. Redirecting to the new image source and submitting a new sitemap pointing to the URL 3 location for your images will be big steps in the right direction. Be sure to follow the instructions here for your sitemap: https://support.google.com/webmasters/answer/178636 as well as reviewing image publishing guidelines: https://support.google.com/webmasters/answer/114016. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
Question About Local SEO
Hey all, If a business operates in one city but works with associated organizations across multiple regions how would this impact a local SEO campaign? For example, a transportation company is located in Texas but services the Northwest and New England by outsourcing to smaller transportation companies in each of those regions. Would it be wise to create pages for each region they service on their website and then break that down in further into specific cities? Also, would it be worth targeting local search terms even though specific cities are serviced by the associated organizations and not the parent company itself? Thanks in advance, Andrew
Local Website Optimization | | mostcg0 -
Using IP Detection to Filter Directory Listings without Killing Your SEO?
I have a client who maintains a directory of surgeons across the United States (approx. 2,000 members at present), and wishes to use IP detection to dynamically filter their surgeon directory to a sub-set that is relevant to the geography of the visitor. At the same time, however, we want the pages in the surgeon directory to rank nationally for terms like "[insert specialty] surgeons". Any tips/best practices for implementing an IP detection solution without shooting yourself in the foot from an SEO perspective? Is it even possible? Thanks! Jeremy
Local Website Optimization | | Jeremy_Lopatin0 -
Should I use Rel-Canonicals links for a News site with similar articles each year
Our small town news site provides coverage in a lot of seasonal areas, and we're struggling with the current year's content ranking above previous years. For instance, every year we cover the local high school football team, and create 2-3 articles per game. We'll also have some articles preseason with upcoming schedule and general team "talk". We've seen where articles from past seasons will rank higher than the current season, presumably because the older articles have more links to them from other sources (among other factors). We don't want to delete these old articles and 301 them to the newer article, since most articles include information/stories about specific players...and their families don't want the article to ever come down. Should we rel-canonical the older articles to the newer one, or perhaps to the "high school football" category page? If to the category page, should we rel-canonical even the new articles to that main category page? Thanks for the help.
Local Website Optimization | | YourMark.com0 -
Store Locator Apps - Which Do You Use?
Hey Everybody! I'd so appreciate feedback from our web developers and Local SEO wizards here regarding store locator apps (you know - type in a city/zip and get shown the stores nearest you). There are a number of different paid options out there on the market, and a couple of free ones. If you are managing the websites/SEO for multi-location clients, would you share with me which store locator app you chose, why you chose it and how you like it? I am particularly interested in two things about these: Does you app allow you to build a permanent landing page for each store location, including the ability to fully customize the content on that page? In terms of ensuring that these landing pages get crawled, have you used an html sitemap, some type of directory page with crawlable links or some other feature that allows bots to reach the landing pages? Or, if you're not doing any of that, do you believe Google is crawling javascript/ajax/something else to get through your store locator widget to the landing pages? Thanks, in advance, for helping me with my research on this topic!
Local Website Optimization | | MiriamEllis0 -
How to approach SEO for a national umbrella site that has multiple chapters in different locations that are different URLS
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local site. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago. We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Local SEO + Best Practice for locations
Hi All, Based on a hypothetical scenario, lets say you are a plumber. You live and operate within Chelsea in London. You have established a Google places profile and incorporated schema data to tell Google your fixed place location. In addition you operate in several nearby towns with no fixed location presence. i.e Brentford, Bromley, Catford, Cheswick and Tottenham. I create a feature rich page on 'How to find a quality plumber'. Within the page I incorporate the following description: blah blah, as a quality plumber serving the community of Chelsea, we also offer our services to nearby towns of Brentford, Bromley, Catford, Cheswick and Tottenham. I create hyperlinks for the towns (Brentford, Bromley, Catford, Cheswick and Tottenham) that allow the user see in details a full list of services, operation hours, etc. Naturally all towns will have there own unique content (no duplication). Question
Local Website Optimization | | Mark_Ch
Is the above scenario the correct way to provide local seo or is this approach considered spammy to Google? Thanks Mark0