You're on it. Redirecting to the new image source and submitting a new sitemap pointing to the URL 3 location for your images will be big steps in the right direction. Be sure to follow the instructions here for your sitemap: https://support.google.com/webmasters/answer/178636 as well as reviewing image publishing guidelines: https://support.google.com/webmasters/answer/114016. Cheers!
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by RyanPurkey
-
RE: Image URLs changed 3 times after using a CDN - How to Handle for SEO?
-
RE: Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Right. Not everything is going to be served from cdn. It's most likely setup for your images so your sitemap will still reside on www. Make sure to point to the front end files though as those are the publicly accessible ones.
-
RE: Asking a site to remove a "nofollow" on a link to our client
I wouldn't worry too much about the nofollow link, especially since having a complete lack of nofollow links in a big profile would be a warning sign of link manipulation. Still, Google also knows when a site with really high trust and authority uses nofollow links to maybe a too high of a degree--like wikipedia. That said, they also can bring search value. See: http://moz.com/blog/the-hidden-power-of-nofollow-links and the first comments at the end of the post. Cheers!
-
RE: Should I use sessions or unique visitors to work out my ecommerce conversion rate?
Matthew makes great points. I'd add to this that having conversions tied to membership data makes it all the more person specific. This is why you'll here numbers like 74% conversion rate for Amazon Prime members (see: https://www.internetretailer.com/2015/06/25/amazon-prime-members-convert-74-time). Aside from better tracking you can begin to see the value for Amazon in having members...
- Similar to Facebook they're collecting user data per person and building a massive user base aside from just sales.
- Better tracking.
- Higher conversion rates.
- Top of mind branding.
- Upselling
- And so on...
You get the idea. That's why when you go to Amazon.com the only pop-up or animated prompt you'll see on the home page is to "sign-in". Obviously, this could be something out of scope for your project currently, but food for thought down the road.
-
RE: Badges For a B2b site
Several in our space come to mind: Google's Certifications (AdWords, Analytics, etc.). Eloqua is also offering an Accreditation program to customers and planning to open it up to the public: http://www.eloqua.com/services/eloqua_university/Eloqua_Accreditation.html
The shopping badges are also popular: verisign, hackersafe, etc. The most successful B2B badges all seem to really represent something -- knowledge, security, etc -- instead of being a badge for badge sake.
-
RE: Seeing lots of 0 seconds session duration from AdWords clicks
Like Mike requests, being able to view the landing page plus keyword combination is pretty key to discovering the low session duration. Knowing what the ad copy is would help as well. Currently though you're dealing with way too small of a sample size of 14 users. Something like 140 users would be more indicative of trends, while 1400 users would be even better.
Aside from the low sample size, the basic reason low time on site happens is because people are expecting something different than what they're getting from a site so they leave rather quickly. Specific reasons it could be happening: slow loading pages, poor design, poor matching keyword or ad copy to landing page content, poor user to content match, accidental clicks, etc. Cheers!
-
RE: Pages are Indexed but not Cached by Google. Why?
You're welcome Teddy. Something that goes undermentioned when SEOs run very precise tests on specific page side changes is that they're typically doing them on completely brand new domains with non-sense words and phrases because of the chance that their manipulations might get the site blacklisted.There's no loss to them if that happens other than unanswered questions. If the site does survive for a bit maybe they'll learn a few new insights. This level of granular, on site testing isn't a practical method for legitimate, public facing sites.
When it comes to sites that serve a business function aside from testing possible granular ranking changes, you're going to be much better served by measuring your changes against your user interaction instead of your search engine rankings. In that vein, design and test for the users on your site, not the search engines. If your site is getting visits but none of your business goals are being met, go nuts on testing those factors. Split test, iterate, and improve things with the focus of better conversions. Dive deep into services like Optimizely and research by the likes of Conversion Rate Experts. Use focus groups and usability testing to see how the minutiae of your changes affects interaction. You can go as extreme as you want to in that regard.
Most importantly, the bulk of search engine ranking strength comes from external factors: the number and variety of sites linking to your site, the quality of sites linking to your site, the trust and high reputation of sites linking to your site, the semantic agreement of sites linking to your site, etc. These factors are going to have many times greater influence in your ranking than your onsite tweaks in most cases. If your site is functional and complies with Google's own guidelines (http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf) you've covered the bulk of what you need to do on site. Focus instead on off site factors.
The site: search function exists mostly to provide searchers the ability to find a piece of information on a given domain. For example, let's say a reporter wants to cite an NBA stat from NBA.com, they'd use "stat thing" site:nba.com as a search. For users, that's useful in searching specifics, and for Google that makes them look all the better at "categorizing the world's information." Cache demonstrates the amount of information Google has archived and how quickly it's available. Back in the day--story time--Google used to advertise how quickly and how broadly they indexed things. In fact, they still do! If you look at a search result you'll notice a light gray statistic at the very top that says something like, "About 125,000,000 results (0.50 seconds)" for a search about hot dogs for example. This is Google saying, "We're BIG and FAST." The precise details of your site are way down the list to Google's own story of being big and fast.
If you focus your efforts in off site optimization, linking with other reputable sites, and building your network you'll be way better served because you'll be getting referral traffic as well as lift in Google. Cheers!
-
RE: Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I think the perspective is a little skewed on this... If you look at it form the angle of a link from a spammy site is a bad thing (hence the need to disavow), that includes the anchor text being bad too, even if it's targeted anchor text. What I mean is if the site is considered spam and the link juice from it is negative, why wouldn't the logical conclusion be that the anchor text is not going to count as well, or even be a negative ranking factor for that anchor text.
Within the RFP I'd err on the side of caution (under promise - over deliver) and say that we're going to disavow X number of links and start targeting quality. If by some strange reason you do get an anchor text boost some how, it's extra to the above board work you're doing moving forward.
-
RE: Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
Because the disavow tool is mainly used for restoring sites that have been penalized for having spammy inbound links while unpenalized sites freely use nofollow. Also sites that have been linked to via nofollow aren't penalized because of it, and often see positive effects from it. A study on that: http://www.socialseo.com/blog/an-experiment-nofollow-links-do-pass-value-and-rankings-in-google.html, "Google may not "count" the link as a weighted backlink but this doesn’t mean they ignore the anchor text being used or the authoratative status of the website being linked from."
Further, nofollow links can still engage with active readers and provide tremendous lift--a moz example--while spam en masse is usually found on sites that have very little real world presence. Google has a pretty good idea of many sites that are worthy of a disavow...
For your precise situation you're going to have to run your own tests to get your own data and your own numbers that specifically back up what you believe, but my advice is that you don't let your client expect to get a substantial--if any--lift from their past links that they are planning to disavow.
P.S. Top secret... It's over 9000.
-
RE: How valuable is a link with a DA 82 but a PA of 1?
I agree with Travis. In short, yes it's an excellent link. Like Travis mentions, getting caught up in the numbers can be misleading at times, and for a short hand of the sites and people you want to work with it's better to think of them as relationships. In this case, being connected to an official site that's reputable, spam-free, and exclusive is an excellent connection.
-
RE: Double hyphen in URL - bad?
Hyphens are a very common convention in folder names, and while too many can possibly be a negative in a domain name (it-is-a-really-hyphenated-domain.com for example) they're an accepted practice in folder / file names.
One thing I'd ask though is if someone had a hyphen in their folder name to begin with, would that cause something like /double-----dash--becomes--quintuple--dash/ ? If so I'd ask them to try a little harder to get the dashes down to a minimum, just for the sake of keeping the URL shorter overall.
-
RE: Does a subdomain benefit from being on a high authority domain?
Rand recently did a whiteboard Friday on this very thing: http://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday, the pertinent part on your question being:
You're asking, "Should I put my content on a subdomain, or should I put it in a subfolder?" Subdomains can be kind of interesting sometimes because there's a lot less technical hurdles a lot of the time. You don't need to get your engineering staff or development staff involved in putting those on there. From a technical operations perspective, some things might be easier, but from an SEO perspective this can be very dangerous. I'll show you what I mean.
So let's say you've got blog.yoursite.com or you've got www.yoursite.com/blog. Now engines may indeed consider content that's on this separate subdomain to be the same as the content that's on here, and so all of the links, all of the user and usage data signals, all of the ranking signals as an entirety that point here may benefit this site as well as benefiting this subdomain. The keyword there is "may."
I can't tell you how many times we've seen and we've actually tested ourselves by first putting content on a subdomain and then moving it back over to the main domain with Moz. We've done that three times over that past two years. Each time we've seen a considerable boost in rankings and in search traffic, both long tail and head of the demand curve to these, and we're not alone. Many others have seen it, particularly in the startup world, where it's very popular to put blog.yourwebsite.com, and then eventually people move it over to a subfolder, and they see ranking benefits.
If at all possible, make it part of the domain in a subfolder.
-
RE: Good or bad adding keywords in Pinterest description?
You'll want to avoid creating pages that are keyword stuffed that then point back to your site as you'll be creating a page that could become a negative signal as an inbound link to your site, similar to a spammy link on a different domain. Moz covers this pretty well in their Search Engine Myths and Misconceptions here: http://moz.com/beginners-guide-to-seo/myths-and-misconceptions-about-search-engines, specifically, "One of the most obvious and unfortunate spamming techniques, keyword stuffing, involves littering keyword terms or phrases repetitively on a page in order to make it appear more relevant to the search engines. As discussed above, this strategy is almost certainly ineffectual.
Scanning a page for stuffed keywords is not terribly challenging, and the engines' algorithms are all up to the task. You can read more about this practice, and Google's views on the subject, in a blog post from the head of their web spam team: SEO Tip: Avoid Keyword Stuffing." Even if you have one degree of separation it's still not a benefit and not a best, or safe, practice.