Hi Eslam. A link via an image is perfectly acceptable especially if they're natural links. Some sites will tend to have an abundance of these if they're things like images sharing sites, online comics, meme generators, etc. And many of these sites do perfectly well with a high percentage of image links. The bigger concern is that the site isn't spammy, see Moz's new tool that's part of OSE: http://moz.com/blog/spam-score-mozs-new-metric-to-measure-penalization-risk and http://moz.com/blog/understanding-and-applying-mozs-spam-score-metric-whiteboard-friday Cheers!
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by RyanPurkey
-
RE: Does backlinks in images equal naked backlinks?
-
RE: Is there any benefit of having a .tv tld instead of a .com for a video centric website?
None that I know of. Most of the optimization tools available can be used on any domain. See: https://support.google.com/webmasters/answer/80472.
Video content includes web pages which embed video, URLs to players for video, or the URLs of raw video content hosted on your site. If Google cannot discover video content at the URLs you provide, those records will be ignored by Googlebot.
A .tv won't enhance the ability for Google to find your videos, but proper tagging and sitemapping will. Cheers!
-
RE: Links from Google Books
Great question Neil! Google uses optical character recognition (OCR, more specifically OCRopus) to convert visible print into search searchable text; hence, you're able to find terms in Google books via Google search. Link text is also recognizable due to the standard 'http' format, so even though you'd never be able to click it via an old book in the library (who knows what new ones will do!) Google Books is still able to recognize the link and treat it as such in the digital, Internet realm. Now, a website that is being mentioned in books has a high likelihood of having a robust backlink profile, but that notwithstanding, I'd bet that Google would give an high amount of trust to a link that makes it into its OCR database.
As for street view, that is pushing it! Who knows though, there's merit in giving a website online exposure for the offsite work they do via billboards, store fronts, etc. I think you and I both would love to know the people that could truly answer that one though, huh?
-
RE: How do you build your pre-sales seo audit
These guides should serve you pretty well: http://moz.com/pages/search_results?q=checklist A lot of what you do depends on the situation.
-
RE: Case Sensitive URLs, Duplicate Content & Link Rel Canonical
In addition to using rel=canonical for the lowercase version of your URLs you should also consider implementing redirection from uppercase to lowercase. A regex expert should be able to write the redirect script you'll need to add to your .htaccess file in order to change upper to lowercase. Cheers!
-
RE: SEO impact of the anatomy of URL subdirectory structure?
I see. In that case, sure, any short folder would be fine. Maybe even 'a' as it reads a little nice: website.com/a/us-en/store/product-name.html. Reads like, "Website, a US, English language store with the product named X." Someone seeing the link would have a pretty good idea of what it is going to be.
-
RE: Tool/Method to find users on Twitter from a CSV file
Since you're looking for users and not necessarily tweets, you can scrape the company names you're interested in via a spreadsheet, and then append this URL to each unique name: http://twitter.com/#!/search/users/
Just getting rid of duplicate company names should cut down the list some, but it's still going to be a tedious process. At least with the spreadsheet you can add further columns to prioritize your work and go after the companies that are most applicable.
To cut the tedium, Mechanical Turk could then process the results for you fairly quickly.
-
RE: Screaming frog Advice
Hi Andy. There are quite a few settings you can adjust to make the server load less while the crawl is running. These can be found with descriptions here: http://www.screamingfrog.co.uk/seo-spider/user-guide/configuration/
For example, by not checking Images, CSS, SWF, and Javascript you'll be able to lessen load substantially, or if you'd like to crawl just a portion of the site you can set it to not check links outside of the start folder.
To have even more control over the crawl, you can use regular expressions to exclude certain pages, or sections that match a given pattern. The page above is fairly robust, so it should help you dial back the crawler to be friendlier to your server. Cheers!
-
RE: Multilingual Site and 301 redirection
For the other languages, it's going to be a conditional redirect, which is best handled by 302. Here it is from Google: http://googlewebmastercentral.blogspot.com/2014/05/creating-right-homepage-for-your.html :
A third scenario would be to automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content.
Remember to use x-default rel-alternate-hreflang annotation on the homepage / generic page even if the latter is a redirect page that is not accessible directly for users.
Note: Think about redirecting users for whom you do not have a specific version. For instance, French-speaking users on a website that has English, Spanish and Chinese versions. Show them the content that you consider the most appropriate.
Cheers!