You're on it. Redirecting to the new image source and submitting a new sitemap pointing to the URL 3 location for your images will be big steps in the right direction. Be sure to follow the instructions here for your sitemap: https://support.google.com/webmasters/answer/178636 as well as reviewing image publishing guidelines: https://support.google.com/webmasters/answer/114016. Cheers!
![RyanPurkey RyanPurkey](/community/q/assets/uploads/profile/1929-profileavatar-1619582383766.png)
Best posts made by RyanPurkey
-
RE: Image URLs changed 3 times after using a CDN - How to Handle for SEO?
-
RE: Value of Outbound Links
Reviewing old content is always a good idea as you can trigger positive signals like freshness of content, higher specificity towards searches, and better linking. Too many people think of links as a one way street, trying to gain links to their site without giving much thought to who they should link out too. For example, let's say you have two articles on two site, both are identically the same (forget about duplicate content for a moment) and have the same numbers when it comes to inbound link strength, PA, DA, etc. The article that links out to better resources (high trust sites, specific examples, sites with frequently updated information) is going to rank higher than the article that does not link to any external resources. In other words, Google can understand the content of your page, but also the quality of that content by its two way link associations.
-
RE: Splitting Google analytics data
It sounds like you'd want three different views. One that filters out business IP(s) 1, and business IP(s)2, and is thus "public". The other including only business IP(s) 2 and is thus Business 2, and the third including only business IP(s) 1 and is thus Business 1. This can be done by creating three views, named something like Business 1, Business 2, and Public then applying the applicable filters in the admin panel. Within filters >> Create New >> Name >> Predefined >> Include Only >> traffic from IP adressess >> etc.
-
RE: Google Penalize?
In most cases there's certainly less value for the second link (even more so to the target page since it's going to the exact same destination and the source page would have less link juice after the first link anyways). And yes, Google can measure several aspects of a keyword: popularity, commercial intent, search frequency, buzz/trend, etc. But I don't think one instance of this on a site would penalize an entire domain. A high instance of it would dilute a site in the hub/authority sense: http://en.wikipedia.org/wiki/HITS_algorithm as it's only pointing to one resource.
-
RE: Duplicate Product Pages On Niche Site
I think a bit more would go into the consideration whether or not to 301 the site like branding, social activity, conversion rate differences, and market segments served. If everything just folds into the main site better than keeping it separate in the niche, then I'd go for it. Here's a nice case study of a company that had an issue kind of similar to yours: http://moz.com/blog/2-become-1-merging-two-domains-made-us-an-seo-killing. Reading through that post will help with seeing things that they looked at in order consider merging the two domains. Cheers!
-
RE: Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Right. Not everything is going to be served from cdn. It's most likely setup for your images so your sitemap will still reside on www. Make sure to point to the front end files though as those are the publicly accessible ones.
-
RE: Asking a site to remove a "nofollow" on a link to our client
I wouldn't worry too much about the nofollow link, especially since having a complete lack of nofollow links in a big profile would be a warning sign of link manipulation. Still, Google also knows when a site with really high trust and authority uses nofollow links to maybe a too high of a degree--like wikipedia. That said, they also can bring search value. See: http://moz.com/blog/the-hidden-power-of-nofollow-links and the first comments at the end of the post. Cheers!
-
RE: Google Analytics, new property or views?
I'd avoid multiple properties for the same property (the website) for the points you outlined above. A new view would allow the previous time comparisons, and could probably be made to be less meaningless by applying and removing a few segments that end up doing similar things as your filters.
-
RE: Questions about canonicals
This might help your copywriter: have them incorporate neighborhoods from those other cities as well as testimonials from past customers based in those locations to help produce different content. Typically cities have their own separate ares: industrial part of town, hip, business sector, etc. that interact with service companies in different ways. Using your knowledge of how you've served people in those areas not only adds in a lot more familiarity to the page as well as localized referrals, but it should differentiate your content too.
-
RE: What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Like you say, there are going to be different things to emphasize per vertical, but here are three resources to help get you started.
- http://moz.com/learn/seo/schema-structured-data Moz's intro to Structured Data.
- http://moz.com/ugc/getting-the-most-out-of-schemaorg-microformats A nice Youmoz post on implementation
- http://www.stateofdigital.com/schema-org-best-practices/ Some best practices
The schema.org markups are designed to increase and diversify over time, so you'll likely have to search through them to find the best fits for your clients. Cheers!
-
RE: If your company name is the same as a famous person/movement/celebrity what kind of options do you have in building your knowledge graph relevance, and improving your SERP?
Hello Lora. Yes, it is certainly possible although the SERPs are going to vary based on current news, personalization, localization, and so on. Each type of site is going to have it's own strengths and weaknesses and the search engines are going to try and match their results to search intent as much as possible. The less you try to shoehorn into phrases that have a clear intent of looking for the celebrity / pop-culture item and instead play to the strengths of the global company, the better.
-
RE: Benefit of Guest Blogging with weak relevancy
Rand recently did a WBF on this, and one of the comments by Matt in the post has an interesting back of the napkin result for his tests that were similar to what you're describing above here: http://moz.com/blog/are-on-topic-links-important-whiteboard-friday#comment-325645, mainly, "Category 3 (Off Topic, Strong Links) actually started taking sites DOWN with it. If we had a strong but off topic/spammy link, it hurt the site more than helped. We tested a few "generic PBN" sites that had decent effect on some sites, negative effect on others. It seemed to stem from relevance with what went up & down."
If it's not something that's likely (high percentage chance of interest) to help a user visiting the page, I'd avoid it. Cheers!
-
RE: Should I use sessions or unique visitors to work out my ecommerce conversion rate?
Matthew makes great points. I'd add to this that having conversions tied to membership data makes it all the more person specific. This is why you'll here numbers like 74% conversion rate for Amazon Prime members (see: https://www.internetretailer.com/2015/06/25/amazon-prime-members-convert-74-time). Aside from better tracking you can begin to see the value for Amazon in having members...
- Similar to Facebook they're collecting user data per person and building a massive user base aside from just sales.
- Better tracking.
- Higher conversion rates.
- Top of mind branding.
- Upselling
- And so on...
You get the idea. That's why when you go to Amazon.com the only pop-up or animated prompt you'll see on the home page is to "sign-in". Obviously, this could be something out of scope for your project currently, but food for thought down the road.
-
RE: Page Speed or Size?
Page speed is a factor. But the video / moving banner thing has been a popular design feature of late--see https://www.paypal.com/home as an example, and wait a bit for the banner image to move--but even a company with bandwidth like Paypal doesn't make the animation overly long. They also achieve their look and feel prior to the animation. In one example, here's the video as a stand alone: https://www.paypalobjects.com/webstatic/en_US/mktg/wright/videos/OneTouchCity.mp4 It's only 10 seconds long and 2.1MB vs 21.2MB. You might be able to use this example with your client as a company that is highly focused on quickly loading pages and caution against including an element that would cause the whole page to suffer as a result. Also, making watching the video an event in Analytics would tell you how many people are really interested in seeing it, versus those visitors who are not.
-
RE: Google's Mobile Update: What We Know So Far (Updated 3/25)
Another consideration: if someone links to a responsive site--whether it was from mobile or non--the link would remain the same. That should help as well. That way there's less need to prompt users to "Switch to Desktop" or vice versa.
-
RE: What are some powerful reviews websites for online-only businesses?
resellerratings.com is one, but there aren't a huge number of these. Sites that are online presence only usually get reviews via industry relevant blogs on the front-end, i.e. a blogger buys and reviews the product and places the review on their own site. Google will definitely notice these as they're almost always linking back to the company / online store.
-
RE: Second anchor text on page. Does it count?
This is also a question of usability / user experience. If your page incorporating in-article anchor links is laid out well and has engaged users clicking those links to explore the different sections of the page, you're likely getting a lower bounce rate and better numbers in-terms of users bouncing back to the search results from your page.
In ATPs case, those links were likely causing navigational confusion and/or potential keyword stuffing flags. Cleaning them up served his site better.
In either case, I don't think it's a simple question of solely how many links count, especially in real world usage. Cheers!
-
RE: Badges For a B2b site
Several in our space come to mind: Google's Certifications (AdWords, Analytics, etc.). Eloqua is also offering an Accreditation program to customers and planning to open it up to the public: http://www.eloqua.com/services/eloqua_university/Eloqua_Accreditation.html
The shopping badges are also popular: verisign, hackersafe, etc. The most successful B2B badges all seem to really represent something -- knowledge, security, etc -- instead of being a badge for badge sake.
-
RE: Can i use "nofollow" tag on product page (duplicated content)?
Hi Franci. You might consider using rel=canonical for category pages... There was a YouMoz post written about this topic a few years back that still has lots of applicable tips that you might consider: http://moz.com/ugc/guide-to-ecommerce-facets-filters-and-categories. Without knowing the details of how your site is using categorization, filtering, faceting, etc it's hard to be precise as to what you should do. That article should help you quite a bit though.
-
RE: Google's Mobile Update: What We Know So Far (Updated 3/25)
I think these changes have the backing of tons of user data--way beyond what we would have access to individually--and a lot of sites are completely in the dark still about mobile usability. Like the majority of sites on the web are happy with the status quo as long as it's working for them.
People do however look at search traffic, income, etc. When they see a sudden drop in that, they'll come around asking why. In the long view many more sites will become much more user friendly after this change.
-
RE: Seeing lots of 0 seconds session duration from AdWords clicks
Like Mike requests, being able to view the landing page plus keyword combination is pretty key to discovering the low session duration. Knowing what the ad copy is would help as well. Currently though you're dealing with way too small of a sample size of 14 users. Something like 140 users would be more indicative of trends, while 1400 users would be even better.
Aside from the low sample size, the basic reason low time on site happens is because people are expecting something different than what they're getting from a site so they leave rather quickly. Specific reasons it could be happening: slow loading pages, poor design, poor matching keyword or ad copy to landing page content, poor user to content match, accidental clicks, etc. Cheers!
-
RE: How risky is temporarily moving high value SEO pages to a brand new domain?
Ouch. That's a tricky one, but a 302 is likely your best bet if you HAVE to move to the new domain and then move them back to new URLs on the old domain with a 301, but why do the middle step of moving these pages to the new domain? If you're going to put them on new URLs on the old domain, why not just create those URLs and move the pages there? Any competent developer should be able to work around having pages on a domain remain live while development takes place in the background. Really the only instance I could see of having to part with your domain for a few months is if you lost ownership of it, other than that, I'd do what I could to avoid bouncing around on domains, especially as far as a brand is concerned.
-
RE: Pogo-Stick or not?
Nope. The way the search engines might "catch on" as Rand mentions in his WBF is that the search engines know when the person leaves to go look at your site and when they come back. So while you might still show a bounce because they had no interaction other than staying for 2 minutes, the search engine sees an exit >> 2 minutes >> a return. A pogo-stick behavior would be exit >> 1.2 seconds >> return, and for a LOT of users doing so.
-
RE: 301'ing old (2000), high PR, high pages indexed domain
Hi Steve. EGOL would be the expert on this question for sure. And just to clarify it was only a site that triggered removal and not you as an individual or company? Beyond that, here's the official guide on how to get a site reinstated based on appeal:
We're always willing to work with you to resolve any issues you might have. If you feel that this decision was made in error, and if you can maintain in good faith that the invalid activity was not due to the actions or negligence of you or those for whom you are responsible, you may appeal the disabling of your account.
To do so, please contact us only through our appeal form.
Once we receive your appeal, we'll do our best to inform you quickly and will proceed with appropriate action as necessary. Please understand, however, that there is no guarantee that your account will be reinstated.
Once we've reached a decision on your appeal, further appeals may not be considered, and you might not receive any further communication from us.
From: https://support.google.com/adsense/answer/57153#q2 Chances are you've already tried this and the appeal was rejected. Did you go over each and every point of their reasoning behind invalid activity? Did you go full kimono? You could try once more but I'd preface that as FINAL appeal and then let it rest and look for different advertisers.
If you do pursue linking the old, banned account to a new one, you'd probably run afoul of, "My account was disabled for being related to another disabled account. Can you tell me more about this relation?" from that same page... Good luck!
-
RE: Pages are Indexed but not Cached by Google. Why?
You're welcome Teddy. Something that goes undermentioned when SEOs run very precise tests on specific page side changes is that they're typically doing them on completely brand new domains with non-sense words and phrases because of the chance that their manipulations might get the site blacklisted.There's no loss to them if that happens other than unanswered questions. If the site does survive for a bit maybe they'll learn a few new insights. This level of granular, on site testing isn't a practical method for legitimate, public facing sites.
When it comes to sites that serve a business function aside from testing possible granular ranking changes, you're going to be much better served by measuring your changes against your user interaction instead of your search engine rankings. In that vein, design and test for the users on your site, not the search engines. If your site is getting visits but none of your business goals are being met, go nuts on testing those factors. Split test, iterate, and improve things with the focus of better conversions. Dive deep into services like Optimizely and research by the likes of Conversion Rate Experts. Use focus groups and usability testing to see how the minutiae of your changes affects interaction. You can go as extreme as you want to in that regard.
Most importantly, the bulk of search engine ranking strength comes from external factors: the number and variety of sites linking to your site, the quality of sites linking to your site, the trust and high reputation of sites linking to your site, the semantic agreement of sites linking to your site, etc. These factors are going to have many times greater influence in your ranking than your onsite tweaks in most cases. If your site is functional and complies with Google's own guidelines (http://static.googleusercontent.com/media/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf) you've covered the bulk of what you need to do on site. Focus instead on off site factors.
The site: search function exists mostly to provide searchers the ability to find a piece of information on a given domain. For example, let's say a reporter wants to cite an NBA stat from NBA.com, they'd use "stat thing" site:nba.com as a search. For users, that's useful in searching specifics, and for Google that makes them look all the better at "categorizing the world's information." Cache demonstrates the amount of information Google has archived and how quickly it's available. Back in the day--story time--Google used to advertise how quickly and how broadly they indexed things. In fact, they still do! If you look at a search result you'll notice a light gray statistic at the very top that says something like, "About 125,000,000 results (0.50 seconds)" for a search about hot dogs for example. This is Google saying, "We're BIG and FAST." The precise details of your site are way down the list to Google's own story of being big and fast.
If you focus your efforts in off site optimization, linking with other reputable sites, and building your network you'll be way better served because you'll be getting referral traffic as well as lift in Google. Cheers!
-
RE: An article we wrote was published on the Daily Business Review, we'd like to post it on our site. What is the proper way?
Hi Pete. Using rel=canonical would be a better implementation as your site showing up for a search on these articles is perfectly acceptable since they're about your site. There are also several other design ways in which you can link back to the original published article...
- Annotation. Instead of republishing the entire article you can quote bits from it and highlight what service/product/thing your company does in relation to the quote. It could perhaps be an expansion like, "We also make this in custom colors..." a clarification, "This is now a permanent service..." or any other applicable detail really.
- Screen cap. Some sites churn through articles so an archived screen grab of the article is nice to show the press you got. Photos are especially handy for when you show up in print.
- A brand scroll. Lots of sites add the logos of well know brands that have written about them titled something like, "What people are saying" and then showing the logo of various sites: the verge, wired, tech crunch, etc. and linking to the article via the logo.
So I'd get rid of the noindex tag. Me finding your site as a result next to the Daily Business Review site would make my user experience better as the search is returning the correlation even before I click through to read the sources.
-
RE: Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I think the perspective is a little skewed on this... If you look at it form the angle of a link from a spammy site is a bad thing (hence the need to disavow), that includes the anchor text being bad too, even if it's targeted anchor text. What I mean is if the site is considered spam and the link juice from it is negative, why wouldn't the logical conclusion be that the anchor text is not going to count as well, or even be a negative ranking factor for that anchor text.
Within the RFP I'd err on the side of caution (under promise - over deliver) and say that we're going to disavow X number of links and start targeting quality. If by some strange reason you do get an anchor text boost some how, it's extra to the above board work you're doing moving forward.
-
RE: Duplicate pages or note? Variations just due to language changes?
Hi Simon. Don has given you some good guidance. Here's a recent Moz Dev Blog post on the subject: https://moz.com/devblog/near-duplicate-detection/. Note their images explaining much of what Don described. Two pages having enough shared phrases (because of the header, footer, nav, etc) can trigger the duplicate warning. While the latter part of the dev blog post certainly gets technical, it should explain why you might be getting duplicate content warnings even further if that's your bent.
Since each tool is a bit different you can also check your pages with other tools, such as: http://www.webconfs.com. Cheers!
-
RE: Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
Because the disavow tool is mainly used for restoring sites that have been penalized for having spammy inbound links while unpenalized sites freely use nofollow. Also sites that have been linked to via nofollow aren't penalized because of it, and often see positive effects from it. A study on that: http://www.socialseo.com/blog/an-experiment-nofollow-links-do-pass-value-and-rankings-in-google.html, "Google may not "count" the link as a weighted backlink but this doesn’t mean they ignore the anchor text being used or the authoratative status of the website being linked from."
Further, nofollow links can still engage with active readers and provide tremendous lift--a moz example--while spam en masse is usually found on sites that have very little real world presence. Google has a pretty good idea of many sites that are worthy of a disavow...
For your precise situation you're going to have to run your own tests to get your own data and your own numbers that specifically back up what you believe, but my advice is that you don't let your client expect to get a substantial--if any--lift from their past links that they are planning to disavow.
P.S. Top secret... It's over 9000.
-
RE: Google Plus One button and https migration
Hi Zach. Since your URL isn't changing dramatically keeping the URL set to the HTTP version in the +1 is probably the best bet at this point. That way if Google does update their code in the future to recognize and allow transfer of likes, especially in a change from HTTP to HTTPS, you'll be ready to make the switch then to just the +1 portion and everything else will already have been in place.
-
RE: Google Cookies - Organic vs PPC visitors
You'll want your IT team to look at the referrer strings within your server logs identifying which ones are from Google. Just subtract out the ones that are from paid placement and you'll be on your way.
-
RE: Google Plus One button and https migration
If the transfer from HTTP to HTTPS is done well with 301 redirects, updating the various webmaster tool locations, maybe a little press around the change (new links, social, etc.) then it should transition well.
-
RE: Where do you find an individual/freelance SEO?
Since you're hiring part-time you might want to make the work more piecemeal in relation to your site, i.e. hire for specific tasks that are aiming to achieve your overall goals. When you're hiring full-time you have more options as you can compete with agencies for their talent. (Sorry for mentioning that agency friends!) As Sean and Ray mentioned above, finding people here at Moz with a proven track record helps as does some out reach via PM.
-
RE: My Home Page meta title on Google isn't what it should be
Search engines at times do not use the provided title tag (and meta description). See: http://moz.com/learn/seo/title-tag and the note, "Keep in mind that search engines may choose to display a different title than what you provide in your HTML. Titles in search results may be rewritten to match your brand, the user query, or other considerations."
-
RE: Confused about SEO for Forums (Status Code 200)
Hi Peiken. Status Code 200 is actually a good thing. That's the HTTP status code that just means the page is OK: https://en.wikipedia.org/wiki/List_of_HTTP_status_codes#2xx_Success
Within the duplicate content checker on Moz.com you'll get a better view of the duplicate content issues by downloading the CSV using the "Download CSV" button. Once that's open you'll have a better look into the pages with duplicate content. My guess is that a root page for a thread or discussion is being compared to "duplicate" ones deeper in the thread. Take a look at the CSV though and see if that begins to make more sense. Cheers!
-
RE: How valuable is a link with a DA 82 but a PA of 1?
I agree with Travis. In short, yes it's an excellent link. Like Travis mentions, getting caught up in the numbers can be misleading at times, and for a short hand of the sites and people you want to work with it's better to think of them as relationships. In this case, being connected to an official site that's reputable, spam-free, and exclusive is an excellent connection.
-
RE: Internal Links Query - What should be use as anchor text
Part of this is a layout question. Are the More / View All links opening up new pages or are they expanding content on the page via a # mechanism? If a user is going to a different page a common practice is to title the link to that page (a nice way to get your keywords there naturally), have a follow up blurb, and then link with a "more..." as well. Usually this is done for content / articles more so than on page expansion. The "Our articles contain valuable..." section on CRE's home page is a good example: http://www.conversion-rate-experts.com.
This can be done on page as well depending on how you create your layout. An old blog post, but discussion around linking via # anchors can be found here: http://moz.com/blog/the-first-link-counts-rule-and-the-hash-sign. Broadly, if your content is grouped together well both in the code and content it's easier for the search engines to define and parse. If links identify it as such, all the better.
-
RE: SEO of Social Media Pages
Hi Timothy,
Several factors could be coming into play... On the page side, age of the page, performance across the domain, links you're missing, and so on. Since it is older, this makes sense. Imagine it this way: when Google came to Twitter and saw the profile page, it was x% of Titter's total. Now it's decreased against the whole, but it made a presence for itself early on via "more tweets". The new site hasn't as much.
That's just the page side, on the search result side, you could be seeing shuffling results, personalized results, or any combination of factors resulting in what you're getting. Is the difference between 1 and 2 mission critical or more just a personal curiosity?
-
RE: Double hyphen in URL - bad?
Hyphens are a very common convention in folder names, and while too many can possibly be a negative in a domain name (it-is-a-really-hyphenated-domain.com for example) they're an accepted practice in folder / file names.
One thing I'd ask though is if someone had a hyphen in their folder name to begin with, would that cause something like /double-----dash--becomes--quintuple--dash/ ? If so I'd ask them to try a little harder to get the dashes down to a minimum, just for the sake of keeping the URL shorter overall.
-
RE: Homepage indexation issue
If you can, get that 301 redirection issue solved first as it's definitely not the type of one you want to use for this behavior. Google specifically recommends, "... to automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content." From here: http://googlewebmastercentral.blogspot.com/2014/05/creating-right-homepage-for-your.html. They go further into the hreflang tags here: https://support.google.com/webmasters/answer/189077?hl=en.
After getting the 301 cleaned up, for finding indexed pages it's better to use the "site:" search based operator in Google, Google Webmaster Tools, and Analytics. But really, get that 301 changed. Cheers!
-
RE: Should I buy a keyword rich domain to prevent competitors from buying it
If there's been a history of that from other competitors it's a consideration. Several companies also buy common typo domains--gooogle.com for example--partially as a user friendliness feature, partly as a competitor buffer. Domains are a low cost, so holding a few is pretty low-risk.
-
RE: Does a subdomain benefit from being on a high authority domain?
Rand recently did a whiteboard Friday on this very thing: http://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday, the pertinent part on your question being:
You're asking, "Should I put my content on a subdomain, or should I put it in a subfolder?" Subdomains can be kind of interesting sometimes because there's a lot less technical hurdles a lot of the time. You don't need to get your engineering staff or development staff involved in putting those on there. From a technical operations perspective, some things might be easier, but from an SEO perspective this can be very dangerous. I'll show you what I mean.
So let's say you've got blog.yoursite.com or you've got www.yoursite.com/blog. Now engines may indeed consider content that's on this separate subdomain to be the same as the content that's on here, and so all of the links, all of the user and usage data signals, all of the ranking signals as an entirety that point here may benefit this site as well as benefiting this subdomain. The keyword there is "may."
I can't tell you how many times we've seen and we've actually tested ourselves by first putting content on a subdomain and then moving it back over to the main domain with Moz. We've done that three times over that past two years. Each time we've seen a considerable boost in rankings and in search traffic, both long tail and head of the demand curve to these, and we're not alone. Many others have seen it, particularly in the startup world, where it's very popular to put blog.yourwebsite.com, and then eventually people move it over to a subfolder, and they see ranking benefits.
If at all possible, make it part of the domain in a subfolder.
-
RE: Now that Google will be indexing Twitter, are Twitter backlinks likely to effect website rank in the SERPs?
He also said the same thing about Google+ (see: http://moz.com/blog/google-plus-correlations) with the applicable quote being, "This post caused quite a bit of controversy. Matt Cutts of Google responded to this thread on Hacker News to imply +1s aren't used directly in Google's algorithm.
While I take Matt at his word that Google doesn't use raw +1s to rank webpages, the evidence seems to suggest Google+ posts do pass other SEO benefits not found easily in other social platforms. If this is not the case, I'm hoping Google will clarify."
Right below that, Cyrus pulled in Mark Traphagen's comment, "It is not the +1's themselves that are causing the high rankings of posts but the fact that most +1's on a site result in a shared post on Google+, which creates a followed link back to the post. It's instant organic link building." While the mechanism is different in Twitter, it is some what similar and the likelihood of something getting a large number of tweets correlating with a large number of backlinks, is likely.
-
RE: What about re-life expired domain ?
There might be some, but if the domain is off brand and not a name you want to use, you're probably better off universally redirecting it to your present domain. It's important to investigate the history of the domain though. Is it associated with Spam? Either as a source of spam email or a spammy link profile? Try to get as much insight in that regard as you can.
-
RE: Art website is being spammed for NFL Jerseys - should I disavow?
Hi Joe. Marie Haynes wrote a nice article on this a couple of months ago (http://moz.com/blog/preparing-for-negative-seo), and example #6 seems to be most applicable:
This was not a competitor trying to hurt my rankings. In fact, the tens of thousands of spammy links that were pointing at my site were actually helping my rankings at that point. What had happened here is that someone had taken advantage of a vulnerability in a Wordpress plugin that had not been updated. They were able to hack into the site and create a whole bunch of new pages. They then pointed huge numbers of spammy links at these pages and redirected them to their Michael Kors affiliate sites.
In this situation, we removed the offending pages, found and fixed the access point, AND I also disavowed all of those links. According to Google, if you get hacked and have bad links pointing to you, you can probably ignore them because their algorithms are good at picking up and just discounting this sort of thing. However, it concerned me that these bad links actually were helping this site. If Google was just discounting them then they should have had no effect. I am 99% sure that I would have been ok to leave them, especially since the pages they pointed to had been removed (which also removes the link pointing to that page), but just to be absolutely sure that something odd didn't affect me with the next Penguin update, I disavowed them all at the domain level.
The whole article is pretty good, so take a look. Best of luck getting it all squared away.
-
RE: What about re-life expired domain ?
Ok. If you decide that you're going to buy the domain, once you do use a 301 redirect to send all former links and references from it to your present domain. Here a guide on that: http://moz.com/learn/seo/redirection.
-
RE: Best method for blocking a subdomain with duplicated content
That's very strange behavior for the admin portion of a site to not be behind password protection. If the current developer is unable to do so, I'd ask around on that because if you block the admin portion via password protection, even if it's still indexed in Google it will rank much lower.
Are you able to contact someone at CMS Source to get some support? That might help resolve this as well as provide guidance on getting the robots.txt uploaded and being able to add noindex to admin only.
-
RE: Improving SEO Structure of a Page
What kind of performance is the page getting currently? Any traffic via google/organic?
-
RE: Good or bad adding keywords in Pinterest description?
You'll want to avoid creating pages that are keyword stuffed that then point back to your site as you'll be creating a page that could become a negative signal as an inbound link to your site, similar to a spammy link on a different domain. Moz covers this pretty well in their Search Engine Myths and Misconceptions here: http://moz.com/beginners-guide-to-seo/myths-and-misconceptions-about-search-engines, specifically, "One of the most obvious and unfortunate spamming techniques, keyword stuffing, involves littering keyword terms or phrases repetitively on a page in order to make it appear more relevant to the search engines. As discussed above, this strategy is almost certainly ineffectual.
Scanning a page for stuffed keywords is not terribly challenging, and the engines' algorithms are all up to the task. You can read more about this practice, and Google's views on the subject, in a blog post from the head of their web spam team: SEO Tip: Avoid Keyword Stuffing." Even if you have one degree of separation it's still not a benefit and not a best, or safe, practice.
-
RE: Representing categories on my site
Hi Mark. Rand's comments here still hold true: http://moz.com/blog/11-best-practices-for-urls, especially these in relation to your question, "Fewer Folders" and "Keep it Short" Looks like you'll be hitting on both of those while still maintaining an appropriate amount of keyword usage. Cheers!