I would consider the ROI of the redirect. If a specific page flows a bunch of link juice and on terms important to the business then redirect to the best page. After a point, the rest won't return enough value for your client to be charged at your hourly rate.
- Home
- sprynewmedia
Latest posts made by sprynewmedia
-
RE: 301 redirect recommendations
-
RE: How to Recover From Unstable Site Structure History
I have had direct experience with this issue. Here are a few of the things I have done:
- Make sure the current URLs are rock solid and can be long lived.
- Ensure all links to the old structure are completely purged from the content. No good to propping up the old patterns.
- Get a clear picture of the off-site back links. No sense worrying about pages that will have no value. If they changed that fast, there won't be many to worry about.
- For those that have good back links, make a direct 301 redirect to the new page.
- At the point of low ROI, redirect the rest on pattern matches. There could be a couple double jumps here but they won't mean that much anyway given #3. (side note, double jumps leak extra link equity so they should be avoided)
- Ensure your entire site can be fully and easily spidered - before resorting to xml sitemaps.
- Ensure you have a helpful even compelling 404 page that returns the proper status code. 410 errors are reportedly a bit faster at getting content removed so use if you can.
- Remove any restrictions you have in the robots.txt file to the old structure until the 404 and 301s take full effect.
- Submit remove requests to pages and folder. This is particularly important if the site is very large (compared to domain authority) and SEs won't get a full picture of the changes for weeks or months due to low crawl allowance.
Doing these got my sites back on track within a couple weeks.
EDIT: forgot a couple...
- remove any old xml sitemaps
- submit multiple sitemaps for different sections of the site. This helps narrow down problem spots easier.
-
Study regarding Font Size widgets
Has any one seen credible evidence about impact of font size widgets? Do/did people use them? Are they moot in the world of full-page zoom functions?
-
RE: What do you when the wrong site links to you?
I have to disagree with the monetization sentiment. One presumes a business plan is based on the servicing core topics of the business. Monetizing off-topic traffic may give you some short term money but it would do nothing to serve your target audience and long term growth. EPV has nothing to do with voice overs - unless one of their actors is a ghost. Any pages on his site about EPV would only water down his messaging and confuse his audience. It would be distracting and disingenuous to try and monetize this traffic.
As to the first part, again: outcome of cost/benefit analysis (which should include opportunity cost).
-
RE: What do you when the wrong site links to you?
Which would be the answer to your cost/benefit analysis.
If he's getting traffic from SEs on the wrong topic, the benefits of solving a few of the more authoritative back links would be worth the time.
-
RE: What do you when the wrong site links to you?
I agree but only after doing a link profile and cost/benefit analysis. Assuming away spam, if a significant number of sites are linking to him on the wrong topic, that would be something I would want to correct.
-
RE: What do you when the wrong site links to you?
This case is a little tricky in that there are degrees of 'wrong'. I'm assuming these are not spam links but just off topic. Here you have to weigh the value of the general link equity vs potentially confusing SE's about your site's topic.
Rand just finished a Whiteboard on how important topic is becoming.
If you decide they are hurting, I suggest contacting the site to outline how the link isn't serving their visitors. Even provide a couple alternative site that would be better (ie, make it worth their time)
I would hesitate to use the disavow tool. It's supposed to be a last resort for spam links and that doesn't really apply here.
-
RE: My website has no links to it at all :(
SEOmoz has a number of public tools they can use. After that, a basic membership allows for up to 5 campaigns so you and your friends could share a membership and get all the benefits of Pro.
A word of caution, don't interlink with your friends sites unless they are of direct benefit to your customers. Even then, use sparingly. The absolutely last thing you want to do is anything that feels like cheating. Link exchanges/purchases are frowned upon.
-
RE: My website has no links to it at all :(
Hey James. Further to Colin's message, with a fresh site launch it's time to outline an ongoing content and marketing strategy. It isn't about some magic number of links but a methodical way you can earn those links on an ongoing basics.
Another easy start is asking partner businesses to give you a bit of link love. You could also consider a press release announcing the new site particularly if it has something special for customers.
Your local phone directory may have a plan that gets you a back link worth your time.
Head over to getlisted.org (now owned by seomoz) and make sure your local profile is as strong as it can be. A car repair shop will really benefit from a strong local showing. Keep on top of any reviews.
After that, Youmoz is full of link building strategies. Select a few and give them a daily or weekly time allowance.
-
RE: If I check in at my sushi joint, am I going to affect their rankings, and end up in a crowded sushi dive?
The phasing made me chuckle...
Holding the obvious impact of normal conversations on marketing aside, one would guess higher positive social interactions would lead to a higher profile.
It wouldn't be a smart business strategy of these sites to not allow people to find hot spots but just continue to order by an arbitrary string of letters. (when you think about it, alphabetical order is really strange UI )
Given scheme.org has clear mark up for ratings, I would further assume this would impact the SERPs, particularly in that location. This requires that the SE can see the data and that the site has sufficient trust/authority )
Testing this would be tricky however since you are talking about brick and mortar places. It would be pretty elaborate and potential cruel to try making a fake restaurant out rank a real one. I would expect checkins from a single account would have limited effect or people would be gaming their listings.
Best posts made by sprynewmedia
-
RE: Is a site map necessary or recommended?
I may get chastised for this but I believe the value of sitemaps is over stated.
All things being equal, I feel they are crutches and band-aids for poor webdesign/production.
Your site should:
- be easily indexed by all engines
- expose all pages with in four-five links of the home page(s)
- utilize thoughtful linking to promote important content in an organic manner
- expose new content on a high value, frequently indexed page (ie the home page) long enough to be found
- be consistent enough that the site will seem similar after one or two passes by Googlebot.
I like sitemaps when big structural changes occur as the sites heal faster. They're good when a lots of pages are only exposed via a long pagination scheme. I also use them to break down parts of a site to expose problem areas (IE when a sitemap has 50 links but Google only indexes 25 of them)
But, they can be detrimental if they are not maintained properly. If anything changes in the structure, it should be immediately reflected in the sitemap. Lots of automated ones don't consider the robot.txt file which can cause problems.
For SEOs, adding a sitemap is an easy way to ensure everything is at least looked at without having to touch the actual site.
Advice: yes, use them but only if you can use them properly or can't fix current indexing issues. Over the long haul however, you should force yourself to think of it as not there.
-
RE: Moz Rank Moz Trust and Authority higher than my competitor but still getting outranked
Two things jump out -
First, they have far higher authority and higher trust which is worth more than rank these days.
Second, this says nothing about the optimization of specific terms. I can out rank a 100Mt/Mr site on terms they don't use.
Start with a comparison of where they get their links for the terms in question then see about earning them yourself. I'm guessing they have a couple really good links from other trusted sites.
-
RE: Footer Links And Link Juice
A no-follow, in terms of juice, would actually hurt your goals as the link still gets allocated the juice portion but it doesn't flow through. **Each no-follow link will siphon off a little juice. **
See http://support.google.com/webmasters/bin/answer.py?hl=en&answer=96569
The effects of navigational links are diminished somewhat as Google treats them differently compared to content links. To help solidify this, surround the footer with
<nav></nav>
tags.
Review: http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links #5
Generally, remove any site wide links that aren't always needed and place them on page where users would like the details. For instance, use a search form instead of a link to the search page.
-
RE: Block an entire subdomain with robots.txt?
Fact is, the robots file alone will never work (the link has a good explanation why - short form: all it does is stop the bots from indexing again).
Best to request removal then wait a few days.
-
RE: Duplicate Content - What's the best bad idea?
This applies to Google Mini or Search Appliance which are custom search tools for an individual website.
They allow site owners to sculpt the indexing of their private set ups.
Adwords also has something to help indicate the important content for determining the page topic for relating ads.
However, they don't apply to Googlebot spidering as mentioned above.
-
RE: How to find page with the link that returns a 404 error indicated in my crawl diagnostics?
If you down load the crawl diagnostics CSV report, you can see one referrer for the problem.
I have better success using Google Webmaster Tools. You get the information "straight from the horse's mouth" as it were.
- Log in to https://www.google.com/webmasters/
- Go to Health > Crawl Errors
- Click "Not Found"
- For each listed, click the URL
- In the pop up window, the third tab will list the pages that link in.
-
RE: My website has no links to it at all :(
Hey James. Further to Colin's message, with a fresh site launch it's time to outline an ongoing content and marketing strategy. It isn't about some magic number of links but a methodical way you can earn those links on an ongoing basics.
Another easy start is asking partner businesses to give you a bit of link love. You could also consider a press release announcing the new site particularly if it has something special for customers.
Your local phone directory may have a plan that gets you a back link worth your time.
Head over to getlisted.org (now owned by seomoz) and make sure your local profile is as strong as it can be. A car repair shop will really benefit from a strong local showing. Keep on top of any reviews.
After that, Youmoz is full of link building strategies. Select a few and give them a daily or weekly time allowance.
-
RE: ‘80-90% of SEO already done for you in Wordpress’ Am I missing something?
Wordpress can get very fast once you properly configure a cache plugin. You could even use CloudFlare.com to enjoy some great CDN enhancements for very cheap.
This assumes you have reliable and speedy hosting - a constant of all websites. It also assumes you are careful about optimizing the images and don't load the page with megabytes of JS libraries/plugins.
(edit note: In my experience cloudflare makes a significant impact if you are on a slower, shared hosting plan. On a pro level host, it isn't as beneficial)
-
RE: Block an entire subdomain with robots.txt?
Sounds like (from other discussions) you may be stuck requiring a dynamic robot.txt file which detects what domain the bot is on and changes the content accordingly. This means the server has to run all .txt file as (I presume) PHP.
Or, you could conditionally rewrite the /robot.txt URL to a new file according to sub-domain
RewriteEngine on
RewriteCond %{HTTP_HOST} ^subdomain.website.com$
RewriteRule ^robotx.txt$ robots-subdomain.txtThen add:
User-agent: *
Disallow: /to the robots-subdomain.txt file
(untested)
-
RE: Block an entire subdomain with robots.txt?
Option 1 could come with a small performance hit if you have a lot of txt files being used on the server.
There shouldn't be any negative side effects to option 2 if the rewrite is clean (IE not accidently a redirect) and the content of the two files are robots compliant.
Good luck
Looks like your connection to Moz was lost, please wait while we try to reconnect.