I would consider the ROI of the redirect. If a specific page flows a bunch of link juice and on terms important to the business then redirect to the best page. After a point, the rest won't return enough value for your client to be charged at your hourly rate.
Posts made by sprynewmedia
-
RE: 301 redirect recommendations
-
RE: How to Recover From Unstable Site Structure History
I have had direct experience with this issue. Here are a few of the things I have done:
- Make sure the current URLs are rock solid and can be long lived.
- Ensure all links to the old structure are completely purged from the content. No good to propping up the old patterns.
- Get a clear picture of the off-site back links. No sense worrying about pages that will have no value. If they changed that fast, there won't be many to worry about.
- For those that have good back links, make a direct 301 redirect to the new page.
- At the point of low ROI, redirect the rest on pattern matches. There could be a couple double jumps here but they won't mean that much anyway given #3. (side note, double jumps leak extra link equity so they should be avoided)
- Ensure your entire site can be fully and easily spidered - before resorting to xml sitemaps.
- Ensure you have a helpful even compelling 404 page that returns the proper status code. 410 errors are reportedly a bit faster at getting content removed so use if you can.
- Remove any restrictions you have in the robots.txt file to the old structure until the 404 and 301s take full effect.
- Submit remove requests to pages and folder. This is particularly important if the site is very large (compared to domain authority) and SEs won't get a full picture of the changes for weeks or months due to low crawl allowance.
Doing these got my sites back on track within a couple weeks.
EDIT: forgot a couple...
- remove any old xml sitemaps
- submit multiple sitemaps for different sections of the site. This helps narrow down problem spots easier.
-
Study regarding Font Size widgets
Has any one seen credible evidence about impact of font size widgets? Do/did people use them? Are they moot in the world of full-page zoom functions?
-
RE: What do you when the wrong site links to you?
I have to disagree with the monetization sentiment. One presumes a business plan is based on the servicing core topics of the business. Monetizing off-topic traffic may give you some short term money but it would do nothing to serve your target audience and long term growth. EPV has nothing to do with voice overs - unless one of their actors is a ghost. Any pages on his site about EPV would only water down his messaging and confuse his audience. It would be distracting and disingenuous to try and monetize this traffic.
As to the first part, again: outcome of cost/benefit analysis (which should include opportunity cost).
-
RE: What do you when the wrong site links to you?
Which would be the answer to your cost/benefit analysis.
If he's getting traffic from SEs on the wrong topic, the benefits of solving a few of the more authoritative back links would be worth the time.
-
RE: What do you when the wrong site links to you?
I agree but only after doing a link profile and cost/benefit analysis. Assuming away spam, if a significant number of sites are linking to him on the wrong topic, that would be something I would want to correct.
-
RE: What do you when the wrong site links to you?
This case is a little tricky in that there are degrees of 'wrong'. I'm assuming these are not spam links but just off topic. Here you have to weigh the value of the general link equity vs potentially confusing SE's about your site's topic.
Rand just finished a Whiteboard on how important topic is becoming.
If you decide they are hurting, I suggest contacting the site to outline how the link isn't serving their visitors. Even provide a couple alternative site that would be better (ie, make it worth their time)
I would hesitate to use the disavow tool. It's supposed to be a last resort for spam links and that doesn't really apply here.
-
RE: My website has no links to it at all :(
SEOmoz has a number of public tools they can use. After that, a basic membership allows for up to 5 campaigns so you and your friends could share a membership and get all the benefits of Pro.
A word of caution, don't interlink with your friends sites unless they are of direct benefit to your customers. Even then, use sparingly. The absolutely last thing you want to do is anything that feels like cheating. Link exchanges/purchases are frowned upon.
-
RE: My website has no links to it at all :(
Hey James. Further to Colin's message, with a fresh site launch it's time to outline an ongoing content and marketing strategy. It isn't about some magic number of links but a methodical way you can earn those links on an ongoing basics.
Another easy start is asking partner businesses to give you a bit of link love. You could also consider a press release announcing the new site particularly if it has something special for customers.
Your local phone directory may have a plan that gets you a back link worth your time.
Head over to getlisted.org (now owned by seomoz) and make sure your local profile is as strong as it can be. A car repair shop will really benefit from a strong local showing. Keep on top of any reviews.
After that, Youmoz is full of link building strategies. Select a few and give them a daily or weekly time allowance.
-
RE: If I check in at my sushi joint, am I going to affect their rankings, and end up in a crowded sushi dive?
The phasing made me chuckle...
Holding the obvious impact of normal conversations on marketing aside, one would guess higher positive social interactions would lead to a higher profile.
It wouldn't be a smart business strategy of these sites to not allow people to find hot spots but just continue to order by an arbitrary string of letters. (when you think about it, alphabetical order is really strange UI )
Given scheme.org has clear mark up for ratings, I would further assume this would impact the SERPs, particularly in that location. This requires that the SE can see the data and that the site has sufficient trust/authority )
Testing this would be tricky however since you are talking about brick and mortar places. It would be pretty elaborate and potential cruel to try making a fake restaurant out rank a real one. I would expect checkins from a single account would have limited effect or people would be gaming their listings.
-
RE: How can I find and fix my broken links
Funny, there was just a question about this a few hours ago (hint, hint SEOmoz team). Here was my advice...
The SEOmoz report will give you one referrer for each 404 error. I've had better success and more complete information using Google Webmaster Tools. You get the information "straight from the horse's mouth" as it were.
- Log in to https://www.google.com/webmasters/
- Go to Health > Crawl Errors
- Click "Not Found"
- For each listed, click the URL
- In the pop up window, the third tab will list the pages that link in.
-
RE: How to find page with the link that returns a 404 error indicated in my crawl diagnostics?
If you down load the crawl diagnostics CSV report, you can see one referrer for the problem.
I have better success using Google Webmaster Tools. You get the information "straight from the horse's mouth" as it were.
- Log in to https://www.google.com/webmasters/
- Go to Health > Crawl Errors
- Click "Not Found"
- For each listed, click the URL
- In the pop up window, the third tab will list the pages that link in.
-
RE: How can I change the page title "two" (artigos/page/2.html) in each category ?
Page two correctly uses:
rel="prev" href="http://www.buffetdomicilio.com/category/artigos" />
Which indicates to Google it is part of a pagination scheme thus it handles it differently to proper content.
You really don't want Google sending traffic to these pages but instead to the actual articles. In that case, optimizing the title or duplication reduction isn't such a big a deal. It's just one small part on otherwise different pages so you won't get a bad penalty.
I wouldn't use the canonical link suggestion as this will effective remove the page from your index. I assume you want these for spidering so anything that removes them could be an issue.
-
RE: Https subdomain campaign
http and https is immaterial from a campaign point of view. The s only means the session is encrypted, not that the content's different.
Rogerbot's purpose is to mimic Googlebot, Bingbot etc. It will have the same access as the other public bots.
The only time it will have an issue is if it gets 804 - HTTPS (SSL) errors due to improper set up. But, you'll know about that well before the campaign gets started.
Now, you could be doing some server side tricks to make the content different once the session is secure. In that case, link and/or do a redirect to the secure URL and the bots will follow.
-
RE: Is a site map necessary or recommended?
This is a strong indicator something is up and deserves deeper investigation.
Perhaps you have content duplication issues, low value content (Panda), spammy back links (Penguin) or other indexing issue. See if there is a pattern to the missing pages, perhaps one of the directories is the cause. How old is the site and how is the domain trust/authority coming along?
-
RE: Duplicate Content
Before you delete anything, submit a ticket to the help desk. They may be able to fix it up or point to why it's happening.
-
RE: Duplicate Content
Double check the redirection here: http://redirectcheck.com. Ensure it's a nice HTTP/1.1 301 Moved Permanently then HTTP/1.1 200 OK. Do this for a couple pages just to be sure.
Next, what settings did you use in the campaign?
-
RE: Is a site map necessary or recommended?
A stale or poorly created sitemap can hurt in the following ways:
- long lived 404 pages - deleted pages continue to be indexed if not removed from the sitemap
- use up Google indexing allowance - if 404 and low value pages are included, Googlebot will use up valuable indexing allowance on them vs covering more of your important content.
- links to private areas - depending on how the map is created, the tool may not be smart enough to not include administration or community pages that you don't want in the index.
- inclusion of noindex pages - a couple methods (such as a robot.txt update after a sitemap is created) will include noindex pages which a technical problem. I'm not 100% sure of the impact but I could see this being a quality indicator.
- create distracting work - maintaining sitemaps, particularly semi-manual ones from Xenu etc., suck time better spent improving your indexability or earning back links.
However, all of these are easily avoidable with a solid approach and/or good server side tools.
-
RE: Moz Rank Moz Trust and Authority higher than my competitor but still getting outranked
Wait, I miss read your question and didn't go far enough to the right. Please ignore the trust/rank statement.
However, this still doesn't reflect on the specific keyword optimization. Many other things are at play there.
-
RE: Moz Rank Moz Trust and Authority higher than my competitor but still getting outranked
Two things jump out -
First, they have far higher authority and higher trust which is worth more than rank these days.
Second, this says nothing about the optimization of specific terms. I can out rank a 100Mt/Mr site on terms they don't use.
Start with a comparison of where they get their links for the terms in question then see about earning them yourself. I'm guessing they have a couple really good links from other trusted sites.
-
RE: Is a site map necessary or recommended?
I may get chastised for this but I believe the value of sitemaps is over stated.
All things being equal, I feel they are crutches and band-aids for poor webdesign/production.
Your site should:
- be easily indexed by all engines
- expose all pages with in four-five links of the home page(s)
- utilize thoughtful linking to promote important content in an organic manner
- expose new content on a high value, frequently indexed page (ie the home page) long enough to be found
- be consistent enough that the site will seem similar after one or two passes by Googlebot.
I like sitemaps when big structural changes occur as the sites heal faster. They're good when a lots of pages are only exposed via a long pagination scheme. I also use them to break down parts of a site to expose problem areas (IE when a sitemap has 50 links but Google only indexes 25 of them)
But, they can be detrimental if they are not maintained properly. If anything changes in the structure, it should be immediately reflected in the sitemap. Lots of automated ones don't consider the robot.txt file which can cause problems.
For SEOs, adding a sitemap is an easy way to ensure everything is at least looked at without having to touch the actual site.
Advice: yes, use them but only if you can use them properly or can't fix current indexing issues. Over the long haul however, you should force yourself to think of it as not there.
-
RE: ‘80-90% of SEO already done for you in Wordpress’ Am I missing something?
Wordpress can get very fast once you properly configure a cache plugin. You could even use CloudFlare.com to enjoy some great CDN enhancements for very cheap.
This assumes you have reliable and speedy hosting - a constant of all websites. It also assumes you are careful about optimizing the images and don't load the page with megabytes of JS libraries/plugins.
(edit note: In my experience cloudflare makes a significant impact if you are on a slower, shared hosting plan. On a pro level host, it isn't as beneficial)
-
RE: ‘80-90% of SEO already done for you in Wordpress’ Am I missing something?
Out of the box WP handles on-page optimization fairly well especially for the "SEO unaware". Titles and descriptions are customized from article content and the entire site can be easily index. The pages tend to be light weight thus load very fast. From there, it depends on the theme and plugins.
But that is far from 80-90% of SEO. It isn't even 80-90% of on-page optimization which highly dependent on the content and content category.
As a blanket statement, it is false. If all they are saying is the sites will be easily indexed, sure.
-
RE: No internal links showing up in OpenSiteExplorer report
The Mozbar Page Attributes indicates:
| Internal Followed Links | 276 |
| Internal Nofollow Links | 60 |
| External Followed Links | 14 |
| External Nofollow Links | 14 |But Link Data says 0 internal links. Curious.
PS, cute robots.txt file.
-
RE: Why are the bots still picking up so many links on our page despite us adding nofollow?
Whoa! Your view state is HUGE (That's what she said).
I couldn't decode it but somewhere along the lines the programmer didn't turn off session management and, likely, the entire copy of the page is encoded in the view state. This is causing load speed issues.
-
RE: Why are the bots still picking up so many links on our page despite us adding nofollow?
You meta tags are in more trouble then your link count:
id="MetaDescription" name="DESCRIPTION" content="Page Details" />
AND
name="Description" />
I see you are using DNN: what version and what module are you using? There are a ton of things one can do to DNN to make it SEO enhanced.
-
RE: Why are the bots still picking up so many links on our page despite us adding nofollow?
My suggestion is to try AJAXing the tabs. If the outbound links are more of a concern then the keywords of the link, AJAX loading of the tab content would remove them from consideration. Google won't index content pulled in from an external source.
However, be careful to put a rel="nofollow" on the link that loads the content as you don't want SEs indexing the source.
Do not put a meta nofollow in the head, it will kill all links on the page and seriously mess up your link flow. Your use of rel="nofollow" is correct in the context of the specific link tags.
I wouldn't sweat the shear number of links - the 100 count is a left over from the days when spiders only downloaded 100k from the page. It has since risen to the point that the practical limitations of over 100 links is more pressing (IE, do you visitors actually value and use that many links?)
If each link is valuable and usable, no need to worry. If not, perhaps there is a structural way to reduce the count.
Also, load the footer by AJAX onscroll or on demand. Assuming all of the pages can be found in the top navigation, the bottom links are just exacerbating your issues. Primarily, this section is giving far too much weight to secondary or auxiliary pages.
For instance, your Privacy Policy only needs to be linked to where privacy is a concern (ie the contact form). Good to put it on the home or about pages too if you have a cookie policy.
-
RE: 23000 Links are not found- Should I redirect them?
Track the back links to the pages. Any with good links should be 301 redirected to a closely matched page.
For the rest, clean up the server response and make sure it shows a firm 404 or 410 (gone)
Hopefully these are in a directory then you can use GWMT to request the directories are removed.
-
RE: Duplicate Sub-domains Being Indexed
If you don't want these indexed, first put a noindex tag on all pages. Leave the follow alone as the engine still needs to find the pages to change the index status.
Add the domain to GWMT then request a removal all the pages.
Allow this to take effect then add a robots disallow to the entire sub-domain.
Your domain then be cleaned from the index and the duplication won't be an issue.
-
RE: Two homepage urls
If you can use server side programming to change the homepage, this would be better for your users and optimization. Just be sure to expose enough content on each for Google to spider and index properly.
Yes, 301 redirect the old night home page. In fact, redirect all homepage requests to just http://www.domain.com/
The canonical link can be removed if on the domain root.
-
RE: Is there such a thing as to many 301 redirects?
Chaining multiple 301s would be an issue (what I worry about above) but this sounds like the right process in migrating platforms. If the old pages didn't have back links, you will be able to remove the 301s.
You could create a million redirect rules but search engines will only care about the ones they find through back links.
Perhaps your new platform is capable of simpler URLs? Keywords in URLs and domains has lost its power so it might be better to create shorter, more stable URLs. A format like:
/products/product-name
would be ideal depending on category attributes. If a product can be in two categories, this URL won't create content duplication issues.
-
RE: Is there such a thing as to many 301 redirects?
301 redirects dissipate link equity in the same way a link does. That means each one is slowly eating into the equity (PR).
Any that are created systemically would be bad design but it sounds more like this occurs when you rename a product?
In that case, why are the names changing so often? Perhaps there is a better business process you could follow to have long last naming? Change the page title all you like but the URL should be very long lasting by design.
-
RE: Pin It Button, Too Many Links, & a Javascript question...
This test showed a little light on what is indexed typically: http://www.seomoz.org/ugc/can-google-really-access-content-in-javascript-really
-
RE: Pin It Button, Too Many Links, & a Javascript question...
Loading link via JS is fairly standard technique. (See http://sharethis.com/ or http://www.addthis.com/). Google will index some JS created content so you may have to delay the link tag creation until a mouseenter event to get the desired effect.
Added bonus: using well written JS code can lighten the code weight of the page allowing it to load faster. Currently, each Pin icon contains a div, a link and an image tag. If you use prototyping, JS can replicate all this content from the attributes of the primary image tag very quickly. (I see you load jQuery so this task is very easy to accomplish)
Also, move the rel="words" in the link into the img tag as an alt attribute. Current the images lack alt tags which isn't the best. Using keywords in the rel attribute isn't correct. It is supposed to mark up the relationship to between items and "Stacked Stone Panels" isn't a relationship. You may have been thinking of the title attribute.
Next, you are loading WAY too many resource files (mainly js). A few items twice. Try combining them into a few minified files. There is a lot of work that could be done to speed up the site: http://www.webpagetest.org/result/130320_PT_12RV/ over 25 seconds to load.
Think about making a sprite of the images, it would save a ton of requests and downloads. Also, pagination, if done correctly, could save a lot of time.
-
RE: 404's in WMT are old pages and referrer links no longer linking to them.
How long ago did you switch platforms? It can take months for Google to come back around to a page that linked to your site. Page on your site will stay in the cache until a few passes.
When you switch, did you do any 301 redirects? Examine the back links to your domain - any that come from good pages should be redirected to the new URL. If not, they will be scooped up by active SEOs. (finding 404 links is a popular link building technique).
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93633
If you know the links will be dead forever, try using a 410 response as it is supposed to make search engines drop the page faster.
http://www.seroundtable.com/404-410-google-15225.html (bottom)
Have you requested Google remove old directories/pages? If the content is gone and has no back links, try a removal request.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
RE: Should We Switch from Several Exact Match URLs to Subdomains Instead?
In the video, he directly clarifies 301 redirects dissipate PR.
However, I understand the site move tool is supposed to mitigate this faster.
http://www.youtube.com/watch?v=UU3xyhCXP9Q
I haven't found much credible evidence one way or the other however as it's probably very difficult to test.
To be a paranoid conservative, I'd guess it doesn't transfer 100% PR and any domain migrations must have very strong justifications as there will be an SEO cost.
-
RE: Product Schema - Not supported if you don't actually sell the product online...
I took this to mean that they won't show a snippet unless they find all the ingredients - not that you "can't" so much. I still did the mark up exercise because at least it was a start in communicating the nature of the page content.
I'd love to hear what others think however.
-
RE: Do contextual links hold more weight?
Ironically, I still have this resource in my CnP memory - this should help you understand link values better:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
To answer more directly: probably yes given my assumptions of the placement and coding of the byline.
-
RE: #1 with grade F page optimization
"Domain authority 82"
There's your answer right there - that's a fantastic link. One to cultivate and track. Hopefully you can be a resource to the domain again sometime. Perhaps the topic can be updated year by year etc.
-
RE: #1 with grade F page optimization
How do the back link profiles compare? One really good link for keyword 1 could be enough to propel it to the top. Alternatively, perhaps it gets better treatment by your internal link structure.
Another thought: competition is usually a measure of how many pages are returned for a search but it doesn't get into how well optimized those pages are for a given term. It also doesn't measure the authority and trust of the domains. It could be the competition for the first keyword is low value domains and unoptimized pages.
-
RE: Most significant link building factor
This should help you:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
All things being equal, more domains is better than more links. But, I agree with Kevin, it really depends on the trust and authority of the domains. Two links from an authoritative and trusted site are going to be worth a 100 low value domains, perhaps more.
-
RE: Duplicate Content - What's the best bad idea?
This applies to Google Mini or Search Appliance which are custom search tools for an individual website.
They allow site owners to sculpt the indexing of their private set ups.
Adwords also has something to help indicate the important content for determining the page topic for relating ads.
However, they don't apply to Googlebot spidering as mentioned above.
-
RE: Are the menus created by Locu crawlable?
Out of interest, I tried Bing and the searches failed.
One more reason to add a plain text version in the noscript tag.
-
RE: Are the menus created by Locu crawlable?
Could be. They could also be linked to on those phrases from other sites.
So I tested a different string from both menus:
"goat bucheret, carmody, dry aged jack, pt." -> success
and
"Satur Farms Green Salad" -> success
Perhaps you can confirm with your own test but it appears the claim is true.
However as a back up, it couldn't hurt to include no-script content since that's literally the purpose of the tag. Just remember to maintain the content.
-
RE: Are the menus created by Locu crawlable?
If they claim it, ask them to back it up with a real example or two. Then copy what they did (ie noscript link perhaps?)
-
RE: Should We Switch from Several Exact Match URLs to Subdomains Instead?
You would have to employ the GWMT's Change of Address feature to not have link equity dissipate through the 301s, no?
-
RE: Biggest Benefit for Footer Links "Created by ___?"
Another tactic one client used was a website credits page. It had full priority content dedicated to the authors/designers and developers of the site. This link could be treated as higher priority over small footer text. Bonus, the page would likely be more topical to your business. Not for everyone however.
-
RE: Do 410 show in the 404 not found section in Google Webmaster Tools?
Given the spec, a 410 for truly gone pages will be better regardless of reporting or Google.
http://googlewebmastercentral.blogspot.ca/2011/05/do-404s-hurt-my-site.html
Sadly, from that page: "Currently Google treats 410s (Gone) the same as 404s (Not found), so it’s immaterial to us whether you return one or the other."
This is direct non-compliance which I can only guess is due to people unwittingly use the code incorrectly. Then again, they have allowed little errors to completely wipe out a site before.
There has been a suggestion from other engineers that it will reduce the number of Googlebot retries and 410'd page will take longer to re-index if they do reappear. http://productforums.google.com/d/msg/webmasters/i70G2ZAhLmQ/neKEH4spacUJ
EDIT oops, your question. Seems Google reports all the errors it finds but I personally can not attest to seeing a 410. Here is the list of errors in the GWMT help section: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=40132
410 is on the list. -
RE: Duplicate Content - What's the best bad idea?
If the descriptions are very technical then likely there is a fair amount of repetition in the sentence pattern, diction etc. I'd recommend playing with regex to help transform content into something original.
For instance, you could search for industry abbreviations CW and replace with long forms _**Clockwise (CW). **_Maybe they over use an adjective that you could changeto your own voice.
Also, perhaps the stock descriptions have blocks of useless content you could strip out in the mean time?
The DB probably has a few other fields (name, product attributes etc) so be sure to find a unique way of assembling the meta description, title and details.
If you find enough to change, I'd think having the description would be better then having a page that is too light on words.
Be sure to mark up with http://schema.org/Product so SE's understand the nature of the content.
EDIT: I have used the regex technique to enhance the content of a database by added inline tooltips, diagrams or figures and glossary links. However with Penguin, I would be careful with automated links. You would only want to create a handful using the same anchor text.
EDIT2: I forgot - MAKE FREQUENT BACK UPS. Regex is super powerful and can tank a database really fast. Make a backup of the original and of every successful iteration - it will take a little longer but it will save your butt when things go bad.
-
RE: Footer Links And Link Juice
A no-follow, in terms of juice, would actually hurt your goals as the link still gets allocated the juice portion but it doesn't flow through. **Each no-follow link will siphon off a little juice. **
See http://support.google.com/webmasters/bin/answer.py?hl=en&answer=96569
The effects of navigational links are diminished somewhat as Google treats them differently compared to content links. To help solidify this, surround the footer with
<nav></nav>
tags.
Review: http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links #5
Generally, remove any site wide links that aren't always needed and place them on page where users would like the details. For instance, use a search form instead of a link to the search page.