Hi Kristina - sorry for the long delay!
I definitely wouldn't go with sub.subdomain.rootdomain.edu - I'd go with a subfolder and hopefully, something as close to the root/main domain as possible.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Kristina - sorry for the long delay!
I definitely wouldn't go with sub.subdomain.rootdomain.edu - I'd go with a subfolder and hopefully, something as close to the root/main domain as possible.
There's thousands of bloggers who write about jewelry: http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=jewelry+inurl%3Ablog
There's cool stuff like this:
http://www.huffingtonpost.com/2010/01/20/geeky-jewelry-the-nerdies_n_429408.html
http://mashable.com/2010/08/05/geek-jewelry/
Videos with hundreds of thousands of views: http://www.youtube.com/results?search_type=videos&search_query=jewelry&search_sort=video_view_count&suggested_categories=26%2C10%2C24
There's no niche, IMO, that's too boring or too stale for great content.
Fastest way?
Create content (research, data, tools, apps, infographics, presentations, video, etc.) that's so amazingly remarkably badass good that everyone on the Internet can't help but share it everywhere.
Think like these guys:
What they have in common is that they're all content/resource hubs. It's hard to spend 10 minutes browsing them without finding something you want to share with people you know.
That kind of content creation and then the finding/building of a community who loves it is the essence of great web marketing AND the fastest way I know to build domain authority.
Richard - most likely, you're correct. PA, however, is an amalgamation of a lot of metrics combined via machine learning against Google's SERPs to produce the highest correlated metric with rankings. Thus, it could well be the case that this move doesn't boost PA (and PA itself recalculates/calibrates with each index).
However, I wouldn't be too worried about PA - I would be worried about potential rankings and traffic, and it sounds like your solution should help with that (if you're combining like/duplicate content).
Certainly possibly that it was a small algo shift, but unless lots of other sites fell in the rankings, too, I'd say the fresh boost/loss is the more likely culprit in this case.
BTW - you might want to check out this slide deck for some ideas on growing content/links/inbound marketing: http://www.slideshare.net/randfish/inbound-marketing-for-startups-in-2011
Links is a bit narrow - I'd say citations of any type. We've seen examples where having a site's name brand mentioned in a news article led to a flurry of crawling and higher rankings. It's also the case that adding new content to your site, making it a better converting site, improving usability, design, navigation, etc. can all have positive impacts on your ultimate goals and on SEO, too - http://www.seomoz.org/blog/the-next-generation-of-ranking-signals
No - definitely not. I'm saying that for competitive search results, having a good site and good content isn't enough. You need to show the engines that you're also an important, well-referenced resource and that means citations (links, tweets, shares, mentions, news articles, local citations, etc).
Links are important, but link building without the basics of a good site and business (which Bitabliss.com seems to have) is folly.
Hi Shawn - it looks to me like the biggest issue is lack of quality links. I think what may have happened is that your on-site efforts, relaunch and start of link building efforts gave you a short term boost that's now ending. I looked through your links at Yahoo! and with OpenSiteExplorer (the latter is usually slower in finding new ones), and there's only a very small number and from only a few sites.
I'd work on earning those citations from quality sources and over time, you'll likely regain some of those rankings. Many SEOs have, in the past, observed something akin to a "fresh boost" and I suspect that's what happened here.
BTW - this link building tip in particular might be useful - http://www.seomoz.org/blog/one-dead-simple-tactic-for-better-rankings-in-google-local - in addition to the local links, you can likely get lots of good direct links, too.
If the content is simply duplicated then moving it over with a 301 redirect is the best way to go. If, however, you've got separate, unique content, you may need to export those posts (or copy + paste) from the old Wordpress install into the new one before the shift or you could lose old content from the prior install.
I'd probably recommend contracting a Wordpress developer you trust or who has some experience (many contractors can be found on sites like oDesk, Freelancer, etc.) and asking them to help with the file move if the content isn't duplicated.
Best of luck!
I would be very hesitant to do something like this. If you want to link to something, use a followed link. If you don't trust it, don't link to it. Worrying about "reciprocity" of links in this fashion shouldn't be a concern for users or for search engines. Websites and pages link back and forth to each other all the time - it's a natural activity. It only gets troublesome when you intentionally try to manipulate rankings using "reciprocal link lists" (and I think that's why the word "reciprocal" in linking/SEO has become so maligned and misunderstood).
This WB Friday might help - http://www.seomoz.org/blog/whiteboard-friday-sitewide-reciprocal-and-directory-links
It's a good suggestion - mostly it's just a resource limit on our side (retrieving lots of rankings for lots of customers from lots of engines is challenging). We'll certainly look into it, though.
It really depends on your goals from here. Are you looking to boost traffic from social sites? If so, you're going to want to learn what your audience wants and start creating it, sharing it and participating in conversations on these platforms to earn their trust and engagement.
If your goal is to provide customer service and reputation tracking/management, you can set up alerts through services like Google Alerts (www.google.com/alerts) and many other services (this Quora thread is quite helpful on tools that can assist on that front: http://www.quora.com/Are-there-any-free-tools-available-to-track-Twitter-mentions-and-ideally-sentiment)
If you're seeking to boost SEO rankings using social media, you'll need to build up an engaged contingent of friends/followers who will share/like/tweet your content.
Some good resources include:
Best of luck!
Dejan's solution is solid, but you may also want to try writing a redirect rule, using mod_rewrite for Apache (or ISAPI_rewrite in IIS). More info here - http://www.seomoz.org/learn-seo/redirection and here - http://www.seomoz.org/blog/url-rewrites-and-301-redirects-how-does-it-all-work
I hear you - it's a frustration, but in a system that needs to crawl 30K+ sites each week without hitting anyone's server too hard, being a bother to a SysAdmin and keeping up with a massively complex, ever-changing queue, it's the reality.
That said, we do have a custom crawl tool you can fire anytime and usually get data back within just a few hours! It's here - http://www.seomoz.org/labs/cc
The intent/value behind the crawl inside the campaign is much less about a specific, one-time crawl, and more about having data every week, with historical information showing progress, updates and warnings (in case something goes wrong). There's lots of good free tools as well for single-purpose crawls, e.g. http://www.seomoz.org/blog/xenu-link-sleuth-more-than-just-a-broken-links-finder
Also, just to be totally clear - the system running the crawls for the web app in you campaigns is different than the Linkscape web index (which only updates every 3-4 weeks). Eventually, the two might merge, but we didn't want to bias any crawling inside Linkscape when we launched the web app last September.
Three things I'd recommend checking out:
As Joel pointed out, the on-page optimization tool that's part of PRO can also be helpful, but it's good to have an understanding of the what and why, too.
This powerpoint deck might also be helpful if you like PPT format as a way to run through things - http://www.seomoz.org/blog/a-comprehensive-intro-to-seo-powerpoint-slide-deck-
Wish you the best of luck!
In the PRO Web App, the CSV export of your crawl includes the list of sources for all 404s/500s/etc. that can help. You'll also find these in Google Webmaster Tools for your site.
There's a few popular ones:
They all use large-scale results scraping to get data (except Compete, which uses ISP/clickstream info), so make sure to check the quality of data sources.
Hi David - I'm looking at the top pages report and don't see those (but only browsed through a few pages, so they could be further down). I suspect what's happening is some sort of misconstructed redirects or links to those pages. If you use the PRO Web App's crawling functionality, you can create a campaign for kpmginstitutes.com and see not only where those errors happen, but download a CSV and get the sources of the 404s, too.
You might also find this information in Google Webmaster Tools.
It isn't YOUTube, but we did cover http://www.wistia.com which allows you to make embeds that point back to your site, automatically creates Video XML Sitemaps for you and generally rocks
http://www.seomoz.org/blog/video-seo-basics-whiteboard-friday-11080 and http://www.seomoz.org/blog/creating-online-video-strategy are both good posts on the topic.
Pamela - I'd also make a recommendation to use http://www.whitespark.ca/local-citation-finder/ to help identify some good sources AND definitely make sure all the information is up to date on old listings in the local results for your client. Google Places relies on these citations to do the rankings, so you can seriously help them by being on top of those listings
Yeah - htaccess is the way to go for rewrite rules. Check out http://www.seomoz.org/learn-seo/redirection for more depth/detail, too.
If that's the timeframe, it's most likely not the recent Farmer Update. I'd guess it's most likely to be a penalty of some kind (manipulative linking practices, cloaking, thin/duplicate/scraped content are the most likely culprits). You might check out these posts for more detail:
http://www.seomoz.org/blog/pagelevel-algorithmic-penalties-on-the-rise-from-google
It really depends on the situation. If you've got a Facebook page that can get an extremely high number of likes/shares, but the webpage/domain version of that same page couldn't earn similar quantities of links/social mentions, it's possible the Facebook page would outperform. That said, due to the limitations and restrictions of the Facebook platform, and because it's a property someone else owns and controls (Facebook), I'd never go that route by default unless I knew it was a very specific, short term campaign.
I don't think I could get by without the mozBar. It makes my SEO browsing so much faster and I always feel like a power-SEO in front of others when I can show it off, particularly the page overlay vs. having to click "view source" or "inspect element" to find issues on a page.
Hi Kristina - I actually wrote an in-depth post about this and did a whiteboard Friday on the topic:
Basically, the answer is "sometimes" subdomains will be connected with the root domain and other subdomains in the eyes of the engines, but other times, this isn't the case.
In the specific one you're describing, my guess is that much of the root domain authority and ranking power will pass to the subdomain, so long as there are good links passing between the two (indicating a relationship, rather than a new, separate site).
best,
Rand
You probably don't need to worry about the noindex tag, just the rel=canonical should be enough to get the engines recognizing the right page (and I'm not 100% sure how the noindex might interact).
Hmm... I'm not sure I like that as much as getting the product page indexed and known by the engines as the canonical version. Perhaps you could produce the RSS feed/blog with the reviews, but use rel="canonical" on those pages to point over to the product pages which include the reviews? That would be a way to potentially have your cake and eat it too
Hi Antonio - a lot of sites, particularly in the e-commerce field, face precisely this issue. What I've seen be most effective is what Amazon, BestBuy and many others do, which is to create a single page for any product and include editorial/user reviews and more detailed information when it's available and when it's not, leave that area open for future additions of content. This way, you have a single version of any given page and you create a positive association with the crawlers and humans that some/much/most of your content/products will eventually get a good, rich description.
You can also use Saibose's suggestion in combination if you'd prefer having this content in separate, embedded "tabs" on the page that all resolve to the same URL. Check out a code sample and example of this in action here - http://dhtmlkitchen.com/scripts/tabs/tutorial/navigation.jsp
Best of luck!
Rand