Definitely -- would still be nice to hear another opinion or two, but Todd was very specific and thorough. I've got some great, actionable takeaways from this conversation.
Posts made by dunklea
-
RE: If UGC on my site also exists elsewhere, is that bad? How should I properly handle it?
-
RE: If UGC on my site also exists elsewhere, is that bad? How should I properly handle it?
Really helpful thoughts all around. Sounds like we might continue to no-index the content for now (we assume that the descriptions of the programs are duplicate, since there's no way for us to efficiently check if people uploading program listings are just copy-pasting from their website -- reviews usually are the unique piece of content that allows us to move a page from noindexed to indexed).
Also, thanks for bringing the leave-a-review URLs to our attention -- I'll bring it up with my team this week!
Thanks again, Todd!
-
RE: If UGC on my site also exists elsewhere, is that bad? How should I properly handle it?
Understood -- however, 1. It provides value and matches common user queries (i.e. "University of Michigan study abroad program reviews -- just made that exact query up though) 2. It's not lean content; we only think about indexing these pages when there are enough in-depth reviews (i.e paragraphs not a line or two). 3. I totally understand that that's usually the problem with duplicate content, but our competitors are ranking for these because we're being cautious of potential future algo updates and they're not -- so regardless of the fact that we're putting the content up first, they're getting the points.
My question is more along the lines of: is review content / user generated content treated differently than you traditional duplicate blog post? Is it OK if we index this page with reviews even though our competitor has already done so?
-
If UGC on my site also exists elsewhere, is that bad? How should I properly handle it?
I work for a reviews site, and some of the reviews that get published on our website also get published on other reviews websites. It's exact duplicate content -- all user generated. The reviews themselves are all no-indexed; followed, and the pages where they live are only manually indexed if the reviews aren't duplicate. We leave all pages with reviews that live elsewhere on the web nofollowed. Is this how we should properly handle it? Or would it be OK to follow these pages regardless of the fact that technically, there's exact duplicate UGC elsewhere?
-
Rich-Snippets with Sitelinks. How-to?
Quick question with a hopefully easy answer (and to get away from all this Penguin 2.0 talk). Does anyone have any tips for activating Sitelinks for landing pages other than your homepage?
I've attached a screenshot to better illustrate what I'm hoping to achieve, but basically I would like our SERP to have the same extra links under our meta description as the #2 result. I haven't been able to find any direct information as to how to achieve this. Is their markup code involved? Another one of those things that Google just randomly does?
The keyword in questions is, "teaching jobs abroad" FYI
Any factors that would point me in the right direction would be very helpful!
Andrew
-
RE: NOINDEX listing pages: Page 2, Page 3... etc?
We've taken the approach of adding rel="canonical" to all paginated content with the link pointing to the first page of results. We keep everything indexed and followed. We also help Google identify the URL parameter created for paginated content in our Webmaster Tool settings.
This has worked well for us, but another approach would be to add rel="next" and rel="prev" to the paginated links. This is openly supported by Google to handle pagination, but might be a little tricky to setup in your CMS.
Good luck!
Andrew
-
RE: 301 - should I redirect entire domain or page for page?
I snooped around in the Google Webmaster Tools help section and it seems like a lot of other people have faced the same problem with no solution offered. Shame on Google!
As much as possible, I would go back to all the sites that were linking to your old domain and ask them to update their links. 301s pass most of the link juice, but not all of it, so it's worthwhile to save as much as that as possible. It also helps Google start to ignore your old site and focus more on your new site.
This is all probably a lot of work, but I hope it works out! Good luck.
Andrew
-
RE: 301 - should I redirect entire domain or page for page?
A 301 redirect is basically telling search engines that content has moved from an old URL to a new one. In order to do this effectively the content has to no longer exist on the old domain (if a 301 redirect wasn't there, a 404 would be returned).
And yes, you should redirect each page individually. This will ensure that what PageRank the old content had is passed onto the correct new content. Redirecting everything to your homepage means that the rest of your new site is effectively starting from scratch.
If you're trying to redirect over thousands of URLs you might want to look into writing some custom Apache script to match everything together. It's a little tricky, but can potentially save you a lot of time!
Good luck,
Andrew
PS. Remember to update your Google Webmaster settings to let Google know that you moved your content to a new domain.
-
RE: Implementing Schema.org Metadata for Reviews
Thanks, guys! I'll keep this thread updated with our implementation and offer advice for would-be schema's
Cheers,
Andrew
-
Implementing Schema.org Metadata for Reviews
Does anyone have much experience implementing Schema.org metadata for reviews? I run and operate a website that reviews study abroad programs and we've started the process of implementing this code to receive rich SERP snippets.
We're going to use the framework used here: http://schema.org/Review
My main question is how long does it generally take to see the results? I would also like to hear from people who implemented this code, but ran into problems, and how they overcame them.
Any other tips and advice would be greatly appreciated!
Cheers,
Andrew
-
Resolving duplicate text issues with a duplicate image?
We are a listing site for programs overseas. Many of our listings are inherently the same content, because in many cases the same exact information applies. We have resolved duplicate content issues to some extent by making some of the content in these listings unique. However, for the rest of the content which is going to be the same for about 100 pages, we were wondering if its better to have an image in place instead of duplicate text content (this would basically be an image of the text in question). We know this is a problem, because this is inherently duplicate content as well (only its a duplicate image instead of duplicate text). However, what's the best solution to this problem, and is a duplicate image just asking for trouble, or might this actually be a good idea?
-
RE: Seeking URL Advice
Thanks again for your feedback!
These pages are all pretty new (4 weeks tops), but we are aware of the link juice issue. Our purposed solution has been to create a page that lists all of our country pages (approx. 125) and have this page linked to from our homepage. Hopefully Google can then get from our homepage to all of our country pages in two jumps.
Sound?
-
RE: Seeking URL Advice
Thanks, Anthony, for your response. Just to clarify how our CMS works, however, internships-abroad is actually a subdirectory of gooverseas.com. Removing the slash is not possible in this situation.
Does this alter your advise?
Thanks!
-
Seeking URL Advice
Hey Moz Community,
I'm looking for some URL structure advice for a new directory of a website. We're trying to rank for the term 'internships abroad in <country>'</country> We have roughly 100 pages targeting specific countries.
Right now the URL structure is www.gooverseas.com/internships-abroad/china, but some of my colleagues believe this structure would be better: www.gooverseas.com/internships-abroad/intern-in-china. I personally prefer the shorter structure, but we couldn't come to any agreement so we thought we'd pose the question to the community.
Any thoughts?
Thanks!
-
RE: About duplicate content
I don't think some of the responses in this thread have given you adequate information to solve your problem. 301's and rel canonical are there to solve two very different problems, and when used correctly, can solve a lot of different SEO problems.
In your example you have two URLs which I am going to assume have the exact same information on them. Classic duplicate content situation. Ideally, I think you would want to delete one of these pages and create a 301 to redirect any users and links to the other page. This will focus all your content and links onto a single page and your PR and rankings will rise. I would choose to keep the page that has the better keywords in the URL, and no, it doesn't matter if you have the .html at the end of the URL. With or without, the actual keywords in the URL are more important.
The use of rel="canonical" has a very different purpose. Say for whatever reason you want to keep both of your URL's even though they have the exact same content (testing conversation rates, for example). In this case you would use a rel="canonical" on the page you don't want to rank in the search engines, pointing to the page you do want to rank for.
On http://www.mysite/blue/index.html for example, you would create this tag: <rel="canonical" href="http://www.mysite/blue/">eCommerce sites have to do this a lot.</rel="canonical">
Rel canonical should not be used when you're trying to move content from one URL to another. That's what 301s are for.
-
RE: Nofollowing to boost internal page rankings.
What's your PageRank? I have seen evidence that websites with high PR can get away with a higher number of links on their homepage. Just look at the New York Times or similar sites. They have 100's of thinks on their front page, but Google will crawl all of them because of their authority.
But to address your question, no, nofollowing internal links has been shown to be ineffective and frowned upon by Google (see link in other reply). Follow everything and really think about the value of each link and whether it really needs to be there.
Good luck!
Andrew
-
RE: Latest OSE Update - Strange Numbers and Links
Yeap, we noticed the same thing. A lot of these new links turned out to be weird files. I'm hoping this is just a minor glitch with the latest update of the linkscape, and in the months to come will start to die off.
Good luck!
Andrew
-
RE: Incoming Spam link removal
Do you have a link to that post? We've been noticing the same thing on our domain and it's got us worried as well..
Thanks!
Andrew
-
RE: Domain authority decrease after open site explorer update. Reasons?
We noticed the same decrease on three of our websites as well, and some of the competitors we track. One of our websites also went dramatically up (with many new questionable links). I believe this latest update of the link graph had some major changes that altered the domain authorities on a lot of websites. These numbers are all relative to everyone else, so hopefully this isn't an accurate reflection of how our websites are performing.
Good luck! Happy optimizing
Andrew
-
Latest OSE Update - Strange Numbers and Links
Hey everyone. I just checked our website in OSE with the latest update and noticed some strange numbers. The number of linking domains skyrocketed from 300 to 785 which definitely caught me off guard. Like all good SEOers I work hard to get links, but in a typical month I usually get between 40 and 50. To suddenly jump up by over 300 seems odd.
Digging deeper, I noticed that some of the incoming link domains could be considered less than trustworthy (pages filled with links), while others I couldn't find at all on the page in question. I also never ever asked for any of these links, nor did anyone on my team.
Should I be worried? I've heard of websites who purposely gain bad links to their competitors in order to penalize them. Is there an easy way to find out if this is going on? Could this possibly be a problem with OSE and just reporting non-existant links?
Thanks for your feedback!
Andrew
BTW, the website in question is www.gooverseas.com
-
Why does SEOMoz use Wistia? Why would you not use YouTube as well? Does Vimeo factor into the equation at all?
I am wondering what the best options are for video hosting websites (YouTube, Vimeo, Wistia, etc.). Which host has the best interface that you would want to use to embed into your site? Any advice you have in regards to optimizing the videos on our site, then that would be much appreciated. Thanks! And our YouTube channel is here http://www.youtube.com/user/gooverseas
-
RE: Page Authority vs. Domain Authority
I've since realized that links from pages with PA less that 20 are pretty worthless, and not worth spending too much time getting. If you can get them with minimal effort, than go right ahead, but a link with a PA of 50+ will pay dividends in the long run.
-
RE: Pagination and links per page issue.
Rel canonical doesn't tell engines not to crawl the page (the mata tag nofollow does that), but rather just tells the engine not to index the page in place of the 1st page of your pager results. This helps reduce duplicate content penalties and consolidates your PA onto a single URL. I could be wrong, but I imagine you would also prefer organic users to go to the first page of results rather than, say, page 6.
The result is that engines will still crawl your pages and find your listings (and index those), but only index the first page of your listing page.
I hope that helps to clarify things!
Andrew
-
RE: Duplicate content across multiple domains
Worth a shot. Crawl bots usually work by following links from page to the next. If links links no longer exist to those pages, then Google will have a tough time finding those pages and de-indexing them in favor or the correct pages.
Good luck!
-
RE: Page Authority vs. Domain Authority
Thanks for your reply! I'll keep plugging away and aiming for higher PA links
Cheers,
Andrew
-
RE: Duplicate content across multiple domains
As long as the duplicate content pages no longer exist and you've set up the 301 redirects properly, this shouldn't be a long term problem. It can sometimes take Google a while to crawl through 1000's of pages to index the correct pages. You might want to include these pages in a Sitemap to speed up the process, particularly if there are no longer any links to these pages from anywhere else. Are you using canonical tags? They might also help point Google in the right direction.
I don't think a no cache meta tag would help. This is assuming the page will be crawled and by that point Google should follow the 301 and cace that page.
Hope this helps! Let me know how the situation progresses.
Andrew
-
Page Authority vs. Domain Authority
Like many of you, I've spent a lot of time in Open Site Explorer analyzing our links and our competitors, and looking for more linking opportunities. Recently we've been focusing our SEO efforts on gaining links from high value .edu domains and so far we've been very successful. The DA of these links are high (80+), but the links are also coming through on low PA pages (typically 10-35). Are these links still worth while? When does a link not become worth it?
-
RE: What link building techniques do you teach to new hires with no SEO experience?
I guess it depends on your niche. I have three interns working on three separate topics on my site so I haven't been too worried about this. I guess you could keep a master spreadsheet for your interns to reference while their linking building?