That's exactly what I'm looking at, thank you Jane.
@Robert, the "referral attributes" would not be passed through as parameters, but maybe as session data instead, therefor providing stronger SEO benefit.
Very thorough, thank you Jane
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
That's exactly what I'm looking at, thank you Jane.
@Robert, the "referral attributes" would not be passed through as parameters, but maybe as session data instead, therefor providing stronger SEO benefit.
Very thorough, thank you Jane
When looking at links and how search engines look at "social signal," does it matter if a link is 301 vs 302?
In addition to that, if I build out my own short URL system that gets used for link redirects that include referral attributes, would/could I get penalized if I use 301 instead of 302?
Here's something more interesting.
Bitly vs tiny.cc
I used http://web-sniffer.net/ to grab the headers of both and with bitly links, I see an HTTP Response Header of 301, followed by "Content", but with tiny.cc links I only see the header redirect.
Two links I'm testing:
Bitly response:
<title></span>bit.ly<span class="tag"></title>
<a< span="">href="https://twitter.com/KPLU">moved here</a<>
I wonder why you're seeing a 403, I still see a 200.
I looked at all 3 redirects and they all showed a single 301 redirect to a 200 destination for me. Do you recall which one was a 403?
Looking at my original comment in the question, last month bit.ly had 10M results and now I'm seeing 70M results, which means there was a [relatively] huge increase with indexed shortlinks.
I also see 1000+ results for "mz.cm" which doesn't seem much strange, since mz.cm is just a CNAME to the bitly platform.
I found another URL shortner which has activity, http://scr.im/ and I only saw the correct pages being indexed by Google, not the short links. I wonder if the indexing is particular to bitly and/or the IP subnet behind bitly links.
I looked at another one, bit.do, and their shortlinks are being indexed. Back to square 1.
I spot checked a few and I noticed some are only single 301 redirects.
And looking at the results for site:bit.ly, some even have breadcrumbs ironically enough.
Here are a few examples
<cite class="_md">bit.ly/M5onJO</cite>
None of these should be indexed, but for some reason they are.
Presently I see 70M pages indexed for "bit.ly"
I see almost 600,000 results for "bitly.com"
I did a quick search for "site:bit.ly" and it returns more than 10 million results.
Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination?
I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Similar to the eBay situation with "expired" content, what is the best way to approach this?
Here are a few examples.
With an e-commerce site, for a seasonal category of "Christmas" .. what's the best way to handle this category page after it's no longer valid? 404? 301? leave it as-is and date it by year?
Another example. If I have an RSS feed of videos from a big provider, say Vevo, what happens when Vevo tells me to "expire" a video that it's no longer available?
Thank you!
I've been looking at some bigger enterprise sites and noticed some of them used HTML like this:
<a <="" span="">data-href="http://www.otherodmain.com/" class="nofollow" rel="nofollow" target="_blank"></a>
<a <="" span="">Instead of a regular href=""
Does using data-href and some javascript help with shaping internal links, rather than just using a strict nofollow?</a>
I did a search for "car" vs "cars" and I see a drastically different number of results.
3.3B vs 1.5B, respectively.
Do you have any research to support your response? Just curious where you're getting your information from.
I'm building out a specific section of our site and I want to make sure I target it correctly.
Is there a rule of thumb when to know how to use "car" vs "cars"? (as an example)
Is there a specific way to research the right approach?
thank you!
That might work.
The way our tracking links are setup, it's domain.com?ref=1111&goto=/this/path
then it redirects the user with a 302 to domain.com/this/path
Would changing it to a 301 be damaging because it includes the referral link and isn't a natural backlink?
We have a website with a large member base where users share content a lot. But the SEO problem I have, is that every share needs to reflect a "referral link" so that if a new user signs up, the original sharer receives "referral" credit for that sign up.
I really want to be able to take advantage of our huge member base and their sharing, but I'm finding it difficult to do so because each user shares a link that has their referral in it.
Can anyone think of an SEO friendly method of sharing that both gives our site SEO benefits, as well as providing the referral credit back to the user?
After doing some further research, I looked at comments on big sites like Techcrunch.com and Mashable.com, only to find out that it looks like their comments are not being indexed. But comments on SEOmoz.org/blog/ are indeed being indexed. So it leads me to believe that Javascript comments are still a bad idea, no matter how they're loaded in. That making comment data available directly in the HTML is the way to go.
Does anyone have any data to backup my assumption?
Likely not slowing it down, that's just how the system was originally built (SEO was not taken into account).
Just curious if anyone had a better method to do it.
Agreed. It can only help to set canonical. Google is smart enough to figure out to discard those parameters, as they are their own parameters. But you could also set those parameters to be ignored in GWT.
The way our in-house comments system was built, it uses AJAX to call comments as the page is loaded.
I'm working on a set of requirements to convert the system over to be more SEO-friendly.
Today, we have a "load more comments" after the first 20 comments, then it calls the server and loads more comments.
This is what I'm trying to figure out. Should we load all the comments behind the scenes in the page, then lazy load the comments or use the same "load more" and just load what was already loaded behind the scenes? Or does anyone have a better suggestion about how to make the comments crawlable for Google?
I've looked at Google CSE yes. It doesn't provide enough control over result styling and formatting, and isn't as up to date with URLs as I would like.
I'm looking at integrating a better search engine solution on our site and looking for some input on what other people use.
I'm aware of SLI Search and Lucene by Apache, do you have any other suggestions for us to look at?
Thanks
I see evttag= used on realtor.com, what looks to be for click tracking purposes.
Does anyone know if this is an official standard or something they made up?
Thanks, that's more accurate to the question I was asking. Appreciate the response.
Hi,
I've read that document already, it still doesn't answer my specific question -- that's why I asked it here.
I agree, using that Keyword | Brand is best. However, that doesn't answer my question above...please re-read the question about the character limit and how to implement a <title></p></title>
I have a Title on a page that has more than 70 chars in it, and I also have to include the page type and brand in the title.
Is it better to concanenate the title server-side and use a .. instead, like this?
Type: This is the page title of the long title and I need to.. | Brand
Or, should I put the full Title of the page so that the keywords are available for Google? example,
Type: This is the page title of the long title and I need to add more content | Brand
I'm trying to keep to the 70 char limit for Google.
Which solution is the more ideal situation?
Thanks all
jonathan