Next question.
-- Person shares a link from our site. Someone else sees it in their feed in tweedeck and clicks. That click through is recorded as a direct.
Anyway to combat that issue?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Next question.
-- Person shares a link from our site. Someone else sees it in their feed in tweedeck and clicks. That click through is recorded as a direct.
Anyway to combat that issue?
Well, we have not done a comprehensive job of using tracking parameters, so I am assuming that's part of the slippage. Also, we have share links on millions of pages that our visitors use. So folks clicking through from tweetdeck and the like from those links would show up as directs.
Thanks. We do use parameters, so we're covered there, but we are seeing an increasing number of directs in sources, which I assume is from tweedeck, et al. Thanks for the clarification!
Hey, How do shortened links show up in GA? So if I tweet about something and use bitly, does twitter get the referral? I am thinking not. I have never seen bitly show up as a referrer, but we gets lots of clicks from those links. Hmmmm. Anyone?
E
Hello there,
My site is basically an Ajax application. We assume lots of people link into deep pages on the site, but bots won't be able to read past the hashmarks, meaning all links appear to go to our home page.
So, we have decided to form our Ajax for indexing. And so many questions remain.
First, only Google handles indexable Ajax, so we need to keep our static "SEO" pages up for Bing and Yahoo. Bummer, dude, more to manage.
1. How do others deal with the differences here?
2. If we have indexable Ajax and static pages, can these be perceived as duplicate content? Maybe the answer is to disallow google bot from indexing the static pages we made.
3. What does your canonical URL become? Can you tell different search engines to read different canonical URLs?
So many more questions, but I'll stop there.
Curious if anyone here has thoughts (or experience) on the matter.
Erin
Oh, duh. missed the dropdowns. Not your UI, just me multi tasking. Thank you!
Does SEOMoz have a tool that allows me to see backlinks to a specific URL sans redirected URLs?
So, URL is www.website.com/search.
www.website.com redirects to www.website.com/search and I want to see how many people have linked to the /search, the resulting URL.
Thanks for any help!
Erin
How is the toolbar different that a serp? i assume they can track the same stats in both. Curious.
yslow for firefox
and good old Google Webmaster Tools
Hey guys, Thanks for the info, but I really interested in the following: do you believe (based on fact, testing) that Ggle takes bounce rate into account in the algos, and if so, how do they do it? There should be a separation between ggle analytics and the search engine, so do they determine it by time?
I have believed for years that a high bounce rate (from search) could lower your rankings over time. Makes sense; if users bounce right back to search after looking at your page Google should think that page wasn't very useful and will push your down the SERPs.
But, how do they determine this? If a user comes back after 30 seconds that's a bounce?
Or is my premise incorrect and Google does not take bounce into account?
Erin
Hmm, ok.
I understand that linking to deep content will help that content, but we are trying to do discrete tests. My company was founded by a computer scientist, so everything must be scientific, even if my inclination is to simply do all the best practices and expect (hope) for the best. The test sets are too large and not in different directories, so going through logs would be . . . horrifying.
Here's the issue I am curious what peeps think:
What if I set up an additional GA profile and have two sets of GA code on each test page? Then I am getting credit for views in my main GA profile and can easily see if test sets are being viewed in the second profile.
Or, can someone give me info about how to set up a 1x1 pixel and see those stats in GA?
Basically we are looking to get more of our pages indexed. So rather than do everything to every page (we have tens of millions of pages) we are looking to test ideas on subsets. Whichever test gets indexed faster and more click throughs we will promote to all of the pages.
What is CRO?
We are making changes to a subset of our pages that are crawlable by google, but we aren't linking to them in a special way, so I am unsure how the URL builder will help us. But I may be missing something!
Trying to get some best practices on testing SEO changes.
We are going to make a bunch of changes on subsets of pages. Say testing about 5 different on-page changes.
Originally we were going to submit separate Sitemaps to GWT and see if our test sets get indexed, how quickly, etc. But we noticed that GWT says some pages in our Sitemaps aren't indexed even though we know they are (what gives?).
So we thought, for each test, let's put a unique code on the page so we can see how many get indexed by Google.
But that doesn't solve the issue: how many people clicked on our test pages. So we are thinking of putting a tracking pixel on the test pages, specific for each test. But then I am thinking, why not just create a separate Google Analytics profile and place that code on the test pages (set up goals to track visits per test since we aren't going to change the actual URLs).
and on and on
This is where you come in. What kind of tracking do you implement when you set up tests?
Advice appreciated!
E