Hi there!
It's important to me that the reports from GA arrives in an specific hour of the day.
Does anyone know how to make the reports to come at 6am, for example?
Thanks.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi there!
It's important to me that the reports from GA arrives in an specific hour of the day.
Does anyone know how to make the reports to come at 6am, for example?
Thanks.
Hello World!
May you guys tell me if I can trust in BT Buckets data, and if it's a good solution?
We're integrating this solution to our website, but it seems that the API is out-of-date, like it's abandoned by the owner since 2010.
Thanks.
Thank you, but it seems that GA is not tracking page load time at all.
I use the new asynchronous tracking code, it's supposed to do it automatically, right?
Sometimes it show one or two results, but almost everything is ZERO.
It's making me confused.
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too.
I've made a research, and I found that GA is used to track page load time automatically, isn't it?
I share your opinion about Google seeing it as something suspicious.
Anyone more have something to say about it?
Hi there!
There's a new platform on the web which says that it can crawl on the web to get the best content around social media and blogs in order to allow the developer of website "x" to public this content on their websites (with all rights reserved - that make me believe that it generates automatically rel="cannonical" tags).
http://techcrunch.com/2012/04/18/scribit-launch/
I aways heard that all kind of automatic content applications suck, and my fist impression about this one was the same.
But my business partner found it really interesting, mostly because we have a small team to generate constant great content.
So I said to him that automatically generated content commonly suck in any case, and google sometime will find that our blog has 1kk rel="canonnical" tags for every post and they might find it suspect.
Anyway he asked me to research more about this specific platform and I wanna know what you guys think about it.
Forget speech errors. English is not my native language.
Regards.
Sorry, I didn't understand your question.
Let's say you have to put your needs, a specific date for it and your zipcode. These actions are in the middle of the funnel. I want to track if the user abandoned the funnel in any step of these, and It's till important to understand what people are typing.
May I set up a text box submition without clicks as an event to track on the funnel?
Hi there!
In our website, we have a few text boxes that users need to use to complete the goal.
The boxes aren't search boxes, but it's still important to us to track what people type on it.
I'm looking for a way to track the data through the "event" feature in Google Analytics, but it seems that this tracker can only calculate clicks, or video views etc.
Does anyone knows how to track do it?
So...
We are running a blog that was supposed to have great content.
Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better.
In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda.
So we decided to restard our blog from zero and make a better try.
So. Every page was already ranking in Google.
SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors.
My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects.
Does Google penalyses me for that? It's kinda obvious for me that the answer is YES.
Please, help
That's all?
Just "don't worry about it"?
No sitemap changes, nothing?
Hi, everybody!
We're starting up a local services website in Brazil. Something like redbeacon.com or thumbtack.com, but obviously different.
So we are developing our 2.0 version of the site, and I want do put microdata in every provider's pages, to rank people's evaluation about this particular provider, and geographic information about him. Ok, we want to use microdata in several pages, but those are more important: the providers.
These data (geo and rank) will be dynamically generated from our database.
In Schema.org, I only found information about using static data to build microdata for my intentions.
My doubt is: does google and bing and yahoo and etc index dynamic generated data? Is there something about sitemaps.xml or robots.txt that I can do to have my data indexed on search engines? Our front-end is the guy who deal with html and our codemaster uses pure php for coding.
Thanks!
Hi!
I think Google should imagine that your site is trying to rank better using duplicated content.
Maybe you could use rel canonical tags for all this content to show Google that this content came for that place, and is the #1 site. Than you can create new content to the sites that are coming now for optimization.