Couldn't it be A/B/C testing of sorts ?
- Home
- pqdbr
Latest posts made by pqdbr
-
RE: How often does Google crawls old pages, is it worth to get links from them ?
Your answer was excellent, but let's increment the discussion: what if the article is from 2-3 months old ? Is it "as worthless" link as a 2-years old blog post ?
-
How often does Google crawls old pages, is it worth to get links from them ?
Let's say there's a blog post in my industry niche that's 2 years old. Is it worth to get a link from that blog post ? How do I know Google will actually re-crawl that page, and how long can that process take ?
-
Using MOZ Keyword Rankings with Google Analytics in Excel to track paid keywords
Hi,
I sorted my keywords in GA from highest incoming traffic to lowest, exported it and then imported these 280 or something keywords into SEO Moz.
The problem is, it appears that SEO Moz Keyword Rankings tool pulls traffic data (in the traffic column) only for NON-PAID traffic ...
So what I'm seeing is that keywords that brings A LOT of traffic to my site (because of AdWords) sometimes have 0 of traffic in SEO Moz, because I'm on page 3 of SERP and I literally got no non-paid traffic that week.
The thing is: I want to improve my rankings for keywords that I'm already performing well, even if they are paid keywords, because they already bring me a lot of traffic to my website. The way this is working is not helping me out, because when I sort by the traffic column very important keywords end up in the last place for me.
Questions:
-
Whey does SEO Moz pull traffic data from non-paid keywords only ? Shouldn't be at least an option for this or I'm missing something here ?
-
While they don't change it, any Excel Gurus could help me how to export SEO Moz ranking data, import it on Excel and MERGE a new "paid traffic" column that I export from Google Analytics ? How to assing each keyword line the paid traffic value I'm getting from GA ?
Thanks !
Felipe
-
-
RE: How not to get penalized by having a Single Page Interface (SPI) ?
John, thanks for the quick reply.
I had already read the "make your Ajax page indexable", but unfortunately it was too late in product development and our programmers simply convinced us it would imply re-doing the entire backend for it to work.
So we already have in place a workaround for crawlers reach all these listings. Below the search panel (that has Ajax pagination and loads the ads on the same page with javascript) we have a standard html
So the crawlers can reach the properties individual pages. In other words, we comply with the rule "make each of your pages reachable by at least one internal link".
But my question was more focused about how google "sees" the navigation pattern of my users ... I know the crawler is reaching those pages, but since the majority of users use the search panel (that loads the properties by javascript/ajax) and not the static links below it, it might appear that the users only viewed one page inside our site.
-
How not to get penalized by having a Single Page Interface (SPI) ?
Guys, I run a real estate website where my clients pay me to advertise their properties.
The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination.
So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property.
People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors.
My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
-
Alternatives for having less then 100 links per page
Guys, I'm aware of the recomendation of having <100 links per page.
The thing is I'm running a vacation rental website (my clients pay me to advertise their properties on my website). We use an AJAX interface with pagination to show the properties. So I have cities that have +400 properties on them... the pagination works fine but google can't crawl trough it (there is a google doc about making ajax systems crawlable, but that would invove a huge rewrite of our code and I dont understand how it helps the SEO).
So my question is: what do I do to mantain each property having at least one link pointing to them at the same time that I keep the # of links in each page <100 ? Any suggestions ?
Looks like your connection to Moz was lost, please wait while we try to reconnect.