This is true. The rewrites using [R=301,L] are great when it is necessary to use a regex to perform the 301's with fewer entries in your .htaccess
- Home
- coneh34d
coneh34d
@coneh34d
Job Title: Head Nerd Whats In Charge
Company: TSH
Favorite Thing about SEO
The depth and challenge
Latest posts made by coneh34d
-
RE: Apache Rewrite Verse Redirect
-
RE: Apache Rewrite Verse Redirect
A rewrite passes no link juice from the old page to the new. Whereas a 301 does. In .htaccess the code is simple:
Redirect 301 /oldpage.html /newpathto/newpage.html.
Cheers!
-
RE: Avoiding duplicate content on internal pages
It is very important that you ensure the content is unique. Make sure it passes a test like Copyscape before publishing.
The primary reason is this. Duplicate content will not get Indexed. All that is necessary is to have the content written really well either by yourself or a service like Textbroker.com.
I run into this sort of thing all of the time for my clients that offer the same service in multiple cities. If you just churn out similar content, Google will not Index it. Of course, if its not in the Index, it isn't getting found.
Cheers!
-
RE: Unoriginal Content Shared By Web Firm with Industry Niche
Resolving issues with a site that has content that is not unique i is not as simple as adding good unique content.
Panda addresses this and the following is a good read: http://searchengineland.com/why-google-panda-is-more-a-ranking-factor-than-algorithm-update-82564.
In a nutshell, Panda "tags" your site as offending. To quote the author of the article, "Panda has been described as more of a ranking “signal” as opposed to an actual algorithm change. This means that when Panda is run it applies a “tag” to an offending site so that when the “normal” Google algo comes around, it uses that “tag” as a ranking signal (which basically tells the “normal” algo – “do NOT rank any page on this site well”).
Read more: http://www.potpiegirl.com/2011/10/how-google-makes-algorithm-changes/#ixzz2FMiCcJR3
Now. What to do. You can:
- Replace all of the duplicate content and get all on page and off page SEO in line. This may allow the site to self correct over time or you can submit a reconsideration request to Google once it is clean (this is working more and more even for sites hit by algos).
- Build a new white hat SEO site side-by-side with the original, using great content.
I am going through a similar situation right now, and whether you can come out of this type of algorithmic penalty cannot be predicted with 100% accuracy. It pays to hedge your bets in this case.
Cheers!
-
RE: SERP Rankings: Breadcrumb appears near URL
This is expected behavior. Check out http://youtu.be/-LH5eyufqH0 from Google Webmaster Help.
Cheers!
-
RE: How do i target keywords locally
The way I have handled geographic terms with little data is as follows:
- Find the pivotal seed keywords first. These are the high traffic keywords without any geography attached. In your case I would start with the obvious (e.g. cosmetic surgeon City Name, and so on)
- Next sort the cities your client does business in by population (i.e. by Google traffic)
- Then create your long tail keywords using the seed keyword and city name.
If the amount is manageable I recommend creating a targeted landing page dedicated to that keyword (e.g. Breast Augmentation Tarrrytown, NY.
Now here is where it is very important to take heed. You absolutely must have unique pages. Breast Augmentation Tarrytown and Breast Augmentation Staten Island should pass a Copyscape when compared. Otherwise, the page won't get indexed.
I do this type of work quite a bit, and presently am working with a client that services a little over one thousand cities.
Unique content is a must.
-
RE: Best way to structure a service area page with many locations to maximize internal links to them?
No. What matters is that the page for each area that you service gets Indexed. Unique content is necessary for this to happen.
-
RE: Best way to structure a service area page with many locations to maximize internal links to them?
For the purpose of feeding the crawlers, I recommend an HTML sitemap that is technically organized by County. Make sitemap.html with links to pages for Airports, Rail, and Counties. Then make a an HTML page for each of these with the individual links to pages.
Your user experience should be driven by what is simple. I'd want to punch in my zip code to determine if you service my area. There are likely many other good ideas for the end user experience. Why not ask some current customers?
Thanks,
Eric K. Cone
Best posts made by coneh34d
-
RE: SERP Rankings: Breadcrumb appears near URL
This is expected behavior. Check out http://youtu.be/-LH5eyufqH0 from Google Webmaster Help.
Cheers!
-
RE: How do i target keywords locally
The way I have handled geographic terms with little data is as follows:
- Find the pivotal seed keywords first. These are the high traffic keywords without any geography attached. In your case I would start with the obvious (e.g. cosmetic surgeon City Name, and so on)
- Next sort the cities your client does business in by population (i.e. by Google traffic)
- Then create your long tail keywords using the seed keyword and city name.
If the amount is manageable I recommend creating a targeted landing page dedicated to that keyword (e.g. Breast Augmentation Tarrrytown, NY.
Now here is where it is very important to take heed. You absolutely must have unique pages. Breast Augmentation Tarrytown and Breast Augmentation Staten Island should pass a Copyscape when compared. Otherwise, the page won't get indexed.
I do this type of work quite a bit, and presently am working with a client that services a little over one thousand cities.
Unique content is a must.
SEO, PPC, Consulting, Project Management
Looks like your connection to Moz was lost, please wait while we try to reconnect.