I'm determined to master htaccess and regex... feel free to pm me the file
Best posts made by WilliamKammer
-
RE: Can someone interpret this entry in my htaccess file into english so that I can understand?
-
RE: Why is my website providing no feedback on the Open Site Explorer?
The website is not even a month old. This doesn't give a lot of time to collect data on links, or for links to be created to the website.
From the looks of things, it looks like this domain has very very few links. Data collectors like Moz will eventually find these when they refresh their data, but these kinds of tools are not real-time, and won't find every single link to your domain. So if there is like one link pointing to this domain, it's possible it'll take a while for bots to discover it.
-
RE: Is it okay to copy and paste on page content into the meta description tag?
I also do this. The meta description is supposed to have a nice sentence or so that is relevant to the page and makes people click. If the content on your page can't do that, you have a bigger problem than meta descriptions.
-
RE: Can someone interpret this entry in my htaccess file into english so that I can understand?
Sorry to disappoint, Travis. Nothing too complicated, looks like it was just a botched www cleanup. A good old:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^legacytravel.com [NC]
RewriteRule ^(.*)$ http://www.legacytravel.com/$1 [L,R=301]Will do the trick.
-
RE: SEO and page redirects from a high ranking site quandary
It depends on what you're trying to accomplish. I'd go with option #2 if you're looking to transfer your rankings from the old site to the new. Once Google reindexes everything, you should have most of your ranks back.
If you're keeping the same content on the new site and the old, then option #1 isn't really an option, unless you canonicalize the old pages, which will have the same effect on ranks as the 301s.
So then it comes down to what's better for users? Do you think they would rather clickthrough the old site and its calls to action, or would they prefer to be automatically redirected to the new property?
I would think that you would want the new website to rank instead of the old, since having the old site continue to rank would just create an extra step for users.
-
RE: Better to use specific cities or counties for SEO geographics?
Wait... the town and county have the same name? Then there's no issue.
People don't search by county and rarely put in the state when searching a city geo. Your money term is variations of, "cleaning services in Winona". Even though this phrase and others like it don't have a high search volume, experience and years of data tell me this would be the way to go. Unless you wanted to focus on carpet cleaning, which is a different ball game.
To sedate your client, maybe discuss a local SEO play with G+. Then you can define the exact area you'd like to cover, which would include both no problem.
-
RE: Manual Action - When requesting links be removed, how important to Google is the address you're sending the requests from?
Google doesn't care where the email comes from to request a link removal. I've never seen a disavow report where the email of the requester is even mentioned. All Google wants to see in a disavow report is which links you want to disavow, and how much of an effort your made to get them removed manually.
The reason your SEO is requesting an email address at your domain is likely because he's using software to request link removals, and that software requires the email. Services like Rmoov are great for streamlining the disavow process, but in order to use Rmoov, you have to prove you're part of company, which requires the email address.
-
RE: Duplicate Content... Really?
I think you're already in Panda territory. The content can't get much thinner. It seems like all those sub-pages that are linked to on the page you just shared are unnecessary, no? Couldn't you just have the one page, build it out with the cars it works in, maybe a diagram or instruction on how to put it in, and make a really valuable page?
What's the thought process of creating a bunch of new pages, even though it's the same product, just referred to differently by different companies? Just for the unique URLs and titles?
Consolidating all of that would eliminate thin content and likely strengthen your landing page exponentially.
-
RE: Disallow statement - is this tiny anomaly enough to render Disallow invalid?
The additional asterisk shouldn't do you any harm, although standard practice seems to be just putting the "/".
Does it seem like Google is still crawling this subdomain when you look at webmasters crawl stats? While the disallow function in robots.txt will usually stop bots from crawling, it doesn't prevent them from indexing or keeping pages indexed that were before the disallow was put in place. If you want these pages removed from the index, you can request it through webmasters and also use meta robots noindex as opposed to the robots.txt file. Moz has a good article about it here: http://moz.com/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
If you're just worried about bots crawling the subdomain, it's possible they've already stopped crawling it, but continue to index it due to history or additional indicators suggesting they should index it.
-
RE: Using disavow tool for 404s
I wouldn't recommend using the disavow tool for this. The disavow tool is used to clean up spammy links that were not gained naturally.
A better solution is to use 301 redirects and redirect the 404'd pages to the new pages that work on your website. That way users will land where they should if they click the links, and Google will still give you juice from those links.
Here's a place to get started on how t do that: https://support.google.com/webmasters/answer/93633?hl=en
-
RE: Fake Links indexing in google
Looks like a hack. A hacker somehow got in at some point, dropped a bunch of Ugg Boot affiliate marketing pages and left. Not sure why they are 404ing unless someone already discovered these when they happened and cleaned them up. That could've happened months and months ago.
The 404s shouldn't effect your SEO, but the hack has potential to if it hasn't been cleaned up properly. Do you see a spike in search queries if you look back over the last year or two? That may indicate when the hack occurred and was cleaned up. It's important to know how the hack was cleaned up, so you can ensure that the vulnerabilities have been resolved. If they haven't been, your site is still open to additional attacks, and spam like that can hurt your SEO.
For Wordpress, it's important to keep not only Wordpress itself up to date, but also your plugins (and only use well established plugins, and do a little research on them to make sure people aren't screaming about hacking issues). Hackers search for vulnerabilities in all sorts of places.
-
RE: Please share best practices for subfolders and paths in a domain name
It depends on a lot of factors, but in general, I don't like to put category subdirectories in unless they are necessary. For example, if a category page is useful to your audience, then it might be a good idea, but in this case it doesn't sound like one would be.
Also, consider how your audience will react to the URL: domain.com/attorneys looks a lot better to people, I think, than domain.com/industries/attorneys, which implies you're marketing to a bunch of different industries and may lead them to believe this is solely a marketing play and you don't actually understand their needs. A page right off the root seems more exclusive and like you care more about their industry.
These are just my opinions. From a strictly SEO perspective, you're fine either way, but I'd still go right off the root.
-
RE: SEO impact difference between a URL Rewrite and 301 redirect
To Google, there is not. The R=301 at the end of the rewrite rule defines it as a 301 redirect, so it's practically the same thing. For a one-off redirect I wouldn't use the rewrite format. This is usually for when you need to grab big chunks of URLs and redirect them all at once. Still, if it works this way, there's really nothing wrong with it from the redirecting standpoint. If there was, when people used it for large quantities of redirects, it wouldn't work.
-
RE: Migrating Reviews from Old SIte
Are you able to access the .htaccess file on the old website? If so, it sounds like the best thing to do would be to 301 redirect from the old review pages to the new ones. With these 301s in place, Google can figure out that the site has moved, and you aren't just spamming a ton of new reviews and manipulating dates.
If that isn't possible, implement canonical tags on the old pages.
In either case, make sure to mark up the new pages in Schema.
-
RE: Best way to create content in Google Plus to help SEO
I agree with the other responses on the point that the quality content you generate should not go into a G+ post. Put your good stuff on your good site, and share through G+, it still is, technically, social media.
With that said, there is a great way to generate content and increase your ranks (of your G+ listing) with content in G+, and that's by relying on others. Get more reviews, link your G+ in places that people will see, so they follow you, be engaging with them on G+.
Don't think of G+ as another site for you to put content on. Think of it as another social media site where you can attract an audience and be engaging. By doing that, you are allowing others to produce content for you on G+ while you provide value by engaging them.
-
RE: When the Plural has more traffic, but the singular makes much more sense. What to do?
Ha, yeah. This job would be so easy if only clients weren't a factor. The ones that listen are always the ones that have more success.
Good luck convincing your client. Keep your cool, it can be frustrating when clients force you to let them shoot themselves in the foot. This is because once their foot is bleeding, they're going to blame you for the pain.
-
RE: Identifying Bad Domains
There are tools out there you can pay for to do this work for, things like linkdetox.com can be helpful to identify the bad links. Something like Rmoov is great to then begin to removal process.
If doing it on your own, you'll need to check out each questionable domain manually. You can rule out big domains that you know are not spammy, and focus on the ones you have no idea about. When looking at a domain, ask yourself the following:
-
Is this site indexed in Google?
-
Could I easily create a followed link from this domain to mine? OR pay for one?
-
Does this site seem to be scraping information of mine from other sources?
-
Is this site in English (or whichever language your site is in)?
-
Is it a crappy directory site? Article submission site? Other easily exploitable kind of site?
These are the sorts of things to look out for. Google doesn't want to count links for you if you made them for SEO purposes, or paid someone else to. If you look at a site, and it appears that their business model is building links, or they are spammy in anyway, you know it's a red flag.
Also, I'm learning more and more that when you're on the fence about a domain, disavow it.
-
-
RE: Does the position of an author byline on a page affect authorship?
In my experience, it doesn't matter where the byline is. I've dealt with sites with is almost everywhere and recently had a site with it in the footer with no issue. Maybe not the ideal place to have it, but still an option it seems.
-
RE: Migrating Reviews from Old SIte
Is it possible to host the domain somewhere cheaply once it's shut down, just to have an .htaccess file on it? Then it can still be done, just make sure everything redirects somewhere in the .htaccess.
An .htaccess file on your new domain can't control web pages from the old domain to redirect them, that .htaccess file has to go on the old domain. .
Correct, you can cannot tag canonically if the old site is shut down.
Schema generators are to taste, pick one you feel comfortable using and double check the markup.
-
RE: Language Specific Social Account
Yes, and have each one direct to the proper language on the website. If I'm a social media user, why would I want to engage with an entity that only speaks my language a third of the time?
-
RE: Is it worth disvowing scrappers' links?
I make sure to disavow these as well. No need to disavow each link individually, just knock out the whole domain in the disavow.
This became common practice for me once I had to clean up a manual action penalty that wouldn't go away until the scrapers were disavowed.
-
RE: Why is my website providing no feedback on the Open Site Explorer?
I'm referring to inbound links to your website. That's what Open Site Explorer shows you, but since your domain doesn't have much yet and it's all pretty new, it's not displaying yet.
-
RE: A good review schema markup tutorial?
Depending on the amount of reviews you need to mark up, this tool may be all you need: http://schema-creator.org/review.php
Just embed the generated code and you'll be good to go, or it will at least give you a good start.
There are also Wordpress plugins that allow this sort of generation in the backend.
Hope that helps.
-
RE: Site with no ads hit by Page Layout update?
I wouldn't attribute this to a page layout penalty.
Make sure to keep track of your rankings, so you can check to see if there was a dip in ranks that correlates with the dip in traffic. Keep in mind the ranks do fluctuate, especially if someone else is actively focuses on the terms.
This could very well be other people gaining ranks above you, or search volume being a little low for a bit. The data isn't large enough to determine trends. A few dozen impressions here and there could be anything. Take larger time-spans of data to determine trends to figure out why things are happening, but this isn't a display penalty in my opinion.
-
RE: Claiming Google+ URLs?
I do sometimes see hijacked G+ pages, or other shady G+ activities, but in my experience flagging it or calling Google resolves the issue quickly. You're always going to have people trying to game the system, just make sure to know how to defend yourself against it.
As for +TacoBellCom, first you would have to have the business named "Taco Bell" and get that verified (good luck there). Then, you'd have to hope that Google lets you add characters to the end of the URL instead of deciding it for you. Then, you would have to hope you were never reported, flagged, or otherwise looked at by a human on Google's end. So, odds are it wouldn't work for too long.
That's not even getting into the legality of trademark infringement.
-
RE: Sitemap indexed pages dropping
Try to determine when the drop off started, and try to remember what kinds of changes the website was going through during that time. That could help point to the reason for the drop in indexing.
There are plenty of reasons why Google may choose not to index pages, so this will take some digging. Here are some places to start the search:
-
Check your robots.txt to ensure those pages are still crawlable
-
Check to make sure the content on those pages isn't duplicated somewhere else on the Web.
-
Check to see if there was any updates to canonical changes on the site around when the drop started
-
Check to make sure the sitemap currently on the site matches the one you submitted to Webmasters, and that your CMS didn't auto-generate a new one
-
Make sure the quality of the pages is worth indexing. You said your traffic didn't really take a hit, so it's not de-indexing your quality stuff.
-
-
RE: Had SEO Firm tell me to Start Over - pros and cons help please
You can recover, but I wouldn't recommend hiring one of those companies that said you couldn't.
-
RE: Chinese Sites Linking With Bizarre Keywords Creating 404's
Check this post out, it may help https://support.google.com/webmasters/answer/93713?hl=en
EGOL knows what he/she/it's talking about. You can report and disavow if you'd like, but worry about yourself. These people don't go away and are better off ignored if they aren't directly effecting your marketing efforts.
-
RE: SEO Consulting for HUGE Website. How Big Is TOO Big Of A Change?
Sounds like the site is big enough that you have the luxury taking a nice little chunk of pages, doing your tests, seeing what happens, and then deciding whether to make the change site-wide. Take a good sample of pages across you site to test on, make sure you know their baseline ranks and traffic to those pages, make the changes, monitor, test some more, etc. This way, no guess work
-
What should go in a "Link Juice" cocktail?
The office just got some of those fancy Google Partner bottles, and I want to fill them up with a tasty cocktail that is link juice themed.
Any ideas on how to tie Google and/or SEO into a fancy cocktail?
-
RE: Can new content be added to a url which has a 301 redirect?
Moving the content over and 301ing the old URL will do that. You'll see a dip in ranking, then a recovery period (usually weeks as opposed to months). That's your best bet when you are forced to make a move like that. Ideally, you'd find a way to keep the content where it is, but if that's not possible, then the 301 will do.
-
RE: Merging two pages into one - bad seo done previously
The page 2 URL looks spammy, and if it provide the same kind of content as on page 1, then I would agree that 301ing page 2 is a good option.
-
RE: Same Alt tag on the images
It's OK to have his name, just add more to it to differentiate the photos.
-
RE: Too Many Links on One Page - What to Do?!
Google updated their guidelines a while ago, and no longer suggests the 100 or less links per page. Now the guideline simply states, "Keep the links on a given page to a reasonable number," which is subjective. https://support.google.com/webmasters/answer/35769?hl=en
With a site like yours, full of different kinds of forms and such, it's logical to consider having 100+ links per page. There are other options for you as well, if you believe these links are hurting, but according to Google they likely are not.
If you wanted to try something different, you could think about building out detailed category pages for each sections of things you offer on the site, and make those the pages that rank for your terms. This way, the number of links on your main page is dramatically reduced, and the user experience might improve, since things aren't quite as condensed.
-
RE: Correct approach to a business website with separate content for personal and business customers
Depending on the size of the company, it might be better to leave subdomains out of it. Subdomains are like brand new sites, so you'd be working a lot harder to get things ranking if you went that route. You don't have a ton of pages per category, so everything might be more manageable under the single domain with good URL structure.
Websites that aren't AT&T need all the help they can get, and keeping things in subfolders instead of subdomains will help keep all your authority under a single domain.
So, site.com/business, site.com/personal, etc.
-
RE: What can I do to stop ranking for a keyword that has nothing to do with the companies website?
What do you call a porn-addicted Mozzer? A person with a lot of link juice.
Yeah, I heard Moz gets a lot of porn traffic. I also heard they get a lot of unauthorized backdoor entries.
Something something DDOS attack. Something something too many partners.
I'll be here all week, or until Moz bans me.
I'm sorry. I'll go back to spreadsheets.
-
RE: Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
Subdomains are seen as completely different websites. This means you would be diluting your authority. Using duplicate content on these subdomains would also definitely be an issue.
You're best bet would be to have different pages for each location, and make sure each of those page have unique content. This way it's all on the same website, fresh content on each page, and you still accomplish what you were looking to.
You could also go one step further and link each of the location pages to individual G+ pages and take advantage there as well.
-
RE: Google Webmaster Tools Search Traffic->Manual Actions shows "No Manual WebSpam Actions Found" but is it right?
While the manual action may have been removed, it's possible you're still suffering from a different, non-manual penalty (one the algorithm dishes out without manual (human) involvement). So, it's good news that you no longer have the manual action, but it doesn't mean you have no penalty at all.
-
RE: Merging domains into sudomains
This is really a case-by-case basis, and a lot of different factors should be taken into consideration:
- I think the first things to consider is: why? Why do you want to consolidate the sites? If they are all somewhat different, all get traffic, and are doing their jobs, why change it? Is it broken? If not, why fix it?
- Does the brand of the strongest domain make sense as an umbrella to the other websites? Don't try to force domains together that don't make sense for the visitor.
- If it is strong enough, are the sites similar enough to where you could fold them into a single domain and brand (subfolders instead of subdomains)? I would recommend subfolders to subdomains if it still makes sense for the website.
Trying to consolidate multiple established domains into one domain, or a domain and it's subdomains, is possible but risky. Ask yourself if it's worth the risk to do all this, or if this is just one person's whim to organize domains without knowing what could happen.
In my experience, 301ing one domain to another (if done properly), should take about 2-4 weeks to recover the majority of organic traffic. Rarely have I seen a domain recover 100% of their organic traffic after a 301, but you will recover the majority of your ranks. So, if you are 301ing multiple domains, that would mean you are losing a sliver a traffic on each of them, and that doesn't count the downtime during the 301 transition.
And of course, the above is assuming all 301s are done properly, which could be a pretty big undertaking from the sound of it.
-
RE: 301 for a Very Long URL
You can always go in and edit the .htaccess file yourself to put in the redirect. Here is some help with that, you'll likely find what you need under "301 Redirects in Apache": http://moz.com/learn/seo/redirection
-
RE: Is my GeoSiteMap correct?
Nope, you're good. This is the style I use, and I've had good success with it. As long as your KML file is on-point, this should be fine. You're just telling the bots where the KML file is.
-
RE: Optimizing Webpages for Keywords- Using Text Links to Distribute Internal Page Rank
Charging on-page SEO at a per-page price is a bit out of the norm in my experience. Most SEO companies will charge a monthly fee and will then work on what they deem more important to the campaign.
Keeping that in mind, answering your question is difficult without the price points, but if it were me, internal linking would be part of the on-page service. If it's strictly a content company, they won't do this, but if it's an SEO company, internal linking is a given.
Once the content is complete, the internal linking that is involved is not a large undertaking in comparison. It takes some time and strategy, but a fraction of the content building efforts that hopefully took place.
-
RE: Merging domains into sudomains
If consolidating the domains into subdomains is inevitable, then 301s are going to be the way to go. Just be careful to redirect everything to where it should go in the subdomain, and do it one chunk if you can. You don't want to 301 some stuff, then re-301 or do a second round later.
Be as clear as you can to Google. once you are ready to move the site, do it all at once, so Google can clearly see that everything from the domain is now at the new domain. So yeah, wholesale switch, just make sure the client understands what he's getting himself into.
There is no exact number with regards to traffic loss, chance of recovery, risk of things not going smoothly, etc. Make sure they are aware of all the risks and how to best lower the possibility of something bad happening.
-
RE: How can I fix this home page crawl error ?
This was brought up a little while ago, hopefully Chiaryn's answer here can help: http://moz.com/community/q/without-robots-txt-no-crawling
-
RE: New un-natural links to my website that i didnt create.. and lots of them!
A disavow report is a must. Also make sure there's no manual action message in Webmasters that may require a reconsideration request.
Definitely contact all the webmasters to request removal and document the requests, they will be useful for the reconsideration if needed.
If it was a bot, it seems like there would be a lot more than 5 links a week. A number that low looks more like someone manually adding things, which could be a sign of negative SEO tactics by a competitor (if it's not an SEO company you hired that is just bad at their job).
-
RE: Optimizing Webpages for Keywords- Using Text Links to Distribute Internal Page Rank
I hate to step on another SEO's business modal, but for that price they should absolutely include the internal linking. I would be weary of an SEO that is trying to break down every single process into it's smallest parts in order to charge you for each one.
At an average reputable SEO company, that $7k would go a lot further, I feel.
This is just my opinion. When it comes to pricing a services, everyone does it differently and charges different things. So, in my opinion, you are getting nickeled and dimed.
-
RE: Taking Back Google+ Page Managed By Advertiser
This is the process I use when I need to get control of the map listing:
-
Go to the listing page and click on "manage this page." If the button isn't there, click on "edit details" below the details section of the business.
-
This should bring you to a "Request for manager access" page, where you can fill out the information telling Google you are the owner, not the advertisers.
-
Now, if you don't want to wait a million years for anything to happen with the page, you can click on "contact us" at the top right corner of the page and have Google call you.
-
Tell a human at Google the problem and it's possible they will expedite the process.
-
Follow up with the human every few days. The Google human should give you a way to contact them directly through email.
-
-
RE: How can I fix this home page crawl error ?
Yeah, your robot.txt seems fine, but the answer sounded like the error code could be misleading, so maybe you're looking in the wrong area for the root of the problem due to that reason. Wish I could be of more help.
-
RE: Hidden H1 Tags
I have seen this done in order to help screen readers, but I definitely do not recommend this. They may argue that semantic rules of HTML 5 dictate that this is an okay practice, and it is for HTML 5, but not for Google.
Google doesn't adopt new rules or semantic changes right away. Changes have to take hold and become common practice. This isn't common enough, and hidden H1s do open you up for potential penalties.
-
RE: Why doesn't Google show my site in the results when searching for my exact URL ?
So the website itself is being redirected? This has the potential is get it removed from the index. A 301 redirect tells search engines that the site has moved permanently, which means you're recommending they not index it, since it's not there anymore.
Also make sure to submit a sitemap to Google Webmasters to see if there are any errors there.