I would like to know is it possible to track a keywords past ranking history? We would like to check the ranking history of some potential vendors, but only get data from the point I add them to Alabs, moz etc. Id like to go back a year to see the trend in their rankings. Is this possible and what tool or process can be used to achieve this. Thanks in advance.
- Home
- anthonytjm
anthonytjm
@anthonytjm
Job Title: Senior Web Developer
Company: TJM Promotions
Website Description
We design and manufacture promotional products for awareness, promotions ad custom applications.
Favorite Thing about SEO
Getting Ranked!
Latest posts made by anthonytjm
-
Is it possible to track rank history
-
RE: How often does the WMT incoming links gets updated?
Im not sure the inbound links in GWT is ever accurate. Ive worked with several sites that had links removed over a year ago and they still show up in GWT. Even when linked site no longer existed. I guess such is the penalty for sites that once violated Googles TOS.
I think this would also delay the fixing of an algorythmic penalty if GWT hasnt recognized the newly revised or more accurate inbound links.
-
Landing pages rank higher thank home page
Ive been tasked to work on several sites from a client that had some shady past seo work done for them. Weak content, tons of useless links all pointing to home page, etc. Ive gone through site and given a total re-write and worked on removing and dissavowing links.
Majority of the sites home page ranked high on first page google. Now the sites are closer to 100 rank than first page, and the odd thing is that home page cant be found but pages like terms and conditions or disclaimer are ranking. These pages have hardly no content compared to home pages, and while they are still ranked 80 to 100 positions back they are ranking while the home page cant be found.
We have tried changing anchor text to "click here", and "for more info" instead of keyword and nothing much changed yet. Competition still has pretty much keyword anchor text and worthless links and they still rank well.
Am I wasting time with rewrites and link dissavows or should I continue and eventually get the sites home page to come back on the radar? We have added social media presence and a blog that we post to regularly and for 6 months of work have hardly seen a dent for improved rankings.
Am I on the right page or spinning my wheels? Any advise is appreciated.
-
Boatload of 301 Redirects Question
We have a client that came to us and they recently did a site makeover. Previously they had all their pages in root directory including 75+ spammy article pages. On their makeover, they moved all the article pages into a directory and added 301 redirects. In going over their site we noticed they have redundant articles, like an article on blue-marble-article.htm and blue-marbles-article.htm Playing on singular and plural with dulpicate content for most part with exception to making it plural.
If they have 75 articles, Id say 1/3 are actually somewhat original content. I would like to 301 redirect 2/3's of the articles to better re-written article pages but that would add a whole lot more 301 redirects.
We would then have a 301 redirect from root directory to article directory, then another 301 redirect from spam article to new re-written article.
- My question is, would this be too many redirects for googlebot to sort through and would it be too confusing or send bad signals?
- Or should I create a new directory with all good articles and just redirect the entire old articles directory to the new one?
- Or just delete the redirects and old spammy directory and let those fall on a 404 error page.
Id hate to lose 50-75 pages but I think its in fact those spammy pages that could be why the site fell from top of first page google to third page and now 10th page in a years time.
I know, Im confused just typing this out. Hope it makes sense for some good feedback and advise. Thanks.
-
Should sub domain blog have www or non www
My main site has htaccess rewrite condition to force www.
I just added a blog on sub domain and have it www.blog.domain.com
Is this setup correctly or should it be http://blog.domain.com or does it make a difference? A colleague just pointed that out and said it should be non www for blog or any sub domains.
Thanks for any clarification.
-
Month old site and alreasdy ranks 3 for competitive keyword
I know this individual does this with several sites and then offers them for sale to his competitors. Obviously spammy thru and thru, but how can google reward a site thats not even two months old, with 1900 + links with a ranking of #3 for a highly competitive keyword?
Please dont post the actual name or url of the website as we dont want to give him any more credit but this blows my mind as he has done this several times with other sites and never gets penalized.
Any ideas as to how he can accomplish this besides almost 2000 links in less than 2 months? How is that even remotely natural?
I know his other sites have been reported to google but they never did anything about it.
Thanks for any feedback.
-
RE: Does Google have problem crawling ssl sites?
Thanks for the replies. Think we have the http fixed and will work on footer area next. Thanks again for the heads up.
-
Does Google have problem crawling ssl sites?
We have a site that was ranking well and recently dropped in traffic and ranking. The whole site is https and and not just the shopping pages. Thats the way the server is setup, they make whole site https.
My manager thinks the drop in ranking is due to google not crawling https. I think contrary, but would like some feedback on this. Site is here
-
RE: Link directories question
Thanks Matt,
The seo agency came into play post penguin / panda. Flags and bells are going off in my head like a four alarm fire in reviewing their monthly reports.All links are using main keyword phrase and linking to home page. There is no variety in phrase or landing page, and I suspect thats why the client is not ranking well and got a seo agency involved.
Said agency claims to do white hat practices but I just cant find any, as stated above.
Before I post my findings Id like to make sure my approach and suspicions are valid and backed up by others in this fine community.
-
Link directories question
Looking over a clients site and they have a bunch of link directory links that seem very skeptical to me, but the mozrank and authority seem to be ok on the home page.
One directory is addlinkzfree and they have the same template and layout as a few other directories this client has. Link page has no juice whatsover, but home page has PA 54, MR 5.04 and root domain is DA 45.
At first glance this would appear to be respectable numbers right? But the title of the directory and multitude of links lead me to think its nothing but a link farm.
Should I advise the client to run and try to remove links from these type sites even though home page has decent scores? Im of the mindset that anything diredctory with links, free, partners etc in title need be avoided.
Would appreciate any backup on this or am I just being paranoid?
Best posts made by anthonytjm
-
RE: Does anyone have any opinions on whether to use a pipe | in your title tags? Or commas or just nothing at all?
I think the way you are suggesting would appear to be keyword stuffing. Proper title should appear in more of a short statement, like the following:
Mens leather boots and footwear at bootsjeansandleathers.com
The above would appear to be a more natural flow of text in a statement form while still encapsulating your keywords. Try to target only one to two keyword phrases per web page. otherwise your diluting your page and losing its focus.
-
RE: Is purchasing internal page links from directories worth it?
I think google's stance is that any paid links for search engine ranking purposes is bad. Purchased links for advertising purposes is ok. If your purchasing internal links from directories for the sole purpose of rankings, I think google will pick up on that and end up penalizing you sooner or later.
-
RE: Could you use a robots.txt file to disalow a duplicate content page from being crawled?
Well, the answer would be yes and no. A robots.txt file would stop the bots from indexing the page, but links from other pages in site to that non indexed page could therefor make it crawlable and then indexed. AS posted in google webmaster tools here:
"You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file (not even an empty one).
While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results."
I think the best way to avoid any conflict is applying the rel="canonical" tag to each duplicate page that you don't want indexed.
You can find more info on rel canonical here
Hope this helps out some.
-
RE: Are these 'not found' errors a concern?
Are they internal missing links or external missing links?
If internal, then that would be a big problem.You can click on the link that will show you three options. Error details, in sitemap or linked from. Make sure none are from sitemap.
If external, not so much a problem because it would be beyond your control as to who is trying to link to your site. My webmaster tools show missing links all the time from outside sources we have nothing to do with. I usually just contact the source and ask them to remove the link, or if its from a semi quality site Ill ask them to fix the link and give them the accurate url.
I would make sure you have a 404 error page setup for any bad urls linking to your site. Hope this helps.
-
RE: Google Releases Penguin Update 1.1
We had two different product sites that ranked number 1 in google for their respective keywords. They basically had the same outside seo co strategy applied to both. Right before penguin we did a site wide makeover. Noticed some articles were duplicate or too similar and removed the dupes and applied 301 redirects. Also added the htaccess code for non www to www. Penguin comes along and wham. One fell like a rock, the other is still ranking strong at number 1.
Now how can that be? They have almost same number of pages, similar inbound links and linking footprint.
The co that did the seo work has closed down over night. So I am now left with having to try and contact tghe questionable inbound links and try and get them removed. other than that I cannot see why one flourished and the other dropped. ??? -
RE: Changing domain names... Bad idea?
I think you have the right idea and are on the right track. I believe with 301 redirects to new site you will transfer most if not all link juice and page authority to new domain name. And you should see a boost in rankings having the keyword in there as well. So I think its a win win all the way around for you.
-
RE: Internal Site Structure Question (URL Formation and Internal Link Design)
If your planning on adding several different article directories why not have a main articles link from main navigation, then have droop down menus for sub directory articles? So it would look something like this:
/articles/
/articles/buying-guide/
/articles/software/
/articles/hardware/
etc, etc, etc.
Or you could have your main articles landing page list all links under various sub categories with H tag titles separating the directories.
-
RE: Why does GWT fine duplicate descriptions where none exist?
The problem your having is that you have duplicate title, description and keyword tags in your source coding. If your using a CMS application, check your template meta settings. I think the template meta data is being added to every page and then you are making each unique on page, but both are being added.
<title>Hearts Pest Control Service of Rancho Bernardo</title>
................................
........................................ -
RE: What are your favorite tactics for getting links to money-pages?
We worked on a website that sells silicone wristbands. Not an exciting product anymore by any means. What we did to help create a buzz was partner with various awareness causes during a given month and posted polls and funds raised results on landing pages. Then we had the awareness cause we worked with link to those landing pages where visitors can see the amount of money raised, read more info about the cause and even place an order to support the cause.
If client has a mailing list of any significant size, you can send out email blasts offering discounts, free upgrades or other incentives that are further explained on landing pages and build traffic and possibly links to that page as well.
Im also looking forward to hearing some strategies others may have to offer.
-
RE: How to move domain content w Penguin Penalty?
Per google webmaster tools:
Removing an entire directory or site
In order for a directory or site-wide removal to be successful, the directory or site must be disallowed in the site's robots.txt file. For example, in order to remove the http://www.example.com/secret/ directory, your robots.txt file would need to include:
User-agent: *
Disallow: /secret/
It isn't enough for the root of the directory to return a 404 status code, because it's possible for a directory to return a 404 but still serve out files underneath it. Using robots.txt to block a directory (or an entire site) ensures that all the URLs under that directory (or site) are blocked as well. You can test whether a directory has been blocked correctly using either the Fetch as Googlebot or Test robots.txt features in Webmaster Tools.Only verified owners of a site can request removal of an entire site or directory in Webmaster Tools. To request removal of a directory or site, click on the site in question, then go to Site configuration > Crawler access > Remove URL. If you enter the root of your site as the URL you want to remove, you'll be asked to confirm that you want to remove the entire site. If you enter a subdirectory, select the "Remove directory" option from the drop-down menu.
I first got started in web design back in 1999 working as a freelancer. Seven years ago I moved to Ocala, Florida and started working for TJM Promotions. We manufacture and design just about every promotional product under the sun.
Looks like your connection to Moz was lost, please wait while we try to reconnect.