Can Anyone show me a site that has followed the seomoz seo rules
-
Hi i have been reading the seo information on here which is very interesting and i would like to know if anyone can point to any sites that have followed the rules and advice.
It is great when you can read the info and rules but i feel it is also better to see a site that has followed the rules and to hear from people who have followed the information and put them into practice and explain what results they have got.
I am currently building the following website http://www.womenlifestylemagazine.com
so it would be great to see a site that has followed all the rules and who can explain if they work or not.
-
I am not sure that it helps rankings yet, but it cannot hurt them and it will help at some point in the future. As search engines try harder to understand what sites are about, using contextual markup will help. These changes to HTML and also the schema.org rich snippets will be used in future and will help. SEO is evolutionally as are development practices and it is all about staying ahead of the competition when search engines change the playing field.
I do know following the guidelines here, getting strong relevent links built and ensuring a fast user experience helps rank on sites we built. Does it give us an advantage oer the competition? Maybe, only time will tell.
-
hi this is a good site. so does this help with rankings and google by showing what the articles are on the site. at the moment i use sef404 so would have to add this to the site. sef uses h1 and h2 etc
-
We are setting up a site at http://www,dreambuilders.com.au which uses all those tags to seperate articles from navigation and the aside. It is still in development but the HTML 5 tags are set up.
Brett
-
can you give me an example of a page where you put this into place please
-
Sure Diane, Thanks. If HTML5 there are specific tags to denote type of content.
-
- means that the content between these tages is main content
-
<nav>- is the navigation links</nav>
-
<aside> - is subsiduary content, such as ad content and general information</aside>
This allows for seperation of interests and allows your site to have a logical flow and still provide contextual infromation about the content. If you look at our markup you see content wrapped in these tags.
-
-
very impressed with your site, can you explain what you mean by the following. We tag to HTML5 where it is clear what an article or main section is and navigation or subsidary links are,
-
Hi, I am a developer and hire an external SEO to do the link building but we do the site optimization following guidelines pointed out here and using the Moz Tools for our site www.oznappies.com
We tag to HTML5 where it is clear what an article or main section is and navigation or subsidary links are, as these are defined in the standard. This means we have total control of content meaning that Google will index. I also noticed that Google is including site speed in their beta analytics and so we optimise for performance, using best practices and cdn for js libraries. It is worth running your site through www.gtmetrics.com to see where you have performance issues that will affect rank in the near future, as Google is aiming at 5sec load time for user experience.
We are a new site (3 months old) and have moved from 100+ to page 1 for all our targeted key phrases, including the most competitive ones. We have in-house content authors writing original content every couple of days and posting on relevant forums and blog comments. We are now in the process of taging as schema.org rich snippets to prepare for search engines factoring this in.
-
oh yes content is god
you need content first
for links refer to http://www.seomoz.org/beginners-guide-to-seo/growing-popularity-and-links
-
-
sorry what i mean ref the info is.. the seo info as far as i understand is saying that content should come first rather than links to other pages as otherwise google sees that the page as a link page http://www.seomoz.org/beginners-guide-to-seo/basics-of-search-engine-friendly-design-and-development
but i cannot see any sites who follow this rule.
the other question is the links that you gain from other sites, do you join link swapping sites or do you contact websites to request a link exchange or write articles to gain links or use other methods to gain good quality links to your site.
-
i was talking about links from other sites to your site?
actually i dont understand your question
-
just looked at your site which is a great site. i have a question. in the info that i have been reading it is basically saying that by having the links first google just thinks it is a link page and should have the content first so google understands what the site is about. but saying this, you have your links first according to cache and the tool that is recommended to use and 99 per cent of sites also have the links first which is understandable. now this confuses me because it is naturally that you want your menu bar at the top or at the side, so i do not understand why the seo information is saying basically that you need your description and content before your links unless i am misreading it.
-
thank you. i am just about to look at your site. what do you look for when you search for quality links. do you join a link exchange programme, ask for links or look for sites where you can gain a link.
-
hi there - well I have pretty much followed the advice on here for my website (at least tried)
feel free to take a look - Garden Beet performs very well in natural search - but then I spend oddles of time optimising - onsite and offsite
The best offsite SEO strategy is getting high quality links that point to your domain - I have seen my website perform well - the problem is getting the right anchor links poiinting to the correct page - that kind of task is what will give you a position in the top 3
The best onsite SEO tasks are all the meta tags, unique content etc BUT your site has to look good to get people to share or bookmark your site -while a sexy site is overlooked by the robots its what draws people in
What I find amazing is the gap in knowledge between SEO people and developers - and dont believe developers who says they know all about SEO - unless they are actually doing it regularly - I fail to see how they can keep up -
if you have a very large website sometimes developers do not understand the key terms you are optimising for a given page - and inadverntely apply a h tag to a non-optimised sub heading- you need to watch the developers for little issues like this to ensure you SEO objectives are being achieved for each page - even templates sometimes get assigned the wrong h tags - duplicating the error site wide - it is difficult to know whether fixings errors such as these has a direct impact on search straight away
but I do know when all my meta was incorrectly copied over to a new site I lost over 50 key search positions - as soon as they were restored my positions were reinstated - let me say that was not fun - but clearly demonstrates the importance of correct meta titles and descriptions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO: can I choose only certain pages for subfolder?
For a client we are discussing international SEO options. I have pushed against CCTLD because they do not have the resources to manage multiple sites. Instead we want to go the subfolder route: .com/uk/ My question is whether we can properly create a subfolder version that only includes a handful of pages rather than the whole site - so 5-10 pages vs 4k. Is that possible? I'd love your thoughts. International SEO is not my strong suit. Also - if the subfolder for .com/uk/ content is almost entirely the same as the us-based .com is that a problem? Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
SEO effect of content duplication across hub of sites
Hello, I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham
Intermediate & Advanced SEO | | gmwhite9991 -
Does anyone know of any tools that can help split up xml sitemap to make it more efficient and better for seo?
Hello All, We want to split up our Sitemap , currently it's almost 10K pages in one xml sitemap but we want to make it in smaller chunks splitting it by category or location or both. Ideally into 100 per sitemap is what I read is the best number to help improve indexation and seo ranking. Any thoughts on this ? Does anyone know or any good tools out there which can assist us in doing this ? Also another question I have is that should we put all of our products (1250) in one site map or should this also be split up in to say products for category etc etc ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
How can I improve my rankings in Google with help of seomoz
Hey guys, I have joined seomoz today and set up campaigns for my sites. I got reports about keyword rankings, errors, notices etc. But I am still confused about how to use seomoz in order to improve my rankings. My point is does seomoz provides any services for improving position in google or simply seomoz provides only reporting? These reports are good but my ultimate goal to join seomoz is to improve my rankings for my each website and each post. Please help. BJ
Intermediate & Advanced SEO | | intmktcom0 -
SEO for one web site two domains
I have web site www.sxxxcafe.com and there is a another domain for the same like xxx.com .How can i use second domain for the same web site keeping SEO up and without loosing ranking .
Intermediate & Advanced SEO | | innofidelity0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Can this site be optimised?
I have been told that because of the technology this site was developed with it cannot be changed for example urls title and meta tags cannot be changed. why is that and what other types of sites also cannot be changed. http://www.alliedpickfords.com/Pages/Landing.aspx For example i have been told alot of online stores cannot be optimised because the urls change every time some one goes to the page therefor you cant lionk to a certain page is that true and what is the way around it if any.
Intermediate & Advanced SEO | | duncan2740