Need to know best practices of Search Engine Optimization 2013
-
I want to know best practices of Search Engine Optimization 2013 and also need best possible sources.
Thanks
-
I found this article useful: http://www.searchenginejournal.com/seo-in-2013-7-surprisingly-simple-factors-that-will-take-the-lead/57092/
Hope it helps,
Best regards,
Holger
-
Agreed, I shouldn't have left them off.
-
Great recommendations, Brad.
I'd add in searchengineland.com as well and look into presentations from the SMX conferences as sources.
-
My short answer would be to use the SEOmoz toolset to uncover some of your basic issues and work to correct them. From there I would start working on developing and building content on your site that can earn links. Earning links will require a significant amount of outreach to raise awareness about your content. I would also encourage you to focus on the user and what is good for them above your own thoughts about SEO. It is a delicate balance but building things just for SEO is never a good solution.
Here are some things I would encourage you to look at
- All Whiteboard Friday's
http://www.seomoz.org/blog/category/whiteboard-friday
- All recent Matt Cutts videos from the google webmaster youtube page
http://www.youtube.com/user/GoogleWebmasterHelp/videos?view=0
- If it is in the budget grab a ticket to mozcon in July
Hope this helps.
Brad
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Best practices for robotx.txt -- allow one page but not the others?
So, we have a page, like domain.com/searchhere, but results are being crawled (and shouldn't be), results look like domain.com/searchhere?query1. If I block /searchhere? will it block users from crawling the single page /searchere (because I still want that page to be indexed). What is the recommended best practice for this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Need Reviews on my new website
Hi, I recently developed this website: http://goo.gl/fl5a5 And started link building to that website and getting some very good links so far. So far ok, but i would request some experienced guys here to post some reviews and help me with your suggestions so that i can rank better. Its been a month since i started link building to this site. . PS: I have cloned my competitors site with unique content. Will this becomes an issue? You can check my competitors site by Google'in my site entire title. Please let me know your thoughts on this.
Intermediate & Advanced SEO | | Vegit0 -
Canonical Issue need hep
Hi Is my site has any issue with duplicate pages within the site , have i define my canonical tag properly , can any one advise please help. childrensfunkyfurniture.com
Intermediate & Advanced SEO | | conversiontactics0 -
Best linking practice for international domains
SEOMoz team, I am wondering that in the days of Panda and Penguin SEOs have an opinion on how to best link between international domains for a web page property. Let's say you have brandname.DE (German site) brandname.FR (French site) brandname.CO.UK (British site) Right now we are linking form each site on the page to the other two language sites to make users aware of the translated version of the site which obviously make it a site wide link which seems to be lately disencouraged by Google. Did anyone out there have any ideas how to strategically interlink between international domains that represent language versions of a web site? /PP
Intermediate & Advanced SEO | | tomypro0 -
Need a trained eye to help with a quick search to see if there’s a poison pill buried somewhere on my site!
Need a trained eye to help with a quick search to see if there’s a poison pill buried somewhere on my site! This is an e-commerce site that I’ve worked on and ran for 5 years which ranks from middle to top in just about all of the quality analytic scores when compared to top 10 competitors in Google, yet this site can hardly stay on the 3<sup>rd</sup> page let alone the 1<sup>st</sup>. Only weakness in metrics that I see is that I need more linking root domains and traffic. Any suggestions will be greatly appreciated. Lowell
Intermediate & Advanced SEO | | lwnickens0