SEO Developers
-
I have a team of inexperienced SEO developers and the argument we continually have is that its a marketing role - however most of it is technical, no view state, no js, page load time, csssprite, metatags, frequency of updates to server, duplicate content via coding methodology, lioading content prior to ads, not spidering ads (IT says impossible yet google says required) etc.
I looked at your referrals for developers and couldnt find any that recognized SEO as part of their skill set - do you believe tehre aer developers that specialize in this?
Thanks,
Michelle
-
one option is to have all ad code in separate files that get loaded on page-view, through an include. If you then place all ad files in their own folder, you can then noindex the entire folder in the robots.txt file.
Engineering and Marketing go hand - in - hand regarding SEO. What engineers say is "impossible" regarding SEO is due to their lack of specific methodology and the human condition that tends to cause people to instantly conclude that if they have never done it, never seen it done, or never been shown, then it must be impossible.
This is an existential concept common with people who are extremely logic oriented. Rather than arguing, it's much more efficient to do the research, and find the answers. Only at that point is it then wise to go back and say - here's a way it can be done. Show them links to actual blog articles or discussion forum pages where the topic is discussed, if need be.
It comes down to understanding that it's a teaching moment, with no ego involved. Purely educating others, so everyone can work together for the common good.
-
It sounds like they want you to disallow those components in your robots.txt file to keep them from getting indexed by search engines. Here's what the Google Webmaster Help says about robots.txt. If the ads are in an iframe, you can disallow the page the iframe points to. If it's an Flash file for example, and the link is in the Flash, you can block robots from indexing any of these ads if you put all of them in their own directory. For ads that will get indexed (if they're in the HTML), if you put a "rel=nofollow" on the links, I think search engines consider that enough?
For page speed, there are a few free tools people to help with page speed. In Chrome, you can install Page Speed. You can install the add-on in Firefox as well if you've installed Firebug first. This will test your page and give you a list of things you can do to improve performance. Once it's installed, you can have it test any page on your site, and it'll give you a list of things to do to improve performance. Another similar Firefox add-on I haven't had much experience with is YSlow.
-
DART is the industry standard software used to serve ads on a site. Google webmastertools indicates that ads should not be crawled utilizing the robot text function.
I admistrate mopst of this software as a marketer but several items such as page speed, etc are out of my range of skills.
-
Sorry, but what is DART code? Looked around a bit but couldn't find any info about it.
Depending on what you want IT and marketing to do... mostly I monitor the tools, and tell marketing and IT what needs to be done. I don't think IT would need to be in there, you should probably be able to tell them what changes need to be made without them digging through data and reports. Marketing could use tools to find good keywords to target, and especially if they do link building to find opportunities for that.
-
Thanks - Do you know how to have the robots not crawl the DART code - as my developer is saying that is not possible. Also would you expect IT and marketing access your tools or marketing only?
Thanks, Again,
Michelle
-
Good developers should do a lot of these things by default, like optimize page load time, use sprites, avoid duplicate content, load page content prior to ads, etc. A good SEO should be aware of all of these things, and when things need to change, should be able to communicate those changes to a developer. Identifying these issues are more on the SEOs themselves, not the developers. In my experience, most tasks are more front-end tasks, and a few are back-end, so depending on what your developer does, they should be able to handle the tasks within their niche if you point them out.
I don't think they need to put "SEO" in their skill-set.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Express js and SEO?
Hi fellow Mozzers, I have been tasked with providing some SEO recommendations for a website that is to be built using express.js and Angular. I wondered whether anyone has had any experience in such a framework? On checking a website built in this and viewing as a GoogleBot etc using the following tools it appears as though most of the content is invisible: http://www.webconfs.com/search-engine-spider-simulator.php http://www.browseo.net/ Obviously this is a huge issue and wonder if there are any workarounds, or reccomendations to assist (even if means moving away from this - would love to hear about it)
Technical SEO | | musthavemarketing2 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Writing of url query strings to be seo frinedly
I understand the basic concepts of url write and creating inbound and outbound rules. I understand the creating of rules to rewrite url query strings so that it’s readable and seo friendly. It’s simple when dealing with a small number of pages and database records. (Microsoft Server, asp.net 4.0, IIS 7) However, I need to understand the concept to handle this: Viz the following: We have a database of 10,000+ establishments, 650+ cities, 400+ suburbs. Each establishment can be searched for by country, province, city and suburb. The search results show establishments that match the search criteria. Each establishment has its own unique id. Each establishment in the search results table has a link to the establishments detailed profile aspx page. The link is a query string such as http://www.ubuntustay.com/detailed.aspx?id=4 which opens the establishments profile. We need to rewrite the url to be something like: http://www.ubuntustay.com/detailed.aspx/capetown/westerncape/capetown/campsbay/diamondhouse which should still open the same establishment profile as the above query string. I can manually create a rule for this one example above without a problem. But there are over 10,000 establishments, all in different provinces, cities and suburbs. Surely we don’t manually generate a rewrite rule for each establishment? The resulting .htaccess will be rather large(?!) Therefore my questions are: How do I create url rewrite rules for dynamic query strings that originate from a large dataset? How do I translate the id number into the equivalent <country>/<province>/<city>/<suburb>/ <establishment>syntax?</establishment></suburb></city></province></country> Do I have to wire-up the global.asax so that every incoming requests extracts the country, province, city and suburb based on the establishment id which seem a bit cumbersome(?). If you’re wondering how I currently do it (it works but it’s not very portable or efficient): For each establishment which is included on the search results I simply construct the link url as: http://www.ubuntustay.com/detailed.aspx/4/Diamond%20House/Camps%20Bay/Cape%20Town On the detailed.aspx page load I simply extract the record id (4 in the example above) from the querystring and select that record from the db. Claude, what I’m looking for is advice on the best approach on how to create these rewrite rules and would be grateful if you can have one of your SEO friends lend their advice and experience. Any web resources that show the above techniques would be great. I’m not really looking for simple web links to url rewriting overviews…I have plenty of those. It’s the detail on the specific requirement above that I need please.
Technical SEO | | claudeSteyn0 -
Advice on improve this content page for seo and google
Hi, i use joomla and i am looking for some help to find out what i should be doing to make my content pages better for seo and google. I would be grateful if people would look at the following page as an example http://www.in2town.co.uk/trip-advisor/top-american-ski-resorts-for-over-50s and let me know what i should be doing to make it better for seo and for google so people can find the page. I am using the above page as an example so i can learn from it. I would be grateful if people could look at the source code for the page to see if there is anything that should be in their that is not and if i should be looking at any joomla plugins for the content pages to improve the seo of the page. Any help to improve my seo for my content pages would be great. many thanks
Technical SEO | | ClaireH-1848860 -
Videos for SEO & Profits
Hello, I'm in the middle of developing a website that will be a tutorial site for SEO, http://universityofseo.com. My plan is to do video tutorials & blog posts to help entry-level SEOs and SMB Owners to help them become familiarized with SEO through quick and easy to watch videos. I eventually want to turn this into a revenue stream through advertisements. I want to know for both SEO and profit reasons, if I should host the videos on youtube and then embed them on my site, or do something like Bits on the Run / Whistia and put ads in the videos that way? I'm not overly obsessed with monetizing the site, but it would be nice to do it, but first and foremost i'm concerned with optimizing the site, having great and actionable content, then monetizing it. I'd appreciate any help on this matter, Zach
Technical SEO | | Zachary_Russell0 -
Can local SEO harm national rankings?
Today I met with a firm called Localeze that provides local directory submissions. I understand the importance of this service if your site is competing locally, however I'm not sure the effects of local SEO for a national brand. Our firm gets most of our traffic from across the country, not just one location, and our business is scattered (which is a good thing). We rank for service related keywords that are not tied to a location. We do not show up for local results so our business in our immediate location is weak. We would like to increase our local presence in search engines but I want to make sure that this will not take away from our national presence. Will optimizing a site for local search negatively affect general rankings? Thanks
Technical SEO | | KevinBloom1 -
What are the SEO related negative aspects of having a blog on a subdomain?
Just double checking this one. Is it less than ideal to have a blog on a subdomain rather than as a sub folder on a domain?
Technical SEO | | PerchDigital0