I read the topic the other way around. Wasn't in touch with the proposed Moz update date.
![donford donford](/community/q/assets/uploads/profile/32979-profileavatar-1619581109062.png)
Best posts made by donford
-
RE: Two days since the supposed update
-
RE: H2 Tags- Can you have more than 1 H2 tag
You may use more than one of ANY Hx tag (even 1), there has been some argument about risking penalties for using more then one H1, but with the way HTML5 sites are going it is starting to wain towards more people doing it.. Before this debate, I don't recall much conflict about using H2 tags more then once. I would just be careful about it and use them appropriately.
-
RE: Keyword Targeting with Dynamic Pages
Hi Niners,
Hi Niners, most current CMS (content management systems) operate in a dynamic capacity, which means the issues you are facing are not exactly unique to .net applications. This is is actually good news, the more people who face the issues the better chance there is some help out there.
I have built and or worked over a dozen large scale eCommerce applications in my time, and the one thing they all have in common is even though they are dynamic in nature they all required extensive time and content development. The ones that have done the best had the most time spent on creating content and working on SEO. Since .net is not an actual language more of a framework I think I most people would be hard pressed to provide much more targeted info. The main key is using a CMS that will accommodate user specific tweaking so you can differentiate yourself from the other sites using the same platform. In the end all of which requires time and effort.
Hope that helps
-
RE: Can white text over images hurt your SEO?
Hi Thomas,
The last I heard Google / Bing / MSN /Yahoo has no automatic way to know if you were obfuscating text. The way sites are built now, layers on layers or divs inside divs it would pretty difficult to decipher all the code to just check if there is hidden text. However, if a competitor catches you doing it, reports it, and then the Search Engines do a manual check you're likely going to get dinged.
I haven't seen anything new on this subject in a year or more but looking at your site I don't think this is your problem. In fact our corporate site uses white text on an image on every single page and we have no issues.
-
RE: Why is Moz not crawling my backlinks
Hi Ashish,
How much time has elapsed since you been tracking with Moz? Typically Moz does an once a month.
-
RE: Google Not Pulling The Right Title Tag & Meta Description
Just want to echo Chris's response and also point you to a link from Matt Cutts (lead engineer @ Google) talking about just this subject. http://searchengineland.com/googles-matt-cutts-look-title-match-query-190039
Hope it helps!
Don
-
RE: When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Hi Kingalan1,
A couple things I hope will help.
First the suffix at the end of the url :2082 is indicating a port, the typical configuration is when hitting port 2082 is to redirect to the site's cPanel. This URL should never be indexed and never displayed in any SERP. (Search Engine Results Page). I'm not sure how this url got submitted to Google or indexed, but you certainly don't want it there. Could be in an auto created xml site index, or if you used the host's site submit they could have done it. Something you want to look into.The second thing looking at the site nyc-officespace-leader.com it does not appear to use a SSL or it is improperly configured. If you go to any of the sites URLs and change the HTTP to HTTPS you'll notice this error: (Error code: ssl_error_rx_record_too_long) here is a helpful post on stackoverflow.com that deal exactly with this this error.
Hope this info helps,
Don
-
RE: Ranking dynamic landing pages
HI Mark,
There really isn't a big secret to ranking dynamic pages. Provided they all have unique URL's then it is just a matter of populating the content based off of some parameter. I can't say what is the best way to do this but I can give you an example of how I would handle lets say a site like LinkedIn
So we have a search box that requires some input lets say we use a name.
Run a query in a database to see if we have a match. If we have multiple matches then query basic information (Location, Title, Image) show them to the user and let them select the correct one.
Once a user is selected then use that information to populate the page. So I would set as many variables as I could based off of our database information. Say we queried my name (Don Ford) and you found me from multiple matches displayed them to the user and they selected me.
$person_fname=Don
$person_lname=Ford
$person_job=Web Developer
$person_company=Columbia Engineered Rubber
etc....Then I would use this variables to generate my title, description, h1, h2 tags, and content and serve it up. Provided the page has a unique url you get my page on LinkedIn. It can be included in a sitemap, indexed by search engines and found online.
Example:
<title>$person_fname . ' ' . $person_lname . ' Professional ' . $person_job . ' At ' . $person_company</title>The real trick is how much unique content can your store in your database and return to the user. The more you have the more unique content you can display.
Since its dynamic you make everything possible dynamic including all the SEO basics. Its really that simple.
-
RE: Is it important to keep your website home index page simple to rank better?
Hi Alan,
I will focus a tad on the general question rather then an overview your site.
"Should my homepage be simple?"
I tend to think of it this way. A home page should be like meeting a new person, or a first date if you like. The idea is pretty simple,
Hi my name is Don
I enjoy online games, website development, and cooking.
I work at Columbia Engineered Rubber, Inc
My job includes; web development, IT development, and business to business relations
I live in Vandalia, Ohio
I'm 38 years old
It was nice to meet you. Do you want to know more about anything I said above? Click for more info...That would be an outline for my homepage. For a business it is the same, tell me about what you do, where you're at, and give me options to explore. When I type in a brand in Google, what do I expect to see? I expect to see more about the brand general information. Most of the time Google can accomplish this by serving up the Home Page for said brand as the top SERP.
So in short, yes your homepage should be simple. You don't want to hear about my 2 year bout with athletes foot on the first date do you?
-
RE: Canonical tag - link juice to the frontpage
As html both are correct, the /> is a standard close tag, however browsers and crawlers do not require the "/" to know the tag is closed.
In XML you would close the with a standard close tag .
That should answer the technical question.
I also want to mention something about what you wrote. "I want to use it on pages that rankes together with the frontpage in Google, but i only want the frontpage to rank alone and to have the link juice from the other 2 sites direct-ed to the frontpage.."
You should understand the purpose of Canonical tags is really used to keep pages from competing with each other. A good example is if you have a page with info but you also want to give users a printer friendly version (you don't want those pages competing).
If you have two pages with stronger content then your home page for "Keyword X" you could be doing yourself a dis-service by trying to rank a weaker page.
-
RE: How does a search engine bot navigate past a .PDF link?
Hi Dana
I think your question has been dodged a tad. I ways lead to understand that a .pdf or any page that opens in a new tab and does not link back to the original site, (dangling page) is not a problem. The reason being is that crawlers don't really care how a page is opened. Because the crawler will fork at every link and crawls each new page/link from each fork, when it finds a orphan or dangling page it just stops. This of course is not an issue since if the crawler has forked at each link.
So the question is how a SE treats .pdf's rather how does it treat orphan page. Maybe somebody who works with crawlers can confirm or educate us both on they work.
Don
-
RE: Is it important to keep your website home index page simple to rank better?
HI Alan,
If I understand you correctly, it sounds like you're afraid by moving content off of your homepage you may lose rank?
If this is the case I would not worry, you can easily create applicable pages and get them ranking with the same content. I am trying not to critique your site as I'm sure you have gotten a wide range of advise on what to do and what not to do already. What I would do is create an introduction about your company, list a few of your best properties highlights and redo the navigation. Robert hit on many things the can be addressed. To give you an example of a small website focused on one thing you can view a similar site I made here: http://rubberprototyping.com/
The idea is to create a flow, (with clicks (not scrolling)) to present the user the information they are looking for without to much searching around. A side note, having more pages while can be harder to maintain without a CMS (content management system) it is actually beneficial for SEO purposes, larger sites have a tendency to attract more keywords and rank better then smaller sites.
Hope this helps
Don
-
RE: 804 HTTPS (SSL) error
Hello Happy,
Okay so you have content being served as http on your https page. When you reference an image or script you need to make sure it is a relative reference or a https reference, otherwise you will get these types of warnings.
See Mozilla Facts here
Also see the image attached.
Also the SSL isn't misconfigured it is missing. To configure one properly you need to contact your host and ask them to install a SSL cert (most host will not allow users to do this themselves). If you have not yet purchased a SSL you will need to do so. SSL certs also require dedicated IP addresses which most host also charge for.
In summary if you purchase a dedicated IP and a SSL certification you're problem should go away unless you specifically declare content as http.
Hope this helps,
Don
-
RE: What is better for web ranking? A domain or subdomain?
I think the answer is basically the same.
The answer is they will be treated differently. You won't see a direct benefit from latching on to the main domain.
In some cases it does make sense to use sub-domains for example, forums.domain.com. This gives the user both keywords forums & domain name (an indirect benefit). It makes it easier to manage content in a CMS by having the forums on a sub-domain. Heck Google does this for many of its products as does Amazon. In most cases these are services offered by the main domain. For example: Google->Ad-Words, Google->Translate, Amazon->Seller Centeral etc..
However if they are completely different items or services I would just use a different domain completely. I would point to my work for an example. I work for a custom rubber manufacturer who's primary focus is working on new product development and manufacturing for OEM's . But some of the products on our website were generating a lot of interest and phone calls from smaller opportunities. Instead of ignoring this option and potential revenue we built a distribution site (On a different domain) to handle these smaller opportunities. Had we done this on a sub-domain we could have confused larger OEM clients by thinking we are a retailer or middle man, when in fact we are a manufacturer.
I hope that clears things up, really evaluate the reason for the sub-domain if it is a sub-product / service by the main domain and there is no clear conflict for making it a sub-domain then okay, otherwise and in most cases it will likely make more sense to use a different domain completely.
This is my thoughts I hope it helps,
Don
-
RE: Canonicial redirect non-www to WWW in Magento
Magneto has its own little system for redirection. If you can not access a non www page via http or https then you are okay.
Here is some further help for other users:
http://kb.siteground.com/how_to_redirect_magento_to_open_through_www/
-
RE: To Genesis or not to Genesis that is the question
I am not a big WP user but my experience with Oscommerce may help here. When you take an open source item and lay on it any set of modifications that are not open source you run the risk of become dependent on the modification developers to help with issues.
Perhaps you should also evaluate Word Press forums to see what others are asking and saying. I know on Oscommerce as soon as you mention a non-open source ad-on or modification your thread can be deleted and people clam up or tell you to go ask the developer.
As a business if you are building sites with another parties software just make sure your clients know about it, and the differences from using a vanilla installation vs one with the third party framework. I don't think there is anything wrong with using third party programs, as long as you have time to become familiar with the differences and are able to relay help to your clients if need be.
I am always leery of getting stuck relying on other business for help when a serious problem arises. This is why I love open source software and their communities.
Good luck with whatever you choose!
Don
-
RE: 804 HTTPS (SSL) error
I just ran a crawl and did not see any 804's
You can view the results here:
You may want to contact Moz directly to see if one of the Moz staff can help you further.
-
RE: "Hot Desk" type office space to establish addresses in multiple locations
Thanks for the reply Miriam. I was interested to see where this topic went.
Nice links!
-
RE: Duplicate Content - Captcha on Contact Form
The rel=canonical tag should fix the issue. Something like this in the head section of the page.
Here is the Google Article about Rel Canonical.
Hope this helps,
Don
-
RE: Duplicate Content Issue: Mobile vs. Desktop View
HI Dino,
Before I said to much I had to look at Visual Composer. Spent about 10 minutes there and didn't really see how the code turns out. Perhaps if you like to post a link to the webpage or just message me if you don't want it public. I'll be happy to review the source and offer a thumbs up or any suggestions I can.
Good luck,
Don
-
RE: 804 HTTPS (SSL) error
Yes, you need to work with Moz support to get the issue fixed.
-
RE: Training Website Improvements...
Hi Gaz,
I'm glad you found it helpful. I can certainly concede the point about the Google layout I am sure more people find that type of layout pleasant then not, or it wouldn't be so popular.
I think you really have a firm grasp on what to do and it certainly looks like it is going to require some thought. I kind of wish more people gave you some opinions to help you with your future development.
Best of luck,
Don
-
RE: Block Domain in robots.txt
Hi Philipp,
I have not heard of Google going rogue like this before, however I have seen it with other search engines (Baidu).
I would first verify that the robots.txt is configured correctly, and verify there is no links anywhere to the domain. The reason I mentioned this prior, was due to this official notification on Google: https://support.google.com/webmasters/answer/156449?rd=1
While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results.
My next thought would be, did Google start crawling the site before the robots.txt blocked them from doing so? This may have caused Google to start the indexing process which is not instantaneous, then you have the new urls appear after the robots.txt went into effect. The solution is add the meta tag noindex, or block put an explicit block on the server as I mention above.
If you are worried about duplicate content issues you maybe able to at least canonical the subdomain urls to the correct url.
Hope that helps and good luck
-
RE: Duplicate Content Issue: Mobile vs. Desktop View
HI Dino,
I don't see any issues. It is okay to use multiple H1 tags for reasons such as this. Google has confirmed multiple H1 tags are okay.
My example above was probably more alarming to you then I could have realized. My effort was to point out a simple case of how to use css for multiple device types. In your case having different text is for the benefit of the user which is exactly as it should be.
Good job,
Don
-
RE: Open site explorer is giving me strange redirect message.
Hello,
Sorry for not getting back to you sooner. Weekend and all..
Okay the problem is still there. You can check the header response codes yourself here:
http://tools.seobook.com/server-header-checker
The URL http://www.a-fotografy.co.uk/ 302 redirects to https://www.a-fotografy.co.uk/ which 301 redirects to https://a-fotografy.co.uk/
There are 2 possible problems I can think of. 1 the code to redirect http://www.a-fotografy.co.uk/ is still in the htaccess file and before the code I gave you. Or 2 the host has a domain redirect in place that is executing on the server before the htaccess is read.
For me to help you further please post the contents of your htaccess file and I'll see if there is something I can pick up on.
Don
-
RE: More bad links
Hi David,
Sorry to hear about your experience with a bad SEO company. Unfortunately, it is a common occurrence. A positive is that you're now engaged in the SEO process more intimately.
Your question about new links from disavowed domains, Google keeps your disavow list on file so any new links from previously disavowed domains should have no impact (positive or negative). Remember you can only have one disavow file so make sure you keep it updated.
Building positive links requires research and time. Moz has a beginners guild to link building, and there are a few good YouMoz post about it. I will share my approach for a quick recap.
- Research competitors (note their backlinks) (Moz Competitive Link Finder) great Tool!
- Research my demographics posting habits. That is find the best forums / blogs where your customers are posting.
- Create accounts on these sites
- Engage! Find topics about your types of products or services and create compelling informative content
- Build the links! When you're posting on these forums or blogs if appropriate link to the relevant content on your site. Use proper title and keywords when linking and make sure the page you link to is absolutely beneficial to the readers. If it is not you will lose credibility and possibly be viewed as a spammer.
- Not ever post you make needs a link. Establish yourself as an authority and make sure your site has content to back that up. Done correctly when you do drop a link, people will pay attention, take note and possibly help you in future post by linking to your site themselves.
Hope this helps,
Best of luck,
Don
-
RE: Dev Site Was Indexed By Google
Hi Tyler,
You definitely don't want to battle yourself for duplicate content. If the current sub-domains have little link juice (in links) to them, I would simply block the domain from being further indexed. If there are a couple pages that are of high value it maybe worth the time to use a 301 redirect to prevent losing any links / juice.
Using robots.txt or noindex / tags may work, but in my personal experience the easiest and most efficient way to block any indexing is simply use .htaccess / .htpasswrd this will prevent anybody without credentials from even viewing your site effectively blocking all spiders / bots and unwanted snoopers.
-
RE: Duplicate Content Issue: Mobile vs. Desktop View
Hi Dino,
Is your code something (basic) like this.
I love lamp!
I love lamp!
Then you use a switch to determine which view to show?
If so, the correct way would be to use the switch to select which CSS to load instead. Thus you can use the same class but it will show up different based off of the users view.
I love lamp!
Here is a nice article about switching CSS based on views.
Hope that helps,
Don
-
RE: Why would so many links be appearing in the source code of this page - but not on the page itself?
Looks like its navigation that is not working correctly. This is based on the fact the links that aren't showing are nested inside the navbar.
The jquery file that is being referenced is old (jquery-1.4.2.js) so upgrading that may fix the problem, we are currently at (1.10.2) http://jqueryui.com/download/ .
For the bonus I'm guessing that is a http://www.aspdotnetstorefront.com/default.aspx store.
Good luck,
Don
-
RE: Duplicated Meta Descriptions on Dynamic Paginated Pages
Hi Andy,
I have to options. First the easiest. Canonical the pages to the main page. Pretty simple but possibly not the best.
The second solution would be to make the descriptions & titles dynamic as well. For example (I don't have all the info) so bare with me.
Base Description:A selection of the best Silversea cruise deals taking in over 800 destinations across all 7 continents.
On Category Selection:
$category Silversea Cruise Deals $silversea_deal_1Which would give you these 4 initial descriptions:
- Recommended Silversea Cruise Deals
- Cruise Price Silversea Cruise Deals
- Sail Date Silversea Cruise Deals
- Best Value Silversea Cruise Deals
Then you define the deals..
$silversea_deal_1 = 4 Day 3 Night Key West To Bahama's $499Now you append the variable to the description:
- Recommended Silversea Cruise Deals 4 Day 3 Night Key West To Bahama's $499
- Cruise Price Silversea Cruise Deals 4 Day 3 Night Key West To Bahama's $499
- Sail Date Silversea Cruise Deals 4 Day 3 Night Key West To Bahama's $499
- Best Value Silversea Cruise Deals 4 Day 3 Night Key West To Bahama's $499
That is how I typically handle multiple pages that need to rank SEO'wise but have dynamic content. You just find a pattern that you can use and plug in the variables, just like any of the content but you do it inside the SEO part.
Hope that makes sense and helps,
Don
-
RE: Are pages not included in navigation given less "weight"
Great answer Dirk and I completely agree.
-
RE: Hyphens vs Underscores
Hi Logan,
I was faced with the similar question a couple years ago when I started with my current company.
The short answer is no, do not change a url that is currently using underscores to hyphens if it is well indexed.
If you're making a new page, then you should probably use hyphens instead of underscores.
-
RE: Traffic from organic grew significantly. But why?
Hi Yannick,
Sounds like you got mentioned on a few post. Perhaps a DIY blog or the like. These tend to drive traffic fairly well. Check your back link profile and see if you noticed any new sites popping up. A few months ago customer service said "what is up with all these new teflon o-rings sales lately", I did the research and it turns out we were mentioned on two forums with links to our teflon o-ring section.
Hope this helps,
Don
-
RE: Why my website sudden gone down its ranking?
Given the recency of the change and the content I did see, it is likely applicable. Even if you never got a notice about it being in violation does not mean that Google's algorithm update didn't affect you. Perhaps others may see something I missed.
-
RE: Clients Slow to Publish Content
HI Kyle,
The process you described is similar to what happens here. I have had an updated website (Theme and functionality) done as of December 2011, it is still being approved through various channels. Technical information must be accurate and approved by engineering, legal wants protection and marketing wants SEO godly-ness.
I'm not sure what particular job you are trying to fill within this company, if its SEO or Web Development or Content / Copywriter or all three. I think the best solution is to help your client understand exactly what you do and what results are obtained. If they don't understand the process of publishing content then you may be responsible for teaching them, so you can actually function without them harassing you.
Me, I'm mostly the web developer here, but also do some marketing work. When we get technical mumbo jumbo back from the engineers we are responsible for staying true to the jargon but still using our keywords and SEO optimizing the information and presenting it in a pleasing manners to our customers.
The short answer is your client must understand the process, what your role is. They must also understand that your service is contingent on certain needs that you don't control. They have to share the responsibility of making things happen.
-
RE: How does Google treat anchor tags on badges after penguin update?
I've actually seen this question before (I looked but couldn't find the discussion) sorry.
What I seen some folks offer as advice is to try and rotate your anchor text if possible. Nobody gave a solid yes or no don't do it, but most cautioned that it seemed risky.
My thoughts are to try and use some dynamic anchor text, about the only thing that will be different from site to site is the domain name so....
Website Design of XXX.com By DonFord.com just use php or asp to pull the domain name for the anchor text.
Furthermore, it may be prudent to create a page for the client on your site and point the link there. A simple outline of their service, and your service a few images and maybe ask the client for small snippet about the work you did for them. This would prevent kathousand links going to the same page, and still bring in traffic from people who like the site you designed. Most people probably can figure out to hit your logo or something to go to your homepage. The added benefit is it makes it easy to see how much traffic each one of these sites sends your way.
My 2 cents, hope it helps.
-
RE: URL Parameters as a single solution vs Canonical tags
Hi Ivor,
This is a very good place for canonical tags. If you put the canonical tag on the root page then you should be okay when the page=2 or sort=Az parameters are added it will still canonical to root page. There is nothing wrong with putting a canonical page tag to itself so there is little worry about.
Fixing parameters in Google is only one of the search engines all the other crawlers won't know what Google sees so it is best to fix it for everybody.
The other option would be to use a exclude in your robots.txt so the pages are not seen as duplicates, but I would advise to use canonical first.
User-agent: *
Disallow: /*page
User-agent: *
Disallow: /*sort
For example.
Hope this helps
-
RE: 301 Redirect keep html files on server?
Hi Heiko,
When you use 301 you remove the page from server. 301 redirects are done in server files like Linux/Unix's htaccess
example
redirect 301 /old-page.html http://www.adomain.com/new-page.phpFurther more, it is good practice to remove / update all your website links to this page.
In theory you are correct, if you leave the page up, it will likely never get viewed as the redirects happen at the server level. But the general rule would be to remove it.
-
RE: Why Did Our Site Disappear for 6 Months?
Hi Kelley
This response would alarm me "The reason they gave was Penguin, Panda, and Hummingbird". If they are siting this as a reason that means they were 99.89% likely doing something wrong to begin with.
Having crappy directory links isn't necessarily bad unless they make up the majority of links you have. Although you should never spend time trying to get listed in crappy directories, its a waste of time. Just look at the link and say would I "ever" visit this site for anything ever? If the answer is no, you likely have no reason to be listed there.
Peter pointed out one possible reason, another could be canonical tags point or looping on themselves. In general it sounds like the company used was horrible, and it is a boon that you're back.
Hope all is well with your husband, and welcome back.
Don
-
RE: Duplicate Pate Content - 404's or 301's?
Hi Braunna,
It sounds like you deleted pages, then remade them. It is great you're keeping up with the freshness of the site but for search engine purposes you should have simply updated the current page with fresh content, or remade the page then 301'd the old page to the new page.
In general you should try to avoid deleting pages or remaking the same page with a new url unless there is a reason greater than content driving the decision. Such as a new CMS (content management system, Joomla, WP, OSC etc..), switching server side scripting (php to asp), or overhauling navigation and architecture.
If there was a case of a the page simply being completely useless and you are removing it completely then 404 would be correct. If you are seeing any duplicate content issues in this case it is likely because the search engines have not de-indexed the old page or it is still in their cache. Google can help you with removing cached versions and forcing de-indexing.
I hope that hit the correct answer for you.
Happy New Year
-
RE: 301 redirects for a redesign.
HI Richard,
I think you may have stumbled into a bit of a messy situation here. If you do not have access to the old site, how can you 301 redirect to the new site? The 301 redirects are put into the .htaccess file of the old site. Think of it like a "We Moved Sign" you would place on a retail store. You pull into the parking lot and see a sign that says. "hey we moved, we are now located at...". If you're not allowed to use the parking lot, you have no where to put this sign.
You need to make sure your client is allowed to, or at least the company hosting the site is willing to allow you to put in the redirects. To answer your question you need to specifically put in a the .htaccess file on the OLD site to the new site.
You do not have to be concerned with matching URL's only that you format the .htacess file correctly. An example would be (again this must be on the old site server).
- redirect 301 oldpage.html http://www.NewSite.com/newpage.html
- redirect 301 oldfolder/somepage.html http://www.YourNewSite.com/newfold/newpage.html
If you choose to keep the same file structure you can do a blanket 301 redirect using mod rewrite (it must be enabled on the old server). Like this.
Options +FollowSymLinks
RewriteEngine On
RewriteCond %{HTTP_HOST} ^OldDomain.com$ [NC]
RewriteRule ^(.*)$ http://www.NewDomain.com/$1 [R=301,L]You can read more about redirects here at Moz.
If you do not have the ability to put an htaccess file or the clients hosting provider is un-willing to help (they may for a charge) then you're in a bad situation with no great solutions. The only thing I can think of is having the clients old host create a page that say this site has moved and provide a link to the new site. Our company did this many many years ago http://www.columbiaindustrial.com/ when they changed their name and host (before my time). Then you need to recommend how long you would recommend the client pays for this new page to be live... I'd say at least 3-6 months.
Hope this helps
Don
-
RE: ADA, WCAG, Section 508 Accessibility and hidden text
Wow, interesting question. I am with you I would definitely worry about obfuscated text penalties (keyword stuffing) employing that particular method. I have no experience with these guidelines but I am interested in what others have to say about the matter.
My initial though would be something like
Directions
Under the assumption that a speech reader would read alt text since users wouldn't see the image. And of course the image could be something completely simple like an arrow or bullet point.
I will wait to see what others may say,
Good luck,
Don
-
RE: Is it reasonable to not give an SEO access to our CMS?
I kind of have the feeling that there is something missing in the story. This is one of the challenges that happen when dealing with multiple hands in the kitty.
I wouldn't really buy the SEO's excuse of what they do is secret. Rather what they really mean is, we don't want to train another web company how to do what we do.
As a web developer I would understand why an SEO would want access, it could make things easier and faster. Having to go through company A to submit to get company C a change may not be exactly the service company A purchased.
As an SEO I understand why a web developer would be skittish about giving access to a company they had no hand in hiring. The web devs do a lot of hard work, some of which can actually be proprietary. I would of course in an SEO perspective be willing to work within those constraints if need be.
In the end it becomes Company's A issue. They need to find a compromise between company b and c. If it takes extra money to get the service for the SEO, or figure out which one they feel is the most valuable and fire the other and find a replacement that is willing to work within the constraints they lay out.
My thoughts..
-
Does Google's Information Box Seem Shady to you?
So I just had this thought, Google returns information boxes for certain search terms. Recently I noticed one word searches usually return a definition.
For example if you type in the word "occur" or "happenstance" or "frustration" you get a definition information box. But what I didn't see is a reference to where they are getting or have gotten this information.
Now it could very well be they built their own database of definitions, and if they did great, but here is where it seems a bit grey to me... Did Google hire a team of people to populate the database, or did they just write an algorithm to comb a dictionary website and stick the information in their database. The latter seems more likely.
If that is what happened then Google basically stole the information from somebody to claim it as their own, which makes me worry, if you coin a term, lets say "lumpy stumpy" and it goes mainstream which would entail a lot of marketing, and luck. Would Google just add it to its database and forgo giving you credit for its creation?
From a user perspective I love these information boxes, but just like Google expects us webmasters to do, they should be giving credit where credit is due... don't you think?
I'm not plugged in to the happenings of Google so maybe they bought the rights, or maybe they bought or hold a majority of shares in some definition type company (they have the cash) but it just struck me as odd not seeing a reference to a site. What are your thoughts?
-
RE: 301 redirects for a redesign.
If you're not switching domain names, just host then the situation is much better because now you have control.
When you switch host you're going to get a new DNS (Domain Name Server) it will likely take a few days for this to propagate through the internet. So expect some initially slow traffic days.
Once the DNS has changed to your new site you have control, so you will want to put the 301's on the new site for any of the old pages that don't have url's to their new equivalent. Just like if you were making new pages on the site. Even if they don't have any inbound links you should still 301 any decent ranking URL's to the new ones you make, since they are likely indexed or bookmarked by search engines and users.
Hope this helps,
Don