That makes sense... I didn't realize you were talking about top level categories; I thought it was more of a product page...
Post and let everyone know how it turns out...
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
That makes sense... I didn't realize you were talking about top level categories; I thought it was more of a product page...
Post and let everyone know how it turns out...
Without seeing the actual 301s and tracing it all on the server, my guess is that that what might have happened during the 301 process is that all of the server rules in .htaccess is forcing a user to go through multiple "hops" before the redirect is complete.
This is a simplified version, but what might be happening is something like this:
/store/widget-group1/ --> /store/widget-group1/index.php --> /store/widget-group1/index.html --> /store/index.php?widget-group1
If this is the case, I'd try to create a 301 that reduces the number of hops.
This page discusses how Matt Cutts of Google shoes that after a few 301 hops, Google may give up crawling:
http://www.mktdojo.com/matt-cutts-discusses-301-redirect-limits/
Hope this helps...
Bryan -
I just did a quick look at your site. You have a crazy amount of Google +1s.
Anyway, Google seems to have about 5,260 pages indexed, according to a quick search (see screenshot).
Google is most interested these days in creating an amazing user experience. I wouldn't worry about 100 pages, as a 301 is going to be a better quality signal to Google than 100 sudden 404 errors showing up on the page.
If you're really worried, though, and want to reduce the number, you could do a backlink check on the pages you're about to remove and see if anyone has linked in to those pages directly. 301 those, and then perhaps don't worry about the others?
Bryan -
The best practice is to 301 those pages to a relevant top-level category page, so that Google doesn't see that links into the site from social media or other pages is broken.
You can simply let them be a 404, but that doesn't send the solid signal to Google that you're managing the pages on your site well.
Yes, once the pages are removed, I'd recommend re-generating the XML-based sitemap and resubmitting it to Google, so they don't look for those pages.
But 301s for those removed pages will eliminate any future issues with people who bookmark a page or link to it from another site.
Just my $0.02...
Clifford-
Industry.com looks to me like a directory site where you are encouraged to pay for advertising.
On many pages, it appears that a link like "Label Printer" is just displaying Google Adsense, and trying to get people to click on ads. (See screenshot.)
My $0.02 is that the industry.com site appears to me to be a directory, and the kind that Google is trying to penalize people for using for links. I would avoid this, as it probably won't do a lot to help, and could wind up hurting the site with spammy looking links.
That said, Industry.com has a 39 domain authority, and 162 linking root domains coming into the site.
Unless it's a particularly amazing domain name, this is an issue that's going to follow you for quite some time. The disavow link system can be helpful in circumstances like this, but it's always been promoted by Google as a tool of last resort.
Not knowing much about the site, my $0.02 would probably vote for a clean slate with a new domain name, and shut down the older site.
Sharla - That is indeed problematic. I think that you have a couple of options:
1. Through an attorney, send a certified letter asking for the removal of the information from the job history. Have this followed up with a defamation claim for doing harm to a business for fraudulently claiming employment.
2. You can Report a Violation of the Facebook Terms of Service, even if you don't have an account by using the form here: https://www.facebook.com/help/contact/274459462613911
3. You can go the person's profile page, and click on the gear icon next to Message, and click on Report. You can use this feature even if you're not friends with the person on Facebook. This might fall into the bullying realm, too.
Hope this helps
David -
I just spoke to an SEO person who applied for a position at our company. He previously worked at a PPC company. His experience was that:
For every 1,000 calls (and they were used a fairly qualified telemarketing list of companies), 10-20 people would agree to a web demo (about 1%). Of those that viewed the demo, about 10% converted into a paying customer (six month contract). So, probably 1-2 new clients from 1,000 calls. If you spend 5 minutes per call (to quickly look at their website), that's 83 hours of time to make 1,000 calls. At $15 per hour (roughly $30k per year), that's $1,245 to bring in a new client or two each week. Given the size of a contract, it might be worth it though...
But to me, it seems like quite a grind. He said they struggled, especially to keep motivated and positive, and about 20% of customers churned after their initial 6 month contract was up.
I've heard of telemarketing success from companies targeting specific vertical industries, like dentists or doctors, but usually with low-cost "packages."
I haven't found telemarketing to be particularly effective for higher-end SEO consulting... but that's my experience.
Love to hear from others who have heard more, or have "cracked the code" on how to make this work
Jorge-
Can you check the age of the domain name? I've seen instances where sites that have been alive for a long, long, long time can outshine other sites with better statistics, but they've withstood the test of time for Google. That's one place I'd look to see if there is a substantial difference.
Olivier -
A couple of quick ideas to make sure you are ranking locally for SEO:
1. Have you verified ownership of your Google Places Page?
2. Have you added store hours, verified your address, business category, etc?
3. Does your website list your store address on the footer of each page? That can certainly help. I'd also recommend putting it into an RDFa or Microformat tag, so that it's easier for companies to read.
For example, our corporate address is:
Customer Paradigm
5353 Manhattan Circle #103
Boulder, CO 80303
Phone: 303.473.4400
Google and other search engines can figure this out, but if you use a v-card type of format, it's easier for them to structure the data more properly.
Here's an example:
Customer Paradigm
5353 Manhattan Circle
Suite 103
Boulder CO, 80303
303.473.4400
4. Make sure you add a few photos onto your Google + page, uploaded by you, the owner. This should help with ranking.
5. Try to get some of your customers (perhaps in a second day email, after a product has arrived) to do a review on your Google+ page. More google reviews, especially with star ratings, will help.
Hope this helps!
-- Jeff
One SEO implication that redirecting all of your error 404s to the home page is that it could lead to Google and other search engines to flag your site for having 'soft 404 errors.'
Best practice is to have a custom 404 page, that returns a 404 message, that then helps the end user find what they are looking for. Either a list of top categories, a search box, or a phone # to call tech support.
Here's what Google says about why you shouldn't do this:
https://support.google.com/webmasters/answer/181708?hl=en
Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted (also, you probably don’t want your site to rank well for the search query [File not found]).
We recommend that you always return a 404 (Not found) or a 410 (Gone) response code in response to a request for a non-existing page. You can improve the user experience by configuring your site to display a custom 404 page when returning a 404 response code. For example, you could create a page containing a list of your most popular pages, or a link to your home page, or a feedback link. You can also use the Webmaster Tools Custom 404 widget to add a search box and more site search options to your site. But it’s important to remember that it’s not enough to just create a page that displays a 404 message. You also need to return the correct 404 or 410 HTTP response code.
Jeff-
The slider is certainly a bit nicer from a design perspective, but the client does have quite a few images that have been indexed by Google images (see screenshot below).
That said, I ran both pages through a spider simulator, and the non-slider (thumbnails) does index / link to an image in each case.
The slider version shows the image name as plain text, but doesn't recognize the images as links. So the slider, in my opinion, isn't as helpful for SEO as the thumbnail version. It might be a more engaging consumer experience though.
One option: on the slider, allow someone to click on the image to view on a larger scale. That link would likely display...
Hope this helps...
1st-impressions-non-slider-spider-simulator.jpg 1st-impressions-slider-spider-simulator.jpg 1st-impressions-google-images-results.jpg
Danny - We use Nginx on our WordPress site, and it's pretty quick and easy. We're able to use the same .htaccess rules to handle rewrites, and for the most part, there's very little downside. You do want to make sure that your site isn't going to break before you launch it on Nginx, so I'd test it with a test URL first before you push it live.
We're also running Varnish as a caching system, and our page load speed takes the page from a slowwww load time to a really fast 1.5 second load time.
Hope this helps...
Felipe -
Cars used to be sold via classified ads in newspapers, until Craigslist came around. Craigslist removed a huge source of revenue - classified ads - away from newspapers. That, and the fact that nobody receives newspapers anymore.
I would say that any classified car site, even one focused on Utah, has to compete with Craigslist. How to do that? Craigslist has the audience of people looking to buy cars, and a ton of people who want to post cars for sale.
Craigslist, due to it's minimal, almost text-only design, loads really fast. It allows people to upload images, and list a price. It makes it really easy to find cars.
Whatever this site that you're building for a client needs to do everything that Craigslist can do, but more.
One thing I don't love about Craigslist is that you can't search specifically by car model, year, type to hone in on the results.
A better search option would be really helpful.
That said, if you're just starting out, you'll likely face an inventory volume problem - so you don't want to have too many empty categories.
Another thing that doesn't work as well for Craigslist is that the quality of car images tends to not be great. Probably this is my bias, because I'm a professional photographer. But an iPhone shot of a car just doesn't work well to sell a several thousand dollar car.
Perhaps one idea is that to launch the site, anyone who wants to list can drive the car to a location in Salt Lake City, or somewhere else, and have someone on the classified team take professional photos of the car, against a nice backdrop, and also give nice detailed macro close up images of the cars and trucks for sale. Including one of the open engine compartment, the odometer, etc.
One other idea for functionality: allow users who come to the site to get an email alert when a car matching the description of what they're looking for (i.e. I want to buy a Toyota 4Runner w/ a manual transmission).
You'll obviously want to do tie-ins to Facebook, Twitter and Pinterest, to allow people to pin something they want, and share when they list something.
That said, posting on Craigslist is free. And it's touch to compete with free. (Craigslist makes their money for job postings, and everything else is free.)
Yes, using a date structure in URLs can help search engines understand the date context of the information.
For example, CNN uses a date-based system, like this, where the date is right after the URL: /yyyy/mm/dd/:
ABC News uses a similar structure:
http://abcnews.go.com/blogs/politics/2013/11/the-notes-must-reads-for-wednesday-november-6-2013/
As does the NY Daily News:
Hope this helps...
Gina -
I've attached a few screenshots showing how the site displays in different widths. The site performs pretty well in the different widths, as you can see. The portfolio page works well, as does the contact information page.
I'd recommend putting in a footer navigation, as a mobile user who is on the bottom of the page might want to navigate that way instead of trying to scroll back up the top of the page.
The menu seems to work well at the tablet size and the smart phone size as well.
Personally, I think that the new responsive site layout is much better than the existing site. The live site has huge, dense blocks of text that make my eyes gloss over, overwhelmed by the volume of content.
Hope this helps!
-- Jeff
fat-eyes-desktop-version.jpg fat-eyes-tablet-version.jpg fat-eyes-iPhone-Version.jpg fat-eyes-phone-layout-menu.jpg
Have you checked the age of your domain name, compared to the age (both forward and backward) of their domain name? That could factor into the ranking analysis.
It's possible they bought likes, but it's likely that the people running the other organization are spending a lot of money sponsoring Facebook posts that get people to like their site.
This might be the case where the social component of the ranking outweighs a site that has other high metrics from Moz (Google doesn't use PA and DA; Moz ranks based on these estimates).
It could be that their site content is more recent and topical, so the site is ranking for fresh, new content, instead of a page that has withstood the test of time.
I'm happy to take a look at both your site and the competitor's site to see additional differences, and look at search results, if you want to post them here...
Thanks,
-- Jeff
We've recently launched a number of responsive designs for eCommerce companies.
I'd love to tell you that there is zero risk in launching with the new site from a rankings perspective, but because you kept all of the content and the URL structure the same, you've mitigated a lot of the risk.
Google has come out in the past and said that it prefers a responsive design framework, as opposed to an m-dot mobile site + a desktop site, as it doesn't have to worry about duplicate content.
That said, most of the risk is going to be based on how well the responsive site actually performs when a user is on a desktop, tablet and phone.
If the design works well, and isn't confusing to the end user, then go for it.
But if the design is buggy or looks a lot worse (due to the limitations of responsive design), maybe do a bit more testing.
If you'd like, post a link to the site and I'm happy to take a look at how it looks on different devices...
Yes, if you do a View Source on the page for the Gray Shadow Financial, you'll see under the
FYI, in this screenshot, I am seeing in the Google cached version of the site the "About", "additional info", "contact", and "media" pages. But I do need to click on those pages to make the content appear.
To Google and other search engines, these are not separate pages, but content that is served within the same page. The URL doesn't change at all. If you wanted to have those pages indexed, I'd recommend creating them as separate pages, with links that open up in a new page.
That said, you might get penalized for duplicate content if you have all of the same content on the page, but list this information below.
Another idea would be to keep the left hand navigation for the About, Additional Info, Contact and Media, but have all of the content display on the page; just link to the content from the top.
The way you have it built does limit the page length, but the user experience may be confusing to some, especially on a touchscreen tablet.
I've been working on a research study for Local search results, and submitted it to Moz last week. So I was able to take a look and see a few things that correlate with the results.
Here's what I'm seeing as far as search rankings (for me... might be different for you; I'm in Boulder, Colorado)
Google search for: "Los Angeles Criminal Defense Attorney"
1. James Blatt
1st ranked site has 1 additional photograph on their Google+ page, uploaded by the Google+ page owner.
2. Gregory Caplan
2nd ranked site has 4 additional photograhps on their Google+ page, uploaded by the Google+ page owner.
Also has 27 Google Reviews and a 4.8 star rating.
3. Stephen Rodriguez
3rd ranked site has no images uploaded to their Google+ page; the photo displayed is scrapped by Google from their home page.
4. Marks & Brooklier
4th ranked site has no images uploaded to their Google page; the photo displayed is scrapped by Google from their home page.
5. California Criminal Defense Center
The 5th ranked site has one image, but was not uploaded by the Google + page owner
That said, this result has a domain name match to the keywords, plus 5 Google reviews and a 4.7 star rating.
Not that this is the end-all-be all; other things like page rank, inbound links, age of the domain and other things matter.
But if it were me and my client, I'd highly recommend getting control over the page and adding appropriate photos... even if they are from the Website.
Plus, I'd try to get a few Google reviews from past clients. Especially those who didn't wind up in jail.
The great guru Seth Godin wrote about this on Saturday:
http://sethgodin.typepad.com/seths_blog/2013/11/tenacity-is-not-the-same-as-persistence.html
Seth wrote:
"Persistence is doing something again and again until it works. It sounds like 'pestering' for a reason.
"Tenacity is using new data to make new decisions to find new pathways to find new ways to achieve a goal when the old ways didn't work.
"Telemarketers are persistent, Nike is tenacious."
My $0.02 is that if someone doesn't respond to the first 3-4 phone calls, it's unlikely that you're going to "wear them down" by trying to be persistent. You're probably just bugging them.
Instead, try to find a new way to get their attention and link to your site.
If it's a really valuable note, send a package with cookies, and a message tying in your pitch. Send flowers. Do something out of the box to get through the clutter...
Hope this helps...
-- Jeff
As the owner of an SEO and Web development firm, the most important thing I do each and every day is make sure that our customers are happy, feel they are being treated fairly, and want to pay us for the valuable work that we do.
Everything else (blogging, writing content, sending emails, working with project managers, designers, search marketing people, billing department, answering email) all follows this.
It doesn't matter if we did a great technical job on a project, if a customer doesn't feel like what we produced was what they wanted. It doesn't matter if we drive lots of traffic to a site, if it's the wrong traffic, or the bounce rate shows people don't like what they are seeing, or if nobody converts.
Hope this helps
We ran into this in the past, and one thing that we (think) happened is that the links to the dev site were sent via email to several gmail accounts. We think this is how Google then indexed the site, as there were no inbound links posted anywhere.
I think that the main issue is how it's perceived by the client, and if they are freaking out about it. In that case, using an access control password to prevent anyone from coming to the site will limit anyone from seeing it.
The robot.txt file should flush it out, but yes, it takes a little bit of time.
The site URL looks fine, and here is a link to a previous page:
I don't see anything really bad with a quick look, but it might be good to do a bit more deep diving to make sure...
FYI, I would take a look at the Way Back Machine to see cached versions of the site, and make sure that the content on the site is not something that was spammy or objectionable to future search results. If the site was penalized, or your gut says it's not good, it might hurt your efforts.
On the other hand, if the age of the domain name is really old, it might help out with the new site's relative rankings...
Your web developers are hiding behind a content management system that is poorly designed, and their ignorance of ALT tags for images.
I am also a professional photographer (as well as a Google-certified photographer for panoramic business photos, too).
ALT tags were developed for the visually impaired, so that a text reader could read a description of the image to someone who couldn't see it.
In some cases, sites are required by their brand standards or internal guidelines to comply with ADA requirements and have ALT tags in place for images.
I would tell the web developers that the interface needs to have ALT tags available, and if not, I'd try to move to a different platform that supports this.
The meta information is nice, but ALT tags are critical. Dreamweaver (current versions) for example, won't let you add an image to a Web page, without putting in an ALT tag for the image.
Jon -
If you have the different websites running with different languages (i.e. .com is English, .nl is Dutch, .fr is French, etc), then you should probably have a separate sitemap for each site.
If they are all the same language, and you just have the site loading with .com / .nl / .fr, then Google will see this as duplicate content and you should likely make changes to keep them a bit more separate...
Thanks,
-- Jeff
It looks like your site might be using a combination of Flash and iFrames to deliver results.
You do have a Page Authority of 33 on your home page, and a domain authority of 27, with 12 linking root domains. Hope this helps...
Without looking at a sample product page, my initial thought is that the UI change might be conflicting with the CSS portion for the rich snippet. Did you rework the CSS on the page in a way that is different than before? Without knowing more, that's where I'd probably start the troubleshooting process.
Bob -
I'm assuming that your blog promotes the custom design and printing business in the UK?
If that's the case, I'd recommend putting up articles that are potentially interesting to end users, including:
Post a link to the blog so we can take a look...
If you switch to the "Classic" View of Google Maps, you'll be able to see the Google + page for the Ticket King Inc site:
https://plus.google.com/+TicketKingInc/about?gl=us&hl=en
On this page, I don't see that you have hours listed for your Google Places page. Other companies often have this listed.
If you have the ability to log into this page and edit it, that might help. It could be that it was a places page, that was converted into a Google+ page, and now not using the class map function.
When I drill all the way down on the editing of the Google+ page, I do see that you have your hours listed correctly, with just the 1 pm time listed as the closing time for Saturday. (M-F is listed as 8:30 am - 6:00 pm).
It's also listed correctly on Facebook and Yellow pages.
I might recommend going through this form / system and updating this again, as it might (hopefully) override the 1:00 pm closing time on the Google result:
ticket-king-inc-google-plus-page.jpg suggest-changes-for-ticket-king-inc-google-plus-page.jpg google-results-ticket-king-inc-google-plus-page.jpg
I know that there are penalties if you have a navigation item in your categories that links to another site (root domain) that yields the same design as the original link.
Is there any negative SEO implications with having a subdomain with the same design as the root domain i.e. root.com vs. store.root.com
And the navigation functionality within will have links to both store.root.com as well as root.com directories?
Thanks in advance.