Hello MrPenguin,
Your code should look similar to this:
Redirect 301 /product/product-name http://domain.com/product-name
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hello MrPenguin,
Your code should look similar to this:
Redirect 301 /product/product-name http://domain.com/product-name
Hi bigrat95,
If you'd like to use a new domain with the same content as the old domain then you need to 301 the old domain to the new domain. Leaving content up on the old domain and using the same content on the new domain will cause duplicate content to be recognized, so you want to 301 the old domain to new domain.
You can keep the old domain's home page up and write some unique content for it, without redirecting the home page of the old domain. I suggest writing unique content for that home page and explain your new brand/domain, the value the new site brings to your users, and suggest them to bookmark the new site.
Eventually, your new domain will outrank your old domain and you won't be as worried about keeping it around in the long-term.
You could also check out the Content Grouping feature. This would allow you to group all your old pages (old url structure) into a category and compare those metrics to the group of new pages.
Google link: https://support.google.com/analytics/answer/2853423?hl=en
Hi RosemaryB,
AFAIK, no - you cannot implement a 301 redirect from an old FB/Twitter profile to your new one.
The business should minimize the old profile (remove any unnecessary content) and put up a description/text about the new company's handle/page/twitter. You could also send out an email blast with the updated profile and make sure to check the old accounts for any straggler users that find the old profile and need to be redirected to the new profile.
First steps would be to connect with Google Webmaster Tools and see if there any manual actions against the domain. Correct any that they notify you about.
Update the site to be a value-added website. I.e. remove all thin content pages, duplicate content, optimize the site navigation architecture, optimize the internal link architecture, clean your sitemap, ect.
Examine the site's backlink profile and disavow any domains that are spammy and non-value added (link farms, blogs with duplicate content, excess comment spam, thin content, ect).
That should give you plenty to start with and you should see results once completed.
Everything is to be as expected - in time Google will adjust appropriately and drop the old domains from the index to include the newest domains (which are 301'ed to the new URL).
The 404 error could be around a common error experienced with Yoast sitemaps: http://kb.yoast.com/article/77-my-sitemap-index-is-giving-a-404-error-what-should-i-do
1st step is to try and reset the permalink structure, it could resolve the 404 error you're seeing. You definitely want to resolve your sitemap 404 error to submit a crawlable sitemap to Google.
In WordPress, go to the Yoast plugin and locate the sitemap URL / settings. Plug the sitemap URL into your browser and make sure that it renders properly.
Once you have that exact URL, drop it into Google Webmaster Tools and let it process. Google will let you know if they found any errors that need correcting. Once submitted, you just need to wait for Google to update its index and reflect your site's meta description.
Yoast has a great blog that goes in depth about its sitemap features: https://yoast.com/xml-sitemap-in-the-wordpress-seo-plugin/
Yoast sets up a pretty efficient sitemap. Make sure the sitemap URL settings are correct, load it up in the browser to confirm, and submit your sitemap through GWT - that will help get a new crawl of the site and hopefully an update to their index so your meta descriptions begins to show in the SERPs.
Ha, that's exactly what I did.
I'm not showing any restrictions in your robots.txt file and the meta tag is assigned appropriately.
Have you tried to fetch the site with the Webmaster Tools 'fetch as googlebot' tool? If there is an issue, it should be apparent there. Doing this may also help get your page re-crawled more quickly and the index updated.
If everything is as it should be and you're only waiting on a re-index, that usually takes no longer than two weeks (for very infrequently indexed websites). Fetching with the Google bot may speed things up and getting an external link on a higher trafficked page could help as well.
Have you tried resubmitting a sitemap through GWT as well? That could be another trick to getting the page re-crawled more quickly.
What was the exact search term you used to bring up those SERPs?
When i search 'atlastpartners' and 'atlastpartners.com' it brings up your site with a meta description.
Many times, WordPress users (and similar CMS') noindex their media categories to prevent duplicate content. I suggest noindexing the media category if you have a main page for the videos.
Unfortunately, Google Analytics does not provide a 'What's Changed' report, but you can somewhat easily make that report by exporting your Keyword information and traffic details to excel, then comparing your top terms over time.
I wouldn't suggest looking at all 1600 keywords. Instead, identify the keywords that sent 80% of your traffic and see what's changed among those. Was there a few specific keywords that sent huge amounts of traffic and no longer do so? Or can you identify a general decrease across all keywords (maybe signalling a seasonal change?).
Content that adds value to the user experience is never bad - more content, more reasons to rank, better information for the user to take action.
Ecommerce websites are more difficult to handle though - the goal is usually to convert to a sale and sometimes content can get in the way of that happening. If you've A/B tested and found that conversions are greater without it, then good for you! You've identified a barrier to your visitors and eliminated it for better conversions.
What needs to be weighed is the amount of conversions - does the content bring in more, new traffic and eventually converts? I.e. is the conversion volume greater with the content than without it. If no, then stick with the A/B test results.
Putting content down the page may help you rank, but is it helping your visitors? You don't want to shove content onto a page just because you think it belongs there - is it beneficial to the user? If no, can you put that content somewhere else where it is beneficial to the user?
Amazon is common SERP to beat, let's take a look at one of their category pages: http://www.amazon.com/Outlet/b/ref=sv_gb_3?ie=UTF8&node=517808
We see that they do include a small snippet of text at the top, to describe the category. Then they give the user what is expected, a lot of products to view; and at the bottom they have a longer category description (sometimes reviews and related category information). This is a common way to structure category pages. But, your market may be different so it may be worthwhile to brainstorm how your page structure would most benefit your users.
if you'd like to instruct Google not to crawl the URLs with that parameter, select 'No URLs' in the Google Webmaster Tools > URL Parameters section.
Still, this is a guide for Google to follow and they do not always follow the rules set. Going the robots.txt method ensures that Google will not crawl those URLs.
Are the pages useful to the user? Do you expect users to actively use these pages on your site? Do you want users to be able to find these pages when they search for their issues through Google?
If you've answered 'yes' to any of these questions, I wouldn't suggest removing them from Google. Instead, take your time and set a schedule to optimize each of these pages.
If these pages are: Not valuable to the user, unnecessary to be indexed by Google, locked behind a membership gate, duplicate pages, thin content - then these would be good reasons to noindex them from all search engines.
Given your scenario, there could me many areas to check and see what is has changed to effect your rankings. Have you done a 'what's changed' analysis to see exactly where your traffic dropped?
Month-to-month look at the keywords that were sending you traffic and identify which keywords have reduced their visits. Then track those keywords in Moz, see where you can improve on-page, and see what your competitors have done to outrank you.
Once you know why the change has happened, you can adjust to get that traffic back that you've lost.
No, generally a 301 redirect will transfer 99%+ of the authority to the new link. I would anticipate waiting about 2 weeks before noticing the new URLs rank as high as the old URLs.
If you change the URLs back to the old structure, those rankings should return more quickly. However, if you new structure is more user friendly, it may be a good idea to make the change now rather than later, when the URLs gain even more authority.
AFAIK there is no efficient # of files to include in a folder directory for maximum crawl effectiveness. If you folder legitimately warrants 5k html pages in a directory, then Google will crawl all the pages. Make sure to create value-added pages with high quality content - Google will recognize them and crawl them as appropriate.
If you have the options, use your Google Webmaster Tools account to adjust crawl settings. Once your site is a specific size, Google will take-over crawl rate settings for you.
Make sure to 301 the old URLs to the new URLs - this will pass along 99% of the authority and you should see an increase in the SERPs for your new URLs.
Only add a backlink if it adds value to the user / article. Do not add a bunch of comment backlinks thinking it will help you rank in the SERPs.
For example, you find an article that helped you do X. Write an article expanding on X to XYZ. Then link back to your new article from the old articles comments about how you were inspired by the original article and added more to X.
Hopefully, your new content adds enough value that the original article author uses it as a source and/or the original article readers find your new article and maybe link to that from their blogs too.
Check how people are arriving at the goal (conversion) page. Are they going directly to that page somehow through a link or direct? That would cause an increase in goal conversions and since they did not start from the 1 step they are not being shown at the beginning of the funnel.
I wouldn't worry about the location of the IP too much. What's most important is that you're getting quality, value-added links to your website for your niche.
This is conjecture on my part, but I say it would be more important to receive a link from a .es domain than from a website with an IP in Spain (for specifically ranking in the .es SERPs). However, unless the terms you're trying to rank for are very competitive, I would concentrate on just receiving quality links, regardless of location. I'm not sure that there would be that much incremental benefit from the IP address location.
Hi vijayvasu,
Maintaining a healthy link profile is necessary; and being proactive, like your question implies, is a great skill to exercise.
Start by using Moz's link analysis tool. Export the links to an excel, remove duplicate domains, then identify from that list the domains that may be spammy (low DA/PA).
You'll need to visit the site to know how spammy it is and whether or not your should proactively shun that link. Use Google's disavow tool to remove them from your link profile.
I'd love some tips for helping them rank better without building out an entire site.
All you can do then is ensure that all on-page SEO is optimized and try to create word-of-mouth traction for external links.
However, if any competitor comes into your niche and does 'build out' the website, they will have an easy time outranking you.
If the domain is a known spam property or you feel that the entire domain adds little to no value, then go ahead and disavow the entire domain.
Hi alrockn,
If you feel that these sites are in fact spam related and do not provide any value to the user, then I suggest getting proactive and disavowing those properties before any penalty happens.
I would be concerned, but not too much - use the disavow tool and you're website will be fine.
If you give me a few of your direct competitors I will do a search to see if they have any inbound affiliate links to their website.
Usually live (production) websites have a counterpart, their development website. I.e. dev.domain.com
This subdomain can easily be setup as its own profile and allow for company IP addresses to accrue data. I usually apply my test scenarios to the development environment first, test the goals, and then move the changes into the production environment.
There are not any very effective free solutions that I'm aware of, and hopefully someone can add one here so I do become aware.
However, if you're interested in controlling your affiliate program, than Hasoffers is a great product with a low entry cost. You can also look into the popular affiliate networks, such as Neverblue advertising and Commission Junction.
Also, look at your competitors affiliate programs and see what they are using.
Are you using a WordPress theme or a custom site theme?
Let's use this flow as an example:
User comes to your site and views a product
User submits a review to the product
Manager approves the reviews through the CMS
The review shows up on the product page
When the review shows up on the product page, your site probably generates that once the review is approved by a manager. The site generates some HTML and sticks in the review. That HTML generated should include the schema markup for the review.
Same with your products, if you push a new product into the site it generates a product page. Within that page's template should be the markup auto-generated (if it is built into the theme).
Does that help? You can PM me your site and I will take a look to be more specific.
Make sure you have GWT set to the www version.
Make the www version the canonical (and 301 redirect) for the http:// version
Allow the http://dev version to canonical to itself. I also suggest no-indexing your development environment so Google doesn't index it, competitors won't as easily see it, and your customers do not stumble upon it.
Design for what is important to your company, first. Are you getting a large portion of traffic through a new device? That may warrant putting resources into the design for that specific device.
If not, then do not worry about it so much. Monitor the performance and traffic from those devices and adjust as necessary.
If you want to position yourself as proactive, rather than reactive, then begin to consider a truly responsive design (fluid). This way your working towards a site that looks good at all device dimensions. Even still, when a specific device takes priority (largest portion of traffic), there should be resources put into the UI/UX for that specific device.
Your numbers do not need to be manually updated, but they do need to be wrapped in the correct schema in order to be read and updated by Google.
Your site's theme should include the schema markup and whenever a new review is entered the theme would take care of the code behind the scenes.
If you're adding markup manually now, then your theme probably doesn't have the markup built into its code/functionality. In that case, you'd need to adjust your site's theme to include the markup automatically or add a plugin to do so. You can also use Google Webmaster Tools' Data Highlighter to try and have it update automatically (works well if your site is clearly structured).
It's true that it wouldn't matter to the user, such as 'above the fold' real estate. But, to the bots do pay attention to the parts invisible to the user and should be optimized too.
Yeah, maybe not. But, 'above the fold' is understood to be better real estate on a web page - why not higher up on a document too?
Does the location of the title tag, meta tags, and any structured data have any influence with respect to SEO and search engines? Put another way, could we benefit from moving the title tag up to the top?
Yes, location does matter.
Let's consider this extreme scenario: A competitor and you are competing for the same term and have the following...
Basically, you and your competitor have the same internal/external optimizations - so all other factors are equal aside from the <title>location.</p> <p>Pages are rendered from top to bottom. Crawlers read pages from top to bottom. Your competitors' <title> tag is higher on the page than yours. When Google crawls the site, they understand this (the location of the title tag in relation to the page). How will they decide between your page and your competitors' page? Your competitor puts the title tag up higher on the page than you do, it must be more important.</p> <p>Now, this is a very extreme scenario that is super difficult to replicate (you'd need to control both sites to do it properly). But, using this extreme can show why location of the <title> tag is important. It may be a very slight difference, but sometimes that is all that is needed.</p></title>
Try creating a custom report in Google Analytics to view all of the Hostnames that have your GA code.
> Custom Reports
> Metrics Groups (your choice [sessions])
> Dimension Drilldowns
This will give you a report of all the domains that have your GA code - are there any surprises here?
The sidebar widget will help the page rank, but it is at a disadvantage. Google's understanding of the sidebar makes the content in a sidebar less effective (my opinion, please do your due diligence).
Most sidebars include links, content around advertisements, and generic information. This has resulted in the understanding that content in the sidebar should carry less weight for rankings.
I suggest building up a strong intralinking website architecture. Build those landing pages with unique content, link back to the home page for your parent keyword phrases. Make the slideshow as HTML friendly as possible (can it be done completely in HTML/CSS?) and fill in your alt texts.
Ideally, your home page would have content to help its rank. Can you show content through tabs, modals, or other interaction methods? This way the content is not a focus to the user, but does exist on page for those who want it.
Send me a private message with some questions you have and I'd be happy to provide a clear, honest answer.
Next steps:
Identify the keywords you'd like to improve your ranking for
Identify the pages you'd like to rank for those keywords
Go into the Moz grade page tool and grade each page for the assigned keywords
Optimize the pages for an 'A' rating
Make sure to add the keywords to the Moz tracker to see how your SERPs change over time
Another website has your GA code?
I agree to avoid a domain such as:
domain.com/kidney-dialysis-ckd-dialysis/article
Your article URL will probably include a similar keyword and it does start to get too long/stuffy.
I also agree that you want to setup an internal structure as described: Category > Article Name
Where I differ is by including the category in the URL, I believe it is not needed. Instead, allow the URL to be the article name. Then, structure your website so that you have a strong category page for your main keyword phrase and include links to these articles (and vice-versa) as appropriate.
Your internal link structure will tell Google just how important the main category page is for the main term and your supporting articles will be organized into categories through your UI & navigation structure.
Setting it up this way will inform google how each piece of content is related and still allow for the article to be the main term in the URL structure.
However, this is just a preference. You can include the category in the main URL structure and it may even be a big benefit to your site. I prefer the more direct URLs and enforcing the structure through UI & internal link design - I think it allows for more flexibility and attention on the article's terms.
I'd suggest dropping the /articles from the URL and just go with a friendly URL that is http://www.domain.com/name-of-article
You could include the categories, such as /kidney-dialysis, but I suggest keeping the URL train short and focusing the article title as the main portion of the URL.
Create efficient category pages for your main terms and try to get them to rank as well.
http://www.weddingrings.com/www.yoy-search.com
This is probably due to to a relative URL being used in the site code. When relative URLs are used improperly, you usually see the url added to the end of the correct http url. Make sure to use http in your anchor tag.
i.e. and not
What do you mean exactly by 'generate schema' ?
You would want to build the schema code into your template to integrate it properly. As long as it is in your site template then it will be added to your product pages accordingly.
The best way to do that is by adding and validating it manually. However, it wouldn't adding it manually to every single product since building it into your site's template would do just that.
Correct, thanks for that correction. I do not think this would be considered blackhat.
Let me see if I understand this correctly...
Your team developed .swf games and distributed them to flash game sites. In the .swf file you included a link back to your original website. Now, you're wondering if those links will be considered spam? No, I do not think so.
In fact, this sounds like a good way to build backlinks. If you're the original developer and legit websites want to distribute your game, let them and be happy about the link juice.
I suggest ensuring that you can track the links back to each game, watch to see which games are distributed most, and which games generate the most clicks back to your site. Then develop similar games and reap the rewards!
Unless someone else int he community has direct experience with this being black-hat, but I do think it is.
Hi Shaqd,
I have not personally implemented a thumbs up/down rating system, but have implemented a 5 star rating system.
Since your rating only have two values, either 1 or 2 (1 being thumbs down, 2 being thumbs up), you'll need to specify the Bestrating attribute so Google understands you only have two options.
This is because, by default, Google assumes a 5 star rating system (1-5) if the attribute is not specified.
On this page: http://schema.org/Rating you'll see how to add the bestrating attribute (bestRating)
It would be great if someone could give an example of how that looks in the SERPs too.