Yes I´m afraid of that to..
Lets say you copy all the default content to an UK-section:
Will the HREFLANG solve the duplicate content problem, or does it need to be unique?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Yes I´m afraid of that to..
Lets say you copy all the default content to an UK-section:
Will the HREFLANG solve the duplicate content problem, or does it need to be unique?
Hi Andy!
Thanks for your answer, I´ve been thinking about this tag as well. Do you know if you can have two tags on the same page saying this is for both "en-gb" and "en-us" ?
Do you guys have any tips to increase the visibility in both Google.com and Google.co.uk?
The site today, have good visibility in USA, but its poor in the UK...
Information:
Do we need to add a specific section for the UK (uk.site.com or site.com/uk/) and specify region in GWT to make sure Google handle this the right way? Its a lot of work, rewrite all the content for another section, which also is in english...
To make use of people who "borrow" your content, add a link back to your site inside of the content. When they publish the content on their site, you got yourself a backlink.
If you use Wordpress, Yoast SEO-plugin has a feature for this, very easy to use.
Good luck!
Some thoughts:
- worse user experience because of the amount of text? high bounce rate?
remove those spammy meta keywords
make a more user friendly meta description
lifestylemonthly is very active on Facebook, shares and likes
check your exact anchor texts and remove links from spammy sites. Its better to have few links then many bad links.
remove all those details that makes the website look over optimized
improve the usability
check for bad links
Hope this will help you!
Example of keywords that you used for the comparison between those sites?
Hi!
Got a solution for this issue,
It's gzip compression that makes the 302 redirection! Enabling compression causes the redirects Google boot to cookies_usage.php and therefore Google believe they visit cookies_usage.php while they came thru another URL. Disabling it fixes the problem.
Need to investigate the gzip-plugin for OsCommerce and see what I find.
Thanks for your help!
Hi!
I´ve a problem with a webshop that run OsCOmmerce. Google report thousands of urls "Not followed" in Webmaster Tools, and it´s increasing for every day.
When you fetch as Google you get a 302 to ../cookie_usage.php. But if you fetch the page in your browser you get 200 OK.
Why does Google get 302 but users 200? We´ve checked for malware and restored old backups, nothing helps.
Thoughts about this?
Hi Paul!
First time I see someone that uses this technique.
It is not a problem for the search engines, they se all the content for that page. The URL-changes in html5 will just help the users to define specific sections on the site, just like #news (html anchors).
My recommendation, do not create a super-huge-one-single-page-with-all-content. Create separete subpages to define a clear hierarchy, good for both users and crawlers.
Hope this will help you!
Hi Panini!
I dont think that there is a problem if you remove them all at once, you still have 10 000 pages left with great content. This is what Google want with the Penguin update, to clean up the web and make it better for all users.
Good luck!
Thanks for your support
I´ve checked the sitemap, you find it at ....se/sitemap_index.xml
It´s Yoast plugin BTW.
Sure, notice that it´s in swedish
Example of one page that Google report 404 on is our contact page with /feed/ at the end,
Hi Aran, thanks for your time!
Meta and sitemaps are clear. I also checked the source code for the specific pages that Google have in their reports, but no link.
I´ve also googled about the automatic feed functions in WP but no problem there either.
Hi!
In Google Webmaster Tools I find *.../feed/ as a 404 page in crawl errors. The problem is that none of these pages exist and they have no inbound links (except the start page).
FYI, it´s a wordpress site.
Example:
Does Google search for /feed/ by default or why do I keep getting these 404´s every day?
Hi Xoffie!
The RSS might help Google finding all your pages but, my recommendation for speeding up the indexing process is this:
Good luck!
Hi Diane!
Upgrading your server will help you a few steps, but, if you want to get big improvements you better take a look at these recommendations:
http://developer.yahoo.com/performance/rules.html
Good luck!
Hi Diane!
There is a difference between www.yourdomain.com and yourdomain.com. Because of that you have to make sure Google and others see the version of your choice, with or withour www.
My recommendation:
Choose www.yourdomain.com or yourdomain.com and then add a redirect 301 to make sure that visitors and search engines always see the same version of your site.
More reading about 301 redirects:
http://www.webconfs.com/how-to-redirect-a-webpage.php
Good luck!
Hi Cygnis!
If you want to get good results in different countries there are no quick fix.
My recommendation:
or
Then to one of the most important stuff for great ranking - unique content.
Good luck!
Hi Joel!
Googles recommendation for this is a 301 redirect.
If you need to change the URL of a page as it is shown in search engine results, we recommended that you use a server-side 301 redirect.
More reading: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93633
In other case, if you dont have access to do it server-side, then a canonical is better then nothing.
Good luck!
Hi Roel!
I´ve spoken to people in many different countries about this and they all confirm that the Penguin update is spread everywhere.
Google said this:
_The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice. _
More reading: http://insidesearch.blogspot.se/2012/04/another-step-to-reward-high-quality.html
Hope this will help you!
Hi Highland!
I know that relative URLs is anything but good, especially when you also use URL rewrite.
The only question is how Google will react to this?
Thanks for your answer!
Hi Cyrus and thanks for your answer!
The client is using the base tag on all pages on the site, but with different URLs. For example:
Root page: <base href="http://domain.com/1.0.1.0/2/1/">
Subpage:
<base href="http://domain.com/1.0.1.0/5/1/"> OR
<base href="http://domain.com/1.0.1.0/13/1/">
Productpage:
<base href="http://domain.com/1.0.1.0/14/1/">
As you can se they are using a lot of different base locations and unfortunately we are unable to change the base URL and test.
We have problems with both broken links and rankings. Whenever a new version of the system is created, all base URLs will be changed. This may mean that old links are still there and will be broken.
What do you think Cyrus, can this hurt us from a SEO perspective? It must be confusing for Google with all the strange base URLs?
I think the best would be to rebuild the structure and remove the base tag!
A customer is using the <base>-tag in an odd way:
<base href="http://domain.com/1.0.0/1/1/">
My own theory is that the subfolders are added as the root because of revision control.
CSS, images and internal links are used like this:
I ran a test with Xenu Link Sleuth and found many broken links on the site, but I can't say if it is due to the base-tag.
I have read that the base-tag may cause problems in some browsers, but is this usage of base-tag bad in some SEO-perspective? I have a lot of problems with this customer and I want to know if the base-tag is a part of it.
Hi!
Interesting thoughts, I have never tried that.
Google says:
_We're experimenting with using this data to help people find content from great authors in our search results. _
More reading:
http://googlewebmastercentral.blogspot.se/2011/06/authorship-markup-and-web-search.html
http://support.google.com/webmasters/bin/search.py?hl=en&query=rel+author&ctx=en:searchbox
Hi!
What black hat stuff did you do?
My recommendations are:
Unfortunately, when you´ve broke the Google guidelines, they wont fix everything over one night, you have to keep up with the good work to persuade Google that you are good guys, and that you deserve to rank again.
Good luck!
Hi Max!
Facebook still dominates the market and it gives good ranking signals as well.
I would recommend you to use Facebook as the primary network, you will reach the most people with this channel. Then as a complement I would recommend you to start using Google+. I´ve seen a lot of discussions about Google+ and I think that it´s going to be a very important ranking factor (it is already).
Google Google = true
Then we have Twitter, an easy way to create buzz around the subject, customer or whatever you want. - It all creates positive signals to Google, you´re a hot subject in social media -> ranking signals.
Good luck!
Use internal links, I´ve seen a lot of pages that use their internal power to increase their rankings. Make sure that they point to the same page and that it looks natural (not over optimizing).
I would also recommend you to start a blog, write posts with good content for your keywords and voila! you have content that drives traffic to your website. And, you can also use the blog for internal linking.
In a long term, I would recommend you to rebuild the site and use flash for design and user experience, and ordinary html for navigation and presenting content.
Good luck!
One important thing to keep in mind is that you need to have a great connection between both parts. if you have a good communication, understand each other and make progress, then its good (even small SEO agencies can deliver).
I would suggest you to try a "pilot project", give them a shot and let them help you with one smaller problem that you have. Then evaluate.
Good luck!
Hi!
See the answers in this thread: http://www.seomoz.org/q/does-server-location-impact-seo
See this video with Matt Cutts: http://www.youtube.com/watch?v=hXt23AXlJJU (What impact does server location have on rankings?)
Hope this will help you!
Hi!
My recommendation is absolutely Wordpress. You get access to hundreds of great SEO plugins and it´s very easy to work with.
Reading and plugins for Wordpress - http://yoast.com/wordpress/.
Simple platform = more time for blogging.
Good Luck!
Done some google:ing and found this:
Maybe this will help you!
Strange, I get to the page I click on, as expected.
Try and clean your cookies in the browser, then do the search + click again.
Hi!
I´ve followed your instructions but I cant recreate your issue.
What other website are you coming to?
Hi!
An easier way to fix the problem is by Canonical tags (if you´re not familiar with htaccess or server side scripts).
You find Rand Fishkins amazing article about it here:
http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
Good luck!
Hi Sofia!
See this video with Matt Cutts: http://www.youtube.com/watch?v=hXt23AXlJJU (What impact does server location have on rankings?)
Good luck!
As Jeffrey said, dont try to reinvent the wheel, check out the SEOmoz keyword difficulty tool.
Otherwise you can look at these parameters:
Combine all of these parameters to get an idea of the difficulty.
Good luck!
I can recommend you to read Wil Reynolds latest presentation from Linklove London, 2012:
http://www.slideshare.net/mobile/wilreynolds/stalking-for-links
He is a very smart guy, enjoy reading!
I would recommend you to make contact with your web agency and see if they can fix this for you. The best solution is to get one single unique URL for the index page.
Quite often, the web agencies can fix it, I´ve worked with a lot of customers having the same issue.
Good luck!
Add a canonical URL that points to the URL that you want to use.
or
Some reading:
Hi Andre!
This is a common problem and I know several people that have met it.
Tips for fixing this problem:
1. Make sure that your prefered landing page are relevant for the keyword, title, H1 and content.
2. Use the power of internal links, if you mention the keyword on other pages, make a link to the prefered page with that keyword (naturally sometimes, not always). It will give Google a signal on what page you want to rank for the targeted keyword.
3. Check incoming links in OSE, if you find "good" links that are pointing to the wrong url, make contact and ask them to update the URL.
4. Make more signals by using social media-links (G+, Facebook and Twitter) to preferred URL.
Hope this will help you!
Hi Pete!
Take a look in Googles SEO guide on page 26, http://static.googleusercontent.com/external_content/untrusted_dlcp/www.google.com/en//webmasters/docs/search-engine-optimization-starter-guide.pdf
Good luck!
If he choose option C he will loose the traffic as well, visitors will just come to a dead end.
If you put the visitor in the front seat, I would suggest redirecting the old subpage to a corresponding page on the new site, and if you dont have any as he mention, I would suggest option A.
Hi Andre!
I would suggest option A. If you cant find a corresponding url on the new site, then you should redirect it to the new start page. Then you will take care of the visitors that might find the old pages and send them to your new site + get the most of the old subpage PR.
Recommendations from Google:
http://www.google.com/support/webmasters/bin/answer.py?answer=83105
Good luck!