Hi there,
I'm using Firecheckout on a few projects, and it is really easy to use. (M 1.9.3.x)
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Technical SEO enthusiast
Company: Multiplan Electronics
Favorite Thing about SEO
It's all about the challenge. Basically we play a game of chess, where search engines can swap the rules at any point.
Hi there,
I'm using Firecheckout on a few projects, and it is really easy to use. (M 1.9.3.x)
Oh, sorry. Somehow I didn't get any notification on your reply.
For IIS you could go with web.config of your website. The code will be something like:
<rule name="Force WWW and SSL" enabled="true" stopprocessing="true"><match url="(.*)"><conditions logicalgrouping="MatchAny"><add input="{HTTP_HOST}" pattern="^[^www]"><add input="{HTTPS}" pattern="off"></add></add></conditions>
<action type="Redirect" url="https://www.domainname.com/{R:1}" appendquerystring="true" redirecttype="Permanent"></action></match></rule>
Hi Sammy,
If I understand your question, you need help with htaccess code to force both https and www with same rule? If so, this might be what you are looking for:
RewriteEngine On
RewriteCond %{HTTPS} off [OR]
RewriteCond %{HTTP_HOST} !^www.domainname.com$ [NC]
RewriteRule ^(.*)$ https://www.domainname.com/$1 [L,R=301]
Hi there,
The URL structure will remain the same? If so, in the .htaccess file of the subdomain, you should add the following after the RewriteEngine On:
RewriteEngineOn
RewriteCond%{HTTP_HOST}^shop.domain.co.uk$[NC]
RewriteRule(.*)https://www.domain.co.uk/$1 [R=301,L]
this should do the trick to redirect https://shop.domain.co.uk/product-category/great-merchandise/?product_order=desc to https://www.domain.co.uk/product-category/great-merchandise/?product_order=desc
I hope this helped
You have my details on my profile. And after we resolve it, we should paste here the solution without domain-specific information, so it helps others in the future. (if you don't mind).
Hi there,
Probably what is happening is that your plugins are not optimized for redirects. You should address it from your .htaccess file (probably adds the redirects, but they are not optimized). If you can give access, I can help you out.
Hi James,
So far as I can see you have the following architecture:
Since from the robots.txt the listing page pagination is blocked, the crawler can access only the first 15 job postings are available to crawl via a normal crawl.
I would say, you should remove the blocking from the robots.txt and focus on implementing a correct pagination. *which method you choose is your decision, but allow the crawler to access all of your job posts. Check https://yoast.com/pagination-seo-best-practices/
Another thing I would change is to make the job post title an anchor text for the job posting. (every single job is linked with "Find out more").
Also if possible, create a separate sitemap.xml for your job posts and submit it in Search Console, this way you can keep track of any anomaly with indexation.
Last, and not least, focus on the quality of your content (just as Matt proposed in the first answer).
Good luck!
In my experience, it will help the overall site, but still... do not expect a huge impact on these. URLs are shared, but I don't believe people will start to link to them except for private conversations.
This is a technical question, that they need to tackle from database side. It can be implemented, but it needs a few extra development hours, depending on the complexity of your website architecture/cms used/etc.. Anyways, you are changing the URL, so don't forget about the best practices for them. Good luck!
Hi there,
I believe the most logical implementation would be to use "noindex, follow" meta robots on these pages.
I wouldn't use canonical because it does not serve this purpose. Also make sure, these pages are not blocked via robots.txt.
Hi there,
This is one of the reasons why the hreflang tag was created. You can learn more about this here: https://moz.com/learn/seo/hreflang-tag
Also I would advise you to read Dave's post about it: https://moz.com/blog/hreflang-behaviour-insights
I hope this will clear it a little bit.
Keszi
Hi Russel,
My advice would be to go and check up the webinar section. There has been a quite good webinar last Thursday pointing out dos and don't for Google Places.
I believe it will give answer to all of your questions.
I hope that will help,
Istvan
Hi Chris, Maybe opensiteexplorer.org could be a solution? You could export the info and check anchor texts and links also. Gr., Istvan
Hi Salvyy,
I always told my team, that if we gain links from bad websites, that is in a some way harmful, BUT pointing out to such sites can flag us. (The same is told by Matt Cutts who pointed out that you cannot influence who is linking to you, but you can choose where to link to).
Therefore my advice would be, if it is not in your niche then don't link to these sites.
Gr.,
Istvan
Hi,
You can block RogerBot from Robots.txt
Check for further instructions on: http://www.seomoz.org/dp/rogerbot
"Please note: Adding this code will prevent our crawl test tool from being able to crawl your website."
Gr.,
Istvan
What I would add to Lewis's answer, just to make it clear, canonicals do pass link juice. If you are interested, you can check this article from Dr. Pete: http://moz.com/blog/an-seos-guide-to-http-status-codes
But for sure the 301 would be the best practice in your case. Because there would be no good use to have the pictures on 2 urls (the kws and the non kws versions).
I hope this helps.
Keszi
Hi Jackie,
What I'd advise you to do, is to create a Screaming Frog Crawl so you can compare the two crawl data. If both show missing meta descriptions on the same pages, then there might be something on your side.
It is always good to double-check the information you are provided. So you are sure that the issue isn't a bug.
Gr., Keszi
P.S. I made a quick crawl with Screaming Frog on the first domain: sundancevacationsblog.com and found quite a big number of Missing Meta Titles.
Hi,
I'd quote from Moz: http://moz.com/learn/seo/domain-authority
"How do I influence this metric?
Unlike other SEO metrics, Domain Authority is difficult to influence directly. It is made up of an aggregate of metrics (MozRank, MozTrust, link profile, and more) that each have an impact on this score. This was done intentionally; this metric is meant to approximate how competitive a given site is in Google.com. Since Google takes a lot of factors into account, a metric that tries to calculate it must incorporate a lot of factors, as well.
The best way to influence this metric is to improve your overall SEO. In particular, you should focus on your link profile—which influences MozRank and MozTrust—by getting more links from other well-linked-to pages."
So answering your question, Yes, you can improve the DA, while working on your SEO.
In general, in my belief, the 301 redirect could have influenced the score, but I would rather check the rankings and organic evolution of the website after the redirect instead of checking only the DA. If the organic results improved, I wouldn't worry to much about the DA itself. (maybe it is just only a temporary glitch until Moz recalculates it's value based on the new information they get).
Keszi
Hi there,
Let me quote Google for this:
Average position: The average top position of your site on the search results page for that query.
To calculate average position, we take into account the top ranking URL from your site for a particular query. For example, if Jane’s query returns your site as the #1 and #2 result, and David’s query returns your site in positions #2 and #7, your average top position would be 1.5.
Source: https://support.google.com/webmasters/answer/35252?hl=en#details
I believe that they do take in consideration also the local results. (but that is my opinion)
Gr., Keszi
Hey Miranda,
I would go for one of the following:
1. usage of canonicals (link rel canonical to domain.com/jobs-in-london/)
2. the dev team could use cookies to keep trackings (I am not a developer guy, but this issue has been solved many times with cookies) - Developers please correct me if I am wrong in this.
I hope it hepled,
Istvan
In-house SEO, marketer and developer at PC House.
Looks like your connection to Moz was lost, please wait while we try to reconnect.