Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi If i have a homepage which is available at both www.homepage.com and www.homepage.com// should i 301 the // version to the first version. Im curious as to whether slashes are taking into consideration Thanks in advance

    | TheZenAgency
    0

  • Hi all. I've heard that it's good to put the link rel canonical in your header even when there is no other important or prefered version of that url. If you take a look at moz.com and see the code, you'll see that they put the <link rel="<a class="attribute-value">canonical</a>" href="http://moz.com" /> ... pointing at the same url ! But if you go to http://moz.com/products/pricing for example, they have no canonical there ! WHY ? Thanks in advance !

    | Tintanus
    0

  • We are having some issues with response codes on our product pages  on our new site. It first came to my attention with the mozbot crawl which was picking up 1000s of 302 redirects, but when I checked them manually there was no redirect (and even the moz toobar was giving a 200 status) I then check with this tool http://tools.seobook.com/server-header-checker/?page=single&url=https%3A%2F%2Fwww.equipashop.ie%2Fshop-fittings-retail-equipment%2Fgridwall%2Fgridwall-shelves%2Fflat-gridwall-shelf.html&useragent=2&typeProtocol=11
    And its showing that there are 2 responses at 302 and a 200  ( but with the same bot under googlebot setting only shows the 200 status). I'm also getting no warning about it in WMTs Does anyone know what's happening here and how worried about it should I be as it seems goggle is using only the 200 status btw the developer thinks it something to do with how the browser is handling the canonicallink, but I'm not convinced Thanks

    | PaddyDisplays
    0

  • Hi I know general concensus is to stay away from the non established domain suffix types and concentrate on .coms .co.uk’s etc etc. But i have an aged .info domain that has some content on it related to a online news paper i have on that subject (on the news paper providers domain sub-folder currently) which i want to focus more time on and put on its own dedicated domain. So i want to upload it to this aged .info domain. However waste of time if .info domains are bad for seo etc Does anyone have any experience of .info doing well in serps or should i totally scrap the idea and try find a new .com etc type domain ? My .info has been live with related content for 7 years so hoping that should count for something 🙂 All Best
    Dan

    | Dan-Lawrence
    0

  • Hi All Wanting some advice.  I have a client which has a number of individual centres that are part of an umbrella organisation.  Each individual centre has its own web site and some of these sites have similar (not duplicate content) products and services.  Currently the individual centres are sub domains of the umbrella organisation. i.e. Umbrella organisation www.organisation.org.au Individual centres are sub domains i.e. www.centre1.organisation.org.au, www.centre2.organisation.org.au etc. I'm feeling that perhaps this setup might be affecting the rankings of the individual sites because they are sub domains. Would love to hear some thoughts or experience on this and whether its worth going through the process of migrating the individual centre domains. Thanks Ian

    | iragless
    0

  • Howdy Moz community, I had a question regarding canonicals. I help a business with their SEO, and they are a service company. They have one physical location, but they serve multiple cities in the state. My question is in regards to canonicals and unique content. I hear that a page with slightly differing content for each page won't matter as much, if most of the content is relevantly the same. This business wants to create service pages for at least 10 other cities they service. The site currently only have pages that are targeting one city location. I was wondering if it was beneficial to use a template to service each city and then put a canonical there to say that it is an identical page to the main city page? Example: our first city was san francisco, we want to create city pages for santa rosa, novato, san jose and etc. If the content for the 2nd, 3rd, 4th, city were the same content as the 1st city, but just had the slight change with the city name would that hurt? Would putting a canonical help this issue, if i alert that it is the same as the 1st page? The reason I want to do this, is because I have been getting concerns from my copywriter that after the 5th city, they can't seem to make the services pages that much different from the first 4 cities, in terms of wording of the content and its structure. I want to know is there a simpler way to target multiple cities for local SEO reasons like geo targeted terms without having to think of a completely new way to write out the same thing for each city service page, as this is very time consuming on my end. Main questions? Will making template service pages, changing the city name to target different geographic locations and putting a canonical tag for the new pages created, and referring back to the main city page going to be effective in terms of me wanting to rank for multiple cities. Will doing this tell google my content is thin or be considered a duplicate? Will this hurt my rankings? Thanks!

    | Ideas-Money-Art
    0

  • Hi Hi, I've inserted some Markup on my homepage for a vacation flat. It's looking like thies: As there is not just one price for the whole season, I used minPrice and maxPrice to define it. Unfortunatelly the Google Testing Tool (https://developers.google.com/webmasters/structured-data/testing-tool/) sais, that the value "price" has to be also in the code, but I have no clue what value I should give it? Has someone an advice? Thank you very much! Best regards André

    | Andre-S
    0

  • Hi guys,
    When working with advanced modern websites it many times means that in order to achieve the look and feel we end up with pages that has almost 1000 lines of code or more. In some cases it is impossible to avoid it if we are to reach the Client's visual and technical specifications. Say the page is 1000 lines of code, and our content only starts at line 450 onwards, will that have an impact from a Google crawlability, hence affect our SEO making it harder to rank? Thoughts? Dan.

    | artdivision
    0

  • Hi all, I recently relaunched a site on a brand new URL - www.boardwarehouse.co.uk. I've spent the last couple of weeks building some backlinks as well as developing a basic content strategy. We've started ranking for a few of our less competitive keywords which is great, however there's a strange site which either redirects or is mirroring our content. I'm at a complete loss as to what's causing this to happen and what i can do to stop it. On the attachment - my content is top and second. The fourth result is the offending site. Any help/ advice would be most helpful! Thanks in advance, Alick 0BSyNn6

    | Alick300
    0

  • My site serves a consumer-focused industry that has about 15-20 well recognized categories, which act as a pretty obvious way to segment our content. Each category supports it's own page (with some useful content) and a series of articles relevant to that category. In short, the categories are pretty focal to what we do. I am moving from DNN to WordPress as my CMS/blog. I am taking the opportunity to review and fix SEO-related issues as I migrate. One such area is my URL structure. On my existing site (on DNN), I have the following types of pages for each topic: / <topic>- this is essentially the landing page for the topic and links to articles</topic> /<topic>/articles/ <article-name>- topics have 3-15 articles with this URL structure</article-name></topic> With WordPress, I am considering moving to articles being under the root. So, an article on (making this up) how to make a widget would be under /how-to-make-a-widget, instead of /<widgets>/article/how-to-make-a-widget I will be using WordPress categories to reflect the topics taxonomy, so I can flag my articles using standard WordPress concepts.</widgets> Anyway, I'm trying to get my head around whether it makes sense to "flatten" my URL structure such that the URLs for each article no longer include the topic (the article page will link to the topic page though). Thoughts?

    | MarkWill
    1

  • Hi all, I run an ecommerce website, not a great ranked site, however i want to try and improve the product detail pages. To do this, i am first going to focus on 1 page (this one: http://goo.gl/eS62SU) If i type the product code directly into google.co.uk search i am on the 8th page (see https://www.google.co.uk/#q=hac-hfw2220r-z&start=70) which is a bit poor to say the least. I see this kind of thing for a lot of my products. Hence, i am going to see if over the next month or two i can get this one page moving up the rankings purely with on page optimisation. I would like to ask a couple of things: 1. Is there anything that jumps out at you as to why that product detail page could NOT ever rank well, i.e some code / set up of page etc that prevents google ranking it 2. Any advice you could give that might improve that page in rankings for its product code. FYI - I can not change the dynamic URL, I only have control over such things as product name / summary / features / spec etc any advice welcome

    | isntworkdull
    0

  • Hey Everyone: We are currently implementing hreflang tags on our site, and we have many parameter pages with hreflang tags; however, I am afraid these may be counted as duplicate content without canonical tags. example.com/utm_source=tpi href='http://example.com/de" hreflang="de" rel="alternate" href='http://example.com/nl" hreflang="nl" rel="alternate" href='http://example.com/fr" hreflang="fr" rel="alternate" href='http://example.com/it" hreflang="it" rel="alternate" I have two questions 1. Do I need a canonical tag pointing to example.com ? 2. On the homepage without the parameter, should I add self referencing hreflang tags? (href="http://example.com/" hreflang="es" Thanks so much for your help! Kyle

    | TeespringMoz
    0

  • Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko

    | Yarden_Uitvaartorganisatie
    0

  • I work with a unique situation where we have a site that gets tons of free traffic from internal free resources. We do make revenue from this traffic, but due to its nature, it has a high bounce rate. Data shows that once someone from this source does click a second page, they are engaged, so they either bounce or click multiple pages. After testing various landing pages, I've determined that the best solution would be to create a landing page on a separate domain and hide it from the search engines (to prevent duplicate content and the appearance of link farming). The theory is that once they click through to the site, they will bounce at a lower rate and improve the stats of the website. The landing page would essentially filter out this bad traffic. My question is, how sound is this theory?  Will this cause any issues with Google or any other search engines?

    | jhacker
    0

  • On my site "example.com" I have set up the following in the header: The problem is that the tags are universal across the site, so every page has these tags, leading obviously to no return tag errors. I.e. the page www.example.ca/testing.html still has the tags: Not tags with "testing.html" in them. How bad is this? Does it matter?

    | absoauto
    0

  • Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?

    | Whittie
    0

  • Hello Moz community, I'd like to know if you build "pre-sales seo audit" when selling your services to a prospect. I think the main idea of a pre-sales audit is to show your prospect that you understand his industry (trends & competition) understand the opportunities know the roadblocks on his website If so, i'd be interested in discussing the information you put into your pre-sales audit and how you organise it. If you know ressources i should read as regards to mini seo audit / pre-sales seo audit just paste the link 🙂 Thanks for your answers

    | Sindicic_Alexis
    0

  • I was wondering does Google make an exception for news stories where duplicate content is concerned? After all depending on the story there can be a lot of quotes and bulk blocks of the same details. Is Google intelligent enough to distinguish between general website content and actual news stories? Also like a lot of big firms we publish news stories on our website, but then they get passed on to other websites in the form of PR, and then published on other websites. So if we put it on our website, then within a few hours or the same day other websites publish the story at the same time (literally copied and pasted) - how does this affect our website in terms of duplicate content? Will Google know automatically that we published it first? Thanks!

    | Brabian
    0

  • Just wondering if anyone has any experience with the WordPress Simple Firewall plugin. I have a client who is concerned about security as they've had issues in that realm in the past and they've since installed this plugin: https://wordpress.org/support/view/plugin-reviews/wp-simple-firewall?filter=4 Problem is, even with a proper robots file and appropriate settings within the firewall, I still cannot crawl the site with site crawler tools. Google seems to be accessing the site fine, but I still wonder if it is in anyway potentially hindering search spiders.

    | BrandishJay
    0

  • We have several brand related domains which are parked and pointing to our main website. Some of these websites are redirecting using a 302 (don't ask, that's a whole other story), but these are being changed. But it shouldn't matter what type of redirect they are no? Since there has never been any traffic and they are not indexed? But it seems that one of them was indexed: exotravel.vn. A search for our brand name or the previous brand name (exotravel and exotissimo) brings up this parked domain first! How can that be? The domain has never been used and has no backlinks. exotravel.vn is redirecting and I submitted a change of address weeks ago to Google, but its still coming up first in all brand name searches for exotissimo or exotravel.

    | Exotissimo
    0

  • Hello. 20 days ago, I changed my domain from uclasificados.net to uclasificados.com doing redirect 301 to all urls, and I started to loose rankings since that moment. I was wondering if changing it back could be the solutions, but some experts recommend me not to do that, because it could be worse. Right now I receave almost 50% of traffic I used to receave before, and I have done a lot of linkbuilding strategies to recover but nothing have worked until now. Even though I notified google of this change and I send again my new sitemap, I don't see that have improve my situation in any aspects, and I still see in webmastertools search stats from my last website (the website who used to be uclasificados.com before the change). What should I do to recover faster?

    | capmartin85
    0

  • Hi there, I know there are answers all over the web to this type of question (and in Webmaster tools) however, I think I have a specific problem that I can't really find an answer to online. site is: www.lizlinkleter.com Firstly, the site has been live for over 2 weeks... I have done everything from adding analytics, to submitting a sitemap, to adding to webmaster tools, to fetching each individual page as googlebot and then submitting to index via  webmaster tools. I've checked my robot files and code elsewhere on the site and the site is not blocking search engines (as far as I can see) There are no security issues in webmaster tools or MOZ. Google says it has indexed 31 pages in the 'Index Status' section, but on the site dashboard it says only 2 URLS are indexed. When I do a site:www.lizlinketer.com search the only results I get are pages that are excluded in the robots file: /xmlrpc.php & /admin-ajax.php. Now, here's where I think the issue stems from - I developed the site myself for my wife and I am new to doing this, so I developed it on the live URL (I now know this was silly) - I did block the content from search engines and have the site passworded, but I think Google must have crawled the site before I did this - the issue with this was that I had pulled in the Wordpress theme's dummy content to make the site easier to build - so lots of nasty dupe content. The site took me a couple of months to construct (working on it on and off) and I eventually pushed it live and submitted to Analytics and webmaster tools (obviously it was all original content at this stage)... But this is where I made another mistake - I submitted an old site map that had quite a few old dummy content URLs in there... I corrected this almost immediately, but it probably did not look good to Google... My guess is that Google is punishing me for having the dummy content on the site when it first went live - fair enough - I was stupid - but how can I get it to index the real site?! My question is, with no tech issues to clear up (I can't resubmit site through webmaster tools) how can I get Google to take notice of the site and have it show up in search results? Your help would be massively appreciated! Regards, Fraser  

    | valdarama
    0

  • Our site is an online Marketplace for Services. Naturally, we have a lot of unique content in the form of :
    a) Job Posts
    b) Profiles of Service Providers We also have 2 very important pages:
    a) The Job Listing Page
    b) The Service Provider Page The Listing pages have very valuable H1 Titles, but everything else is duplicate content. To capture those keywords currently in H1, we have created a different landing page for each category page, and we`ll optimize around that, so these H1s are not that big of a deal any more. These landing pages are the key to our SEO strategy and we are building new content every day to help them rank I want to make the Listing Pages No Index Follow. This way they pass Juice to Jobs, and Profiles which have unique contents, but are not indexed themselves. Is this a bad idea? I have been thinking about doing this for over a year but it never felt important enough to be worth the risk of accidentally screwing up We `ll soon do a new on page flow optimization and that's why I am considering this again. Thank you so much in advance Argyris

    | Ideas2life
    0

  • I want to know about your advice to re-life expired domain, assume PA:55 how I get benefits from it, when I use it for new related content.

    | zant
    0

  • Hey there Mozzers. I want to change the url of a certain page on my website. Example: www.example.com/poker-face I want to change this www.example.com/poker-faces Should I create a new page and make the old one 301? Does 301 pass all the link juice in the new page or do i have to make a rel=canonical also ?

    | Angelos_Savvaidis
    0

  • My website has a forum that is using the title of the posts as a Meta Description.The problem is that when a posts becomes long and separates in pages Google tells me that i have duplicate meta description issues because the 2nd page and the 3rd page are using the same meta description.What is the best course of action here?

    | Angelos_Savvaidis
    0

  • Hello, a client has asked us today to quote for how much it would cost them to get a micro site built. A Google employee has told them that because their current URL doesn't include .co.uk or.com it is simply: brandname.word that it will be harder for them to get their website to rank. My understanding is that micro sites aren't a good solution for any problem as Google doesn't like them. Would it be better for them to buy a .co.uk (they are a UK company) url and then redirect the url to their current website or is there a better solution? Many thanks

    | mblsolutions
    0

  • If you've got a genuine blog mentioning your website is it wise to accept a trackback to the site.  The site is wordpress?

    | Cocoonfxmedia
    1

  • What software do you guys use to generate a sitemap?

    | Angelos_Savvaidis
    0

  • I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?

    | RG_SEO
    0

  • I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore.  The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots.  They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots.  Everything, Google, Yahoo, Bing.  They also banned my access from the site.  My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam.  As a matter of fact most of the bad bots ignore robots.txt anyways.  It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts.  I really appreciate it!

    | JoshuaLindley
    1

  • Hi I have a new client whose first MA crawl report is showing lots of duplicate content. The main batch of these are all the HP url with an 'attachment' part at the end such as: www.domain.com/?attachment_id=4176 As far as i can tell its some sort of slide show just showing a different image in the main frame of each page, with no other content. Each one does have a unique meta title & H1 though. Whats the best thing to do here ? Not a problem and leave as is Use the paremeter handling tool in GWT Canonicalise, referencing the HP or other solution ? Many Thanks Dan

    | Dan-Lawrence
    0

  • Hi, I am adding a bunch of similar category stickers and I am not looking into that good SEO for these since there will be hundreds of them coming but I just want to include the relevant keywords that people perhaps use in the Google image search to take them to our site. They are all related to JDM (Japanese Domestic Motors) so I decided to include JDM at the end of all the SEO titles. I am writing totally different short descriptions for all of these stickers and the Related Products are changing as well. I just want to achieve something like Amazon or eBay listings do - not the perfect SEO since I cannot spend too much time with each sticker optimizing it but I don't want to NOINDEX, FOLLOW them either - hence the different related products for all items and also unique short descriptions. If you check one of the pages: http://www.redrockdecals.com/rising-sun-wakaba-leaf-sticker-red-black-jdm Do you think I should be in the safe side so I don't hurt my overall SEO? Thanks!!

    | speedbird1229
    0

  • Hi Everyone When doing 301 redirects for a large site, if a page has 0 inbound links would you still redirect it or just leave it? Im just curious on the best practice for this Thanks in advance

    | TheZenAgency
    0

  • I'm working on a project where a site has gone through a rebrand and is therefore also moving to a new domain name. Some pages have been merged on the new site so it's not a lift and shift job and so I'm writing up a redirect plan. Their IT dept have asked if we want redirects done by DNS redirect or IIS redirect. Which one will allow us to have redirects on a page level and not a domain level? I think IIS may be the right route but would love your thoughts on this please.

    | Marketing_Today
    1

  • Our county's website has a news' blog, and they want to do an article about an award we won. We're definitely going to do it, and we're happy about the link. However, all the other news' articles they have only have a PA of 1. The DA is 82, and the link is completely white hat. It's a govt site in our locale, however, with such a terrible PA, I'm don't think the link is really all that great from an SEO stand point. Am I right or wrong (or is it some dreadful murky grey area like everything else in this industry (which I'm thankful to be a part of 🙂 )? Thanks so much for any insights! Ruben

    | KempRugeLawGroup
    0

  • I am currently working on a small site with approx 50 web pages.  In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site.  Anyone come across this before?

    | Pete4
    0

  • I am doing major changes in my website some of my old url pages i don't want them to be indexed or submitted in site map some of other old pages i want to keep them and there is new  pages any one  can give me hints what should i do also I have thousands of pages on my website and I don't want to submit all my pages i want to submit best pages to google in sitemap that why i want to resubmit new site maps

    | Jamalon
    0

  • Dear All,
                   We have a website where we are showing some products. Many times it happens when we remove any product and add that again later. In our site, we have product list page and detail page. So if any product get deleted and client hits the detail page with deleted product url, then we are returning 404. Next time when that product will be available our server will return 200. I have two questions : 1. Is is right way to deal with deleted product ? 2. After deploying it, we observed that our keywords ranking is going down, is that really affect ? Thanks,
    Om

    | omverma
    0

  • Dear All,
                      We have some websites. How we can check if site got affected by penguine / panda. We have observed few things since last few days like impressions are going down and keyword ranking is going down too. Any tools or any steps, to detect it will help us.
    Thanks,
    Om

    | omverma
    1

  • If I want to override a Disallow directive in robots.txt with an Allow command, do I have the Allow command before or after the Disallow command? example: Allow: /models/ford///page* Disallow: /models////page

    | irvingw
    0

  • My link metrics have decreased for several campaigns, loss of links range from 500-1500. Why is that ? Is there a new update or something that would cause this issue? I find it weird that three separate campaigns would decrease in external links when we have not done anything to remove links on our end.

    | rap79
    1

  • A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages.  We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?

    | oeroek
    0

  • Hi there, While doing some research on the indexation status of a client I ran into something unexpected. I have my hypothesis on what might be happing, but would like a second opinion on this. The query 'site:example.org inurl:index.php' returns about 18.000 results. However, when I hover my mouse of these results, no index.php shows up in the URL. So, Google seems to think these (then duplicate content) URLs still exist, but a 301 has changed the actual goal URL? A similar things happens for inurl:page. In fact, all the 'index.php' and 'page' parameters were removed over a year back, so there in fact shouldn't be any of those left in the index by now. The dates next to the search results are 2005, 2008, etc. (i.e. far before 2013). These dates accurately reflect the times these forums topic were created. Long story short: are these ~30.000 'phantom URLs' in the index out of total of ~100.000 indexed pages hurting the search rankings in some way? What do you suggest to get them out? Submitting a 100% coverage sitemap (just a few days back) doesn't seem to have any effect on these phantom results (yet).

    | Theo-NL
    0

  • So I have some chums setting up their own digital outfit. When discussing SEO, naturally domain names came into play. They were looking at 'Gray Digital'. So, initially they jumped to the conclusion that they ought to buy 'graydigital.com' and the .co.uk variant. But a best practice post: http://moz.com/learn/seo/domain  - Leads me to think that 'gray-digital.com' may be the better option as far as readability is concerned? Then of course you start thinking - 'should we just make it 'Gray-Digital-Marketing.com' instead?' From your experience, what would you ladies and gents do? Kind regards, John. (EDIT: Having read more around the subject I realise more than one dash is a bad idea. So instead would you bother with the singular hyphen?)

    | Muhammad-Isap
    0

  • Hey guys My website is http://www.oxfordmeetsfifth.com According to SEOcentro, my website should appear to Google as Fashion Tips for Women | Oxford Meets Fifth. I have used the Yoast plugin and force rewrote titles to ensure that is the home page meta title. It also appears correctly in browser. Could anyone advise why this is the case? Thanks in advance!

    | OxfordMeetsFifth
    0

  • Hi, I have editors / copyreader that write twitter headlines & meta descriptions in excel cells. Do you know a way for google docs & excel how to highlight the text of a cell in red if there are more than 140 chars ( or number to define) in the cell so the editor can see if the tweet / description is too long?

    | Autoschieber
    0

  • I have a site that has gotten quite a few google +1's. I am currently migrating to https for the site and it seems I will lose all the +1's? Per the documentation it seems I can set the URL to the normal http://, but that would then allow ppl to continue +'ing the old url. Is there a way around this? Thanks!

    | plahpoy
    0

  • Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?

    | Muhammad-Isap
    1

  • My product is laptop and of cause, I like to rank high for the keyword "laptop". Do any of you know if the search engines tends to rank a front page higher than a landing page? Eg. www.brand.com vs. www.brand.com/laptop

    | Debitoor
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.