Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi, we put an image sitemap in the searchconsole/webmastertools http://www.sillasdepaseo.es/sillasdepaseo/sitemap-images.xml it contains only the indexed products and all images on the pages. We also claimed the CDN in the searchconsole http://media.sillasdepaseo.es/ It has been 2 weeks now, Google indexes the pages, but not the images. What can we do? Thanks in advance. Dieter Lang

    | Storesco
    0

  • Hi everyone. It was hard to find some actual evidence that some of the atributes to be declared in a sitemap have some real impact.
    Particularly, im interested in these two: <changefreq></changefreq> and**image:title</image:title>** I've used them in a few cases just to check their effect and couldnt see any.
    Do you have any experience with these? Or any other atribute that might be helpful, in order to create a more accurate and effective sitemap? Also, this could be a great topic to create a new Moz Blog post, the one about sitemap is 8years old.

    | Gaston Riera
    0

  • Hello Moz World! I have a client that has an events page that they update every week. They conduct weekly demos with current customers and potential customers for their software. They post dates, times and topics for each demo. I'd like to enable event rich snippets for their website (see attached image for an example), but I am unsure exactly how to do that. A) Do I just need to setup Event Schema Tags? Does it need to be updated manually every week? Is their a software solution? Thanks ahead of time for all the great responses! Cheers Will H. 8Wsh3l8

    | MarketingChimp10
    0

  • Hi Guys, I know this question has been asked a lot, but I wanted to double check this since I just read a comment of Gianluca Fiorelli (https://moz.com/community/q/can-we-publish-duplicate-content-on-multi-regional-website-blogs) about this topic which made me doubt my research. The case: A Dutch website (.nl) wants a .be version because of conversion reasons. They want to duplicate the Dutch website since they speak Dutch in large parts of both countries. They are willing to implement the following changes: -       Href lang tags -       Possible a Local Phone number -       Possible a Local translation of the menu -       Language meta tag (for Bing) Optional they are willing to take the following steps: -       Crosslinking every page though a language flag or similar navigation in the header. -       Invest in gaining local .be backlinks -       Change the server location for both websites so the match there country (Isn't neccessery in my opinion since the ccTLD should make this irrelevant). The content on the website will at least be 95% duplicated. They would like to score with there .be in Belgium and with there .nl in The Netherlands. Are these steps enough to make sure .be gets shown for the quarry’s from Belgium and the .nl for the search quarry’s from the Netherlands? Or would this cause a duplicated content issue resulting in filtering out version? If that’s the case we should use the canonical tag and we can’t rank the .be version of the website. Note: this company is looking for a quick conversion rate win. They won’t invest in rewriting every page and/or blog. The less effort they have to put in this the better (I know it's cursing when talking about SEO). Gaining local backlinks would bring a lot of costs with it for example. I would love to hear from you guys. Best regards, Bob van Biezen

    | Bob_van_Biezen
    0

  • I’m having a problem … the wrong page is indexing with Google, for search phrases “not on that page”. Explained … On a website I developed, I have four products. For example sake, we’ll say these four products are: Sneakers  (search phrase:  sneakers) Boots  (search phrase:  boots) Sandals  (search phrase:  sandals) High heels  (search phrase:  high heels) Error: What is going “wrong” is … When the search phrase “high heels” is indexed by Google, my “Sneakers” page is being indexed instead (and ranking very well, like #2). The page that SHOULD be indexing, is the “High heels” page (not the sneakers page – this is the wrong search phrase, and it’s not even on that product page – not in URL, not in H1 tags, not in title, not in page text – nowhere, except for in the top navigation link). Clue #1 … this same error is ALSO happening for my other search phrases, in exactly the same manner. i.e. … the search phrase “sandals” is ALSO resulting in my “Sneakers” page being indexed, by Google. Clue #2 … this error is NOT happening with Bing (the proper pages are correctly indexing with the proper search phrases, in Bing). Note 1:  MOZ has given all my product pages an “A” ranking, for optimization. Note 2:  This is a WordPress website. Note 3:  I had recently migrated (3 months ago) most of this new website’s page content (but not the “Sneakers” page – this page is new) from an old, existing website (not mine), which had been indexing OK for these search phrases. Note 4:  301 redirects were used, for all of the OLD website pages, to the new website. I have tried everything I can think of to fix this, over a period of more than 30 days. Nothing has worked. I think the “clues” (it indexes properly in Bing) are useful, but I need help. Thoughts?

    | MG_Lomb_SEO
    0

  • Does anyone know if it is possible/recommended/not recommended to use href lang in image or video XML sitemaps? This had not crossed my mind until recently, but a client asked me this question and I couldn't find any information on this topic.

    | ChrisKing
    0

  • Hey Guys We all know that relevancy largely trumps DA nowadays. What I am wondering is if there is a DA 'level' at which relevancy doesn't really matter - you probably still want a backlink from that site... For example, sites with DA of 100 we probably want backlinks from. So where do you draw the line? What I mean is for a high DA 'non relevant' site, what DA is 'acceptable' where you start to disregard relevancy? I'm thinking something like 70 and above would like some other thoughts... Obviously you would still be building relevant links too, developing content to do so and all that good stuff.  I am just wondering what DA I should focus on for building non-relevant links ALONGSIDE relevant links 🙂 Thanks

    | GTAMP
    0

  • I hope one liner question explains what I'm looking for.

    | SEOEnthusiast
    0

  • Hi all, Can I use only HTML sitemap or I should use both versions?
    How much I would lose in case when I would lose only HTML sitemap, without XML sitemap? Thank you.

    | Tormar
    0

  • I'm not quite sure what I'm seeing here. It's a site that uses Angular JS (version 1) and the crawl is showing infinite 302 redirects, but the redirects are all to the same URL? Here's an example: https://www.razoo.com/us/story/Armco-Park-Foundation Has anyone seen this before? What causes it and how do I counsel the client on how to fix it?

    | KatherineWatierOng
    0

  • Hi amazing Moz community 🙂 Couldn't find this question anywhere, and knew this was the place to ask! We are helping a client redirect an M Dot website to a Responsive Design website. We want to retain our mobile rankings for keywords. Three questions - We should use 301 redirects from the M Dot website to the new website correct? (not 302s?) How long does it take for Google to understand that we have launched a responsive website? Can we remove the 301 redirects after a few days (if the M Dot website interferes/breaks the new Responsive website)? We have verified an account on Google Search Console for the M Dot website, along with a mobile sitemap that has been submitted and verified. What should we do with this M Dot GSC account? Just delete it? Or keep it and upload the NEW XML Sitemap with the new WWW links (because the website is responsive). THANK YOU!

    | accpar
    0

  • Hi team, Our new e-commerce website has launched and I've noticed both http and https protocols are being indexed. www.mountainjade.co.nz Our old website was http with only the necessary pages running https (cart, checkout etc). No https pages were indexed and you couldn't access a https page if you manually typed it into the browser. We outrank our competition by a mile, so I'm treading carefully here and don't want to undo the progress we made on the old site, so I have a few questions: 1. How exactly do we remove one protocol from the index? We are running on Drupal. We tried a hard redirect from https to http and excluded the relevant pages (cart, login etc from the redirect), but found that you could still access https pages if you we're in the cart (https) and then pressed back on the browser button for example. At that point you could browse the entire site on https. 2. Is the safer option to emulate what we had in place on the old website e.g http with only the necessary pages being https, rather than making the switch to sitewide https? I've been struggling with this one, so any help would be much appreciated. Jake S

    | Jacobsheehan
    0

  • Hi all, So, I have studied about multilingual and multiregional websites. As soon as possible, we will expand the website languages ​to english and spanish. The urls will be like this: http://example.com/pt-br
    http://example.com/en-us
    http://example.com/es-ar Thereby, the tags will be like this: Great! But my doubt is: To /es-ar/ The indexing will be only to spanish languages in Argentina? What about the other countries that speak the same language, like Spain, Mexico, etc.I don't know if it will be possible develop a Spanish languages especially for each region. Should I do an multiregional website or only multilingual? How Google sees this case? Thanks for any advice!!

    | mobic
    1

  • Thought I'd ask this question to confirm what I already think. I'm curious that if we're publishing something in two language and both are verified by the publishing center if the group would recommend publishing two separate Google News Sitemaps (one in each language) or publishing one in each language.

    | mattdinbrooklyn
    0

  • Hi All, An SEO and Google guidelines question. We've recently purchased several local businesses that have websites. Legally, we've put a disclaimer saying we've purchased those businesses, the question is whether we should link from those sites to our main site. Will this bring a manual action from Google? It's legitimate that we'd like the visitors from those websites come to our main site because those business no longer named the way they were. So, is it OK to link from these sites to ours? Will this violate Google's guidelines regarding backlinking? Should we even link and if so add the rel:nofollow tag? Thanks!

    | OrendaLtd
    2

  • I thought it was generally said that Google will favour 1 page per domain for a particular SERP, but I have seen examples where that is not the case (i.e. Same domain is ranking 2 different pages on the 1st page of the SERPs...) Are there any "tricks" to taking up 2 first page SERP positions, or am I mistaken that this doesn't always happen?

    | Ullamalm
    0

  • Scenario: So imagine if LinkedIn turned off their main navigation/header if you landed on your personal profile via a search engine or via an external link. But if you were on LinkedIn when you found it, the navigation remains the same.

    | mysitesrock
    0

  • I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online. Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities). My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release. Are there pitfalls I should avoid in terms of pushing out so much back content at once?

    | andrewkissel
    0

  • I'm working on an event-related site where every blog post starts with an introductory header about the event and then a Call To Action at the end which gives info about the Registration Deadline. I'm wondering if there is something we can and should do to avoid duplicative content penalties. Should these go in a widget or is there some way to No Index, No Follow a section of text? Thanks!

    | Spiral_Marketing
    0

  • Good morning, This is my first post.  I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below: Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com).  On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP. Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall. Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index. Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace  the subdomain to get us back to where we were before Saturday.  If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again? Thank you for your time, Chase

    | chiplab
    0

  • I saw that any big page doesn't have a title tag on the product link on the category page. Title tag has any advantages on internal links?

    | Tormar
    0

  • Our site (www.nyc-officespace-leader.com) markets commercial real estate for lease in New York City. Any potential negative impact in terms of ranking and traffic by using our blog post in an unconventional manner? I am considering publishing a weekly post describing the latest commercial listings for lease. The post would be formatted and resemble classified advertising appearing in such newspapers as The New York Times. The ads are concise and appealing. Property listings drive a high click thru rate, so I believe blogs posts based on property listings and formatted like old newspaper ads might really improve visitor engagement. Each add could have a link to a corresponding listing page. Would using the blog in this manner every week have a detrimental effect or could prove beneficial? Thoughts??? lr6MIiR

    | Kingalan1
    0

  • Hi Ya'll I'm looking for a company or independent who can transition our website from http to https.  I want to make sure they know what they're doing with a Wordpress website.  More importantly, i want to make sure they don't break any seo juice from external sources while internally nothing gets broken. Anyone have any good recommendations?  You can reply back or DM me. Best, Shawn

    | Shawn124
    0

  • Good Morning, I am just looking for a little bit of advice, I ran a crawl report on our website www.swiftcomm.co.uk. I have resolved most of the issues myself, however I have two questions;- Screenshot image http://imgur.com/VlFEiZ2 Highlighted blue, we have two homepages www.swiftcomm.co.uk and www.swiftcomm.co.uk/ both are set with a Rel-Canonical Target of www.swiftcomm.co.uk/. Will this cause me any SEO issues and or other potential issue? If this may cause an issue how would I go about resolving? Highlighted yellow, Our contact and referral-form are showing as duplicate title and meta description. Both of these pages have separate title and meta desc which it does seem to be detecting. If I search the page in google it returns the correct title and meta desc. The only common denominator behind these pages is that both have php pages behind them for the contact form. Do you think that the moz crawl may be detecting the php page over the html? Could this be cause any issues when search engines crawl the site? Kind Regards Jonathan Mack VlFEiZ2

    | JMack986
    0

  • I am looking for input on best practices to the following solution Scenario: I have basic product A (e.g. Yamaha Keyboard Blast) There are 3 SKUs to the product A that deserve their own page content (e.g. Yamaha Keyboard Blast 350, Yamaha Keyboard Blast 450, Yamaha Keyboard Blast 550) Objective: - I want to consolidate the authority of potential links to the 3 SKUs pages into one destination/URL Possible Solutions I can think of: -  Query parameters (e.g /yamaha-keyboard-blast?SKU=550) - and tell Google to ignore SKU query parameters when indexing Canonical tag (set the canonical tag of the SKU pages all to one destination URL) Hash tag (e.g. /yamaha-keyboard-blast#SKU=550); load SKU dependent content through javascript; Google only sees the URLs without hashtag Am I missing solutions? Which solutions makes the most sense and will allow me to consolidate authority? Thank you for your input.

    | french_soc
    0

  • Hey there Mozzers, I have purchased a very amazing Social Media Related Plugin. I already have a business website about digital marketing which pretty much falls in the same category. I am thinking of transferring that plugin into a subfolder of my own website. Is there anything I should keep in mind when I do that?

    | AngelosS
    1

  • Hi everyone I'm hoping a few of you can help me out... We're an online-one retailer and we're currently looking at rebranding.
    This is for commercial reasons: Our current name is difficult for customers to spell It's not wholly representative of what we now offer We want to push offline and social marketing to help increase or DA In a nutshell, our current name implies 'cheap' and we're moving more upmarket.
    Our DA is only 10, and a re-brand will make our brand more marketable.
    A stronger brand and DA will help us climb up the rankings quickly - last year we ranked no 1 for a relatively competitive term before dropping a few places. In terms of current traffic: 30% is via SEO (we have a low DA but rank ok for certain phrases) 70% is via adwords We had our website redesigned last year and it performs well. 
    The idea is to have a new brand logo and colours and move to a new domain.
    We will keep all our existing products and content. Please could anyone let me know the implications of this move?
    What are potential pitfalls, and what will we need to do to alert Google?
    I have read about 301 redirects, would these be required? As always, any help is very much appreciated. Many thanks Abs

    | piazza
    0

  • Hi folks! We had a question come in from a client who needs assistance with their robots.txt file. Metadata for their homepage and select other pages isn't appearing in SERPs. Instead they get the usual message "A description for this result is not available because of this site's robots.txt – learn more". At first glance, we're not seeing the homepage or these other pages as being blocked by their robots.txt file: http://www.t2tea.com/robots.txt. Does anyone see what we can't? Any thoughts are massively appreciated! P.S. They used wildcards to ensure the rules were applied for all locale subdirectories, e.g. /en/au/, /en/us/, etc.

    | SearchDeploy
    0

  • We rank really well for a brand in Bing (#2 behind manufacturer, and it's a competitive name) but are in about 15th place in Google. Any suggestions on what could be hurting us in Google are welcome!

    | absoauto
    1

  • Hi all, I've read a lot about 301 vs 404 and 410s, but the case is pretty unique so I decided to get some feedback from you. Both websites are travel related but we had one destination as a subdirectory of the other one (two neighboring countries, where more than 90% of business was related to the 'main' destination and the rest to the 'satellite'). This was obviously bad practice and we decided to move the satellite destination to its own domain. Everything was done 2 years ago and we opted for 301s to the new domain as we had some good links pointing to satellite content. (All of the moved content is destination specific and still relevant) Few weeks back we figured out that google still shows our subdirectory when doing specific 'site:' search and looking further into it, we realized we still get traffic for satellite destination through the main website via links acquired before the move. Not a lot of hits, but they still sporadically occur. A decision was made (rather hastily) to 410 pages and see if that will make satellite subdir pages not show in google searches. So 3 weeks in, 410 errors are climbing in GWMT, but satellite subdirectory still shows in google searches. One part of the team is pushing to put back in place 301s. The other part of the team is concerned with the 'health' of the main website as those pages are not relevant for it, and want them gone . What would you do?

    | halloranc
    0

  • In the last year traffic to our site has dropped in half and ranking has dropped significantly. Very little no content has been added in that time. We would now like to improve ranking by adding new content. 2 domains effectively exist for the site. The existing domain is www.nyc-officespace-leader.com. But www.metro-manhattan.com redirects to www.nyc-officespace-leader.com. Our company is Metro Manhattan Office Space, Inc..  We registered www.metro-manhattan.com and created the redirect to www.nyc-officespace-leader.com in 2012. www.nyc-officespace-leader.com was registered in 2006. Many links to the site show www.metro-manhattan.com and I believe this may be a source of confusion for Google.  Would it be best to make the domain consistent at this time by redirecting it once and for all and to do so before adding new content? If this is done correctly can we avoid taking a hit on ranking? Note: -www.nyc-officespace-leader.com is the old domain.
    -www.metro-manhattan is the new domain but has existed since 2012 and has been redirecting to the old domain since then
    -The company name is Metro Manhattan Office Space (similar in branding to the new domain) Am I correct in assuming that having the 2 domains may be causing issues with Google involving domain authority? Change the domain before adding content or add content before?

    | Kingalan1
    0

  • Intermittent DNS errors showing up in GSC for our fashion portal www.AJIO.com. Our IP team doesn't find any issues at our end. Everytime i write to them, they come back saying 'DNS is resolving fine in all servers'. How do we resolve this? Pl help

    | AJIOreliance
    0

  • Hi, we have several ecommerce sites. We want to do an image sitemap, as we have lots of attractive images. The question is, can you put images for non-indexed products there as well, or does that conflict with the normal sitemap (the images would be indexed, the products not)? Thanks in advance. Dieter Lang

    | Storesco
    0

  • please see this link https://www.dropbox.com/s/thgy57zmmwzodcp/Screenshot 2016-05-31 13.25.23.png?dl=0 you can see my domain is getting tons of chinese spam. I have 410'd the page but it still keeps coming.. 7tnawRV

    | mattguitar99
    0

  • When we moved from one host to another in Wordpress engine, we had this insertion weird redirect thing happen.  We 410'd the page cgi-sys/movingpage.cgi, but it hit us hard in the anchors.  If you go to ahrefs, we are literally all Asian in anchors text.  Anybody have any suggestions, thank goodness it looks like it finally stopped.  I am looking for creative ways to repopulate our back end with the right stuff.  Any thoughts would be great! Heres a example: allartalocaltours.com/tumi-tote-401.html ↳customerbloom.com/cgi-sys/movingpage.cgi ↳www.customerbloom.com/cgi-sys/movingpage.cgi ↳lockwww.customerbloom.com/cgi-sys/movingpage.cgi

    | mattguitar99
    0

  • Hi there! Im getting LOTS of "duplicate content" pages but the thing is they are different pages. My website essentially is a niche video hosting site with embedded videos from Youtube. Im working on adding personal descriptions to each video but keeping the same video title (should I re-word it from the original also? Any help?

    | sarevme
    0

  • At the end of April we changed the url structure of most of our pages and 301 redirected the old pages to the new ones. The xml sitemaps were also updated at that point to reflect the new url structure. Since then Google has not indexed the new urls from our xml sitemaps and I am unsure of why. We are at 4 weeks since the change, so I would have thought they would have indexed the pages by now. Any ideas on what I should check to make sure pages are indexed?

    | ang
    0

  • Hi All, We manage a number of auto dealer websites which have their new & used inventory listed on them. There's a separate page for new, used, and CPO inventory, and on most web platforms any filtered inventory subpages are canonicalized back to one of the main inventory pages. Our question is - should we install unique copy on these to-level inventory pages? We're already installing unique meta and H1s and feel like copy could help these rank for more searches but we have a couple hesitations: Most big retailers like Amazon, Zappos, etc don't have copy on these types of pages. Putting the copy above the inventory would distract from shopping behavior, but installing it at the bottom of the page would hurt relevance. We'd appreciate anyone's insight or past experience here! Is it worth taking the time to write unique copy for these pages? Thanks everyone.

    | ReunionMarketing
    0

  • We recently implemented a plugin to minify our code on all of our homepages (800+ websites). The hope was that this would improve our page load speed. Unfortunately, the results are showing that page speed has slowed down across the board for all the homepages. I am very perplexed by this. Any insight into why this might have happened and what steps to take from here would be much appreciated!

    | chrisvogel
    0

  • I'm creating a new amazon affiliate site.  I've researched other successful sites.  I've noticed that they are ranking for 1000s of keywords, but many of these long tail keywords are redirected back to a main page.  I can see how this can reduce the overall total amount of content pages on the site.  How are you able to rank for the keyword in the first place if the the page is redirected?

    | lkomontt76
    0

  • We have our JS tag and iframe tag being used over by 100 leading websites. What would be the SEO impact if we added a follow link in the iframe. Would it have any negative impact ? Vivek

    | kvivek05
    0

  • I am working on a website, https://linkedinforbusiness.net and a ton of 999 HTTP errors have only now surfaced. I would venture from reading the "Request denied" error in the log, LinkedIn means to block BLCs attempts to check those links. It might be because the site has a lot of LinkedIn links; maybe they find it suspicious that the server is sending a lot of requests for their links. How do you see this? Any fixes? What level of harm do you think it brings to the site? I have removed all similar links to LinkedIn from my site to avoid this (https://www.hillwebcreations.com). However, this isn't so easily done for LinkedIn For Business, as her work in all about helping businesses and individuals optimize their use of LinkedIn.

    | jessential
    0

  • One of our employees took an SEO class recently. She was told that having too many 301 redirects can hurt SEO. I have never heard of 301 redirects as having a negative impact. Any thoughts?

    | Smart_Start
    0

  • Hey there Mozzers, What would be preferable to use instead of a 404 on a ecommerce website. Can I use a 301 redirection to the main category of the product? So for example if I have a t-shirt that is not available anymore can I use a 301 to redirect the traffic to the clothing category?

    | AngelosS
    0

  • Implemented canonical tag (months back) in product pages to avoid duplicate content issue. But Google picks up the URL variations and increases duplicate page title errors in Search Console. Original URL: www.example.com/first-product-name-123456 Canonical tag: Variation 1: www.example.com/first-product--name-123456 Canonical tag: Variation 2: www.example.com/first-product-name-sync-123456 Canonical tag: Kindly advice the right solution to fix the issue.

    | SDdigital
    0

  • I’m struggling with a client website that's massively failing to rank. It was published in Nov/Dec last  year - not optimised or ranking for anything, it's about 20 pages. I came onboard recently, and 5-6 weeks ago we added new content, did the on-page and finally changed from the non-www to the www version in htaccess and WP settings (while setting www as preferred in Search Console). We then did a press release and since then, have acquired about 4 partial match contextual links on good websites (before this, it had virtually none, save for social profiles etc.) I should note that just before we added the (about 50%) new content and optimised, my developer accidentally published the dev site of the old version of the site and it got indexed. He immediately added it correctly to robots.txt, and I assumed it would therefore drop out of the index fairly quickly and we need not be concerned. Now it's about 6 weeks later, and we’re still not ranking anywhere for our chosen keywords. The keywords are around “egg freezing,” so only moderate competition. We’re not even ranking for our brand name, which is 4 words long and pretty unique. We were ranking in the top 30 for this until yesterday, but it was the press release page on the old (non-www) URL! I was convinced we must have a duplicate content issue after realising the dev site was still indexed, so last week, we went into Search Console to remove all of the dev URLs manually from the index. The next day, they were all removed, and we suddenly began ranking (~83) for “freezing your eggs,” one of our keywords! This seemed unlikely to be a coincidence, but once again, the positive sign was dampened by the fact it was non-www page that was ranking, which made me wonder why the non-www pages were still even indexed. When I do site:oursite.com, for example, both non-www and www URLs are still showing up…. Can someone with more experience than me tell me whether I need to give up on this site, or what I could do to find out if I do? I feel like I may be wasting the client’s money here by building links to a site that could be under a very weird penalty 😕

    | Ullamalm
    0

  • Hi, If I use a template that maybe 50 other websites use but customise it my way will I still rank or will it hurt my ranking because other websites have the same template (even though they are in a different industry). Thanks,

    | seoanalytics
    0

  • In WordPress you normally show you blog posts on: Your home page. Your "posts page" (configurable in the Reading Settings) I want to do neither and have a third option instead: Assign a parent category called "blog" for all posts, and show the latest posts on that category's archive page. For the readers, the experience will be 100% the same as a regular "posts page". The UI, permalinks, and breadcrumbs will be 100% the same. But, I have heard that the "posts page" is important for Google for indexing and understanding your blog. So is is smarter SEO-wise to use a "posts page" instead of a parent category named "blog"? What negative effects might there be, if I have no "posts page" and just use the parent category "blog" instead?

    | NikolasB
    0

  • Duplicate title tags due to /?escaped_fragment= in the serps, it look like this http://www.site.com/25621/post-name/#! http://www.site.com/25621/post-name/ Does anyone know what the best fix is for this and does this affect the sites performance. Regards T

    | Taiger
    0

  • Problem: Our organization publish maps for public viewing using google maps. We are currently getting limited value from these links. We need to separate our public and private maps for infrastructure purposes, and are weighing up the strengths and weaknesses of separating by domain or sub domain with regards SEO and infrastructure. Current situation: maps.mycompany.com currently has a page authority of 30 and mycompany.com has a domain authority of 39. We are currently only getting links from 8 maps which are shared via social media whereas most people embed our maps on their website using an iframe which I believe doesn't do us any favour with SEO. We currently have approx 3K public maps. Question: What SEO impact can you see if we move our public maps from the subdomain maps.mycompany.com to mycompanypublicmaps.com? Thanks in advance for your help and happy to give more info if you need it!

    | eSpatial
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.