Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I know Moz has a directory of recommended companies, and I've found that very useful. However, we're really looking for an individual (who, of course, keeps up with the latest best practices and trends in SEO) to optimize our site while we put our time into client sites. We've done Craigslist ads, but those seldom pan out. Have any of you had luck finding part-time SEOs? Where did you find them? Thanks!

    | ScottImageWorks
    0

  • The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
    But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!

    | FCRMediaLietuva
    0

  • I recently created a forum for my website (lets say my site is xyz.com so my forum is forum.xyz.com) and I've been getting status code 200s for tons of my forum pages about duplicate content- especially for threads that have a ton of interaction. Is there a way to fix this? I want my forum to be indexed and not get penalized, but if i canonical tag my main forum page wouldn't that take away the glory the awesome interactions my users are having? Anyone with forum experience can shed some light on this? Thank you so much!

    | kentien93
    0

  • Is it when someone does an image search? Or does it count a regular search that has images in it? On an image search does the picture actually have to be viewed on the screen or can it be below in the infinite scroll?

    | EcommerceSite
    0

  • My website www.dealwithautism.com is a 3 month old website. It currently has 50+ quality pages that are KW targeted and on page optimized (usually grade A on Moz page grader). Over the next 12 to 15 months, I plan add a total of 300 to 400 kw targeted pages to strive for topical authority. I am launching my first product (an ebook in the next couple of months) and would eventually move into a membership subscription model in next 15 month. I want to invest in a long term SEO strategy with a reputed and trusted SEO firm. Being just a 1 person show at he moment, my budget is small (about $250 a month) but over time, as I acquire more revenue I will increase my SEO budget accordingly. I believe, if I get traffic, my content has the guts to absorb engagement. From analytics, any page that is not bounced and has received organic traffic (only less than 10 per day though) has an average time spent > 12 mins. So my content seems to be doing its bit now. My question: Is now a good time to invest in SEO for my budget? I need a long term and natural seo strategy, no quick wins - happy to play by the CPC model for my money pages till I see an organic growth. Or should I wait for 5-6 more months to let my site age a bit and also y that time I should have 150+ quality pages, so the authority should be more.

    | DealWithAutism
    0

  • I have a wiki (wiki 1) where many of the pages are well index in google. Because of a product change I had to create a new wiki (wiki 2) for the new version of my product. Now that most of my customers are using the new version of my product I like to redirect the user from wiki 1 to wiki 2. An example of a redirect could be from wiki1.website.com/how_to_build_kitchen to wiki2.website.com/how_to_build_kitchen. Because of a technical issue the url I redirect to, needs to have a parameter like "?" so the example will be wiki2.website.com/how_to_build_kitchen? Will the search engines see it as I have two pages with same content?
    wiki2.website.com/how_to_build_kitchen
    and
    wiki2.website.com/how_to_build_kitchen? And will the SEO juice from wiki1.website.com/how_to_build_kitchen be transfered to wiki2.website.com/how_to_build_kitchen?

    | Debitoor
    0

  • Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on?  How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine.  This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google  Yet, no cache.

    | friendoffood
    2

  • Do they crawl backlinks from an iframe example from a Youtube video embedded in a blog post? TIA!

    | zpm2014
    0

  • I was referred to this plugin and have found it to be the most irritating and poorly designed plugin in the world. I want to be able to set my titles without it changing my page headers as well. For instance - If I set my title to be "This is my article name | site name" it will make my H1 tag read the same. I do not want or desire this nonsense. Why would they think this is something wise? Why would I want my site name on every single H1 tag on my site? How can I fix this? I only want my title to be my title. I want my H1 tag to remain the post/page name that I define in wordpress.

    | Atomicx
    0

  • How detrimental is this in the overall SEO scheme of things? Having checked 3 of our main competitors, they too seem to have similar issues... I am trying to look at a solution but it is proving very difficult! Thanks Andy

    | TomKing
    0

  • Hi, We are selling about 1000 sticker products in our online store and would like to expand a large part of our products lineup to eBay as well. There are pretty good modules for this as I've heard. I'm just wondering if there will be duplicate content problems if I sync the products between Magento and eBay and they get uploaded to eBay with identical titles, descriptions and images? What's the workaround in this case? Thanks!

    | speedbird1229
    0

  • Hello there, Can anyone recommend how to go about finding a good seo company?

    | edward-may
    0

  • Hello Community I will like ot know if im doing something wrong here..... I have setup keywords for my google ranking using yoasr SEO http://imgur.com/BCWTifV but in my Google Webmaster http://imgur.com/V1texto What im doing wrong?

    | dawgroup
    0

  • Hi I was hoping to get some thoughts and opinions on our blog. It is part of our main site (not on a subdomain) but performs very badly, pulling in very little organic traffic (only accounting for 0.6% of our organic traffic). Every page of the blog is listed in our sitemap, and using Screaming Frog I've done spot checks of several pages to see if they are indexed, which they have been. Looking at Google's text cache, all the content is visible. Pages are often well shared on social media (for example): http://www.naturalworldsafaris.com/blog/2014/10/antarctica-photography-safari-2014-updates.aspx I'm aware that we do need more links coming into the blog but I still feel that it should be performing better than it is. Any suggestions would be appreciated!

    | KateWaite
    0

  • Hello Mozzers, I'm noticing increasing numbers of clients' competitors getting physical addresses and phone numbers in multiple locations, no doubt partly for SEO purposes. These are little more than ghost presences (in hot desk style office space) and the phone numbers are simply diverted. Do such physical addresses put them at an SEO advantage (over and above those who don't have hot desk style space and location phone numbers). Or does Google weed out hot desk type office spaces where they can? Your thoughts/experience would be very welcome! Thanks in advance, Luke

    | McTaggart
    0

  • Here is my current sitemap for my site http://www.yakangler.com/index.php?option=com_xmap&view=xml&tmpl=component&id=1 I have some questions about it's current settings. I have a component called JReviews that xmap produces a separate link for each category. ex: http://www.yakangler.com/fishing-kayak-review/265-2013-hobie-mirage-adventure-island 2014-09-03T20:46:25Z monthly 0.4 http://www.yakangler.com/fishing-kayak-review/266-2012-wilderness-systems-tarpon-140 2014-06-03T15:49:00Z monthly 0.4
    http://www.yakangler.com/fishing-kayak-review/343-wilderness-systems-tarpon-120-ultralite 2013-11-25T06:39:05Z monthly 0.4 Where as my other articles are only linked by the content category. ex: http://www.yakangler.com/news monthly 0.4
    http://www.yakangler.com/tournaments monthly 0.4
    http://www.yakangler.com/kayak-events monthly 0.4
    http://www.yakangler.com/spotlight monthly 0.4 Which option is better?

    | mr_w
    0

  • Had a penguin unnatural link building penalty was lifted last June. Since then no significant recovery in rankings. Saying that why do we still rank on page 1 on Bing and Yahoo but rank nowhere in Google. Any suggestions. Thanks David

    | Archers
    0

  • Hi, After a recent disastrous dalliance with a rogue SEO company I disavowed quite a few domains (links he had gained) which I was receiving a penalty of about 23 places. I cleaned up the site and added meta descriptions where missing, and deleted duplicate titles and pages. This gained me another 5 places. In the meantime I have been getting a few links from wedding blogs, adobe forums and other relevant sites so was expecting an upward momentum.  Since the high point of bottom of page 1 I have slowly slid back down to near the bottom of page two for my main keywords. Just checked my webmaster tools latest links and  another 4 domains have appeared (gained by the dodgy SEO) : domain:erwinskee.blog.co.uk domain:grencholerz.blog.co.uk domain:valeriiees.blog.co.uk domain:gb.bizin.eu They all look bad so I am going to disavow. I expect to find an improvement when I disavow these new domains. As I have said,  have started using the open site explorer tool to check my competitors backlinks and getting some low level links(I'm a wedding photographer) like forum comments and blog comments and good directories. I know there is much more than this to SEO and  plan on raising my game as time progresses. I have also gained more links from the domains I disavowed on the 8th January mostly from www.friendfeed.com. will webmaster tools ignore any new links from previously disavowed domains? Like I have said I know there are better ways to get links, but are these links (forum comments, blog comments and respectable directories) one way of raising my rankings? To be honest that is all my competitors have got other than some of the top boys might have a photograph or two on another site with a link. No-one has a decent article or review anywhere (which is my next stage of getting links). Thanks! David.

    | WallerD
    0

  • Hey guys One of our sites was penalised from a while ago and we use to rank page 1 for our keywords and now w can't be found.  Is there a way to recover from this? Also, I came across Rands video about site architecture whilst google were teaching us to generate backlinks, and spammy anchor text what would your suggestion be to recover from this?  Any ideas? Here is the site http://free-love-psychic.com/ Any suggestions?

    | edward-may
    0

  • Hi basically i want to find sites which mention a specific exact keyword on the page e.g. "BMW" but the same keyword "BMW" is not contained in the title tag of the page. Is there a advanced search query to do this? I did try “BMW” Intitle:"-bmw" no luck. I do also have scrapebox if there is a way to do this through that. Cheers, Mark

    | Mikey008
    0

  • Fellow mozzers, we need your help We have a situation where a customer has two websites for each country: flowtracksurf.be → Belgium flowtracksurf.nl → Netherlands They used to have very good keyword rankings in the SERPS in BE & NL. Flowtracksurf.nl had good rankings in Google.nl and Flowtracksurf.be in Google.be.
    Recently there has been a change: Flowtracksurf.nl is not showing up in Google.nl anymore. It also seems that all the rankings from flowtracksurf.nl have been switched to flowtracksurf.be. .BE is doing very well, .NL is suffering. Data shows us that .NL : In the first two weeks of december 2014, we see a massive drop in traffic (GA) In that same week(s) we see a drop in search queries (Webmaster Tools) We see the exact opposite in .BE (growing strong in those weeks) When we look at the cache of flowtracksurf.nl we see only reference to flowtracksurf.be. Is that a hint of what was going on? On the same date that we see a massive drop in traffic on .NL, we see a peak in 'indexation' of .BE We see that the MOZ pages crawled dropped in that same week for NL We're also seeing that all the traffic from Google.nl is now going to flowtracksurf.be. Some keywords we were scoring #1-2 for are: surfvakanties, surfvakantie, surfcamp mimizan, surfcamp, frankrijk, surfcamp spanje, surfen frankrijk We just can't figure out the hard evidence in the data.
    Can you help us on that?

    | Jacobe
    0

  • Hi all,
    We are running a classifieds website.Due to technical limitations, we will probably not be able to list or search expired ads, but we still can view ad details view page if you landed on expired ad from external page (or google search results).Our concern is, if the ad page is still exists, but it's totally isolated from the website (i.e not found by search option on the website and no following site links) will google remove it from the index?Thanks, T

    | Tarek_Lel
    0

  • I noticed in a recent Q&A response that along with Bing and Google web master tools there was a reference to Yandex. What is the relevance of this webmaster tool set? Is there a cost associated with it?  If so is it worth it? I would love to hear what the community thinks.

    | Ron_McCabe
    0

  • a bit of a catch 22 position here that i could use some advice on please! We look after a few Car dealership sites that have daily (some 3 times a day) stock feeds that add and remove cars form the site, which in turn removes/creates pages for each vehicle. We all know how much search engines like sites that have content that is updated regularly but the frequency it happens on our sites means we are left with lots of indexed pages that are no longer there. now my question is should i nofollow/disallow robots on all the pages that are for the details of the vehicles meaning the list pages will still be updated daily for "new content" or allow google to index everything and manage the errors to redirect to relevant pages? is there a "best practice" way to do this or is it really personal preference?

    | ben_dpp
    0

  • I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?

    | MiguelSalcido
    0

  • This company page redirects their external clients links: https://www.coinbase.com/clients QUESTION: What effect does this type of redirection have on the SEO going to these client pages, for their clients Websites?

    | mstpeter
    0

  • Hello All, I have an eCommerce site and have implemented the use rel="prev"  and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't  have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter

    | PeteC12
    0

  • Hi everyone, I have had two pages ranking on page two of Google for a while now for the same term. I have tried dedicating a page to it but as the other has a url with the search term in Google is ranking both it seems. How can I without deindexing one of the pages help better tell Google which one to rank? I imagine if it only ranked one page I would get a higher result rather than 2 weaker ones? On-site has been done and so has links to the homepage, but the innerpage still ranks also as it has the search term in its url. Would a canonical tag be worth it here? the page is however getting some traffic itself for other terms so I am reluctant to do that. Any help much appreciated.

    | tdigital
    0

  • I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day.  I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched.  I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has.  How can I tell what is going on?

    | friendoffood
    0

  • I'm working on a very conventional-type site with a home page (why come to us), methods we use, pricing, reviews, FAQs and contact us. After reading the Moz case study at (http://www.conversion-rate-experts.com/seomoz-case-study/), I have been working on a conversion-optimised home page that consolidates much of content in all these pages. At the bottom of the home page, I then plan to add a list of blog posts "Want to read more? We have a lot of useful information on our blog. Here are the most popular articles:" with articles that explain more about the methods we use for example (content that was formerly on our methods page). Obviously this new blog will also have more interesting information (but a lot that could actually be converted into pages) This radically changes the site into just a home page full of selling points and calls-to-action and a blog. I have some questions about this strategy: How do we keep our search engine ranking for keywords such as "[our service] prices" or "[a particular method] London". We rank quite well on Google for these and it goes straight to the relevant page. Shall we keep the pages active somewhere even though the information is also on the home page? Is a blog actually necessary here (SEO wise)? The things I'm planning to write could easily be made into more pages. Am I going about this completely wrong by trying using the CRO guide? Should this sort of page be reserved for landing pages? The reason why I'm considering making a conversion-generating home page is because we only sell one service pretty much (although there are differences in how we do it on children vs. adults) and because we are quite niche so most of our traffic comes from organic sources. Thank you

    | LondonAli
    0

  • I read some months back that Google was indexing the apps content to display it into its SERP. Does anyone got any update on this recently ? I'll be very interesting to know more on it 🙂

    | JoomGeek
    0

  • Hello everybody, 
    I have started a new website 22 days ago at the beginning of this month and i have long articles. I think this should make the site appear in search results for long tail keywords even if they are not very relevant but as you can see in the attached image from my webmaster tools the impression count has suddenly increased to 100 then significantly decreased again. Even when i cancel "filter" option. Is this normal for a 3 weeks old website? or there is something i have to check? thanks. cLMa04l.jpg

    | mtmaster
    0

  • Looking to get a bit of clarity on redirects: We're getting ready to launch a new website with a simplified url structure (we're consolidating pages & content) & I already know that I'll have to employ 301 redirects from the old url structure to the new. What I'm not clear about is how specifc I should be. Here's an example of my file structure: Old website: www.website.com
    New website: www.website.com Old website: www.website.com/vacations
    New website: www.website.com/vacations Old website: www.website.com/vacations/costa-rica
    New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/costa-rica/guanacaste
    New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/mexico
    New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/mexico/cancun
    New website: www.website.com/vacations/central-america Old website: www.website.com/vacations/bolivia
    New website: www.website.com/vacations/south-america Old website: www.website.com/vacations/bolivia/la-paz
    New website: www.website.com/vacations/south-america Do I need to redirect each and every page or would just redirecting just the folder be enough to keep my SEO juice? Many thanks in advance for any help!

    | JSimmons17
    0

  • The Bingpreview crawler, which I think exists in order to take snapshots of mobile friendly pages, crawled my pages last night for the first time.  However, it is adding a trailing slash to the end of each of my dynamic pages.  The result is my program is giving the wrong page--my program is not expecting a trailing slash at the end of the urls.  It was 160 pages, but I have thousands of pages it could do this to. I could try doing a mod rewrite but that seems like it should be unnecessary.  ALL the other crawlers are crawling the proper urls.  None of my hyperlinks have the slash on the end.  I have written to Bing to tell them of the problem. Is anyone else having this issue?  Any other suggestions for what to do? The user agent is: Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 BingPreview/1.0b

    | friendoffood
    0

  • Hi all, I help run a website for a history-themed podcast and we just moved it to its second domain in 7 years. We've had very good SEO up until last week, and I'm wondering if I screwed up the way I redirected the domains. It's like this: Originally the site was hosted at "first.com", and it acquired inbound links. However, we then started to host the site on blogger, so we... Redirected the site to "second.blogspot.com". (Thus, 1 --> 2) It stayed here for about 7 years and got lots of traffic. Two weeks ago we moved it off of blogger and into Wordpress, so we 301 redirected everything to... third.com. (Thus, 1 --> 2 --> 3) The redirects worked, and when we Google individual posts, we are now seeing them in Google's index at the new URL. My question: What about the 1--> 2 redirect? There are still lots of links pointing to "first.com". Last week I went into my GoDaddy settings and changed the first redirect, so that first.com now points to third.com. (Thus 1 --> 3, and 2-->3) I was correct in doing that, right? The drop in Google traffic I've seen this past week makes me think that maybe I screwed something up. Should we have kept 1 --> 2 --> 3? (Again, now we have 1-->3 and 2-->3) Thanks for any insights on this! Tom

    | TomNYC
    1

  • A lot of websites, by virtue of practicality, will link to wikipedia articles to explain certain concepts.  Would it be worthwhile to reach out to those websites and ask them to change the link to a different resource if that resource is a much better alternative than the wikipedia article?  And how would you approach this? Thanks!

    | mack-ayache
    0

  • Hello, I'm hoping to find some advice on how to proceed with a site I started last year — Quotery. I've spent a ton of time and money on it, tried my best to build value to what is typically a copycat niche, and adhered to Google's SEO recommendations (I believe, anyway). Yet, subpar sites rank well above mine. Now I'm wondering if I should find a buyer for the site and cut my losses. Does anyone have any suggestions on what the next steps might be in order to rank higher? Here is what we've tried/accomplished so far: We were mentioned on Netted. We built a well-designed and easy to navigate site that works on all devices. We've added topic descriptions and images, unique author descriptions and pictures, and exclusive picture quotes. We built a well-designed and useful WordPress plugin, and kept the backlinks nofollow (note: our main competitor also has a subpar WordPress plugin with dofollow backlinks, yet they don't get penalized for it). We've published curated blog posts, along with infographics that have been shared millions of time #1, #2, #3. We've built up significant social media profiles on Facebook, Twitter, and elsewhere. We even tried hiring 97thFloor (highly recommended on Moz) for 4 months, although most of their efforts seemed spammy and/or very basic to me (and cost a fortune), so we decided to take SEO into our own hands afterwards. We've added sources, pictures, and relevant information for thousands of quotes (e.g. example seen here). We started working on user profiles, and had plans for much more down the road. However, despite these efforts, sites like BrainyQuote dominate Google's rankings. So is it truly value that will earn you rankings... or is it still all about gaming the system? Of course, any suggestions in my case specifically would be much appreciated.

    | JasonMOZ
    0

  • Hi guys, Not sure what the correct terminology is, i think its called the knowledge box. Screenshot: https://lh3.googleusercontent.com/-ZRtixuPHr9c/VDdv12o9_HI/AAAAAAAAMVg/mlvjJf8ph5M/w800-h800/go-home-google-knowledge-box-youre-drunk.jpg Anyway i'm referring to the box which sometimes appears above organic listings which either explains the definition of a something like the case above or provides information regarding searchers query. I was wondering if anyone knows how to get the knowledge box appearing when its not being displayed for a query it should be or influence what is contained in the box? Has anyone seen any good write ups about this? Cheers, Mark

    | Mikey008
    0

  • One of our clients is in the Cosmetic Surgery business (bodevolve.com) and individuals most likely to purchase a cosmetic procedure only search for 2 things....'**before & after photos' and 'cost'. ** That being said we've worked extremely hard to optimize all 500+ before and after photos. And to our great disappointment, they still aren't being indexed...we are testing a few things but any feedback would be greatly appreciated! All photos are in the 'attachment' sitemap: http://bodevolve.com/sitemap_index.xml I'm also testing a few squeeze pages like this one: http://bodevolve.com/tummy-tuck-before-and-after-photos/ Thanks so much, Brit

    | BritneyMuller
    0

  • Hi Mozzers, I am working for client that hasn't got penalized but has lots of junk seo directories that I would like to disavow. My question is should i try reaching out to webmasters (if they are existant) first and show proof to google? or should I just go ahead and submit the file without any reach out?will it still work? Thanks!

    | Ideas-Money-Art
    0

  • We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search. Some relevant notes: We transitioned the site from SiteCore to Wordpress to allow for greater flexibility The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS) The new content (and all required 301s) went live on 12/2 The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog) As of posting this question, a site:domain search shows 6,110 results While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful? We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.

    | d50-Media
    0

  • How do you guys go about getting sustainable links from high authority sites? In some markets, like say SEO, it can be as easy as writing great content and "people will share it" because there are a ton of SEO websites on the internet and all of them are talking about SEO and want to share with you great SEO content.  But as you guys know there are markets that aren't as well developed online - where do you look for backlinks for these markets? I'm working on a project and I'm trying to put together a good backlinking strategy.  Part of it will be chasing backlinks from University websites (relevant to my market).  What I'm wondering here is if its OK by Google to barter for links.  Say you have an online store and you give the University a "student's discount" in exchange for a link (I don't know if this would be appealing enough but is it fair game as far as Google is concerned)?

    | mack-ayache
    0

  • hi mozzers, I'm doing an audit on a website. I detected over 60 301s of this nature: www.example.com/help 301d to www.example.com/help/. I believe these are completely useless and increase page load time. Am I right? should i kill those 301s? Thanks

    | Ideas-Money-Art
    0

  • Hi We have a website which lists both upcoming and past events. Currently everything is indexed by google, with no real issues (usually it finds the most up-to-date events) and we have deprioritised the past events in the sitemap. Do I need to go one step further and noindex events which are past or just leave it as-is? They dont really hold much value, but sometimes will have a number of incoming links and social media shares pointing to them. We want to keep the page active for visitors, just wondering about google (there's no real link between past events and future either, so difficult to 'point' to newer version of an event) We have approx 1M 'past' events and growing so its a big change. Also would you keep them in sitemap with lower priority, or just remove them? EDIT: Just seen a Matt Cutts post from 2014 which indicates than an 'unavailable_after' meta tag might be best?

    | benseb
    0

  • Hoping someone can answer this for me, as I have spent a ton of time researching with no luck... Is there anything misleading/wrong with using multiple rel="alternate" tags on a single webpage to reference multiple alternate versions? We currently use this tag to specify a mobile-equivalent page (mobile site served on an m. domain), but would like to expand so that we can cover another domain for desktop (possibly mobile in the future). In essence: MAIN DOMAIN would get The "Other Domain" would then use Canonical to point back to the main site. To clarify, this implementation idea is for an e-commerce site that maintains the same product line across 2 domains. One is homogeneous with furniture & home decor, which is a sub-set of products on our "main" domain that includes lighting, furniture & home decor. Any feedback or guidance is greatly appreciated! Thanks!

    | LampsPlus
    0

  • Hi I have a number of highly ranked category pages. However, at times these contain no products for a few weeks, etc. They are being flagged as duplicate content as they are just stub pages when they have no products, with the same "No products found" message. I don't want to risk 'noindex' ing the pages though - because as soon as they have products in, they become valuable pages and I would hate to lose a good ranking. Should I just leave them as-is and ignore the dupe warnings?

    | benseb
    0

  • Hi, Hope you're doing well. I am sure, you guys must be aware that Google has updated their webmaster technical guidelines saying that users should allow access to their css files and java-scripts file if it's possible. Used to be that Google would render the web pages only text based. Now it claims that it can read the css and java-scripts. According to their own terms, not allowing access to the css files can result in sub-optimal rankings. "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."http://googlewebmastercentral.blogspot.com/2014/10/updating-our-technical-webmaster.htmlWe have allowed access to our CSS files. and Google bot, is seeing our webapges more like a normal user would do. (tested it in GWT)Anyhow, this is my dilemma. I am sure lot of other users might be facing the same situation. Like any other e commerce companies/websites.. we have lot of images. Used to be that our css files were inside our images folder, so I have allowed access to that. Here's the robots.txt --> http://www.modbargains.com/robots.txtRight now we are blocking images folder, as it is very huge, very heavy, and some of the images are very high res. The reason we are blocking that is because we feel that Google bot might spend almost all of its time trying to crawl that "images" folder only, that it might not have enough time to crawl other important pages. Not to mention, a very heavy server load on Google's and ours. we do have good high quality original pictures. We feel that we are losing potential rankings since we are blocking images. I was thinking to allow ONLY google-image bot, access to it. But I still feel that google might spend lot of time doing that. **I was wondering if Google makes a decision saying, hey let me spend 10 minutes for google image bot, and let me spend 20 minutes for google-mobile bot etc.. or something like that.. , or does it have separate "time spending" allocations for all of it's bot types. I want to unblock the images folder, for now only the google image bot, but at the same time, I fear that it might drastically hamper indexing of our important pages, as I mentioned before, because of having tons & tons of images, and Google spending enough time already just to crawl that folder.**Any advice? recommendations? suggestions? technical guidance? Plan of action? Pretty sure I answered my own question, but I need a confirmation from an Expert, if I am right, saying that allow only Google image access to my images folder. Sincerely,Shaleen Shah

    | Modbargains
    1

  • We are about to modify the structure of our dynamic URLs and I wonder what the latest and greatest is in terms of SEO-friendly dynamic URLs. Our thinking so far is to do something like: www.domain.com/products/state/city/first-search-parameter+second-parameter+third-parameter+any-additional-keywords that is, using + to separate search parameters and hyphens to separate words An example might be www.homes.com/listings/ca/san-francisco/single-family-home+3-bedrooms+2-bathrooms+swimming-pool-garden-wood-exterior I'm not an SEO expert so any help would be appreciated Thanks

    | lln22
    0

  • I currently have my mobile site set up as a m dot site.  I have designed a new responsive/adaptive version of my desktop site I would like to start using. When I search from google on mobile, my website is indexed as the m dot site.  When I make the switch, this will no longer be the case as I will only have one url for both mobile and desktop. The m dot url's will no longer work. Are there any SEO consequences from making this shift?

    | mikeylong7
    0

  • Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!

    | jhariani
    2

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.