Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I work for a company that has over 100 physical locations.  We are working to update our Google Place icon.  My question is, do we get any seo benefit having a unique icon for each location, or no benefit and its better to focus on having a the best single icon possible.  Note, we are going to add unique images in each place listing, this is specifically referring to the main icon shown on Google results.

    | NickConfer
    0

  • Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)? Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co? Thank you in advance!

    | joechicago
    0

  • Hello, I would like to know if there is any certificate we can buy to increase the Seo of my website. Thanks very much for your time

    | Cartageno
    0

  • I have a hosting account that is ancient.  So is it's CPanel, its way of operation (I have to call in to change the zone file), and its hardware and software (It can't even recognize Wordpress as a user so i have to change permissions to change anything.) I plan on moving the site, but I want to prepare for any changes that may happen.  Currently the site ranks between #1 or 3 for quite a few very valuable words.  It is also in season for this business.  I know changing hosting data or servers can cause google to temporarily drop rankings.  Does anyone have any experience with this or now how long the faded rankings can last? Or if its even true?

    | MarloSchneider
    0

  • If I am targeting a specific keyword, from an SEO perspective is it better to create a subfolder on a url that has some authority or is it better use the exact domain with no authority? For example, if I want to target the word 'widgets' which is the better choice and why? **Choice 1: ** www.domainwithauthority.com/widgets Note:  this domain has 1000 links to it **Choice 2: ** www.widgets.com Note:  this is a brand new domain with 0 links

    | mnipko
    0

  • We're currently in the IA and design phase of rolling out a complete overhaul of our main site. In the meantime I've been doing some SEO triage, but I wanted to start making a longer term plan for SEO during and after the new site goes up. We have a pretty decent domain authority, and some quality backlinks, but we're just getting creamed in the SERPs. And so on to my question: How would you fix this site? What SEO strategy would you employ? http://www.adoptionhelp.org Thanks!

    | AdoptionHelp
    0

  • Background: My e-commerce site uses a lot of layered navigation and sorting links.  While this is great for users, it ends up in a lot of URL variations of the same page being crawled by Google.  For example, a standard category page: www.mysite.com/widgets.html ...which uses a "Price" layered navigation sidebar to filter products based on price also produces the following URLs which link to the same page: http://www.mysite.com/widgets.html?price=1%2C250 http://www.mysite.com/widgets.html?price=2%2C250 http://www.mysite.com/widgets.html?price=3%2C250 As there are literally thousands of these URL variations being indexed, so I'd like to use Robots.txt to disallow these variations. Question: Is this a wise thing to do?  Or does Google take into account layered navigation links by default, and I don't need to worry. To implement, I was going to do the following in Robots.txt: User-agent: * Disallow: /*? Disallow: /*= ....which would prevent any dynamic URL with a '?" or '=' from being indexed.  Is there a better way to do this, or is this a good solution? Thank you!

    | AndrewY
    1

  • I tend to work on the on page changes first of all following keyword research. Then take a look at some internal linking, Setup a wordpress blog on /blog or sub domain and get my copywriter to start adding regular content . Next stage is link building Old fashioned emails requests, blog comments taking a look through existing sites we own for relevant places. On going analysis once positions change.

    | onlinemediadirect
    0

  • What is the best way to manage 404 errors for pages that are no longer on the server. For example, client deletes old site from server and replaces it with new site. Webmaster tools is reporting 100 +  404 errors form the old site.  I've blocked the 404 pages with robot.text. Requested removal in google webmaster tools. And created a custom 404 page - http://www.tvsphoto.com/missingurlexample Is there anything else I can do?

    | SEOProPhoto
    0

  • A couple of weeks ago I was having problems with my real estate local listing.  I made some changes (like removing anything remotely like keyword stuffing and a few other things).  Then, we re-emerged.  But now, instead of having 4 citations we have 221.  It looks like Google has merged our listing with all of the other agents in our office. So, now if you type in to Google: ABC Realty in OurCity the very first listing is a 1-box that has our listing: Jane and John Doe, Sales Representatives, ABC Realty and our phone number.  We actually rank higher than the ABC Realty office's own web page.  We are getting phone calls from people who think they are calling the main office but instead call us.  (This is not at all bad for business...but perhaps there is an ethical issue?) My problem is that if you click on our places listing, there is one photo on there of a realtor who is not us.  Additionally, we lost our two reviews that we had, but we have one review for another realtor who is not us.  The rest of the listing is totally ours - our photos, our description, our website, our phone number.  If I go to edit the listing, the option to remove that photo is not there. So, now we have a conundrum.  One one hand, it's great to have this boost.  We are appearing #1 for searches for our office and this brings us business.  But, I want to be ethical.  Realtors can be nasty and I don't want other realtors thinking that I have done bad manipulative stuff to steal other peoples' business. Can anything be done?  What would you do?

    | MarieHaynes
    0

  • I am working on a project that is basically a site to list apartment for rent (similar to apartments.com or rent.com). We want to add a bunch of amenity pages, price pages, etc. Basically increasing the page count on the site and helping users be able to have more pages relevant to their searches and long tail phrases. So an example page would be Denver apartments with a pool would be one page and Seattle apartments under 900 would be another page etc. By doing this we will take the site from about 14,000 pages or so to over 2 million by the time we add a list of amenities to every city in the US. My question is should I worry about time release on them? Meaning do you think we would get penalized for launching that many pages overnight or over the course of a week? How fast is too fast to increase the content on your site? The site about a year old and we are not trying to scam anything just looking to site functionality and page volume. Any advice?

    | ioV
    0

  • I apologize first if this comes across as extremely novice, but I realized I really didn't know the answer and so - here I am. 🙂 Is anyone familiar with tracking google place traffic in google analytics? Is it possible? I'd love to know how many of our visitors are coming from our google place listings (we have several locations throughout the state.) Much gratitude in advance ~ Alicia

    | Aaronetics
    0

  • I could do with some help if anyone's got a minute. We've got this one client, no matter what we did (and we worked very hard on this site), nothing would really move. You'd get the usual fluctuations, and maybe some very small progress at times. This went on for an age... much, much longer than usual (and it wasn't even that competitive for keywords). Then suddenly, "Bam!" it shot up like a rocket for all it's main keywords and has stayed there since... more or less (and this was over a year ago). It was as if all the work we'd been doing was building up behind a door and then the door flew open so it could take affect. Anyway... it seems to be happening again, just to a different client with a different website (at least I hope that's what's happening or it might just stay non-affected by anything we do forever). We've checked everything. There's no crawling problems, again it's not all that competitive, the site already has some pretty good trust and authority, and it already ranks well for a bunch of stuff. The site and pages have plenty of age behind them too. Any ideas?

    | SteveOllington
    0

  • I've been wrestling with this one for a while. Take a standard small web site navigation with nav links for: Products Solutions Support Learning Center I believe having drop downs to show the sub-pages of each category provides a better user experience, but it also bloats my links per page in the navigation from 4 to 24. Most of the additional links are useful for user experience, but not search purposes. So, 2-years after Google's changing of how it treats nofollows (which used to be the easy answer to this question), what is considered best practice? A) Go ahead and add the full 24 nav links on each page. The user experience outweighs the SEO benefits of fewer links and Google doesn't worry too much about nav links relative to main body links. B) Stick to only 4 nav options. Having 20 additional links on every page is a big deal and removing them is worth the user experience hit. I can still get to all levels of this small site within 2-3 clicks and do cross category linking to mitigate silos. C) Use some technical voodoo with js links or iframes to hide the nav links from Google and get the best of both worlds. D) Do something that is not one of the first three choices. Does anyone feel strongly about any of the above options or is this a user-preference type of situation where it doesn't make much difference which option you choose on a small 100-200 page site? I'm really looking forward to everyone's thoughts on this. -DV

    | dvansant
    0

  • What have been your most proudest moments in SEO ? What would you consider as being you best achieved job and in what industry was it achieved it. What are the reasons you have this down as your biggest achievement ?

    | onlinemediadirect
    0

  • Beloved community: I'm about to optimize a reasonably large website that has been developed with ASP.NET. My crawl diagnostics do not paint a pretty picture: overly dynamic URLs, loads of duplicate content, and 302 temporary redirects. I found a helpful IIS extension on Scott Guthrie's blog that eliminates a lot of of the above issues. But looking ahead, I need a solution for creating a "category" organized, flat site architecture. What steps should I take with my development team in order to implement a site architecture that is highly-crawlable and user-friendly? Any ASP.NET gurus out there? Thanks in advance!

    | jsturgeon
    0

  • We have taken over a site and now find our self looking at the homepage of the site which has hidden scrolling text. A old school way of adding text without leaving loads of paragraphs. I have also removed all links to the index.htm page but somewhere visitors are still coming to this page in there droves. I am considering using a canonical url code but I would rather nip it in the bud. Would love some feedback from some other experts here is the site - http://www.radiatorcentre.com You never stop learning in seo and maybe we can all learn from this example. Thanks

    | onlinemediadirect
    0

  • We are preparing to launch a newly designed (and much improved) website in the next few months. I want to be very careful to ensure we do not mess up any rankings (and hopefully actually improve rankings) when switching over the site. I'm particularly concerned about one key phrase that our homepage currently ranks on. After the redesign it would be more appropriate for our of our subpages to rank for that term, but I'd rather have our homepage rank (less relevant for this keyword than the subpage) then nothing at all. I know about 301 redirects, and we are planning on creating a few comprehensive diagrams to ensure we redirect old pages to the correct new pages. Beyond that, what can I do to preserve our rankings? Thanks! -Ryan

    | RyanD.
    0

  • Does anyone have experience of how Google deals with slight character variations, e.g. Facade v Façade? From an SEO perspective, are these treated as two completely separate words or is Google clever enough to determine the intent of the searcher & the site?

    | bjalc2011
    0

  • I recently came across what is to me a new SEO problem. A site I consult with has some thin pages with a handful of ads at the top, some relevant local content sourced from a third party beneath that... and a bunch of inbound links to said pages. Not just any links, but links from powerful news sites. My impression is that said links are paid (sidebar links, anchor text... nice number of footprints.) Short version: They may be getting juice from these links. A preliminary lookup for one page's keywords in the title finds it top 100 on Google. I don't want to lose that juice, but do think the thin pages they link to can incur Panda's filter. They've got the same blurb for lots of [topic x] in [city y], plus the sourced content (not original...). So I'm thinking about noindexing said pages to avoid Panda filters. Also, as a future pre-emptive measure, I'm considering figuring out what they did to get these links and aiming to have them removed if they were really paid for. If it was a biz dev deal, I'm open to leaving them up, but that possibility seems unlikely. What would you do? One of the options I laid out above or something else? Why? p.s. I'm asking this on my blog (seoroi.com/blog/ ) too, so if you're up for me to quote you (and link to your site, do say so. You aren't guaranteed to be quoted if you answer here, but it's one of the easier ways you'll get a good quality link. p.p.s. Related note: I'm looking for intermediate to advanced guest posts for my blog, which has 2000+ RSS subs. Email me at gab@ my site if you're interested. You can also PM me here on SEOmoz, though I don't login as frequently.

    | Gab-Goldenberg
    0

  • For my informational site I have a lot of urls that are way too long.  When I first created the site, I wrote a script that takes out the common words of a post and fashions a url.  So, for example, if the first few words of a question were: Hi there, I have a question about back pain. I'm wondering what drugs would be good for relief and how I can get some help? then my url may be: www.mydomain.com/question?id=123-question-back-pain-wondering-drugs-good-relief-how-get-some-help Once I got learning about seo I realized that these urls were too long but I never did anything about them.   Should I be shortening these, or is my time best spent doing something else?

    | MarieHaynes
    2

  • We recently revived a slew of industry-specific widgets that we had created on Widgetbox.com largely for SEO purposes. The widget is remote-hosted and just a static HTML cache page showing information, so all Widgetbox does is wrap that HTML page in javascript to show it. However, I realized that when Javascript is disabled there's no link back to our website, just 4 or 5 back to Widgetbox. When Javascript is enabled we have a simple link back to our website at the bottom of the widget. I know Google has the ability to crawl Javascript now, but what do you think about this, will these links count? I could find virtually 0 information about it elsewhere. I'm thinking I may just add several widgets to another website to see if Google picks up the links.

    | ACann
    0

  • In what order do you carry out your SEO? do you start with keyword research or competitor analysis or is it on-page and then on to link building, do you concentrate on article submissions and then move on to social networking? List your step by step process here.

    | IPIM
    0

  • Hi We have a mobile site which is a subfolder within our site. Therefore our desktop site is www.mysite.com and the mobile version is www.mysite.com/m/. All URL's for specific pages are the same with the exception of /m/ in them for the mobile version. The mobile version has the specific user agent detection capabilities. I never saw this as being duplicate content initially as I did some research and found the following links
    http://www.youtube.com/watch?v=mY9h3G8Lv4k
    http://searchengineland.com/dont-penalize-yourself-mobile-sites-are-not-duplicate-content-40380
    http://www.seroundtable.com/archives/022109.html What I am finding now is that when I look into Google Webmaster Tools, Google shows that there are 2 pages with the same Page title and therefore Im concerned if Google sees this as duplicate content. The reason why the page title and meta description is the same is simply because the content on the 2 verrsions are the exact same. Only layout changes due to handheld specific browsing. Are there any speficific precausions I could take or best practices to ensure that Google does not see the mobile pages as duplicates of the desktop pages Does anyone know solid best practices to achieve maximum results for running an idential mobile version of your main site?

    | peterkn
    1

  • I just don't understand.  If a question or an answer is decent, I give people a thumbs up.  I don't think I've really seen anyone else give a thumbs up.  Why not?

    | alhallinan
    6

  • Hi Which url should i use for my web site ? and why ? 1 : http://www.test.com/how-are-you.html 2 : http://www.test.com/how are you.html thanks

    | nipponx
    0

  • I want to target a set of keywords but I want to know which type of Title tag structure or  wording is most effective? Here are my target keywords: CMMS, CMMS Software, EAM Software, Maintenance Management Software Do you think using exact keywords terms are most effective? For example: Title tag: CMMS, CMMS Software, EAM Software, Maintenance Management Software Or: Title tag: CMMS Software, EAM Maintenance Management Software Same goes for keyword use for content and H1 tags. Your thoughts? Thanks, John

    | VizionSEO99
    0

  • Sitemaps create a Table of Contents for web crawlers and users alike. Understanding how PageRank is passed, HTML sitemaps play a critical role in how Googlebot and other crawlers spider and catalog content. I get asked this question a lot and, in most cases, it's easy to categorize sitemaps and create 2-3 category-based maps that can be linked to from the global footer. However, what do you do when a client has 40 categories with 200+ pages of content under each category? How do you segment your HTML sitemap in a case like this?

    | stevewiideman
    0

  • I am finding it a real uphill task with a few of our clients with there either product or category pages competing against other sites on main keywords. The sites either categories or product specific pages are in direct competition with other sites homepages and I am finding it increasingly more difficult to break into positions. What are other peoples experiences with this ? Do you feel the way the pages are ranked within the xml sitemap with priority could also be a factor.

    | onlinemediadirect
    0

  • A while back I created a new website.  Somehow my "scratch" copies of the site got indexed even though I didn't have links built to them.  (In the future I will use noindex tags when I am playing around with designing). Now, I have three versions of the site online...let's call them TheRealSite.com and Practice1.com and Practice2.com.   Practice1.com and Practice2.com now rank #1 for their main keyword.  (It's a relatively uncompetitive niche).  TheRealSite.com is somewhere lower than page 20 despite having an exact keyword match domain name.  I'm assuming that Google considered it duplicate content as it is the exact same thing as Practice1 and 2. I had considered simply removing Practice1 and 2 from the server, but I was worried that if I did that, I would lose my #1 rankings if TheRealSite didn't recover. So, what I've done is 301 redirect Practice1 and Practice2 to TheRealSite.  I'm guessing that over time TheRealSite will come back to #1 and then I can just remove the files from Practice1 and Practice2. Is this the best way to handle this situation?

    | MarieHaynes
    1

  • On our real estate site we have our office listings displayed.  The listings are generated from a scraping script that I wrote.  As such, all of our listings have the exact same description snippet as every other agent in our office.  The rest of the page consists of site-wide sidebars and a contact form.  The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!

    | MarieHaynes
    0

  • About 3 weeks ago google created a duplicate listing for our law firm on google maps. In building links I have tried very hard to ensure that our address and company name was always listed identically. Our correct firm name and address is Feldman Feldman & Associates, PC 2221 Camino Del Rio South, Suite 201 inevitably somehow the new listing stated Camino Del Rio S, Ste 201 All of our reviews moved over to this new profile, I claimed it, changed it to make it the same reported it to Google. Google merged them. Now Google has created another profile this time the firm name and address matches ours exactly (South and Suite both spelled out), but all of the reviews have moved over except for the most recent one(s). I have claimed it again and reported it to google, changed the address. Google then created another listing. Our page rank for keywords has been hurt by this. any idea why this keeps happening suggestions? Here are the two pages. This is our original listing http://maps.google.com/maps/place?hl=en&cid=468564492130231259 This is the new one google self created that stole all our reviews, but is ranked very poorly for the keyword searches. http://maps.google.com/maps/place?&cid=468564492130231259

    | jfeld222
    0

  • Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks

    | Flapjack
    0

  • Hi Panda update,tickled, slapped and then punched us. I'm now weeding through crawl errors, one of them is http://www.flasharcadegamessite.com/26226-paper-airplane-flight.html">paper airplane flight I tried 301 this to it's proper page using Redirect 301 /26226-paper-airplane-flight.html">paper airplane flight http://www.flasharcadegamessite.com/26226-paper-airplane-flight.html but the server starts spitting out 500 errors. How does one add that rule into htaccess ? thanks

    | Flapjack
    0

  • Although most newspaper comment sections are a no-follow zone, I have noticed that some comments I have posted with links end up being followed.  The comments are participatory and the links relevant and even add to the conversation.  My theory is that some comments are monitored and if the editors are looking to encourage discussion and don't feel like your spamming, why not take the no follow off.  I do plan on doing some testing with poor, spammy comments on the same papers but am encouraged and would like to know what other people have found.

    | phogan
    0

  • Google advises that sites should have no more than around 100 links per page.  I realise there is some flexibility around this which is highlighted in this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru One of Google's justifications for this guideline is that a page with several hundred links is likely to be less useful to a user. However, these days web pages are rarely 2 dimensional and usually include CSS drop--down navigation and tabs to different layers so that even though a user may only see 60 or so links, the source code actually contains hundreds of links. I.e., the page is actually very useful to a user. I think there is a concern amongst SEO's that if there are more than 100ish links on a page search engines may not follow links beyond those which may lead to indexing problems. This is a long winded way of getting round to my question which is, if there are 200 links in a page but many of these links point to the same page URL (let's say half the links are simply second ocurrences of other links on the page), will Google count 200 links on the page or 100?

    | SureFire
    0

  • Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin

    | footsteps
    0

  • I run a site where I answer questions.  As I answer each question I choose a title for the page.  I have been trying to get good keywords in my titles, but now I am wondering if I have been keyword stuffing them and perhaps I should be more succinct. So, let's say I had a question about a sore back.  Here would be the title tag I would use: Why is my back sore? I have spinal pain and need relief and help.  | My Main Keyword That's a fictitious example, but the idea is that I would be trying to get the keywords "back", "sore", "spinal", "pain", "relief" "help" and my main website keyword into the title. As I'm writing this I'm seeing the folly in this.  I think it would likely be much better to simply have a title of Why is my back sore? So, I have three questions: 1. Is it better to have a succinct title targeting one keyword/keyword phrase than to get lots of keywords in my title? 2. Should I be putting my main keyword after each of my title?  Shortly after doing this on 1700+ pages I was #1 for my main keyword.  But, I was also doing other things as well to boost my presence for this keyword. 3. If I decide to do more succinct titles, how would you suggest I go about running a test to see which is better? Looking forward to your responses!  Thanks!

    | MarieHaynes
    0

  • Throughout our site, canonical tags have been added where needed. However, the canonical tags are also included for the canonical itself. For example, for www.askaquestion.com, the canonical tag has been added as www.askaquestion.com. Will this have a negative impact or does it not really matter whether there is such a loop?

    | kbbseo
    0

  • After reading multiple posts on dealing with multilanguage sites (also checked http://www.google.com/support/forum/p/Webmasters/thread?tid=12a5507889c20461&hl=en), I still haven't got an answer to a very specific question I have. Please allow me to give some background:
    I'm working for the official Belgian Yellow Pages (part of Truvo), and as you might know in Belgium, we have to deal with 3 official languages (BE-nl, BE-fr, BE-de |  the latter is out of scope for this question) and on top of that we also have a large international audience (BE-en). Furthermore, Belgium is very small, meaning that someone living in the French part of Belgium (ex. Liège) easily might look for information in the Dutch part of Belgium (ex. Antwerpen) without having to switch websites/language. Since 1968 (http://info.truvo.be/en/our-company/) we have established 3 different brands, each brand is adapted to a language, each has a clear language specific connotation:
    for the BE-nl market: we have the brand "gouden gids"
    for the BE-fr market: we have the brand "pages dor"
    for the BE-en market we have the brand "golden pages" Logically, this results in 3 websites: www.goudengids.be, www.pagesdor.be, www.goldenpages.be each serving a specific language and containing specific language messages and functionalities, but, off course, serving a part of the content that is similar for all websites regardless of the language.
    So we do have following links ex.
    http://www.goudengids.be/united-consultants-nv-antwerpen-2000/
    http://www.pagesdor.be/united-consultants-nv-antwerpen-2000/
    http://www.goldenpages.be/united-consultants-nv-antwerpen-2000/ When I want to stick with the separate brands for the same content, how do I make sure that Google shows the desired url when searching in resp. google.be (dutch), google.be (french) google.be (english)? Kind Regards

    | TruvoDirectories
    0

  • From 2007 - 2004 I worked for Sprint in several positions with my last one being a Corporate Account Manager for fortune 1000 customers. In 2004 I left Sprint after the Nextel merger and created an eCommerce site called thesprintstore.net as a Sprint Nextel preferred partner. I used my inner working knowledge of Sprint to my wonderful advantage and began making 3x my original salary. My desire for more business turned to greed and I began leaking information that consumers loved i.e. phone release dates, price points, warehouse stock levels and tricks of the trade. This garnered me thousands of links from big sites (had no idea at the time) and eventually my site was issued a Cease and Desist order from Sprint's Corporate Headquarters. I recently realized one evening that I had a GEM of a domain with powerful backlinks that I could redirect to my current site TECHeGO.com [staff removed hyperlink]. (Some of the back links are from Engaget, Engaget Mobile, Rimmarkable and even one from Sprint.) The redirection has been in place for months now and I have confirmed that all that sweet Link Nectar is flowing through! I have found it interesting, however, that my back link and referral domain count have never increased leading me to believe that in doing a 301 Redirect existing links become what can only be described as 'Dark Matter Links' i.e. the links are there, simply invisible. Dark Matter Definition: dark matter is matter that is inferred to exist from gravitational effects on visible matter and background radiation, but is undetectable by emitted or scatteredelectromagnetic radiation. Dark Matter Links: dark matter links are visible links that have passed through a 301 redirect which are now inferred to exist but are no longer visible by crawlers? Is there a better definition that could be applied to the term 'Dark Matter Links'?

    | TECHeGO
    1

  • Let's say I have a nice page of content on Domain A, which is a strong domain.  That page has a nice number of links from other websites and ranks on the first page of the SERPs for some good keywords. However, I would like to move that single page of content to Domain B using a 301 redirect.  Domain B is a slightly weaker domain, however, it has better assets to monetize the traffic that visits this page of content. I expect that the rankings might slip down a few places but I am hoping that I will at least keep some of the credit for the inbound links from other websites. Has anyone ever done this?  Did it work as you expected?  Did the content hold its rankings after being moved? Any advice or philosophical opinions on this? Thank you!

    | EGOL
    2

  • Imagine you have a large site on an aged and authoritative domain. For commercial reasons the site has to be moved to a new domain, and in the process is going to be revamped significantly. Not an ideal starting scenario obviously to be biting off so much all at once, but unavoidable. The plan is to run the new site in beta for about 4 weeks, giving users the opportunity to play with it and provide feedback. After that there will be a hard cut over with all URLs permanently redirected to the new domain. The hard cut over is necessary due to business continuity reasons, and real complexity in trying to maintain complex UI and client reporting over multiple domains. Of course we'll endeavour to mitigate the impact of the change by telling G about the change in WMC and ensuring we monitor crawl errors etc etc. My question is whether we should allow the new site to be indexed during the beta period? My gut feeling is yes for the following reasons: It's only 4 weeks and until such time as we start redirecting the old site the new domain won't have much whuffie so there's next to no chance the site will ranking for anything much. Give Googlebot a headstart on indexing a lot of URLs so they won't all be new when we cut over the redirects Is that sound reasoning? Is the duplication during that 4 week beta period likely to have some negative impact that I am underestimating?

    | Charlie_Coxhead
    0

  • I have a website that is a desktop wallpaper script. People can come and upload 100's of wallpapers to share with the community. This is were the problems comes in. Files are normally called 27636dark.jpg or whatever and come with no description. This leads to 2 things. no text content that google can use to know what the page/image is about. Meta descriptions, URL's just look like spam. Example: /car-wallpapers/7636dark.jpg If a text description was added, it would still only be like "Green Trees in the distance". Which as you may guess, with 1,000's of wallpapers... would end up having a lot of descriptions the same. Is there any advice for sites that focus on image driven content?

    | rhysmaster
    0

  • I'm working on a new real estate site and I've got some great ideas to generate links.  One of these ideas is a pretty neat contest.  (I'd tell you what it is, but really...this is one of those ideas where people will go, 'Man!  I wish I thought of that', so I want to do it first before anyone else steals my idea.  🙂  ) The contest involves readers submitting photos and other readers voting on them.  I anticipate it will generate some good press both locally and internationally if I can get enough submissions.  But, the question is, how do I get contest submissions when I don't have good traffic yet? Once I've got some good submissions, I'll contact some media, spread it amongst my personal fb friends, etc. Any ideas are welcomed!

    | MarieHaynes
    1

  • I have a genuine reason for temporarily moving some pages from my site onto a subdomain for a few weeks. If I do this using a 302, what happens to any in-links to the destination page once the the 302 reverses back to the original URL? Is any Google juice from those links lost?

    | StuartAnderton
    1

  • Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000.  So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go?  Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to.  We are using canonical tags. Thanks, Jarrett

    | jarrett.mackay
    0

  • We have a very large ecommerce store with little fresh content being added, accept through a web blog on the sub domain. We are thinking of moving over our blog that is on another domain entirely and has a lot of active users. But first I want to make sure it will actually help the domains rankjings, and second i'm concerned about the duplicate content on the old forum if we move it to the main domain. Should we just copy over all the content, 301 the old forum URL's to the new ones? Thanks much!

    | iAnalyst.com
    0

  • I've been meaning to try out the eTag entity for a while now. They seem like a great way to notify the bot when to and when not to fetch your content. Fruthermore, it is impliad that proxy servers can make use of them and help your site load faster by not fetching a newer copy if one is not available. This is not something that is easy to test on a small site and implementation on bigger sites is in my case a one way road and a few weeks in hell with the developers on staff. Will eTags take some load off the a site with a lot of traffic and dynamically generated content? Is this a good practice, as far as search engines are concerned?

    | Svetoslav
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.