Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I had a website which got hacked and malware added to it. I have since closed that website down but I still have the domain name. That domain name prior to the malware was incredibly well ranking for its niche and had a good range of high quality links to it and a domain age of 6 years. I'm now creating a new website which is similar to the old one (the same but with a different platform and layout). Is it a  good or bad idea to redirect the old domain name to the new website?

    | james.rose
    0

  • Somehow with our site architecture Google is crawling URLS for products we no longer carry (there are no links to those pages so I am still trying to figure out how Google is finding them).Those URLS are being redirected to our invalid product page. That invalid product page is returning a 200 OK code, but according to Google it should be a 404 so we get a soft 404 error. Google is seeing all of the URLs that redirect to that page as soft 404's as well. The first solution I can think of is to create a custom 404 page that looks just like our site, says we don't have the page/product they are looking for, has a search bar, sends a 404 code, etc. Is this the right way to go? And it will probably take some time to implement so is there a quick fix we could do first?

    | ntsupply
    0

  • A client likes the control of Vimeo Pro for embedding videos on site, but for search purposes would like to create a YouTube channel with the same videos, perhaps with altered titles and descriptions. This is the same video content in two places - will we run into duplicate content issues? Thank you, Stephen

    | PerfectPitchConcepts
    0

  • Hi, I've just found two sitemaps - one of them is .php and represents part of the site structure on the website. The second is a .txt file which lists every page on the website. The .txt file is blocked via robots exclusion protocol (which doesn't appear to be very logical as it's the only full sitemap). Any ideas why a developer might have done that?

    | McTaggart
    0

  • Hi, Just encountered the following article on Digital Trends: http://www.digitaltrends.com/mobile/lg-z-rumored/ This is a huge and respected site. Notice, that whenever the word "smartphone" or "smartphones" is mentioned, there is a link to Sprint. Needless to say that Sprint has nothing to do directly with the article's subject (a new LG smartphone that may be coming soon). So, is this an ordered piece? Is this legit? Does it assist Sprint with the article that is not really related? Should I pursue these type of articles (links) for my site or only HUGE companies can get away with it? Any thoughts?

    | BeytzNet
    0

  • Quick question, can Googlebot (or other search engines) follow meta refresh tags? Does it work anything like a 301 in terms of passing value to the new page?

    | kchandler
    1

  • Hello All, I have a situation with my site where a vendor created a local directory of locations on a sub-domain of my site. This sub-domain has approximately 2000 pages. It is a PR3 and a good backlink profile (not many links. Mostly citations. Not spammed). It get decent traffic but 80% of the traffic is driven by ppc. We have created a new local section on the main page of our website and we are trying to weigh the benefit of redirecting all of those pages on the old sub-domain. We anticipate that this new section will begin to replace the old sub-domain in serps. Additionally, when our deal with the company that manages this sub-domain ends in three months, the pages will no longer exist. Is it worth redirecting the pages (you might need more information to give good insight into that)? Also, if we do implement approx. 2000 redirects, what effect will that have on the main site from an SEO perspective. Is it possible that Google might ignore this large scale redirect effort? Will the value also be limited by the fact the redirect might only be live for a month before the original pages are deleted? Any help/insight with this would be greatly appreciated. Thanks!

    | ResponseMine
    0

  • Hi Everyone, Has anyone had experience using the sitelinks demotion tool within GWT after a site redesign?  How long does it take for the demotion to go into effect? We just went live with a new version of a site for one of our clients, and the generated sitelinks are no longer valid.  I have demoted the urls associated with the old sitelinks, and it's been over a week with no change.  Does anyone have experience in a similar situation? Also, for anyone looking for a time frame on changes to a meta description, after resubmitting an updated sitemap, it took 2 days for Google to display an updated meta description. Please advise, Chris Wilson

    | Chris_CM
    0

  • Hi all, For discussion... I am painstakingly working my way through a link profile, highlighting 'unnatural links' and contacting webmasters to try and get the links removed - I haven't got as far as 'disavow' or a 'Reconsideration Request' I have found a large number (around 150) of links from http://www.bookmarks4you.com and when I have attempted to contact the site for link removals I have had a payment request in order to do so. Now the amount being requested is low and so it may be worthwhile, however, I wondered what the consensus was with regards to this sort of demand? I know I could simply add the links to my 'disavow list' but for the sake of a small payment, I could get rid of them much quicker! Also, the majority of sites that I am contacting only have a contact from as opposed to an email address that I can use directly - what I am doing is taking a screen print of each contact form in order to have proof that I am actually doing the 'hard graft' as opposed to simply adding sites to a disavow list - is this a worthwhile exercise? Many thanks Andy

    | TomKing
    0

  • I have a site www.firewall-cs.com that I have been working on for 4 months.  The first 2 months the keywords were going up and then...dropped like a rock!  We didn't build the website, but it is a Wordpress site so I can make some changes. For the keywords that have "IT" in them, we haven't been able to recover.  It's like Google isn't even reading the home page.  The home page slider has the H1 in it and has 3.  I have told this to the client.  Plus there isn't a lot of content on the page.  Is the H1 issue enough for the word "IT Support Orlando" to not rank? Any suggestions would help! Thank you.

    | ClickIt
    0

  • Google has update with the new alogrithm and did you see any effects and as they are not revelaing the techinicaly how they work ? What's your opinion ?

    | Esaky
    0

  • I run a local business in New York City, a commercial real estate brokerage. My firm has both a web site and Google+ accounts, one Google+ account for me personally and a Google+ account for my business. Under address my Google+ account is showing New York, NY. It is not showing a street address. Similiarly when my business name is entered in the Google search bar, my web site is the first result, but under address (directly to the right of a black dot with a grey circle around it) "New York, NY" with the phone number beneath it appears. No sign of my street address. My business is registered under Google Places and we have entered the correct street address. Any ideas on how I can get Google to display our street address? This is obviously very, very detrimental for local SEO. Thanks,
    Alan

    | Kingalan1
    0

  • A newbie to this forum...hope have put the question the right way What is a good way/source to find which directories are suitable for a business. How to identify directories which are more localised..

    | grovermohit
    0

  • If you have a site with a few thousand high quality and authoritative pages, and tens of thousands with search results and tags pages with thin content, and noindex,follow the thin content pages all at once, will google see this is a good or bad thing? I am only trying to do what Google guidelines suggest, but since I have so many pages index on my site, will throwing the noindex tag on ~80% of thin content pages negatively impact my site?

    | WebServiceConsulting.com
    0

  • Hi, I have a site that has a Feedburner feed that has been in place for 5+ years.  I am considering getting rid of the feed or starting a new one to combat content scraping.  Google continues to rank thieves' sites ahead of mine.  Google and Bing have no issue and always get it right.   I use Wordpress and have the plugin PubSubHubb, but that is no guarantee.  Nonetheless, there is no monetary value of my subscribers whereas the content not being accredited to me takes money out of my pocket as my model is advertising. Is there any SEO issue if I do any of the following: Delete the feed and not have one? Change the feed address and drop all subscribers? Attachments: DMCA Dashboard; example of being outranked by scrapers. My site: www.furniturefashion.com Thanks for your time and hopefully I did not vent too much. OWmou6k f6W3xkq.png

    | will2112
    1

  • I'm targeting short head and chunky middle keywords for generating traffic to an ecommerce website. I guess I have two options both with great content: blog posts category pages with content (essentially the blog post). On the basis that it is great content that gets links, I would hope that I could garner links into the heart of the eCommerce website by doing this through option 2: category pages. Any thoughts on blog vs ecommerce category pages for tageting keywords?

    | BruceMcG
    0

  • I've set up a filter to remove bot traffic from Analytics. I relied on regular expressions posted in an article that eliminates what appears to be most of them. However, there are other bots I would like to filter but I'm having a hard time determining the regular expressions for them. How do I determine what the regular expression is for additional bots so I can apply them to the filter? I read an Analytics "how to" but its over my head and I'm hoping for some "dumbed down" guidance. 🙂

    | AWCthreads
    1

  • One of my newish hobby sites has began to attract some crappy links - as per Google Webmaster Tools, Links To Your Site report. The typical .ru and .pl kind of crap that seems to seep into all somewhat successful sites' link profiles. I have not received any notifications or penalties, BUT I am considering proactively disavowing these, but wanted to bounce this idea off some other SEOs before proceeding. Cheers!

    | David_ODonnell
    0

  • How do I implement both Schema and Rich Snippets and how beneficial are they to SEO and the Serps?

    | bronxpad
    0

  • Hi All, Key to a successful website is quality content - so the Gods of Google tell me. Embrace your audience with quality feature rich articles on your products or services, hints and tips, how to, etc. So you build your article page with all the correct criteria; Long Tail Keyword or phrases hitting the URL, heading, 1st sentance, etc. My question is this
    Let's say you have 30 articles, where would you place the 30 articles for SEO purposes and user experiences. My thought are:
    1] on the home page create a column with a clear heading "Useful articles" and populate the column with links to all 30 articles.
    or
    2] throughout your website create link references to the articles as part of natural information flow.
    or
    3] Create a banner or impact logo on the all pages to entice your audience to click and land on dedicated "articles page" Thanks Mark

    | Mark_Ch
    0

  • What is the best practice for removing a product page from an Ecommerce site? If a 301 is not available and the page is already crawled by the search engine A. block it out in the robot.txt B. let it 404

    | Bryan_Loconto
    0

  • Say you've got a Magento e-commerce site and you want to add Schema.org Microdata to it to take advantage of Google's Rich Snippets feature. Would the markup be part of the page's HTML TITLE . . . or somewhere in the bare-bones description (usually wrought by inputting data into separate fields in the CMS), e.g., Item: Something
    Price: $00.00
    Short Description: blah, blah, blah Or, hidden somewhere in the header? Or, can it be marked-up somewhere beneath my lengthy (and Panda-friendly) content and subsequently extracted by Google and highlighted in the SERPs? Admittedly, I'm more than a bit late to the Schema.org party and I'm a content-guy anyway; and not much good at under-the-hood stuff. I figure I'd better get my chops together, now. I've searched Moz.com's Q&A as well as Google's and Schema.org's and haven't come up with an answer yet that doesn't require that I learn a whole new vocabulary.

    | RScime25
    0

  • I have a customer who has a website, 8 years old.  The business has changed, and he has launched a new website (and sub-business_ to handle a particular service.  As such the main website will no longer be handling the new service.  For purpose of example; The service in question had it's own are set aside on his website, so what we have done is to 301 that part of the site (a single URL) to the homepage of his new website. Old Business Site
        Service 1
        Services 2 (301 to new site)
        Service 3 New Business Site This worked well, and within a week his new site was gaining traffic for the service keyword. However, we have now had a un-natural link wartning in webmaster tools. The old page on the old site had minimal links to it (around 400).  It had a page authority of 42, and 142 linking domains. The new website has been live a few weeks now, and has had 3 links to it, all genuine. He was on page one for the new business name, and is now page 6. Has anyone else ever seen this happen, and how should we deal with it.  We could of course remove the 301 redirect and put in a recon-request, but the 301 seems like thje right thing to have done, and is genuine. Any advice greatly appreciated.

    | makeusawebsite
    0

  • I am quite new to international SEO. I have a customer who wants to use the same website content on various domains targeting different countries. Such as; xxxx.hk - same content targeting Hong Kong xxxx.co.uk - same content targeting UK xxxx.de - same content targeting Germany I found that it could be possible with the Google suggested hreflang without any duplicate content problem. Is that true? Could someone explain this for me. Another question; if above is true do we need to make other adjustments as well. Such as; Any adjustments in Google Webmaster Tools (for each domain) Server location, does that really make a difference? Can we host each of the domains on the same server or should we seperate them and host each one on the country it is targeting ? Thanks in advance!

    | stradiji
    0

  • I'm not a business lawyer but has Moz been in the marketplace long enough to take issue with this? Volusion's new mid-to-enterprise level eCommerce platform is http://www.mozu.com/

    | AWCthreads
    0

  • Like adwords is their any good way of advertising a business place at Google map? If the answer is yes.Can you please take me through the process and give me rough idea about cost?

    | csfarnsworth
    0

  • I have been looking for a while for a good an clear Step by Step guide for moving a site from an old to a new domain... so I guess a good discussion here, could help many web masters have a smooth transition. So in your opinion, beside the obvious, what are the most important steps you must take? Here is what I do: 1. 301 old site to new one and TEST.
    2. Check Internal Links - Double Check for 404's.
    3. Update your Social Profiles with new URL.
    4. Let GWT and BWT of the change and request a Crawl.
    5. Contact as Many of Webmaster as you possibly can to point your links to your new domain. What's missing? What have you found helpful and/or Effective?

    | dhidalgo1
    0

  • We have several microsites (by microsite I mean sites that are basically top-level departments of our main ecommerce site. We continue to run these, without much support, and they do generate a few sales but we simply don't have the resources to grow them or manage them effectively. We have "kicked around" the idea of 301 redirecting them to our main ecommerce site with the idea that any additional SEO value would be greater than the few sales they currently generate. All products that are on our microsites can be found on our main ecommerce site, thus we can redirect products on our microsites to the exact product on our main site. How would you treat these sites? Would you 301 redirect them? If so, how would you do it? What would be some considerations if we decide to 301 redirect? Microsite example: http://www.drinkingstuff.com/ Main site: http://www.prankplace.com/ I would greatly appreciate any tidbits the community could provide us on this. Thanks!

    | Istoresinc
    0

  • For a large jobs site, what would be the best way to handle job adverts that are no longer available? Ideas that I have include: Keep the url live with the original content and display current similar job vacancies below - this has the advantage of continually growing the number of indexed pages. 301 redirect old pages to parent categories - this has the advantage of concentrating any acquired link juice where it is most needed. Your thoughts much appreciated.

    | cottamg
    0

  • In other words, does title tag change frequency hurt SEO ? After changing my title tags, I have noticed a steep decline in impressions, but an increase in CTR and rankings. I'd like to once again change the title tags to try and regain impressions. Is there any penalty for changing title tags too often? From SEO forums online, there seems to be a bit of confusion on this subject...

    | Felix_LLC
    0

  • We have a site that is suffering a duplicate content problem. To help resolve this we intend to reduce the amount of landing pages within the site. There are a HUGE amount of pages. We have identified the potential to reduce the pages by half at first by combing the top level directories, as we believe they are semantically similar enough that they no longer warrant being seperated.
    For instance: Mobile Phones & Mobile Tablets (Its not mobile devices). We want to remove this directory path and 301 these pages to the others, then rewrite the content to include both phones and tablets on the same landing page. Question: Would a massive amount of 301's (over 100,000) cause any harm to the general health of the website? Would it affect the authority? We are also considering just severing them from the site, leaving them indexed but not crawlable from the site, to try and maintain a smooth transition. We dont want traffic to tank. Has anyone performed anything similar? Id be interested to hear all opinions. Thanks!

    | Silkstream
    0

  • 3 months we updated our site design design and as such lots of page urls changed. At the time we 301 redirected about 100 pages. (All pages are on the same domain - 301 redirects like .com/about-us/company to .com/company) Anyhow my question is should I leave these redirects active indefinitely or kill them assuming value has passed through by now? Your Thoughts are welcomed. Thanks, Glen.

    | AdvanceSystems
    0

  • Vtex in one of the best e-commerce in Brazil and I´ve just find out they transform any 404 page in a search page. Polishop ( http://www.polishop.com.br/ ) is one of their clientes and if you try to search any page it will never return a 404 error because convert any url in a search. Example: http://www.polishop.com.br/12345678 - 200: HTTP/1.1 200 OK ( it does not return a 404 code) I´m a little bit confused if this good or not... what do you think moz experts?

    | SeoMartin1
    0

  • I was wondering if anybody can shed some light on any recent changes to the Google algorithm in Australia. A competitor, www.manwithavan.com.au has always been number 1 for the most competitive search term in our industry "removalists melbourne". However, in the last week, they have fallen out of the the SERPS and are now (according to MOZ) ranking outside the top 50. As far as l can tell, they have a really well optimized site with good structure, great text and updated content. They are very active within social media circles and have some really good external links. Can anybody tell me why they would have been hit so badly. The reason l ask is that i want to make sure we don't make the same mistake. Any feedback would be greatly appreciated.

    | RobSchofield
    1

  • Hi there, I'm going through my link profile and I noticed I have a few links that are from <10 DA sites.  One has a DA of 6.  Should I remove these? Aside from any referral traffic I receive from these links (I know there is none), are these links hurting me?
    What should I look out for in a site I may guest post on? Thanks!
    Travis

    | Travis-W
    0

  • Hi guys (first time posting). I'm involved in many differnt marketing activities on an ecommerce site and don't always get a lot of time to focus on SEO (although I appreciate its importance). What are your tips for the most effective SEO tasks to focus on considering these time constraints? Think 80/20 applied to SEO. Thanks. Paul

    | kevinliao
    0

  • Hello, Our client allows users to create free-trial subdomains and once the trial expires, all the domains have the same page. If people stick, their own websites are hosted on the subdomain. Since all these expired trials subdomains have the same content and are linking towards the Homepage, should they be nofollows? Has anyone dealt with something similar? Thanks very much in advance,

    | SCAILLE
    0

  • Using the redirect function in cPanel I am able to create the 301 redirect that I need to not have duplicate content issues in Moz. However, the issue now is that when I try to login to domain.com/login it redirects to domain.com/index.php?q=admin, which is not a page on the site and I can no longer login. I have checked the htaccess file and it appears that the entry is correct ( I originally thought that the cPanel redirect was not writing access correctly ). I am not sure if there is a small detail that I am missing with this or not. So my main question is how do I redirect my site to remove dup content errors while retaining the login at domain.com/admin and not be redirected to domain.com/index.php?q=admin? Thank you ahead of time for your assistance.

    | Highline_Ideas
    0

  • Hi folks I am responsible for an e-commerce website. Our website is doing very well but I believe that our product pages should be ranking more highly than they currently are. When taking over my current role, it became clear that a number of changes would need to be made to try and boost the under performing product pages. Amongst other things I therefore implemented the following: New Product content - we have placed a massive focus on reworking all product content so that it is unique and offers value to the reader. The new content includes videos, images and text that is all keyword rich but (I hope) not seen as overly spammy. Duplicate content - the CMS was creating multiple versions of the same page - I addressed this by implementing 301 redirects and adding canonical links. This ensures there is now only 1 version of the page Parameters - I instructed Google to not index certain URLs containing specific parameters Internal links - I have tried to improve the number of links to the products from relevant key category pages My question is, although some of the changes have only been in place for a month, what else can I do to ensure that the product pages rank as highly as possible. As an e-commerce website with so many products it is very difficult to link to these product pages directly, so any tips or suggestions would be welcome! Here's an example of a product page link : http://www.directheatingsupplies.co.uk/pid_37440/100180/Worcester-Greenstar-29CDi-Classic-Gas-Combi-Boiler-7738100216-29-Cdi.aspx

    | DHS_SH
    0

  • I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!

    | RG_SEO
    0

  • Hi all. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (<cite>thewilddeckcompany.co.uk/index.php?id=13</cite>‎) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. Any ideas? I'm stumped!

    | Blink-SEO
    0

  • Our company is setting up a store on Ebay. Is it okay to duplicate our content descriptions on our ebay store with a link going back to our website? Or would this potentially hurt us in Search?

    | hfranz
    0

  • Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!

    | RickyShockley
    0

  • Hi there, I'm working on getting a large e-commerce website indexed and I am having a lot of trouble.
    The site is www.consumerbase.com. We have about 130,000 pages and only 25,000 are getting indexed. I use multiple sitemaps so I can tell which product pages are indexed, and we need our "Mailing List" pages the most - http://www.consumerbase.com/mailing-lists/cigar-smoking-enthusiasts-mailing-list.html I submitted a sitemap a few weeks ago of a particular type of product page and about 40k/43k of the pages were indexed - GREAT!  A week ago Google de-indexed almost all of those new pages.  Check out this image, it kind of boggles my mind and makes me sad.  http://screencast.com/t/GivYGYRrOV While these pages were indexed, we immediately received a ton of traffic to them - making me think Google liked them. I think our breadcrumbs, site structure, and "customers who viewed this product also viewed" links would make the site extremely crawl-able. What gives?
    Does it come down to our site not having enough Domain Authority?
    My client really needs an answer about how we are going to get these pages indexed.

    | Travis-W
    0

  • ello! We host our PDFs, Images, CSS all in a sub domain. For the question, let's call this sub.cyto.com. I've noticed a particular PDF doing really well, infact it has gathered valuable external links from high authoritative sites. To top it off, it gets good visits. I've been going back and forth with our developers to move this PDF to a subfolder structure.
    For example: www.cyto.com/document/xxxx.pdf In my perspective, if I move this and set up a permanent redirect, then all the external links the PDF gathered, link juice and future visits will be attributed to the main website. Since the PDF is existing in the subdomain, I can't even track direct visits nor get the link juice. It appears in top position of Google as well. My developer says it is better to keep images, pdf, css in the subdomain. I see his point and an idea I have is to: convert the pdf to a webpage. Set up a 301 redirect from the existing subdomain to this webpage Upload the pdf with a new name and link to it from the webpage, so users can download if they choose to. This should give me the existing rank juice. However, my question is whether you can set up a 301 redirect for just a single subdomain URL to a folder structure URL? sub.cyto.com/xxx.pdf to www.cyto.com/document/xxxx.pdf?

    | Bio-RadAbs
    0

  • Hi Guys, I have a website which has approximately 15 million pages indexed. We are planning to change url structure of 99.99% of pages but it would remain on same domain. eg: older url: xyz.com/nike-shoes; new url: xyx.com/shopping/nike-shoes A benefit that we would get is adding a related and important keyword in url. We also achieve other technical benefits in identifying the page type before hand and can reduce time taken to serve the pages (as per our tech team). For older URLs, we are planning to do a 301 redirect. While this seems to be the correct thing to do as per Google, we do see that there is a very large number of cases where people have suffered significantly on doing something like this : Here are our questions: Will all page rank value will be passed to new url? (i.e. will there be a 100% passing of PR/link juice to the new URLs) Can it lower my rank for keywords? (currently we have pretty good rankings (1-5) on many keywords) If there is an impact on rankings - will it be only on specific keywords or will we see a sitewide impact? Assuming that we have taken a hit on traffic, How much time would it take to get the traffic back to normal? and if traffic goes down, by what percentage it may go down and for how much time. (best case, average case and worst case scenarios) Is there anything I should keep in mind while doing this? I understand that there are no clear answers that can be given to these questions but we would like to evaluate a worst case/best case situation. Just to give context : Even a 10 day downtime in terms of drops in rankings is extremely detrimental for our business.

    | Myntra
    0

  • Ive read some stuff about expired content here, but have yet to find an answer so I thought I would post my question is regarding a news based site and expired content issues. So my site does Recaps, and Previews for sporing events.  Well eventually the content does become not relevant as nobody cares about a prediction after the game is done.  What would be the best method to deal with this? Should I just leave it there or 301 redirect it to the more relevant games. The reason why I'm asking is because when I have added a more recent game such as New York vs Boston, when I would search for that keyword in google, the page google would show would be like Atlanta vs L.A. Thanks in advance!

    | ravashjalil
    0

  • Big city A is the target optimization for services. Suburb city B is the location of the business. Will the NAP of the business in the footer negatively impact on-page optimization for Big city A?

    | AWCthreads
    0

  • Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
    RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
    RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

    | esiow2013
    1

  • Hi Mozzers, Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it? Thanks in advance, Karl

    | KarlBantleman
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.