Questions created by LindsayDayton
-
Can Google index the text content in a PDF?
I really really thought the answer was always no. There's plenty of other things you can do to improve search visibility for a PDF, but I thought the nature of the file type made the content itself not-parsable by search engine crawlers... But now, my client's competitor is ranking for my client's brand name with a PDF that contains comparison content. Thing is, my client's brand isn't in the title, the alt-text, the url... it's only in the actual text of the PDF. Did I miss a major update? Did I always have this wrong?
Technical SEO | | LindsayDayton0 -
I'm stumped!
I'm hoping to find a real expert to help out with this. TL;DR Our visibility in search has started tanking and I cannot figure out why. The whole story: In fall of 2015 I started working with Convention Nation (www.conventionnation.com). The client is trying to build a resource for convention and tradeshow attendees that would help them identify the events that will help them meet their goals (learning, networking, sales, whatever). They had a content team overseas that spent their time copy/pasting event information into our database. At the time, I identified several opportunities to improve SEO: Create and submit a sitemap Add meaningful metas Fix crawl errors On-page content uniqueification and optimization for most visible events (largest audience likely to search) Regular publishing and social media Over nine months, we did these things and saw search visibility, average rank and CTR all double or better. There was still one problem, and that is created by our specific industry. I'll use a concrete example: MozCon. This event happens once a year and there are enough things that are the same about it every year (namely, the generalized description of the event, attendees and outcomes) that the 2015 page was getting flagged as a duplicate of 2016. The event content for most of our events was pretty thin anyway, and much of it was duplicated from other sources, so we implemented a feature that grouped recurring events. My thinking was that this would reduce the perception of duplicate or obsolete content and links and provide a nice backlink opportunity. I expected a dip after we deployed this grouping feature, that's been consistent with other bulk content changes we've made to the site, but we are not recovering from the dip. In fact, our search visibility and traffic are dropping every week. So, the current state of things is this: Clean crawl reports: No errors reported by Moz or Google Moz domain authority: 20; Spam score 2/17 We're a little thin on incoming links, but steady growth in both social media and backlinks Continuing to add thin/duplicate content for unique events at the rate of 200 pages/mo Adding solid, unique strategic content at the rate of 15 pages/mo I just cannot figure out where we've gone astray. Is there anything other than the thin/copied content that could be causing this? It wasn't hurting us before we grouped the events... What could possibly account for this trend? Help me, Moz Community, you're my only hope! Lindsay
Intermediate & Advanced SEO | | LindsayDayton0 -
Change URL or use Canonicals and Redirects?
We just completed a conclusive a/b test on a client's landing page. The new page saw a 30% bump in conversions, yay! Now what? Option 1: Change the url of the new page to that of the old page, retire the old page. Option 2: Redirect the old page and anything that was pointing to it to the new page, make the new page the canonical. I'm afraid of option 1 because I think Google's WTF penalty will be a bit harsher than option 2, but I wanted to sanity check that here. Any thoughts or experienced advice would be very appreciated!
Technical SEO | | LindsayDayton0 -
URL Changes Twice in the Same Year
I've got a new client with a great site, great off-page optimization and some scars and a hangover from a bad developer relationship. I'd be so grateful for your thoughts on this situation: Some time in the not-too-distant-past, the website is established and new content is posted. We'll call this Alpha. In April 2015, the client migrates to WordPress, implementing 301 redirects on every content page because of the capitalization issues of the old CMS. That means Alpha URLs are redirecting to Betas. Problem is, the new Beta WordPress URLs are the the permalink structure: /%year%/%monthnum%/%postname%/ and update by default when the page content is updated meaning that any updates to existing content cause another 301. It's my belief that for evergreen content, dates in the URL do nothing to help you and might even hurt from a user-experience standpoint, if not a search engine one. So, naturally, I'd like to move to the simple/%postname%/ structure, which would be Gamma. So, here's how I think we should fix it. Step 1: Update the sitemap and navigation and make the desired URL (Gamma) structure the default and the canonical. Step 2: Change the Alpha -> Beta redirects to Alpha -> Gamma Step 3: Add Beta -> Gamma redirects Anyone done this in the past? Anyone have any problems with it?
Intermediate & Advanced SEO | | LindsayDayton0