SEroundtable is usually pretty good at covering algo updates (and potential updates) - this is the only august post he has on a potential update.
You can also check out these volatility tools (which track changes in search engine results):
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
SEroundtable is usually pretty good at covering algo updates (and potential updates) - this is the only august post he has on a potential update.
You can also check out these volatility tools (which track changes in search engine results):
You should utilize og:locale:alternative and have og:locale set to the country you want to present it to. see details here.
Overall, since its English in both scenarios, I don't think you need to worry about it.
So I would recommend you mod_rewrite to a SEO friendly search page for your docs results. e.g. newsites.com/resources/search/technical-documents/ and redirect to that
If not possible, I would still simply redirect to the search results page. Users want to see those docs so redirecting them anywhere else will just cause them to bounce.
Check out their official training: https://analyticsacademy.withgoogle.com/
Also check out http://cutroni.com/ (Google's Analytics advocate)
Backlinks: depends on the quantity and quality of the backlink profiles for your competitors. If they have a much higher DA/PA/Trust, than you probably need to build that up to compete.
Content: unique, long form content works best. Aim to be share worthy... if people share your link on social media, they will probably link back to your website as well.
Can we see example of the search console error?
Here is a news site that does it well so maybe study their code? http://qz.com/723397/the-incredible-things-that-had-to-go-just-right-for-juno-to-reach-jupiter/
1 visit, 3 pageviews
From an SEO perspective, each page is considered a part of the same article. So the difference is really you get more page views if you split it up (given that users will choose to go to next page)
Yep, they are being indexed.
I also see your work for https://www.google.com/search?q=Indianapolis+Baby+Photography&source=lnms&tbm=isch
It could just be an authority issue... you need to strengthen your website authority (get more backlinks) in order to rank your pictures higher.
I think his question is more "why nofollow the icon? google only considers the first link to a domain per page (which in this case is the nofollow icon) which cancels out the 2nd dofollow link"
I don't think this is the case but he cited a French case study that supposedly affirms that statement. Based on the translated version of the article, I don't think there was anything to support it but I could be misunderstanding.
Any French SEOs want to delve in?
Yes, it may be over optimized. First off, make sure the images are being indexed (search for site:yourdomain.com + keyword in image search).
Since most of your images are so similar, you should add more descriptive terms (clothing, scenery, lighting, ethnicity, baby synonyms (toddler, infant, newborn, etc)) so that google doesn't think you are just stuffing keywords.
Note: Googlebot crawlers all originate in California
Don't expect the old site to rank (since Google won't follow the redirect) but new site should be fine.
Make sure...
Don't worry about exif/meta data.
I'm checking out the results and it looks like the the page C doesn't rank for any of their link anchor texts so I'm not sure how they came to a conclusion.
From what I understand...
3 pages each have 2 links to page C. Each link has a unique anchor text that isn't found on Page C.:
Then searching for the specific anchor texts that were linked should show Page C in search results. I don't see page C ranking for any of the anchor texts tested.
Am I missing something?
Why wouldn't the second link be followed? Do you have any studies to cite "tests show that in this case the second link is not taken into account"?
Theoretically, if you paginate it correctly, it shouldn't make a difference (each paginated page would be treated as part of a whole). If you don't, might hurt you for having more thinner pages vs fewer quality pages (it doesn't sound like you will be targeting any additional keywords).
View the source on the cached version of the page: http://webcache.googleusercontent.com/search?q=cache:https://www.oliverbonas.com/jewellery/silver-honeycomb-bee-necklace-29494 - reviews are there
Search for a quote from the review and see if that page ranks: https://www.google.com/search?q="Like+bird+on+a+wire+very+pretty+and+unusual"
Ran another crawler and didn't see any issues so I think you should be good technically.
I also ran a stress test and there were times when the VU couldn't access the page (although generally this would throw up another error code, could be the reason for the 406).
You can put anything into your meta description and it won't mess up any NAP inconsistencies. It's used purely as the text to show in SERPs - not related to SEO, link building, local map pack, etc.
Differences in NAP in the actual website content and various business citations could cause issues, but not in the meta description.
Check to see if they are both in the index (or 1 is in supplemental index). If so, it's an issue.
In general, capitalization does mean a different url (think of all of the link shorteners that user a mix or upper and lower case letters in order to differentiate)
1 and/or 3. not 2.
if there is a lot of volume and its competitive keywords, #3 would help you build sites that are more focused on specific topics (which are ranking well now). Note that this is harder to manage/maintain and will have a larger investment in marketing than option 1.
I also think you should have 1 "corporate/umbrella" website that does what your option 1 says which links to your option 3 sites. Hopefully you can have both sites ranking for your terms.
Not sure I totally understand the question but overall.. meta descriptions have no direct affect on rankings. The goal of a meta description is to entice people to select your link instead of the others on SERPs - use whatever it takes to make that happen.
If someone is looking for a specific office, it would make sense to use that specific office's info in the meta description.
Currently, Google only looks at the desktop version of the page for it's index so collapsing for mobile would have no effect on rankings.
In general, Google says that hidden/collapsible content is given less weight than visible since its not considered as important for users to see.
if English is your default language, keep that at the root (which I already see you guys implemented- nice!) and use subfolders for the remaining languages. be sure to use hreflang markup to let crawlers know which pages are translated versions of which pages.
Check out http://searchengineland.com/the-ultimate-guide-to-multilingual-and-multiregional-seo-157838 (+ sources at the bottom)
Here is what Google sees and indexes for that page: http://webcache.googleusercontent.com/search?q=cache:https://www.whichledlight.com/t/gu10-led-bulbs&num=1&strip=0&vwsrc=1
Be sure you follow all of the steps on https://support.google.com/webmasters/answer/6033049?hl=en
your "support.at-net.net" subdomain hasn't been redirected
You have an extra 301 redirect which doesn't help. Why aren't you using your root domain as site index?
http://at-net.net --> https://www.expertip.net/ --> https://www.expertip.net/en/computer-support-cloud-services-cybersecurity-solutions/
We need more info. Site? URL the 404 had an inbound link from?
While going through your site, the crawler encountered a link that it followed which resulted in the server returning a 404 status code (not found).
I respectfully disagree. While it may not be considered duplicate content, it also can be. In fact, if you look at Google's own suggestions on https://support.google.com/webmasters/answer/66359?hl=en:
Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.
--
So sure, Google says you shouldn't worry about it but if 10% of your pages content is repeated on each page, that could or could not be an issue.. why take that chance? If your actual products don't have much unique content to describe it, the boilerplate content reduces the relevance of the page.
You suggestion of a "pop-up/model window on a single page linked from the product pages" is the best way to minimize boilerplate content and avoid any chance of worse rankings.
If you could keep the .aspx pages at their old locations, you can add aspx redirect code: http://www.rapidtables.com/web/tools/redirect-generator.htm
Here is Google's official recommendations for website testing. According to them, no amount of cloaking is okay. Try using one of the other methods suggested.
i'm seeing difficulty of 30-40 for cheap phones (depending on country): https://moz.com/explorer/overview?q=cheap+phones (watch the video that pop's up from Moz's chat helper)
Check out https://moz.com/help/guides/keyword-explorer for more in depth info on how the scores are calculated.
Keyword Difficulty takes into account the Page Authority (PA) and Domain Authority (DA) scores of the results ranking on the first page of Google for the given query, as well as modifying intelligently for projected click-through-rate of a given page (putting more weight on higher-ranking, more visible pages and less on lower-ranking, less visible pages). The formula also accounts for newer pages on powerful domains that may have DA scores but have not yet been assigned PA values.
Opportunity Score is designed to calculate the relative Click-through-Rate (CTR) of the organic web results in any given Google Search Engine Result Page (SERP). Google SERPs that have very few non-traditional ranking features and are more similar to the classic “ten blue links” only model will have very high Opportunity Scores. SERPs that have many features - like images, ads, news results, answer boxes, knowledge graph panels, etc. - will have much lower Opportunity scores. We use an averaged CTR model derived from our anonymized clickstream data to build this useful metric and apply it based on the features we see in Google’s results.
Yep, your canonicals are incorrect which is likely causing the issue. You have all of the office locations canonicaled to https://www.hrblock.com/tax-offices/local-offices/index.html#!/en/office-locator/ and Google is determining that these are separate pages that should be indexed individually. That's probably why they add extra to the title to differentiate the pages.
Instead, on http://www.hrblock.com/tax-offices/local-offices/#!/en/office-profile/3704?otppartnerid=9192&campaignid=pw_mcm_9192_5138 you should have a canonical pointing to http://www.hrblock.com/tax-offices/local-offices/#!/en/office-profile/3704 (the one listing I'm seeing that loads correctly has that style URL in serps).
Try fixing your URLs, and wait for Google to recrawl/index the urls.
Where is Google finding these URLs? I don't see the variables when I browse through the website.
If the pages are already indexed and you want them to be completely removed, you need to allow the crawlers in robots.txt and noindex the individual pages.
So if you just block the site with robots.txt (and I recommend blocking via folders or variables, not individual pages) while the pages are indexed, they will continue to appear in search results but have a meta description of (this page is being blocked by robots.txt). However, it will continue to rank and appear because of the cached data.
If you add the noindex tags to your pages instead, the next time crawlers visit the pages they will see the new tag and remove the page from the search index (meaning it won't show up at all). However, make sure your robots.txt isn't blocking the crawlers from seeing this updated code.
Here are the 4 you should submit (if you don't have the site in https, can just do the first two)
the http:// is the default protocol so no need to do both http:// and without any protocol
Yes, that would do good. Since content is identical for each of these products, there should only be 1 URL with all of the variations of that product in order to consolidate all of the authority. If you want to keep all of the variations in search, look into creating anchor links that point to the same "master" url. e.g. http://www.prams.net/easywalker-mini-buggy-lightweight-union-jack-b can be linked as http://www.prams.net/easywalker-mini#union-jack
That way, the URL is the structure is more SEO friendly but aesthetically the site is identical.
Search engines won't index anchor URLs so that's not an option.
Does the staff category page have the bios loaded into the HTML on page load? If so, redirect to the staffers category page. If not, you need to come up with another solution (where the bio content is on a page(s) that can be indexed).
Check out the link intersect tool: https://moz.com/researchtools/ose/opportunities/link-intersect?site=fillinyoursite.com
You enter in a bunch of your competitors and it tells you which pages have backlinks pointing to them. The more competitors a page links to, the higher chance you can have your site added there as well.
I don't believe they currently do this. So reorganizing, rewording + adding additional info/links would make the resource pages even better. Just be sure to give credit where it's due.
yeah, if you search for a facebook page on majestic, it will return backlinks to that specific page. you're right about that not being the case for OSE.
Majestic provides data for FB pages.
To your point, a backlink from a high authority FB page may be weighted more than one from a new/fake one.
i recommend developing dedicated pages for those keywords along with the town name (btw, you forgot to edit it out of your last example ). In addition, make sure your homepage contains all of those terms.
Just because you had your manual penalty lifted, doesn't mean the algo penalty (mainly penguin) was lifted as well. Be sure to check up on your anchor distribution to make sure it isn't over optimized.
That being said, we had a site in a similar scenario and ended up switching to another domain after a lack of results for a while. Depending on the extent of the negative SEO, it might be the easiest solution.
Sitemap looks fine. Click on the urls in GSC and see the referring links. That would give you some more insight.
Sounds like a forum script would work for you.
He also mentioned 500 backlinks to the Facebook page, which I will assume aren't crappy. If they are legit, I would try to avoid changing the Facebook URL. Maybe you create a separate FB page and outreach to the webmasters and ask them to update the URL (or better yet, link back to your website instead). Later on, you can merge the two pages.
Create question "category" pages that groups up multiple answers into larger, more encompassing pages.
amazon, zappos, walmart, microsoft store - many ecommerce sites don't use tabs.
A common workaround seems to be to have "tabbed navigation" but instead of toggling visibility, it scrolls down to the corresponding section.
Agreed with one note... its a really bad idea if you are already ranking for the terms you like to. There will definitely be a period of time where you will lose your rankings and although they should return to their previous rankings, Google is fickle.
So if you aren't ranking in the top 50 for several terms, simply changing the URL won't get you to the first page (it may get you into the top 50 though). You would need to update the content and backlinks (internal + external) to see worthwhile movement. Otherwise you just risk losing ranking for the terms you currently do well for.