What are your opinions on the Google News vs Spanish Government Issue ?
-
Greg Sterling said: "Governments across Europe are justifiably alarmed by the declining fortunes of their respective newspaper industries. However punitive or parasitic taxation measures targeting Google, masquerading as copyright protections, are not the answer."
Do you agree?
-
If the Spanish government is smart it will reverse or revise these rules. If they don't reverse or revise their publishers will be starved of revenue and with that their quality must drop - if they don't go out of business.
-
Thanks for the explanation and your answer !
-
The reason i mention it's a good idea to post the article is because not everyone will know the author or its context and it gives them an opportunity to read and if they want comment with their opinion which is what you are asking for.
The comments in that thread are interesting there is also https://www.seroundtable.com/google-news-spain-down-19580.html
I find it amusing that it somewhat backfired in the Spanish government just like it did with Germany and now they are faced with doing far more harm than good.
The term "justifiably alarmed" is just the authors opinion and as far as i am aware i don't know of any governments alarmed, its an age where people want instant news and with no or little cost most of the time via smart phones its not hard to see why papers are going down hill. The main problem is many places that publish "news" are not professionals and normally don't cite sources etc. and can get in some very bad situations but i digress.
if you want to see a prediction check this out - http://searchengineland.com/avoid-liability-google-reduces-news-content-germany-headlines-204811
If the government want to throws toys out of the pram Google has no problems upping and leaving until they come to their sense in my opinion.
Lastly remember this doesn't come in until Jan 2015 so there is still time for change.
-
By the way I didn't ask the question because I read it on Searchengineland, I'm Spanish and I live in Spain and this affects me. I've read a couple more Spanish articles before that one. I just quoted it because I thought it summarized the whole thing in a good way.
-
I quoted the name of the man who said that parragraph, and I'm asking for your opinion, have you got one ?
Thanks for putting the link.
-
Psst if you are going to ask a question great idea to cite where from - http://searchengineland.com/spanish-newspapers-want-government-force-google-keep-news-spain-210860
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google ever begin penalising bad English/grammar in regards to rankings and SEO?
Considering Google seem to be on a great crusade with all their algorithm updates to raise the overall "quality" of content on the Internet, i'm a bit concerned with their seeming lack of action towards penalising sites that contain terrible English. I'm sure you've all noticed this when you attempt to do some proper research via Google and come across an article that "looks" to be what you're after, then you click through and realise it's obviously been either put together in a rush by someone not paying attention or putting much effort in, or been outsourced for cheap labour to another country whose workers aren't (close to being) native speakers. It's getting really old trying to make sense of articles that have completely incorrect grammar, entirely missing words, verb tenses that don't make any sense, randomly over-extravagant adjectives thrown in just as padding, etc. etc. No offense to all those from non-native speaking countries who are attempting to make a few bucks online, but this for me is becoming by far more of an issue in terms of "quality" of information online as opposed to some of the other search issues that are being given higher priority, and it just seems strange that Google have been so blasé about it up to this point - especially given so many of these articles and pages are nothing more than outsourced filler for cheap traffic. I understand it's probably hard to code in something so advanced, but it would go a long way towards making the web a better place in my opinion. Anyone else feeling the same way? Thoughts?
Industry News | | ExperienceOz1 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
Google driving me Nuts - How do you combine 2 accounts?
I know this must be driving a lot of other people mad as I see loads of people who now have 2 registered accounts at google plus due to their seemingly terrible ability to merge or connect accounts. We have a work email address set up through google, then I have a personal Gmail address. In Google Plus now I have 2 profiles - even though I have not signed up to google plus with my work email, I cannot add this email to my Google + account set up on personal email as it just tells me to log into that account taking me to a page to set up a profile for that account. Has anyone managed to solve this problem - it is happening to everyone in the company, driving us all nuts and our IT guys have no idea how to solve it. Were trying to use G+ for the purposes of SEO & Marketing, but if they make it this cumbersome for people to use, then they are going to die a quick death - after a few weeks of use and noticing the huge number of dead accounts, only live accounts being SEO/Internet Marketing related, and huge number of duplicate accounts, I think their user figures are hugely suspect! Rant over - anyone know how to merge accounts?
Industry News | | James770 -
Paste 'do a barrel roll' into Google - See what happens!!
Hi, Paste the words 'do a barrel roll' in Google and see what happens!
Industry News | | activitysuper0 -
Why has my places ranking dropped off google?
I've recently worked for a company who up untill 3 weeks ago have shown up in the google places ranked at number 1 for the following key phrases. First Aid Training in Cornwall
Industry News | | it-ok
First Aid Courses in Cornwall
Health and Safety Training in Cornwall as well as several more... However over the past 3 weeks they have gradually dropped out of the places listing all together, and I'm struggling to understand why this might be. Yesterday I released a new website for them which has had a lot more SEO input in to the site, is this likely to have an affect on the places ranking? the website is www.insafehandstraining.com Any answers would be very appreciated.0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google Directory no longer available?
Now, we will forever not know what is in the Google Directory. I just clicked on the link..... and everything is dead and points you to DMOZ. What does this mean for us? Is DMOZ going to get more editor juice, so submissions are actually reviewed for once? The Yahoo! directory has also been glitching - new submissions have been disabled for over a week now. Any comments?
Industry News | | antidanis0