Google place page Images
-
Is there any real difference in uploading an images directly to your google places page or linking an image from another site?
I have heard that you get better results if you upload a photo to photo bucket then to insider pages then post that link to your google places page. To me it just seems a bit odd to do things this way. I get that it's suppose to give you more back links however I don't think it would necessarily be relevant or useful for the user.
Any thoughts??
-
I am with David, there are so many other important things to consider. This, I can't believe, has any influence.
-
I really don't think it would make a difference. Worry about providing high quality content that provides value and is engaging. That'll benefit you far more!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google push down for not ranking top for branded keywords?
Hi all, Usually websites rank for their branded keywords. Some times third party websites takeover the websites for branded keywords. If there are more number of such queries where website is not ranking (top) for branded keywords, Google push down website in overall rankings? Any correlation? Thanks
Algorithm Updates | | vtmoz0 -
Is it wise to conduct a link building campaign to a Google+ Local page?
Is it wise, while doing a link building campaign to not only focus on the main website target page, but also the Google+ Local page? Here are two strategies I was thinking of using: 1. Conduct a city specific link building campaign to direct traffic to the location specific page on the main website AND the Google+ Local page. 2. Use the main website to direct traffic to each cities specific Google+ Local page. Does it make sense to drive links to a Google+ Local page? It does to me, but I haven't seen anything written about that yet... or perhaps I've just missed it along the way. I'd love to hear the communities thoughts. Thanks! Doug
Algorithm Updates | | DougHoltOnline0 -
Google doesnt index my Google+ Profile
Hey guys! I know it sounds like a novice question, but I have checked ALL THE BOXES THAT TELL GOOGLE TO INDEX MY GOOGLE+ PROFILE. It is Visible for search - 100%. It's been 3 weeks since I opened a Google+ profile and it still hasn't been indexed for its name. Any guesses what's going on? (It's not this name so don't try to google me)
Algorithm Updates | | Yoav_Vilner0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
How much does Google take Social into account in serps
I have been reading and trying to learn more about how google takes social media into account when ranking sites, but I cant seam to find a definitive answer to this, does it make a big difference or does it not really matter?
Algorithm Updates | | MiracleCreative0 -
How did my Page Authority and Page Rank disappear?
I've hit a problem. A couple days ago my site's page authority was 51 and the PR was 3 and now they're 1 and 0 respectively. The developer did adjust some of the code in the site in the past couple days but that shouldn't have affected this. It was last cached by Google on the 5th. Can anyone offer some good advice? If it helps the page is www.duracard.com
Algorithm Updates | | Andrea.G0 -
Google and Content at Top of Page Change?
We always hear about how Google made this change or that change this month to their algorithm. Sometimes it's true and other times it's just a rumor. So this week I was speaking with someone in the SEO field who said that this week a change occurred at Google and is going to become more prevalent where content placed at the "top of the fold" on merchant sites with products are going to get better placement, rather than if you have your products at top with some content beneath them at the bottom of the page. Any comments on this?
Algorithm Updates | | applesofgold0