Bay Area SEO Agency
-
Hi,
Can anyone help with recommendations on good SEO agencies based in the Bay Area who have some history of working with gaming or adult brands which have been badly hit by rankings falls in the past 12 months, we suspect due to Penguin.
Thanks
-
Moz does have a list of recommended agencies that's a good place to start looking. If I were you, I would look for bay area companies and then call them with your questions about specific types of sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex shop page and blog page for SEO?
I have about 15 products in my store. Should I noindex shop and blog page for SEO? The reason I ask this question is because I see someone suggesting noindex archives pages. And the shop page is product archive and blog page is archive too, so should I choose index or noindex? Thanks!
White Hat / Black Hat SEO | | Helloiamgood0 -
SEO Links in Footer?
Hi, One of my clients uses a pretty powerful SEO tool, won't mention the name. They now have a "link equity" tool, which they are using on a lot of their client's sites, which include tons of fortune 500 companies. It involves add footer links to your site that change based on the content of the page they are on. The machine learning tries to figure out the most related pages and links to them with the heading tag of that page as the anchor text. Initially this sounds very spammy to me. But then, it seems a lot like "related products" tools that many companies use. The goal for this tool is to build up internal linking, especially for deeper pages on their site. They have over 10,000 currently. What are everyone's thoughts on this strategy?
White Hat / Black Hat SEO | | vetofunk2 -
Are links on sites that require PAD files good or bad for SEO?
I want to list our product on a number of sites that require PAD files such as Software Informer and Softpedia. Is this a good idea from an SEO perspective to have links on these pages?
White Hat / Black Hat SEO | | SnapComms0 -
What if White Hat SEO does not get results?
If company A is paying 5k a month and some of that budget is buying links or content that might be in the gray area but is ranking higher than company B that's following the "rules" and paying the same but not showing up at all, what's company B suppose to do?
White Hat / Black Hat SEO | | EmarketedTeam2 -
Yet another Negative SEO attack question.
I need help reconciling two points of view on spammy links. On one hand, Google seems to say, "Don't build spammy links to your website - it will hurt your ranking." Of course, we've seen the consequences of this from the Penguin update, of those who built bad links got whacked. From the Penguin update, there was then lots of speculation of Negative SEO attacks. From this, Google is saying, "We're smart enough to detect a negative SEO attack.", i.e: http://youtu.be/HWJUU-g5U_I So, its seems like Google is saying, "Build spammy links to your website in an attempt to game rank, and you'll be penalized; build spammy links to a competitors website, and we'll detect it and not let it hurt them." Well, to me, it doesn't seem like Google can have it both ways, can they? Really, I don't understand why Competitor A doesn't just go to Fiverr and buy a boatload of crappy exact match anchor links to Competitor B in an attempt to hurt Competitor B. Sure, Competitor B can disavow those links, but that still takes time and effort. Furthermore, the analysis needed for an unsophisticated webmaster could be daunting. Your thoughts here? Can Google have their cake and eat it too?
White Hat / Black Hat SEO | | ExploreConsulting0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Any e-commerce users recommend an SEO company for link building?
I manage an e-commerce site. I wanted to know if anyone has worked with an SEO company for link-building that they would recommend. I DO NOT want articled directories, bookmarks, etc.. I want real link-building from credible/related sites. If you would give me an idea of the results or the general process they use I would greatly appreciate it. Thank you in advance.
White Hat / Black Hat SEO | | inhouseseo0 -
Seo style="display: none;" ?
i want to have a funktion which shortens text in categorie view in my shop. apple is doing this in their product configurator see the "learn more" button at the right side: http://store.apple.com/us/configure/MC915LL/A apple is doing this by adding dynamic content but i want it more seo type by leaving the content indexable by google. i know from a search that this was used in the past years by black had seos to cover keywordstuffing. i also read an article at google. i beleive that this is years ago and keywordstuffing is completly no option anymore. so i beleive that google just would recognise it like the way its meant to be. but if i would not be sure i would not ask here 🙂 what do you think?
White Hat / Black Hat SEO | | kynop0