For SEO I would recommend HTML, with the option to download a PDF. That being said you should also consider how much time is wasted on doing it HTML. Can the company earn more money by spending their time on something else? I mean how often do people actually search for a certain dish when they are about to go out for dinner. My bet is that they are more likely to do a search like: "Italian restaurant in downtown Berlin" rather then "Broccoli slightly roasted in white-wine". Do you see what I mean?
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by ReneReinholdt
-
RE: Restaurant menu SEO: PDF or HTML?
-
RE: Tracking twitter traffic in Google Analytics
Yes you have understood it right, it seems. I would recommend googles own shortner though, since it will help google understand your tweet.
-
RE: Tracking twitter traffic in Google Analytics
Both the urls u posted seem identical to me
- nope
2)since they are both just the original url, yes
3)both the parameters I wrote were examples. you could call the BigBearBunga or whatever you like. Call them whatever makes sence to you
-
RE: Tracking twitter traffic in Google Analytics
sry m8 I have been busy with work, and am going to be for greater parts of today.. I will however do my best to find time to answer you l8er today.. so fear not I have not forgotten about you
-
RE: Tracking twitter traffic in Google Analytics
The easy way for you to do it is like this:
The url you wan't to tweet: www.youdomain.com/yourart.html
Now you need to add some parameters to the end of the URL f.ex:
medium=twitter and date=08043011
so the url will look like this:
www.youdomain.com/yourart.html**?medium=twitter&date=08043011**
pass it through an URL shortener like bitLy or googles own. and tweet it.
In GA you should now be able to filter your search so you can check the traffick the tweet generated. Because you can track the URL and see how many came to your artcle by that URL. Next time you tweet someting just change the date and you can track that specific URL. same thing goes for facebook, email ex.
actualy theres a WAY easier way to do it:
grab your link pass it through biutLy. bitly will generate an URL for you. use this for the tweet. You can now track trafick (in bitly) from twitter, facebook exc.
Clearly the easiest way.. you don't even have to think
-
RE: Tracking twitter traffic in Google Analytics
"I suppose this can be done in Traffic sources > All traffic sources. Am i right ?" sounds about right or in other words, yes
"I really, really would like to know what specific tweet traffic was generated."
each time you make a new tweet where you want to link to an article on your site, you should run through steps 1-7 and then use the link generated.If I were you though I would read som tutorial on URL's and query strings to get the know how behind what your doing. that way you won't be shooting in the dark and will be able to write those URL's by hand.
When doing SEO, SEM and SMM its a good idea to have a basic understanding in HTML/CSS and how URI/URL's work. And know Analytics/adwords tool set by heart.
To be a good SEO: you should be able to read and write HTML, CSS and Javascript + have a basic understanding of DNS records and domains.
To be a great SEO: You should be all of the above and be able to write some basic PHP/ASP/ASP.NET and have at least a basic understanding about servers.
Hope it helps. If there is something you need me to elaborate please tell me what and I'll try to do so
-
RE: How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
RE: How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
RE: Tracking twitter traffic in Google Analytics
If you don't care what tweet the visitor comes from then you can just see it in GA. But if you want to know what specific tweet traffic was generated by, the yes, you need to do something like what the article stated.
You need to build a new url for each tweet or you won't be able to track the specific tweet.
I hope it made sense else ask me to clarify and I'll do so
-
RE: Tracking twitter traffic in Google Analytics
you could use an url shorten-er like bit.ly but GA can tell you this whitout the use of all that, so why not just check the trafic sources in GA?
-
RE: How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.