Auto-Generated META tag on 404 Page
-
I'm currently creating a 404 error page for my site and I noticed that a similar site uses some sort of code to automatically generate a meta title. Is this useful?
For instance type in electrolux.com/john
This page does not exist but in the title you'll see John | Electrolux
How can i do this on my site?
-
I'm not 100% sure on what you are asking, but I do have pages where the title tag or meta description are generated dynamically using a php script. It's not that hard to do.
In my case, I am getting information from a database and generating the title from what comes from the database. So, once I have my title I'll insert this in the place of the title tag:
<title><?php echo $title; ?></p> <p>If you're trying to do what the electrolux page does, you can use data from the php $_SERVER info: <a href="http://php.net/manual/en/reserved.variables.server.php">http://php.net/manual/en/reserved.variables.server.php</a></p> <p> </p> <p>But, I'm not sure what purpose it would serve.</p> <p> </p></title>
-
Hi Tyler,
From an SEO point of view, this is useless and you'd be better spending the time elsewhere.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anything wrong in linking to homepage from all sub domain pages?
Hi, We have 6 sub domains which are forums, guides, etc. They have their own visitors for the related queries. We are planning to divert some of them to the website to promote our product with latest content. We are planning to add a link from every page of sub domain to our website homepage. This makes additional thousands of internal links flowing to website homepage. Will this kind of internal linking structure hurts? Any risks involved? Thanks
Web Design | | vtmoz1 -
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Using H tags and its maximum Limits
hi..
Web Design | | funclub247
I want to Know what is a Maximum limit of using H tags in One Page : for Eg : I Know That I Can use Only One H1 Tag per Page, What about Other H tag Limit..
h1 - 1 time Maximum
h2 - ..?
h3 - ..?
h4 - ..?
h5 - ..?
h6 - ..?
h7 - ..?
.....
i want to target more than 30 key word using H tag as a header of the paragraph...1 -
How do I optimize a site designed to be one scrolling page of content?
Our website uses section ID's as its navigation so all the content is on one page. When you click About Us, the page scrolls down to About Us. Products, the page scrolls to Products section, and etc. I am getting crawl errors for meta descriptions but will this go away once the main domain has this info? We just added the meta keywords and description to the header and since the navigation sections use the same page, I assume it will correct the errors. Any other advice on optimizing for site designs like ours would be great. www.theicecubekit.com is the site. Thanks,
Web Design | | bangbang
Chris0 -
Schema.org - Right way to mark the pages
Dear all, Almost since we started designing our site, we are using schema microdata. It is not only because of the rich snippets, but because I want the search engines to better understand what we have. For example, the +1 buttom would not work properly without schema microdata, because it kind of ignores the OpenGraph parameters that specified image and description; and since we are a (very small) local bussiness directory (between other things), all our clients have a hand written schema complient description on their lisings, including address, opening ours, telephone number, description, etc. It is hand written by us because the tools avialable are simply not good enough to cover all different scenarios that a listing can present. I have not use, until today, a proper for the homepage, and it is probably the cause that our page lost the nice links below the site description in the google snippet. I did not place it on the body tag, but near the description, closing it inmediately after the description finishs. Now this is solved and we will wait to see if the links come back in the next weeks. Now to the question. Our site has three sections, with three different systems installed, two running wordpress and a third running another script. the main site is the local bussiness directory. The front page is mark as "schema.org/WepPage", and I do not know how to mark the other pages of the main site. I was thinking of marking the listings as "schema.org/ItemPage" since they are related to specific clients. Would you consired it to be right? Then, we have landing pages for the categories, should they be mark as WepPage, or as an Article, or something else? Many thanks in advance for your help, Best Regards, Daniel
Web Design | | te_c0 -
H1 tag optimization question
Hey folks, I've got a question about h1 coding. Our H1 tags are currently coded like this: [http://www.rapitup.com/mf-doom](<a href=)" class=" current">MF Doom Do you think this would be better? [http://www.rapitup.com/mf-doom](<a href=)" class=" current"> MF Doom My guess is that the second example would be better, and even if not better we know it's not worse. Thoughts? Thanks!
Web Design | | irvingw0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0