How to make SEF URL for PHP/MySQL web site
-
Hi mozzers!
I'm fairly new to SEO topic, but I'm learning fast because all of you, so please take my warm thanks first!
The problem: I have a web site based on PHP/MySQL that has no SEF addresses, it's made by unknown CMS, so I cannot use any extensions or modules, I have to write my own SEF extension.
The question: Would you suggest me, please an article or idea, what I need to make my URLs search engine friendly? What's best to use: .htaccess or something else?
This is the aforementioned web site: www.nortrak.bg
Thanks a lot,
Kolio
-
Hello Everyone,
I'm new here and just getting back into SEO (a little bit) after not doing anything 'myself' for a couple of years. My question is along the same subject. Currently my individual URLs show as: https://www.example.com/index.php?l=product_detail&p=107 (dynamic responsive site).
I can switch it to a static site, so the individual product pages read as:https://www.example.com/catalog/category name/product name-107.html
It's still a long URL, but it would be keyword rich. Some of my current dynamic pages are indexed,and due to an upgrade I had to do several months back, I already have some redirects (301) from my php extensions to the one listed above. This is my long explanation to my following questions:
-
Does having a dynamic or static site matter when ranking in search engines
-
I already have some redirects coming my older site to this dynamic site, so I would have to make more directs from the dynamic site to my static site - is this okay to do?
I'm really at a loss, a couple of years ago, I ranked 1-3 (on Page 1) on Google for all my keywords, (all White Hat work), and now I'm into great abyss of no mans land of the internet (ranked on Page 3+)
Thank you for any and all help from everyone!
~Sandra
-
-
Thank you Saibose!
What to do with old URLs that are already in SERP should I make redirection rules separately? I've found two more articles, that clarifies the SEF-topic:
Explanation of the problem and a tool for generating .htaccess rules
-
I would suggest that depending on the number of pages, you can either do a htaccess based redirection or an apache mod redirect.
Here is an article which could help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
New site causes massive drop off in ranking, old site restored how long to recover?
Hello, We launched and updated version of our site, mainly design changes and some functionality. 3 days after the launch we vanished from the rankings, previous page one results were now out of the top 100. We have identified some of the issues with the new site and chose to restore the old well ranking site. My question is how long might it take for the ranking to come back, if at all? The drop happened on the third day and the site was restored on the third day. We are now on day 6. Using GWT with have used fetch as Google and resubmitted the site map. Any help would be gladly received. Thanks James
Intermediate & Advanced SEO | | JamesBryant0 -
(Australia) Changing .net.au to .com.au - web dev is refusing to do a 301 redirect and wants to run two sites?
After years using a .net.au site, my client has purchased the .com.au version of the same domain. I've now set up a new, responsive website using a wordpress template with new content, but used a similar page structure. I've asked their web developer to now do a 301 permanent redirect on each old page from .net.au site to it's new .com.au page, but he has refused, saying it would be bad for long term SEO. Instead, he says they should run both sites (which I thought would cause duplicate content issues). Both domains are hosted with the same company. I thought as long as the 301 redirects were done on a page by page basis, there were no issues? I'm no SEO expert, (which he claims to be), so I just wanted to get another opinion on what best practice would be in this instance.
Intermediate & Advanced SEO | | carolineraad0 -
Spammy sites that link to a site
Hello, What is the best and quickest way to identify spammy sites that link to a website, and then remove them ( google disavow?) Thank you dear Moz, community - I appreciate your help 🙂 Sincerely, Vijay
Intermediate & Advanced SEO | | vijayvasu0 -
Philosophy & Deep Thoughts On Tag/Category URLs
Hello, SEO Gurus! First off, my many thanks to this community for all of your past help and perspective. This is by far the most valuable SEO community on the web, and it is precisely because of all of you being here. Thanks! I've recently kicked off a robust niche biotech news publishing site for a client, and in the first 6 weeks, we've generated 15K+ views and 9300 visits. The site is built on the WordPress platform. I'm well aware that a best practice is to noindex tag and category pages, as I've heard SEOs say that they potentially lead to duplicate content issues. We're using tags and categories heavily, and to date, we've had just 282 visits from tag & category pages. So, that's 2.89% of our traffic; the vast majority of traffic has landed on the homepage or article pages (we are using author markup). Here's my question, though, and it's more philosophical: do these pages really cause a duplicate content issue? Isn't Google able to determine that said page is a tag page, and thus not worthy of duplicate content penalties? If not, then why not? To me, tag/category pages are sometimes better content pages to have ranked than article pages, since, for news especially, they potentially give searchers a better search result (particularly for short tail keywords). For example, if I write articles all the time about the Mayo Clinic," I'd rather have my evergreen "Mayo Clinic" tag page rank on page one for the keyword "mayo clinic" than just one specific article that very quickly drops out of the news cycle. Know what I mean? So, to summarize: 1. Are doindexed tag/category pages really a duplicate content problem, and if so, why the heck? 2. Is there a strategy for ranking tag/category pages for news publishing sites ahead of article pages? Thanks as always for your time and attention. Kind Regards, Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
SEO Overly-Dynamic URL Website with thousands of URLs
Hello, I have a new client who has a Diablo 3 database. They have created a very interesting site in which every "build" is it's own URL. Every page is a list of weapons and gear for the gamer. The reader may love this but it's nightmare for SEO. I have pushed for a blog to help generate inbound links and traffic but overall I feel the main feature of their site is a headache to optimize. They have thousands of pages index in google but none are really their own page. There is no strong content, H-Tags, or any real substance at all. With a lack of definition for each page, Google see's this as a huge ball of mess, with duplicate Page Titles and too many onpage links. The first thing I did was tell them to add a canonical link which seemed to drop the errors down 12K leaving only 2400 left...which is a nice start, but the remaining errors is still a challenge. I'm thinking about seeing if I can either find a way to make each page it's own blurb, H Tag or simple have the Nav bar and all the links in the database Noindex. That way the site is left with only a handful of URLs + the Blog and Forum Thought?
Intermediate & Advanced SEO | | MikePatch0 -
New web site - 404 and 301
Hello, I have spent a lot of times on the forum trying to make sure how to deal with my client situation. I will tell you my understanding of the strategy to apply and I would appreciate if you could tell me if the strategy will be okay. CONTEXT I am working on a project where our client wants to replace its current web site with a new one. The current web site has at least 100 000 pages. The new web site will replace all the existing pages of the current site. What I have heard for the strategy the client wants to adopt is to 404 each pages and to 301 redirect each page. Every page would be redirect to a page that make sense in the new web site. But after reading other answers and reading the following comment, I am starting to be concerned: '(4) Be careful with a massive number of 301s. I would not 301 100s of pages at once. There's some evidence Google may view this as aggressive PR sculpting and devalue those 301s. In that case, I'd 301 selectively (based on page authority and back-links) and 404 the rest.' I have also read about performance issue ... QUESTION So, if we suppose that we can manage to map each of the old site pages to a page in the new web site, is a problem to do it? Do you see a performance issue or devaluation potential issue? If it is a problem, please comment the strategy I might considere to suggest: Identify the pages for which I gain links From that group, identify the pages, that gives me most of my juice 301 redirect them and for the other, create a real great 404 ... Thanks ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0 -
One site or five sites for geo targeted industry
OK I'm looking to try and generate traffic for people looking for accommodation. I'm a big believer in the quality of the domain being used for SEO both in terms of the direct benefit of it having KW in it but also the effect on CTR a good domain can have. So I'm considering these options: Build a single site using the best, broad KW-rich domain I can get within my budget. This might be something like CheapestHotelsOnline.com Advantages: Just one site to manage/design One site to SEO/market Better potential to resell the site for a few million bucks Build 5 sites, each catering to a different region using 5 matching domains within my budget. These might be domains like CheapHotelsEurope.com, CheapHotelsAsia.com etc Advantages: Can use domains that are many times 'better' by adding a geo-qualifier. This should help with CTR and search Can be more targeted with SEO & Marketing So hopefully you see the point. Is it worth the dilution of SEO & marketing activities to get the better domain names? I'm chasing the longtail searchs whetever I do. So I'll be creating 5K+ pages each targeting a specific area. These would be pages like CheapestHotelsOnline.com/Europe/France/Paris or CheapHoteslEurope.com/France/Paris to target search terms targeting hotels in Paris So with that thought, is SEO even 100% diluted? Say, a link to the homepage of the first option would end up passing 1/5000th of value through to the Paris page. However a link to the second option would pass 1/1000th of the link juice through to the Paris page. So by thet logic, one only needs to do 1/5th of the work for each of the 5 sites ... that implies total SEO work would be the same? Thanks as always for any help! David
Intermediate & Advanced SEO | | OzDave0