Looking for more information on how to do PHP the right way? Check out PHP: The Right Way

Lorenzo Alberton's Blog:
PEAR::Pager Tutorials
Sep 19, 2006 @ 07:31:52

Lorenzo Alberton has posted a tutorial today about using teh PEAR::Pager package to create "pretty links" with a little help from mod_rewrite.

Most PHP pager classes can work just fine with GET parameters, correctly forwarding them through the pages. Few of them let you control the navigation links they create, though. This can be particularly annoying when you have some nice urls (thanks to some mod_rewrite rules o to your hand-crafted front controller) and the pager class can't respect them, showing the real, ugly links to the world.

If the above scenario is not new to you, then you should probably have a look at PEAR::Pager. It's a fully customizable package that should satisfy all your needs, including your preferred link format.

In his examples, he provides the mod_rewrite rules to use, a sample PHP script that would normally use the $_GET values (in an ugly URL) to paginate the results. He also compensates for if the page number is actually a part of the path and not just at the end of the file name.

tagged: pear package pager tutorial mod_rewrite rules get page number pear package pager tutorial mod_rewrite rules get page number

Link:

Lorenzo Alberton's Blog:
PEAR::Pager Tutorials
Sep 19, 2006 @ 07:31:52

Lorenzo Alberton has posted a tutorial today about using teh PEAR::Pager package to create "pretty links" with a little help from mod_rewrite.

Most PHP pager classes can work just fine with GET parameters, correctly forwarding them through the pages. Few of them let you control the navigation links they create, though. This can be particularly annoying when you have some nice urls (thanks to some mod_rewrite rules o to your hand-crafted front controller) and the pager class can't respect them, showing the real, ugly links to the world.

If the above scenario is not new to you, then you should probably have a look at PEAR::Pager. It's a fully customizable package that should satisfy all your needs, including your preferred link format.

In his examples, he provides the mod_rewrite rules to use, a sample PHP script that would normally use the $_GET values (in an ugly URL) to paginate the results. He also compensates for if the page number is actually a part of the path and not just at the end of the file name.

tagged: pear package pager tutorial mod_rewrite rules get page number pear package pager tutorial mod_rewrite rules get page number

Link:

CodeSnipers.com:
Building Clean URLs Into a Site
Jun 27, 2006 @ 15:00:14

On CodeSnipers.com today, Peter Harkins talks about a method, using regular expressions and Apache to turn ugly, GET-laden URLs in your application into clean, search engine friendly URLs without altering the underlying scripts.

So we have two goals. First, requests for the new URL are internally rewritten to call the existing scripts without users ever knowing they exist. Second, requests for the old URLs get a 301 redirect to the new URLs so that search engines and good bookmarks immediately switch to the new URLs.

He starts with a sample .htaccess file, showing a simple RewriteRule to take in the request and remap them back to the old PHP script's input format. They work through a few more changes, noting issues along the way (in case you hit them too) and end up with a simple, and much easier way to achieve clean URL bliss.

tagged: clean url mod_rewrite rewriterule search engine clean url mod_rewrite rewriterule search engine

Link:

CodeSnipers.com:
Building Clean URLs Into a Site
Jun 27, 2006 @ 15:00:14

On CodeSnipers.com today, Peter Harkins talks about a method, using regular expressions and Apache to turn ugly, GET-laden URLs in your application into clean, search engine friendly URLs without altering the underlying scripts.

So we have two goals. First, requests for the new URL are internally rewritten to call the existing scripts without users ever knowing they exist. Second, requests for the old URLs get a 301 redirect to the new URLs so that search engines and good bookmarks immediately switch to the new URLs.

He starts with a sample .htaccess file, showing a simple RewriteRule to take in the request and remap them back to the old PHP script's input format. They work through a few more changes, noting issues along the way (in case you hit them too) and end up with a simple, and much easier way to achieve clean URL bliss.

tagged: clean url mod_rewrite rewriterule search engine clean url mod_rewrite rewriterule search engine

Link:

Hasin Hayder's Blog:
FeedPHP is going to be the largest PHP News Source
Jun 15, 2006 @ 06:04:11

Hasin Hayder has a new note on his blog today about a new site he's developed to aggregate the feeds from other PHP-related sites and blogs as well as a few on other topics as well.

FeedPHP is a free RSS aggregator which collects feed, parse and display the titles with a link back to the original source. The whole site is a single page with mod_rewrite tricks.

For performance, the contents are cached for several minutes, otherwise my site will be convicted for consuming excess bandwidth from those sites. Moreover, the site will be extremely slow for consuming remote contents frequently. Output buffering is used to serve contents quickly.

There is definitely tons of content here, but my only complaint is that it seems to load each section a bit slow, especially for cached data? Other than that, it's a great resource for an overview of what's happening in the PHP (and related) communities.

tagged: feed community resource rss aggregator mod_rewrite feed community resource rss aggregator mod_rewrite

Link:

Hasin Hayder's Blog:
FeedPHP is going to be the largest PHP News Source
Jun 15, 2006 @ 06:04:11

Hasin Hayder has a new note on his blog today about a new site he's developed to aggregate the feeds from other PHP-related sites and blogs as well as a few on other topics as well.

FeedPHP is a free RSS aggregator which collects feed, parse and display the titles with a link back to the original source. The whole site is a single page with mod_rewrite tricks.

For performance, the contents are cached for several minutes, otherwise my site will be convicted for consuming excess bandwidth from those sites. Moreover, the site will be extremely slow for consuming remote contents frequently. Output buffering is used to serve contents quickly.

There is definitely tons of content here, but my only complaint is that it seems to load each section a bit slow, especially for cached data? Other than that, it's a great resource for an overview of what's happening in the PHP (and related) communities.

tagged: feed community resource rss aggregator mod_rewrite feed community resource rss aggregator mod_rewrite

Link:

Simplesem.com:
4 Steps to Make Your PHP Site Indexed Properly
May 12, 2006 @ 05:46:15

In a (very) brief post on Simplesem.com today, there's some suggestions to help you and your site be properly noticed by Google and other search engine spiders out there.

It's common tendency for Search Engine Optimization specialists to avoid use of dynamic URLs and not groundlessness. Search Engine Spiders don't index URLs overwhelmed with dynamic parameters.

So if your site is PHP-based and resides on an Apache Server then you might consider carrying out these four simple steps to boost your traffic.

The steps are basic, but they are a good place to start if you're looking at getting started with "search engine optimization". The main suggestion is to use an Apache rewrite rule to change url parameters into part of the path (and vice-versa). Obviously, it's not the solution for everyone as you'd need access to the server's config to use it.

tagged: search engine index properly rewrite rule apache mod_rewrite search engine index properly rewrite rule apache mod_rewrite

Link:

Simplesem.com:
4 Steps to Make Your PHP Site Indexed Properly
May 12, 2006 @ 05:46:15

In a (very) brief post on Simplesem.com today, there's some suggestions to help you and your site be properly noticed by Google and other search engine spiders out there.

It's common tendency for Search Engine Optimization specialists to avoid use of dynamic URLs and not groundlessness. Search Engine Spiders don't index URLs overwhelmed with dynamic parameters.

So if your site is PHP-based and resides on an Apache Server then you might consider carrying out these four simple steps to boost your traffic.

The steps are basic, but they are a good place to start if you're looking at getting started with "search engine optimization". The main suggestion is to use an Apache rewrite rule to change url parameters into part of the path (and vice-versa). Obviously, it's not the solution for everyone as you'd need access to the server's config to use it.

tagged: search engine index properly rewrite rule apache mod_rewrite search engine index properly rewrite rule apache mod_rewrite

Link:

phpRiot.com:
Creating search engine friendly URLs in PHP
Jan 11, 2006 @ 07:03:54

PHPRiot.com has a new tutorial today dealing with the creation of "search engine friendly" URLs for your site.

One of the major reasons for using a server-side language such as PHP is for the ability to generate dynamic content. Often this will lead to single scripts that produce their content based on the input parameters (that is, the variables in the URL).

This article covers various techniques and methods for representing these parameters in the URL in a clean and "friendly" manner, as well as then how to read the parameters.

They start off with some examples of what they look like, and move right into how to use the Apache mod_rewrite functionality to take in the URL parameters and map them back to a PHP script. They also use the ForceType keyword in Apache to get the server to parse the URL string correctly. They then wrap it all up with the creation of a custom 404 page to handle the errors that might come up, and a summary of the whole project...

tagged: search engine friendly URL apache mod_rewrite ForceType search engine friendly URL apache mod_rewrite ForceType

Link:

phpRiot.com:
Creating search engine friendly URLs in PHP
Jan 11, 2006 @ 07:03:54

PHPRiot.com has a new tutorial today dealing with the creation of "search engine friendly" URLs for your site.

One of the major reasons for using a server-side language such as PHP is for the ability to generate dynamic content. Often this will lead to single scripts that produce their content based on the input parameters (that is, the variables in the URL).

This article covers various techniques and methods for representing these parameters in the URL in a clean and "friendly" manner, as well as then how to read the parameters.

They start off with some examples of what they look like, and move right into how to use the Apache mod_rewrite functionality to take in the URL parameters and map them back to a PHP script. They also use the ForceType keyword in Apache to get the server to parse the URL string correctly. They then wrap it all up with the creation of a custom 404 page to handle the errors that might come up, and a summary of the whole project...

tagged: search engine friendly URL apache mod_rewrite ForceType search engine friendly URL apache mod_rewrite ForceType

Link: