Problems in Perl Filters

 by b0iler
  lecture given on may 17th in irc.unixhideout.con #bsrf
 Written for : - my site full of other cool tutorials - a legendary site full of original tutorials

--- b0iler has changed the topic to: Blacksun Research Facility - - lecture in progress: "Problems in Perl Filters" - msg questions during lecture to b0iler.
--- b0iler sets modes [#bsrf +m]

<b0iler> Common Filtering Problems in Perl.
<b0iler> --intro
<b0iler> This lecture may also be helpful to other languages, but the exact syntax and ideas are for perl.
<b0iler> This lecture will be +m and all questions will be msg'd to me, msg me questions anytime and I will ether answer them at that time, or save it till the end.
<b0iler> This lecture may go at a slow pace, this is so I can think things out.. since my planning of this lecture was pretty weak. If you get bored, too bad XD~
<b0iler> --
<b0iler> The Main Concepts in Evading Perl Filters Are:
<b0iler> Complete lack of filters.
<b0iler> Filters that forget characters.
<b0iler> Filters that are in the wrong order.
<b0iler> Filters that filter previous filters (or filter themselves! I'll explain later).
<b0iler> And multiple inputted variables forgotten in the filters.
<b0iler> --
<b0iler> The What and Why of Perl Filters:
<b0iler> Lets get into what perl filters are and why they are so important in terms of security.
<b0iler> Filters are ways perl programers stop bad things from happening. It's my way of saying things that change something.. most of the time it is just reg ex.
<b0iler> For example, if you do:
<b0iler> $blah = `cat $ENV{'QUERY_STRING'}`;
<b0iler> Then attackers can easily input something like:
<b0iler> script.cgi?/etc/password
<b0iler> Which would read your password file, or they could be even more tricky and do something to this effect:
<b0iler> script.cgi?file.txt;rm -rf anything/
<b0iler> (need to url encode some characters)
<b0iler> So perl programmers filter out characters which can do bad things.
<b0iler> This is a pretty good idea, and almost every script should have some kind of filtering system set up. Although even one flaw in a filtering system can lead to alot of security headaches.
<b0iler> --
<b0iler> Types Of Filtering:
<b0iler> There are two main types of filtering, they are:
<b0iler> input
<b0iler> output
<b0iler> The input filtering is the most used, and is usually the most serious for security.
<b0iler> It comes before any action is taken on the user input. This will stop any bad characters from effecting the actions of the script.
<b0iler> Many people make the mistake to only filter input. Although not always nessasary, output filtering is very useful in stopping file reading vulnerabilities, cross site scriptting, and other attacks.
<b0iler> Output filtering is filtering things right before they are outputted to the client, database, file, or other outputs.
<b0iler> Some times the output filtering may look to be pointless, but data may have been changed throughout the script's execution, so checking to make sure nothing bad is outputted can be a good idea even when you are fairly sure nothing bad can be.
<b0iler> --
<b0iler> Ways of Filtering:
<b0iler> There is also two main ways of filtering:
<b0iler> Filtering bad input
<b0iler> Allowing good input
<b0iler> There are also other possible ways to filter, such as length checking, pattern checking, and other odd ones.
<b0iler> Filtering bad input is the most used, and the one with the most mistakes by the programmers.
<b0iler> When filtering bad input it is extremely easy to forget something or not know of a feature in perl or a feature in an external program which your script uses.
<b0iler> These forgotten filters can lead to vulnerabilities very easily. It just takes the attacker awhile to think up of creative ways to evade the filters or to do something a different way - if one way is filtered do it a different way which isn't filtered.
<b0iler> Allowing good input is the preferred way of stopping bad input from becoming a security problem.
<b0iler> This is because you are only allowing the good character to get by, and stopping all possible bad combinations which would have been hard to filter out with filtering bad input.
<b0iler> Sometimes allowing input is almost impossible if you want to give users any flexability. You cannot always hold them to a set of characters, but you need to make a desision on how much importance should go into security and how much into useability.
<b0iler> Here is an example of filtering bad input:
<b0iler> $blah = $ENV{'QUERY_STRING'};
<b0iler> $blah =~ s/\;//;
<b0iler> print `cat $blah`;
<b0iler> This will stop people from doing `cat file.txt;touch file2.txt` (using the ; to issue another command).
<b0iler> But if you read my "Hacking CGI - Security and Exploitation" tutorial ( ) then you will read about the number of ways to use different methods to do things in system commands.
<b0iler> It is extremely hard to stop all the possible combinations of bad input individually. So instead lets take a look at only allowing good input:
<b0iler> $blah = $ENV{'QUERY_STRING'};
<b0iler> $blah =~ s/[^a-zA-Z0-9\.\-_]//g;
<b0iler> print `cat $blah`;
<b0iler> print `cat $blah`; #*
<b0iler> This will stop anything that might not be good in a filename from being cat'd. But there is another method of allowing good input that I prefer.
<b0iler> This method is denying access to anything if the user inputs a character not allowed:
<b0iler> $blah = $ENV{'QUERY_STRING'};
<b0iler> if($blah =~ m/[^a-zA-Z0-9\.\-_]/){ die "bad characters, only allowed a-zA-Z0-9 . - and _\n"; }
<b0iler> print `cat $blah`;
<b0iler> Ok, enough with the background info. Lets learn how to break these things shall we?
<b0iler> --
<b0iler> My Exploiting Filters Process:
<b0iler> The way I do it is to first look for common mistakes in filters. I don't even really think about what should or shouldn't be allowed in.
<b0iler> I first see if they are filtering bad input or only allowing good input. If it is only allowing good input then my testing will most likely be short.
<b0iler> I will see what I can do with the allowed characters, most likely not much. Then I will go over the rest of the script and see if anything perticular can be harmed with any of the allowed characters. These are mostly design flaws which the filters allow to be exploited.
<b0iler> If they are filtering bad input, then things get a bit more fun =)
<b0iler> Before even looking at what the script does I will see if they forgot any of the basics:
<b0iler> Did they forget any bad characters?
<b0iler> Can the filter be evaded with character insertion?
<b0iler> Do they filter in the correct order?
<b0iler> Did they forget to filter any user input?
<b0iler> I make notes of any possible problems and then I go look over what the script does in detail. After I look over the design of the script I look for specific calls which can be abused.
<b0iler> I then go back to the filters and see if any combo of allowed characters can abuse these calls.
<b0iler> If nothing pops up at me I'll go sit and think about way to evade the filter. Sometimes I'll have to do alot of testing inorder to see how the filter works in certain situations and if anything can be slipped by.
<b0iler> I've learned alot from testing filters for holes when I was new to perl auditting.
<b0iler> Just brainstorming about possible ways to defeat filters is about the best advice I can give.
<b0iler> It is also a very good idea to know alot about the other factors at play. Many times you can find a little/unknown feature in something which the script does not filter for. If the programmer didn't know a character did something then they will probably not filter for it.
<b0iler> --
<b0iler> Actually Exploiting Filters:
<b0iler> This will hopefully be the bulk of the lecture, and where you learn the most important bits of information. This is in no way a complete list or anything, I hope some of you find new methods of evading filters and share them with me ( )
<b0iler> I do not wish to feed the script kiddies out there by giving away exact ways to exploit filters, but it's a nessassary evil inorder for others to learn security.
<b0iler> I'll start with directory transversal filters. The basics being these:
<b0iler> $blah =~ s/\.\.//g;
<b0iler> $blah =~ s/\.\.\///g;
<b0iler> $blah =~ s/\.//;
<b0iler> $blah =~ s/[^\w\._\-]//;
<b0iler> There are more, but these are a few of them. The first one filters for the string '..' So to evade this one you can do something like:
<b0iler> $blah = '.\./';
<b0iler> Which will get by the filter and still go back a directory. This trick also works for the next one, which filters for '../'
<b0iler> But this one has another problem. It takes out any string that matches '../' this means 'ab../cd' would turn into 'abcd' and something like:
<b0iler> $blah = '.../...//';
<b0iler> Now after the filter removes any '../' from $blah it becomes '../' so to fix this we need to add a loop to the filter, deny user input, or not replace the string with nothing.
<b0iler> a loop example would be:
<b0iler> while($blah =~ /\.\.\//){ $blah =~ s/\.\.\///; }
<b0iler> Denying user input example:
<b0iler> if($blah =~ /\.\.\//){ die "illegal string in input.\n"; }
<b0iler> replacing the string with something example:
<b0iler> $blah =~ s/\.\.\//_/;
<b0iler> All these would stop the '.../...//' attack. The last one would turn '.../...//' into '._._/'
<b0iler> Now the 3rd filter works good for getting rid of any reverse directory transversals ('../'), but it cannot be used if the '.' is needed for input. There isn't really anyway around this unless you specify a full pathname starting at root, ex. '/etc/passwd' which has not '.' in it.
<b0iler> The 4th is the best of the these filters, it is only allowing what we know is good. No suprises that we didn't think of can get in there. It might be a good idea to also filter for '..' with this one just in case.
<b0iler> Lets move on from directory transversal filters and discuss the main techniques used with defeating filters.
<b0iler> +Complete lack of filters
<b0iler> This is a pretty obvious way of getting past filters, if they don't exist they cannot stop you. ;)
<b0iler> Many times programmers have no clue about security don't put any filters in or they forget one or two needed filters.
<b0iler> Always take the time to think about everything the user inputs and weather you should filter it or not. When in doubt I would say filter atleast for the allowed characters just to be safe.
<b0iler> +Filters that forget characters
<b0iler> This is close to the forgetting filters one, but this time the programmer was trying to be secure, but forgot a key string/character.
<b0iler> Most of the time it is because the programmer does not know that the string they missed can be used to cause damage.
<b0iler> For instance, a newbie linux user might not know that && can be used to issue additional commands. So they forget this filter even when filtering for | and ;
<b0iler> sorry, *nix user
<b0iler> print `cat /etc/passwd && less /etc/hosts`; #to demonstrate how multiple commands can be used with &&
<b0iler> Inorder to deal with any type of shell commands I would suggest being an expert in the shell, learning alot about the different ways perl can open a shell, and filter really good (only allowing what is good).
<b0iler> Check my paper: for more examples of this technique.
<b0iler> +Filters that are in the wrong order
<b0iler> It is a good idea to take time out and think of what order your filters should come in. If you filter something in the wrong order problems can occur.
<b0iler> The most widely avalible example of this would be something like:
<b0iler> $blah =~ s/aa//g;
<b0iler> $blah =~ s/%([a-fA-F0-9][a-fA-F0-9])/pack("C", hex($1))/eg; #convert url encoding to ascii
<b0iler> A way to defeat this 'aa' filter would be to url encode your 'aa' as '%61%61'. This next filtering problem is almost the same as this one.
<b0iler> +Filters that filter previous filters
<b0iler> This is a strange one I found pretty earily in my perl journey. This one is simular to the '.../...//' trick discribed eariler.
<b0iler> What happends is that one filter is looking for a string, and another filter changes the string later.
<b0iler> So an attacker can get a string to pass the first filter, and then have the next filter(s) change the string into a dangerous one.
<b0iler> An example is easier to follow:
<b0iler> $blah =~ s/<!--(.|\n)*-->//g;
<b0iler> $blah =~ s/javascript//ig;
<b0iler> Above we have the most common used filter to prevent ssi and a filter for 'javascript'.
<b0iler> The way to get around this filter is to use the 'javascript' filter inorder to change a string into ssi. So if:
<b0iler> $blah = '<javascript!-- #exec cmd="rm -rf /home/you/www" -->';
<b0iler> Then $blah will pass the first filter without getting changed, and then be turned into '<!-- #exec cmd="rm -rf /home/you/www" -->' which is a bad string, and would have been filtered by the first filter.
<b0iler> This technique can also be used with only one filter. You've already seen this in the '.../...//' example I talked about.
<b0iler> +Multiple inputted variables forgotten in the filters
<b0iler> This technique is more used on the output side of things (html, flat databases, etc..), or when dealing with filepaths. But I've seen it turn up in odd places, so keep an eye open for it.
<b0iler> The deal with this problem is that your filters may work perfect, they may filter out every possible bad string and any of the other technqiues used to evade are stopped. But.. when the bad string is split between two variables your filters do not find it.
<b0iler> Here is an example with filepaths (pretend all directory transversal is stopped):
<b0iler> $blah =~ s/\.\./;
<b0iler> erm
<b0iler> $blah =~ s/\.\.//;
<b0iler> $file =~ s/\.\.//;
<b0iler> open(FILE "/home/user/${blah}$file");
<b0iler> If $blah = '.'; and $bleh = './anotheruser/file'; the whole filename becomes: '/home/user/../anotheruser/file'
<b0iler> This problem is seen alot when filters try stopping ssi or cross site scriptting. If two variables are printed to html, then you need to make sure they aren't evading your filters.
<b0iler> A fairly hard problem to fix indeed. Purhaps putting all output into one variable and then filtering output is the best solution.
<b0iler> --
<b0iler> That's pretty much it for the lecture, I will just throw one more evasion technique for a poor perl filter that is used all the time.
<b0iler> $blah =~ s/<!--(.|\n)*-->//g;
<b0iler> I've already said that this filter is used to stop ssi. But from my testings there is a way around this filter. That is if:
<b0iler> $blah = '<!-- #exec cmd="rm -rf /home/you/www" - ->';
<b0iler> Notice the space in '- ->' at the end.
<b0iler> Now $blah will pass the filter and still will get parsed by ssi. Atleast from my testing, if you can please try this on your box and email me if '- ->' works (try it with include, not exec as you might have exec disabled).
<b0iler> These types of attacks where inserting characters can be a big problem if you don't know all the variables of the system and the details of how they work. Also you can find new ways to exploit very common filters this way, so experiment alot with what is and what isn't possible.
<b0iler> --
<b0iler> Now time for discussion, I will try to answer any questions (if you have any) or help you better understand anything. This part might be editted in the log posted to bsrf website.
<b0iler> Please try to refrain from talking unless you have a question or are adding to the disscussion. Do not talk about other things which aren't reguarding perl filters.
--- b0iler sets modes [#bsrf -m]
<Strider> nice lec b0iler =)
<b0iler> no questions worry me. ether it was good or people don't care much.
<luCky> is this gonna be on a website
<b0iler> luCky: yes it will be on tonight and maybe if they upload it
<Vegas> pvt now
<dksk8> lol
<Strider> LMAO!
<Strider> rofl
<dksk8> roflmao
<dksk8> no questions just <Vegas> b0iler YOU ARE A FUCKING BASTARD
<b0iler> To conclude the lecture I would like to comment on how obese cyrus is.
<b0iler> <yank> Cyrus: eat a duck.
<b0iler> <yank> I mean dick.
<b0iler> <b0iler> he'll eat both. and then a chicken.

About 4 hours after my lecture I reread it and noticed that I completely forgot to mention a couple of things which my "Hacking CGI - Security and Exploitation" paper covers.  This is how perl's reg ex uses "globing" to find strings.  Which means it will match the first part (starting at the begining), and then find the last part starting at the end going backwards.  This is dangerous if you are trying to correct any string.  Read the paper for a better explaination, but here is a quick example:

$blah = '<script><script>alert("this is not filtered well");</script></script>';
$blah =~ s/<script>(.*)<\/script>/[script]$1[\/script]/ig;

This will result in the s/// finding the first <script> and then going to end and finding the last </script>.  it will only replace these with the []'s, it will leave the middle ones untouched.  So $blah will be: [script]<script>alert('this is not filtered well');</script>[/script]

The other is quite simple.  The fact that .* will match any character.  Or atleast that's what many scripts seem to think.  Perl will match any character up until a newline, unless the /s switch is given or the newline is specificly matched.  This means trouble for things like:

$blah = "<script\n>alert('unfiltered');</script\n>";
$blah =~ s/<(.*)>/[$1]/g;

Now this will match the first < then it will match some characters, but then it runs into a newline before the closing >.  So the pattern does not match and the filter does nothing, but the html tags still work (atleast in browsers I tested).

If you find anymore problems in perl filters please email me about them with a clear discription and some examples.

[-----]  - is my homepage

I got tons of tutorials, mini-tutorials, advisories, and code written by me there.  Come check out what I'm up to and possibly learn a bit.  This lecture was orignally given for but anyone has permission to mirror it as long as it is mirrored in whole and proper credit is given to the author.  Also a link to would be nice.