Focus on Linux
Re: Linux Hardening Oct 21 2007 11:32AM
Liran Cohen (theog rct co il) (2 replies)
Re: Linux Hardening Oct 23 2007 01:40AM
Greg Metcalfe (metcalfegreg qwest net)
On Sunday 21 October 2007 04:32:18 am Liran Cohen wrote:
> I completely agree providing you have the time and dont have a couple of
> dozens of Linux machines to maintain daily, in many cases you have to
> make a sensible choice what would be worth more or in other words asses
> where the risk is higher and invest most of your efforts there.
Unless you're in a very strange environment, you shouldn't be having too much
trouble maintaining a couple of dozen Linux machines. When you get a chance,
you might want to look through USENIX archives (maybe more specifically SAGE
papers), etc. It's not uncommon for a small group to maintain hundreds of
Unixy machines. Automation and a solid infrastructure are your friends.

> Ajai Khattri wrote:
> > On Wed, 17 Oct 2007, Liran Cohen wrote:
> >> what is the machine's location on your network (LAN\DMZ etc...) what is
> >> the machine role, you should ask yourself some questions before
> >> approaching hardening, I would not put the same effort on a machine
> >> which is located on my LAN as much as I would make sure that DMZ
> >> machines are protected
> >
Spot on.

> > I believe even machines on internal networks should all run local
> > firewalls at the very least. There's always some Windoze user using
> > Outlook and clicking on an email attachment they shouldn't click on...
*Nothing is always*. Sorry, but that's a *very* bad mind-set to propagate on a
security mailing list.

What you're referring to is probably quite appropriate on an Ethernet of mixed
Windows and Linux systems. But in some cases you can increase efficiency and
security by subnetting. A few machines doing continuous builds, for instance,
probably don't need more than ssh access. If you have a retired machine, use
it for a gateway into a build farm subnet.

Firewalls do burn CPU cycles. How much depends upon the environment, what your
rule set looks like, whether you're doing centralized logging, etc. It always
pays to test, if for no other reason that you always learn things, whether
the ins and outs of optimizing packet filtering rules, regular exressions
useful for parsing log files, setting up NTP so that your logs are sync'ed
up, or whatever.

That's what you tell management, anyway. The real reason you do it is that
automating the daily trivia away is in itself a learning experience, is tons
of fun, and a source of huge leverage. With automation (often just sets of
bash scripts) harried admins can often get out from behind the 8-ball, and
start having serious fun.

More time means you can learn more, and you'll do a far better job of
hardening Linux than running a script (bastille) from a group of people who
have a history (over several years) of periodically halting development. They
probably had their reasons--things do come up. But it still argues against
depending upon a third-party tool to secure your nets and nodes. The
threatscape evolves--sometimes quite rapidly. In the final analysis, there's
no substitute for local knowledge.

We're just lucky that it's so much frapping *fun*.

[ reply ]
Re: Linux Hardening Oct 22 2007 06:13PM
Ansgar -59cobalt- Wiechers (bugtraq planetcobalt net)


Privacy Statement
Copyright 2010, SecurityFocus