Category Archives: active directory

Disabling NTLM in your Windows environment

NTLM (NT Lan Manager) has been around for quite some time and is a source of problems for network defenders as there are a number of issues with this form of authentication. NTLM credentials are usually stored in memory and can be easily extracted by an attacker using a tool like Mimikatz and the credentials can be also be used in pass the hash attacks. Tools like Responder can harvest NTLM credentials over the network just by pretending to be that network share a user tried to access within your network. So, getting rid of NTLM should be a priority for many but where do you start?

Even though Kerberos is now the default authentication protocol, most companies and organisations cant simply turn off NTML support. There are a lot of applications and systems that rely on NTML authentication to function properly.

So, what can you do as a security administrator to move away from NTLM? The number one thing is to get the facts on who and what is actually using NTLM for authentication. In Active Directory, there is a specific GPO setting that actually allows you to audit all those NTLM requests that would be blocked if NTLM was not allowed. Those events are logged and can be viewed in the Event Viewer on your domain controller or member server. By enabling this audit trail you can start to see what is actually using NTLM for authentication. If you have large environment with legacy applications my guess is that there will be a lot of entries in that particular log. The GPO as well as the other two GPO’s mentioned later on in this post are located at:

Computer Configuration\Windows Settings\Security Settings\Local Policies\Security Options
Network Security: Restrict NTLM: Audit NTLM authentication in this domain

You can set the value to audit only domain accounts or all accounts. If you use local accounts, make sure to set the value to all accounts for a complete log of NTLM use in your environment.

Once the GPO is active, the NTLM authentication requests are logged to the operational log located in Application and Services\Microsoft\Windows\NTLM log on every server where the GPO is set.

Second, once you know who and what that uses NTLM in your environment, see if you can migrate them to use kerberos instead, perhaps by using SPN. From personal experience, even really big commercial products sometimes have yet to migrate from using NTLM to Kerberos, some products require an upgrade or configuration changes. It all comes down to knowing which applications are relying on NTLM authentication which you now have a way of logging and find out.

Some applications will probably not be possible to migrate and if you have to keep them running, you can make an exception for just those applications, that is the third step. By allowing an exception, you can allow a particular server ot use NTLM for authentication even if NTLM has been disabled in your domain. This is very useful for minimising your NTLM authentication needs to a minimum. Look at the GPO listed below.

Network security: Restrict NTLM: Add server exceptions for NTLM authentication in this domain

Now that you know what uses NTLM, have either migrated or made an exception for them, you can finally disable NTLM all together by setting this GPO.

Restrict NTLM: NTLM authentication in this domain

This is the final step required to disable NTLM for your domain all together except for the exceptions you are forced to make for legacy applications. Hopefully you do not have to make any exceptions at all.

Be aware though that if you have missed something, it is a good chance that “something” will break. Also, this is not something you configure on friday morning and expect to be done by friday afternoon, it takes time. However, once complete, you have made it harder for the attackers as NTLM is no longer available as an attack vector at all or at least severely reduced. Kerberos is not without issues either, but that will be discussed in another blog post.

WMI filters in Group Policy gives me errors

When I started working with WMI filters to use in Group Policy, I was struck down by an error “Either the namespace entered is not a valid namespace on the local computer or you do not have access to this namespace on this computer”, the namespace being root\CIMv2.

After researching this issue on the Internet, I realised that I was not alone by any means, it seemed like a great deal of people had these problems. They felt their WMI queries were correct but Windows told them differently every time they saved the query. So, what to do about this?

First of all, make sure that you can list the root\CIMv2 namespace. You can do this by using Powershell with the following command:

PS1> (gwmi -namespace "root" -class "__Namespace" | Select Name)

You should see CIMv2 listed, otherwise you have bigger problems. Then you have to head into troubleshooting WMI and perhaps even repairing your WMI repository. You can read more about this at lansweeper.com. Microsoft also has a guide for WMI troubleshooting and also a specific tool called WMI Diagnosis Utility.

If you do see it, chances are that the error produced in GPMC (Group Policy Management Console) is actually just an irritating bug. I found two tools that can actually help you in building your queries, those being the Powershell “gwmi” command and the other being WMI Code Creator from Microsoft. The latter tool can produce code for querying WMI, but the reason I employ it is because it easily allows me to find all classes and parameters as well as query the properties on the machine I run it on. It can also query a remote machine. This allows me to check my WMI filters to see that they target the right type of computers or whatever the case may be.

The “gwmi” command is quite useful as it has a query flag which can be used like this:

PS1> gwmi -Query 'Select * from Win32_OperatingSystem where Version like "6.1%"'

This allows you to run your WMI query and check the output. The example above should produce output if you are running Windows 7. A very neat resource of WMI queries for different operating systems can be found on nogeekleftbehind.com.

WMI filters can be very powerful when employed in Group Policy. Instead of having to target every organisational unit that contains workstations running Windows 7, one WMI filter targeting all Windows 7 machines can be used instead. However, a word of caution as it is easy to make a mistake with your WMI filter and you can end up targeting other machines then the ones intended. This can produce some strange problems, so do test your WMI queries once or twice before deploying them.

I still get the same error as I started this post with when I try to save a WMI filter i GPMC, but it works never the less. The domain controller is fully patched, but it has not resolved the issue. For the time being, it seems that I have to live with this, but as long as it still works, I can deal with it.

Tired of users who spreads malware using USB devices?

Then perhaps you should get them a USB-device that will teach them a serious lesson, no I am just kidding, let me explain.

A russian security researcher nicknamed Dark Purple seems to be inventing a killer USB device, or more a computer frying USB device. It is an interesting way to use a USB device, thats for sure. You can you read more about it here. It is not for sale, at least not yet, but it is quite fascinating to read about it.

If you wish to employ a little less drastic counter-measures, there are some.

  • Use Active Directory to simply deny the use of USB devices

I know, it sounds impossible, but it is not. It all depends on whether you want to take on the administrative burden of managing exceptions or not. Yes, in a large organisation it will most likely be quite impossible. Even though, it is worth knowing that Active Directory can mitigate the threat from USB devices.

  • Malware Protection Engines

Same as the above actually, the rely on the class ID and serial numbers of the USB devices wether to allow it or deny access. The same administrative burden awaits.

I am not even gonna suggest using superglue on the USB ports since it is almost never an option, but instead say that most important thing you can do about USB devices is training and NOT allowing your users to do their day to day work running with local admin privileges. Then make sure you disable autorun and if possible, never allow code execution on removable devices. Stick with those and the USB threat is at least mitigated. Unfortunately, the USB threat is here to stay and will remain a threat to most organisations for a long time.

SHIPS have set sail, part II

SHIPS as I wrote about in my last post is a system for rotating admin password developed by TrustedSec. It handles both Linux and Windows systems which is great. I read the documentation, I have yet to try it, but I just want to point out a few things that I noticed when reading it.

1. The initial installation of SHIPS is a number of manual steps

This is actually quite necessary as this also builds understanding of how SHIPS works, which is good. The downside of it is that it do take some time to get this up and running. On the other hand, it solves a problem which has bugged sysadmins for years so I can definitely live with it.

2. Dependencies

The required packages of Ruby are easily installed on most Linux distributions, but there is one thing that perhaps is missing a bit and that is the need to have a working PKI (Public Key Infrastructure). The reason for this is that SHIPS uses SSL when communicating between the client and the server, and the client needs to trust the server SSL certificate. Your company or organization should have a CA which can sign the SHIPS server SSL certificate for it to be trusted by your clients. If you dont have this, you will have a harder time setting up the clients. It is possible to work around this by trusting individual self-signed SSL certificates on individual clients, but it is not recommended. There is also an option to use curl (which is used by the Linux clients to communicate with the SHIPS server) with the insecure mode, thereby not validating the SHIPS server SSL cert. Do not use this, instead, if you plan to use SHIPS, make sure you have a PKI to support it to make it work smoother.

3. Idents

Idents are used for managing objects within SHIPS, such as validating which users that can login to SHIPS and retrieve and set passwords as well as managing authorized clients which can connect. Authentication idents can be /etc/shadow, SQLite (database) and finally external using the LDAP protocol for querying users and computers. When it comes to the actual clients, these are handled in arrays unless you using either LDAP or simply allowing any client to connect to SHIPS. If you do allow any client to connect without validating the clients name which is not recommended, it is a possible way to perform a DoS attack against the SHIPS server database. In an enterprise, most would probably go with the LDAP option and most enterprises rely on Active Directory. That does not mean that you are running LDAP. A Windows domain controller does not run LDAP unless you have installed it, it is not included per default when installing Active Directory Domain Services. So, you might have to actually install and configure LDAP in your environment first, and that takes some planning. It is not impossible by any means, but it is additional step that you should be aware of if you are to implement SHIPS using a LDAP as the ident store.

As a last not about idents, it is possible to develop your own ident and integrate it with SHIPS.

4. LDAP

LDAP uses port 389 and one should remember that this is a clear text protocol. If you use LDAP as the ident store for managing the SHIPS interface, there is a possibility to sniff data between the SHIPS server and the LDAP server. This might not be a very big problem, but it is possible to use TLS with LDAP over port 636 which would have been better. This is something I would like to see added to SHIPS if it is doable. The authentication between the SHIPS administrators client and the SHIPS server is using SSL so it is protected, but not the LDAP request from SHIPS to the LDAP server.

Summary

I think SHIPS is a great solution, quite capable even though perhaps a bit tricky to get it up and running. Sysadmins who expect a simple installer will be disappointed, but as I stated in the beginning, the manual steps adds to ones understanding of how SHIPS really works, and that is very important. I will try to test this once I have everything set up to really support SHIPS, not just getting one with it. SHIPS looks like a quality tool and a great project and I want to test it in the best way I can. I will write about my experience on testing SHIPS later on.

SHIPS have set sail

I was most pleased when I saw the release of the SHIPS software from TrustedSec. The problem with managing local admin accounts could be a thing of the past with this tool, and the best thing about it, it is open source. The idea about SHIPS is rotating the local admin password with a random generated password. It is client and server based, so you have a server part which holds the passwords in encrypted form, as well as a client part which sets the actual password for local admin user on every box where you have the SHIPS client installed. It can be installed on laptops, desktops and servers, it does not really matter, as long as it is running Windows. The communication between the SHIPS server and client relies on HTTPS so nothing is transmitted or stored in clear text.

The most used solution today which is is a tool called AdmPWD does not support encryption in the version that is publicly available, passwords are stored in clear text in Active Directory. Not everyone can read that attribute but it would feel better knowing those passwords were indeed encrypted. With SHIPS, that problem is solved.

This looks like a great boost for everyone on a blue team as this has been and still is a real hassle. This will definitely make the life harder for any penetration tester. I cant wait to try this out. Thanks to TrustedSec for releasing this tool, awesome job, now I just wait for a Linux version!