Easy to use encryption tools

Last week I attended Next Generation Threats in Stockholm where a hot topic was the matter of privacy. There are a number of encryption tools out there, but which are really the ones we should use? This easy question has unfortunately no easy answer. In my last post I showed how one can use PGP to encrypt email messages, but it is not very easy to use with webmail providers. PGP works, but one cant say it is very user friendly, and that is very important.

Most users do not care how technology works, they simply want to use it. If it is secure, great, but it is not something that most users are to concerned with. Even if you are concerned with security, the more user friendly it is, the better. The problem is, when somebody claims their software is secure, how can you trust them? As a speaker at this conference pointed out, the more buzzwords you put in, the more likely are you to attract users to use your software. NSA-proof was such a buzzword, a word I would never use myself, but it seems very popular among software vendors these days.

Personally, I prefer to use open source tools instead of closed source. Why? It is not because I am proficient enough to check the source code and validate their claim about being secure or not, but I trust in the community to help me with just that. We are all good a different things and by using open source, those companies gives us a chance, as a community, to validate their efforts of helping us to stay secure when communicating. One obvious example is Open Whisper Systems, which currently have two apps for allowing me as a user to communicate privately. The reason I use their software instead of a number of closed source products is just the fact that their source code is available for everyone. I am not an expert on cryptology, nor am I a great programmer, but the whole Internet community has a number of people who are good at these things. I put my trust in their ability, rather putting my trust into a closed source project that can claim whatever they want without having to actually prove it. Another great project is Tor with their browser bundle as an example, which also is open source. Tor is mainly about anonymity and not about encryption, but the idea of allowing secure communication is basically the same. Secure communication can mean a lot of things, but for me, both these projects are at the core of private, anonymous and secure communication.

Users who do not care about their online privacy and do not care about their data, perhaps you should think again. The idea behind “I have got nothing to hide” may not protect you in the future. If governments and other parties are able to obtain your data and eavesdrop on your communications, that power can be abused. It has happened before, and it will happen again. Remember that your online history and communications can be stored for safe keeping unless you protect it, and maybe not now, but somewhere in the future, that data could come back to haunt you. So, I will leave you with a simple advice, the same advice one of the speakers offered.

If your device offers encryption, use it, simple as that. Many of the devices you use have encryption available, some even have encryption on by default. Encryption is available for most of you, so start using it.

Encrypting email sent with webmail

Webmail is very popular and there are a lot of services out there … gmail … outlook.com, any many others. Your email is easily accessible from any computer. No wonder why so many people uses it. However, when you need to send email securely, most webmail providers have no solution. I am not talking about using SSL to encrypt your webmail session, because that has zero impact on the email you send and received unless those are using encryption. SSL only protects your browser traffic, nothing else. Your emails are basically postcards and visible to anyone who happens to be listening, either on your network or along the way the email travels. So, dont use email for trade secrets, simple as that.

If you are going to send trade secrets using email, do use encryption tools. Myself, I use a Macbook at home, but I also use Gmail. So, how can I send and receive encrypted email, as well as digitally sign my emails? I use a free toolkit called GPGTools to aid me in my efforts.

One thing is extremely important to remember when using encryption with webmail and that is, never write cleartext in the webmail interface. If you do, the webmail most likely saves or cache the information in cleartext even if you encrypt it in a later step.

Instead, use a text editor like TextWrangler and write your email and then simply mark all -> Application Menu -> Services -> OpenPGP: Encrypt as seen below.


You will then have to select your recipients key to encrypt the message, and only after the text has been encrypted, copy the contents into your webmail composer window. That way, no cleartext has been submitted to the webmail provider. All the encryption has happened on your client. If you do not see OpenPGP in your text editor menu, open system preferences -> keyboard -> shortcuts -> services, and make sure OpenPGP is enabled for text handling.

This is of course not ideal, but if you have information worth protecting, it is worth the extra effort to encrypt your message offline. To learn how to setup your public/secret key and other things related to GPGPTools, visit the knowledgebase.

Using virtualized domain controllers only

For a long time there has been a question whether it is best practice to have a least one physical server acting as a domain controller even if you are running a virtualized environment. Some say you should, but if you are looking at this issue without having legacy software to deal with, I think you can do with only virtual domain controllers.

This requires a few things of course, one being that your domain controllers must run Windows Server 2012, and that can of course be an issue for many. Second, you need to have a virtualized environment that is distributed and preferably uses HA and DRS in Vmware, but you can do without HA and DRS, they simply speed things up a bit in case of failure. I do not cover Hyper-V in this post, but the idea is basically the same.

Third, your vSphere hosts must be using NTP to sync their time against a reliable source. Time syncing and issues with time drift has been a major concern in virtualized environments for years. One could easily think that you should use Vmware tools to sync your domain controllers with vSphere, but actually you should not. Instead you should set the domain controller that has the PDC role to sync it’s time using NTP (w32time basically) against the same time source as your vSphere hosts. Then let your other domain controllers sync against the PDC. As the PDC role can be moved to different domain controllers, use a WMI filter to make sure that the domain controller that runs as a PDC always syncs time. Thanks to Markus Lassfolk for teaching me this, good tip.

There are two other things I would like to cover in this post, and those being domain controller cloning which allows for the safe cloning of domain controllers instead of having to go through the installation and dcpromo thing. A domain controller may have other software running on it as well as specific settings which would be nice to just keep when deploying an additional domain controller. Again, your Windows Server version must be 2012 for this to work. There are a few steps to do when cloning a domain controller and I will not cover them here. I just want to point out that the possibility exists. For more details about any of this, see the link at the bottom of the post.

Last but certainly not least is the question about what happens when you perform a restore from snapshot on a domain controller? In the past, before Windows Server 2012, this could cause some serious issues. Active Directory keeps track of all the changes it implements, such as adding users and so on, but when a domain controller is restored to an earlier date, that logic fails. It fails because the restored domain controllers simply put use the wrong change numbers, number that the other domain controllers have already marked as used. This means that the domain controllers are out of sync with each other which is not good. So, what has happened with 2012 of Windows Server? With 2012 Microsoft introduced VM-Generation ID, which is a way to keep track of the state of the virtual machine, whether it has been cloned or restored from a snapshot. Active Directory then uses this information to realize if the domain controller is up to date or not, before the domain controller processes transactions. That way, a domain controller which is restored from a snapshot will make sure it syncs all the changes from the other domain controllers before attempting a write operation. This is a great feature which will make life a lot simpler.

As for Vmware, you must be running vSphere and Virtual Center 5.0 update 2 at least, to support this. This is just a brief overview of the concept, but please read the Virtualizing Active Directory Domain Services On VMware vSphere for all the details.

No logon servers available to service the logon request with auto logon

This can happen for a number of reasons, but the main issue seems to be with DNS settings. However, if you have auto logon enabled using a domain account, simply wait. When the message occurs on the screen that “No logon servers available to service the logon request”, Windows will automatically try again after 2 minutes. I have had this issue, but it resolved itself after a 2 minute waiting period.

Of course this is irritating, so what to do if you are sure your DNS settings are correct? Well, try these two GPO settings:

1. Computer configuration -> Administrative templates -> System -> Logon -> “Always wait for the network at computer startup and logon”

2. Computer configuration -> Administrative templates -> System -> Group Policy -> “Startup policy processing wait time”

This value should be set to a value like 120 which is 2 minutes of waiting time.

Do the above, and your domain auto logon error should disappear.

MDT Dirty Environment Found

In Microsoft Deployment Toolkit (MDT) you can sometimes experience something called Dirty environment which basically means that something is wrong. This is either because of an unexpected error during your task sequence, most likely a forced reboot which MDT is not prepared for, or there are leftovers from a previous task sequence.

One can of course go through the logs for the machine being deployed, but sometimes it is better to sit in front of the screen of the machine being deployed and see what happens. Personally, I had IE11 as an application being deployed as well as some prerequisites packages which worked well, but caused trouble later on. I noticed that the application step had been running for quite some time and sure enough, when I got to the screen, Dirty environment found. Once I sat down and watched the task sequence I noticed that after IE11 was installed nothing happened, but a few applications later, something caused Windows to reboot. This was not expected by MDT which caused the Dirty environment error. For me, a simple solution was to edit the IE11 application and check the little box to reboot after install. Once that was done, the error disappeared. MDT rebooted after IE11 which in turn configured installed updates (IE11 prerequisites packages) and then simply continued installing the rest of my applications.

As for leftovers, do check out this script by Johan Arwidmark called Final Configuration, it really rocks. You can view his blog post on Deployment Research.