Replacing a PHP Contacts Directory with a .Net one (both using LDAP)

The university has an excellent and well implemented email and user account system. When I started at the university in 2004 it was possible to access every single member of staff and current student from an email client and see the staff member’s telephone number, which is more impressive than it sounds, especially for back then. In fact it is so well implemented that when I left to work for a blue chip company in 2008 I was surprised that they didn’t have the same capabilities and they were surprised at what we had here. Even now that student accounts have moved to Office365 the capabilities seamlessly remained the same – it really is well set up and the Network Services team continue to do a great job. *

As a public body the university is required to make all the contact details for its members of staff publicly accessible. This was implemented as a PHP form which queried the contacts directory via LDAP. When we migrated the university website to a .Net cloud hosted CMS this was one of the parts which we didn’t know how to replicate and was left behind. With the impending retirement of the old server, 2 years .Net development experience and possibly most importantly .Net Web Application experience within the confines of our particular CMS, we have replaced this with a .Net solution within the CMS.

The Good

Connecting to an LDAP service is simple in PHP and it is also straightforward in C#. Besides having the benefit of being able to store helper classes and functions in the App_Code directory the built-in classes in System.DirectoryServices.Protocols are well documented. It’s not that PHP can’t be written in clear and concise way, it can – just look at Symfony, but C# makes clean code the path of least resistance (you can still end up with monolithic functions if you want though). I also find the code formatting in Visual Studio saves me having to decide on a code style: just Ctrl-K, Ctrl-D.

The Bad

System.DirectoryServices.Protocols is not included with .Net Web Applications by default. It also isn’t included in the Web Application which our CMS Front end servers are built around. If you were developing a Web Application from scratch this wouldn’t be a problem, and it also wouldn’t be much of a hassle if we were hosting them ourselves on a single machine. Unfortunately we have a cloud hosted, proprietary Web Application which we do not update. This means that if something in App_Code references an assembly which hasn’t been installed on a CMS Front end server, that server will fail to serve any pages at all, in other words it will bring the website down.

We used to have a bespoke piece of functionality which our vendor had written for us and installed as a custom DLL. This had caused issues when the CMS was updated because if that DLL wasn’t copied to the correct folder and a reference kept to it in the Web Config, any Razorviews which used it would crash. We took the decision that nothing in App_Code (which brings the whole site down if it fails to compile) could reference a DLL which had the possibility of not being there.

This isn’t a show-stopper, but was annoying not to be able to write a ContactsDirectory class in App_Code to perform all the heavy lifting because it would need to reference System.DirectoryServices.Protocols. Instead we have a half-way house where common functions which do not connect to LDAP are stored in App_Code but the actual connection handling is performed in each of the Razorviews. This means that if the DLL becomes unavailable the failures will be limited to Razorviews on pages instead of the whole site.

The Ugly

Previously the PHP server and LDAP server were both on the same network. Our CMS is in the cloud. Firstly that meant opening a whole in our firewall to allow the CMS servers access to the LDAP servers which was straightforward. Then I hit a snag: a lot of university servers use a self signed SSL certificate , especially ‘inward-facing’ ones. That’s not a problem if you control the server connecting, just install the university root CA certificate (and again whenever it expires), but if you don’t have direct access to the server, or a new one which may be spun up at peak times it’s not that easy.

I did find a good way to get around this: store the fingerprint of the university root CA certificate and then write a delegate verification method which will build a cert chain and accept all certificates that have that certificate as the final one in the chain.

private bool DelegateVerify(Object sender, X509Certificate certificate) {
    // Thumbprint of your self-signed root CA certificate

    X509Certificate2 certToTest = new X509Certificate2(certificate);
    X509Chain chain = new X509Chain();

    //Normal verification
    if (chain.Build(certToTest)) {
        return true;
    } else {

        //If failed normal check then try to allow signed by the Brighton Root CA cert
        chain.ChainPolicy.RevocationMode = X509RevocationMode.NoCheck;
        chain.ChainPolicy.VerificationFlags = X509VerificationFlags.AllowUnknownCertificateAuthority;
        if (chain.Build(certToTest)) {
            return chain.ChainElements[chain.ChainElements.Count - 1]
        } else {
            //If that failed then definitely fail
            return false;

This is a good way to handle the eventuality that the certificate changes to one signed by a recognised authority or if the old certificate is changed (but still signed by the same root). Unfortunately despite it working locally it would not work on our CMS servers. After some Googling I chased this down to a proxy configuration issue. At that point I realised I would just have to hardcode all the certificates I wanted to accept and resign myself to updating this code if it changed.
NB I changed the signature from bool(LdapConnection, X509Certificate) to bool(Object, X509Certificate) because this is something I really only want to write and change once so this had to go in App_Code but LdapConnection is in System.DirectoryServices.Protocols

Putting it all together

With the assembly and SSL certificates the rest of the Contacts Directory search came together very quickly. The delegate verification and all other methods to with connecting to LDAP which didn’t use anything from System.DirectoryServices.Protocols went in a helper class in App_Code.

There are 3 pages:

  1. A search form which pulls the list of departments from LDAP **
  2. A results form which uses the query string parameter to construct an LDAP filter to search the directory and displays the results in a table.
  3. A details page which uses the query string to pull the details of a given user.

Each page had one Razorview written for it which reference System.DirectoryServices.Protocols to make the actual LDAP connection. So there is some duplication of code, but not as much as I feared.

Final thoughts

When I first started working on this I thought it would be easy. Then I saw how differently PHP and .Net handled the connection. When I understood .Net’s method I thought it would be easy. When I realised System.DirectoryServices.Protocols would cause issues I thought it was difficult. When I realised you could assign a delegate to handle certificate validation I thought it would be easy. By the time I realised the CMS servers wouldn’t build a chain I was fed up with dealing with certificates entirely.

The majority of the work time spent on this was down to our self-signed certificates, apart from that the rest was straight forward. One advantage of bringing the search inside the CMS is we can now use caching (both Varnish and Context Caching). It’s now possible to bookmark search terms. It should look virtually the same as the old search so if I’ve done my job well maybe no one will notice.

* Like Continuity Editors nobody really notices if they do their job well – it’s only apparent when something goes wrong.

** Using caching, departments do change but not every day

What was left behind when we moved the website to a CMS

We used to host the university website on a university server using PHP. We didn’t use a Content Management System (CMS) because MySQL databases were, and still are, unsupported by our IT department and back then most of the available (free) CMSs were PHP & MySQL (and still are). So the old situation was individually edited pages with a some aspects such as navigation through PHP includes. It’s an understatement to say that this was not ideal. We did experiment with concrete5 which is a lovely CMS / framework with an MVC architecture and fantastic on page editing. Unfortunately it’s workflow at the time wasn’t able to achieve what we thought we would want from a workflow if we were to have one, and PHP & MySQL.

In 2014 we re-launched the university website on an externally hosted .Net CMS (Contensis). The process of transferring content and functionality over was difficult but did give us the opportunity to re-evaluate content and the information architecture (IA) of the website. We were able to replace nearly all of the old site with the relaunch, however there were a few parts which for various reasons could not be replaced:

  • School sites
  • Prospectus ordering
  • Contacts directory and academic staff profiles
  • 360 Views for campus locations

School sites

These were in a mix of PHP pages and concrete5 installs. We’ve been slowly migrating the school sites from the old university server and into the CMS since the main site launched. That process now has a hard deadline of March 2017 which should be achievable.

Prospectus ordering

The prospectus ordering comprises a complex PHP form and more importantly an automated fulfilment system which also interacts with our email management system. There have been several suggested replacements over the years which have been investigated until they hit walls. It could be replaced by a CRM or enquiry management system and colleagues are investigating those options.

Contacts directory and academic staff profiles

The Contacts directory was replaced on the 2 February see replacing a PHP contacts directory with a .Net one

Academic staff profiles are awaiting a research information system (CRIS) which can store and output staff profiles. Some work has been done replacing the current individually edited PHP pages with individually edited CMS pages.

*In fact this post was meant to be about that but I spent too long explaining the current situation so I decided it should be about that. Whilst writing this I came up with another idea for replacing the prospectus ordering.