wall had its own perimeter space, a DMZ, a demilitarised zone with a public interface (probably to some packet switching network). In the DMZ files came in, and files were dropped for external systems to pick up, probably using FTP. This was fine when there was little interconnectivity between organisations but even before the public internet there was a need and there was a whole set of protocols like EDI to exchange data (somewhat arthritically). And when the public internet and the Web did arrive the same model was reused, with Web servers sitting in the DMZ outside the trusted area. This is when trust started to leak into the hard perimeter. The web servers needed content and data and had to send any transactions into the internal systems, Sure copies or abstracts of databases could be staged out in the DMZ but data still had to flow at some point and the means of security remained the same, once across the firewall most everything was addressable and if access was controlled it was by account and password. On the whole once in a network space every device was potentially accessible. The combination of web browser, web services and thin or no interfaces between service and data led to minor and magnificently catastrophic exploits. And still does, Considerable effort and work was put into network architectures to try and separate applications, data and control, and that works at an infrastructure level, the virtual networks can be kept separate, but there is still the problem of identity and authorisation. Relying on IP and routing rules whether in firewalls, routers, switches or servers does not scale and that same problem is easily repeated in soft networking using cloud architectures. |