The Evolution of Web Filtering
In the past, blocking website content based on URL and port 80/443 was all you needed to do. Employees accessed the internet at office locations by directly connecting to the corporate LAN. Managing internet access with legacy web filters and their basic block/allow rules worked fine.
However, as the internet evolved, so did the types of websites and applications employees accessed both on-site and away from the office. As functionality rapidly moved to the server, it sparked the meteoric rise in cloud-based applications and changed the way content would be delivered over the web.
"Network administrators need next-generation web filtering to effectively stop the internet traffic they don’t want."
This change exposed the inflexibility of legacy web filters and rendered their basic block/allow rules ineffective. Today, websites and applications proactively hop ports to improve their reliability and speed. Increasing functionality makes some parts of a website or application mandatory for one group of users, while for others it is not. And with more traffic continuing to go through HTTPS, it increases the amount of traffic legacy web filters are unable to see that automatically gets passed through.
Network administrators need next-generation web filtering to effectively stop the internet traffic they don’t want on their network (like bandwidth-hogging games, media streaming and download torrents), while optimizing performance of mission-critical, cloud-based apps. Next-generation web filtering solves these problems by giving network administrators complete visibility into and control over all content traversing the network.
Network Visibility and Control
Bandwidth control is another requirement that most legacy solutions fail to address. With content quality now typically high-definition or above, controlling the amount of bandwidth websites and applications use is key to maintaining reliable internet access. Left on its own, network performance can quickly succumb to Netflix, YouTube, social media and other recreational traffic. Using Layer 7 DPI and SSL inspection, Untangle easily identifies all recreational traffic, preventing users from bypassing the controls, and puts shaping in place to stop network bottlenecks.
Control doesn’t stop at managing bandwidth usage for the web and applications; managing what kind of content users can access is just as important. Some users need access to certain applications or web resources, while others do not to meet regulatory compliance or data privacy requirements. Session viewers and event logs identify individual users and the specific applications that are negatively impacting the network in real-time. Once identified, network administrators can stop users from streaming videos from Netflix, downloading illegal content with P2P clients, and hiding their activities with apps like UltraSurf. If needed, they can also control specific application functions. For example, one group of users may require access to all of Facebook, while another group will need to be restricted from certain functionalities like Facebook games or videos.