Thursday, March 26, 2015

Firewalls, Firewalls and More Firewalls

CIO: Jimmy, I was at a conference last week and the common theme at the conference is that firewalls are a things of the past.  Trends are suggesting that we should start looking at a Next-Gen Firewall to replace what we have.

Engineer: Sir, we just added a UTM solution into our network a month ago???

CIO: I understand Jimmy.  However, the threat landscape is getting serious and I need to make sure that our network is as secure as possible.  I need some recommendations for a Next-Gen Firewall by the end of the week.

The exchange above resonates of a situation that seems all too common in the IT workspace these days.  As IT engineers, we tend to speak our own jargon of acronyms and catchy phrases.  We’ve become so accustom to it that at times we tend to forget our audience and just assume that everyone knows what it is we’re talking about.  It even happens amongst ourselves from time to time.  However, the business side of the house usually has their own set of definitions for things.  Most of them fueled by third party organizations like Gartner.  In this particular case of the misunderstanding pertaining to the firewall discussion between the CIO and Engineer we are presented with an opportunity to clear some of these misunderstandings about the firewall.  Since a vast amount of upper management in organizations tends to lend a significant amount of weight to Gartner’s Magic Quadrant we’ll use their definitions to help compare and contrast what a Firewall, Next-Gen Firewall, and Unified Threat Management really are

 Firewall
“A firewall is an application or an entire computer (e.g., an Internet gateway server) that controls access to the network and monitors the flow of network traffic. A firewall can screen and keep out unwanted network traffic and ward off outside intrusion into a private network. This is particularly important when a local network connects to the Internet. Firewalls have become critical applications as use of the Internet has increased.” -Gartner 2013

According to Gartner a firewall is just an application or an entire computer that controls access to the network.  While technically correct it lacks a good amount of clarity.  When most IT engineers think of a firewall on a network they don’t typically think of a software application or a desktop PC/server as the device that is going to be used to control access in and out of their network.  What comes to mind (at least mine) is a dedicated appliance that has a number of ASIC’s (Application Specific Integrated Circuits) that are designed to process network traffic while comparing it against definitive set of rules at Layer 3 and 4 of the OSI model, stateful sessions, and provide some sort of NAT (Network Address Translation).  Sure, there are other services that seem to be common on a lot of firewalls such as the various types of VPN (Virtual Private Networking) technologies but the truth of the matter is that not every firewall supports VPN technologies.  Similarly, IPS/IDS (Intrusion Prevention/Intrusion Detection Systems) are also common services that are found on a number of firewalls and like VPN there are a number of firewalls that do not support these technologies.  So the question becomes, if these are services that aren’t standard on a firewall and when added they enhance the level of security that a firewall can provide couldn’t that make firewalls that do provides these services a “Next-Generation” firewall?  After all, the definition of the term “Next-Generation” according to Dictionary.com is “pertaining to the next generation in a family; also, pertaining to the next stage of development or version of a product, service, or technology (Dictionary.com, 2014)one could easily deduce that having VPN and integrated IPS/IDS as part of a firewall would be the next stage of development in the firewall family product line and thereby should be a “Next-Generation Firewall”.  Or so one would think but alas we would be wrong in our assumption because Gartner has coined the Next-Generation Firewall phrase with a different meaning.

Next-Generation Firewall
“Next-generation firewalls (NGFWs) are deep-packet inspection firewalls that move beyond port/protocol inspection and blocking to add application-level inspection, intrusion prevention, and bringing intelligence from outside the firewall. An NGFW should not be confused with a stand-alone network intrusion prevention system (IPS), which includes a commodity or nonenterprise firewall, or a firewall and IPS in the same appliance that are not closely integrated.”  -Gartner 2013

To me, this definition sounds a lot like what was just discussed prior.  However, there are a couple of key requirements in this definition order for a firewall to be coined “Next-Generation”.  Application-level inspection being the first, and bringing intelligence from outside the firewall being the second, are the two additional requirements that Gartner has added to this definition in order for one to coin their box as a “Next-Generation Firewall”.  So what is application-level inspection?  Simply put it’s going to be the firewalls ability to classify traffic such as Facebook, Outlook, Adobe Flash, etc. at the firewall level and write policies based on these classifications.  The next requirement is quite vague because “bringing intelligence from outside the firewall” could technically be any sort of service not originating from inside the firewall.  One could argue that a firewall downloading IPS/IDS signatures could be considered “bringing intelligence from outside the firewall”.  What I find troubling about this is that some time ago there was a situation in which a particular vendor met all of Gartner’s requirements as per the definition but wasn’t allowed to compete in the Magic Quadrant because the product was already a leader in the UTM (Unified Threat Management) category and exceeded the requirements of the Gartner definition (Tam, et al., 2012).  There are, however, a number of other products that exceed the required capabilities as per the definition, but these products still appear in the Magic Quadrant.  For example, in the definition stated above there is no mention about offering VPN services and yet there is a plethora of devices in the quadrant that do.

Unified Threat Management
“Unified threat management (UTM) is a converged platform of point security products, particularly suited to small and midsize businesses (SMBs). Typical feature sets fall into three main subsets, all within the UTM: firewall/intrusion prevention system (IPS)/virtual private network, secure Web gateway security (URL filtering, Web antivirus [AV]) and messaging security (anti-spam, mail AV).” -Gartner 2013

When we look at this definition provided by Gartner for Unified Threat Management, there are a few additional features that a UTM provides.  In addition to the requirements laid out for the Next-Generation Firewall, UTM provides VPN technologies, Web filtering, Web antivirus, anti-spam, and e-mail AV.  Let’s not forget that these services have to be offered from on the same box.  However, you will note that there isn’t any mention made in regards to the UTM having to bring any intelligence into the platform from an outside source. 

Conclusion
The waters have been tremendously muddied thanks to third parties such as Gartner making up their own definitions for these platforms.  At the end of the day, they’re all just firewalls.  Firewalls with enhanced feature sets.  Unfortunately, the reality is that because many key people tend to follow Gartner quite religiously it becomes necessary to understand their definitions as well as knowing how a manufacturer views their product in order to try to ensure that there isn’t that communication breakdown.

References 
Dictionary.com. (2014). next-generation. Retrieved from Dictionary.com: http://dictionary.reference.com/browse/next-generation
Gartner. (2013). IT Glossary. Retrieved from gartner.com: http://www.gartner.com/it-glossary/
Tam, K., Salvador, M. H., McAlpine, K., Basile, R., Matsugu, B., & More, J. (2012). UTM Security with Fortinet. Waltam, MA: Synergess.

Thursday, March 19, 2015

Veeam and the Misunderstood SysAdmin

I've worked with many Systems Administrators throughout my career.  Each of them undoubtedly unique, but many share some observable traits.  Often they tend to see themselves as unsung heroes - the frequent saviors of productivity, the mechanics of the business engines, or even as a kind of Atlas supporting not just the weight of the world, but the fate and destiny of the entire company’s collective future.   It’s no secret that many of these brave individuals are the kings of ad-hoc learning, often trailblazing new software frontiers with nothing more in their toolbox than a unique ability to absorb complex integration practices learned often by experience alone. 

The most stereotypical trait for Systems Administrators is disorganization. Hollywood and television programs have convinced us that behind the fogged glass door of the IT department lie piles of workstation and printer graveyards, boxes of random cables resembling spaghetti, and a growing collection of fast food wrappers.  On the surface, the eyesore of the traditional IT habitat rarely convinces many that actual work is performed.   In their defense, this environment is the result of a position that rarely has strict boundaries of responsibility.   Over time, they are asked to evolve into not just Information Technology admins, but also be responsible for supporting almost anything that requires electricity.

The Systems Admin is usually trusted with one of the most important tasks of any network; data backups.   At any time, the combined day-to-day toil of millions of bytes of information from their co-workers could potentially disappear in a heartbeat when the urban squirrel decides to go looking for nuts in local utility pole transformer.  Thanks to the ever-improving resilience of storage technology, the risk of a complete loss of data (aside from the rapid fission of neutrons from Uranium-235 within 1.6km radius of your data center) is thankfully lower than ever.  But a common saying rings true for data backups:  it’s better to have it, and not need it – than to need it, and not have it.

VEEAM is rapidly becoming a favorite tool of these IT-basement dwellers, for the fact that it provides data protection and high-speed recovery while still maintaining a friendly and easily learned interface.   Virtual Servers and Workstations have rapidly become the new normal for many office environments, which does ease many of the previous support challenges of physical devices, but opens the door for even more importance on reliable backups when they are centralized in the data center. 

I began using Veeam products in 2007, where the free Veeam Reporter tool (right) provided an easy method of documenting a VMware virtual environment. Back then, I was also an over-tasked IT Systems Administrator, except that I was not supporting one company, but many.  Like many of my basement-dwelling brethren with messy offices, I didn't have time to spend on training to learn complex software. I had to be the best ad-hoc support engineer that I could be, given the time that I was slaving against the almighty billable hour.  That often meant learning and supporting software IT products as I configured them, and Veeam products allowed me to spend less time figuring them out, and more time using them for their intended purpose.


The free Veeam Reporter has since matured, and is now part of the Veeam ONE Suite (left) that shows an extremely granular visibility dashboard into your entire virtual infrastructure for your Virtual Servers and Workstations.   It provides real-time monitoring, optimization, capacity planning, and over 25 new reports for showing your work, instead of letting your messy office do the talking for you.   By pairing this tool with Veeam’s Backup & Replication set-it-and-forget-it style of management, it frees your time for other tasks….like fixing mobile phones, aging fax machines, and printers.

Thursday, March 12, 2015

Does Virtualization = Better Performance?

I've had the opportunity to work on many VMware virtualization projects since fully embracing the technology back with the release of vSphere 4.0.  One of the common misconceptions I ran into is that when converting physical servers to run as virtual servers, we should expect to see better performance.  Otherwise, why would we do it, right?  “My buddy at Acme Co. said they can reboot all their virtual Windows servers in under 10 minutes – that used to take them over an hour!”

I’ll be the first to admit, I fell victim to this expectation as well back in the day (queue Wayne’s World Dream dissolve):

I remember setting up my first ‘virtual machine’.  I was not sure what to expect when I chose to ‘Power On’ this device.  I watched in absolute awe as the BIOS screen that I had grown so accustomed to flash before my eyes for a split second, POST completed, it was ready to load an 
OS…


Wait – what!?!

I had a hobby of building my own high-end gaming desktops at home for a while and they had never completed POST that quickly.  What trickery is this?  Surely once I install Windows XP on this VM it will bog down… nope, still beats the pants off of any physical machine I have worked with.

Well, it turned out that was my first lesson in virtualization that I can remember – virtualization involves layers of separation between the VM and the underlying hardware.  One of the benefits of this separation, no hardware checks required after Powering On a virtual machine!

It wasn't until later that I realized the one of the key benefits of going Virtual – utilization.  Better performance is often a perceived side effect and oftentimes it’s very real, especially for companies migrating off of old hardware onto Hosts and storage running on new hardware.  But consolidating a 30 server environment to just 3 or 4, that is where the true value surfaces.  Think of the savings on power, cooling, battery backup... even noise pollution! 

If a service or application requires a certain level of performance, it may be best to keep it on a physical server.  This provides it dedicated resources; it’s not sharing the compute or storage with other machines!  Can this be accomplished within a virtual environment?  Absolutely, and with it would come the benefits of being a virtual machine.  But for a true, performance-centric setup, dedicated resources is the way to go.

So when does one go Virtual and when do you stay Physical?  What kind of performance do I need?  Which is more cost effective?  Which is easier to manage?  Which offers more protection against failures?  How is backup and recovery handled?

As part of the Professional Services Team at GLC, these are just a handful of questions I work with every day and the answers vary greatly.  We work with you to discover your business needs and help prioritize them, provide perspective when needed, and set realistic expectations when working together on starting and finishing your IT project.

Thursday, March 5, 2015

Data Security a Top Priority for Retail CIOs in 2015

In our increasingly connected world, company data breaches are widely publicized and talked about across multiple media and social outlets. This is especially true when a breach takes place at businesses in the retail industry. Particularly with sensitive information such as credit card numbers, consumers can become extremely upset with a business when their personal information is at risk. A data breach not only hurts a company’s image, but also creates customer loss and distrust. It inevitably takes a lot of time and money for a company to rectify the situation in all aspects.

A new survey revealed that after data breaches at firms like Home Depot and Target, retail CIOs are almost unanimous in agreeing that data security is the top priority for 2015.This survey, produced by Forrester Research and the National Retail Federation (NRF), concluded that 97 percent of retail CIOs rank efforts to increase cybersecurity defense in their top five items to accomplish this year. This comes as no surprise since Forrester projects that at least 60 percent of business will experience a data breach that exposes personal information this year alone. 

Additionally, many CIOs are taking the high profile data breaches that took place in 2014 as a lesson that no business will be left untargeted. The survey also found that an effort to improve corporate governance within the company is a top concern with 78 percent of retail CIO stating that this issue is in their top five internal priorities.

Budget will continue to be a challenge for any CIO as monetary restrictions are still limiting efforts to fight cybercrime.  Of the retail CIOs surveyed, 40 percent stated that they expect to work with the same or an even lower budget this year, while 34 percent expect increases of less than 10 percent. Many CIOs experience issues with the ability to use existing investments to address new business requirements and lead new innovation as well as the difficulty in pursuing new technology initiatives while working within a tight budget.

The survey also identified other key issues of top concern amongst retail CIOs, including integrating multiple channels of commerce, spending too much money on maintaining legacy systems, retaining and hiring quality IT staff, and tapping into big datasets to glean useful business insights. Considering all of these factors, it is important for retail CIOs in particular to become more responsive and committed to safeguarding consumer data than ever before.