Sunteți pe pagina 1din 14

FIREWALL

COMPARATIVE ANALYSIS
Performance

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

Tested Products
Barracuda F800, Check Point 12600, Cyberoam CR2500iNG, Dell SonicWALL NSA 4500, Fortinet FortiGate-800c,
Juniper SRX550, NETASQ NG1000-A, NETGEAR ProSecure UTM9S, Palo Alto Networks PA-5020, Sophos UTM 425,
Stonesoft FW-1301, WatchGuard XTM 1050

NSS Labs

Firewall Comparative Analysis - Performance

Overview
Implementation of a firewall can be a complex process with multiple factors affecting the overall performance of
the solution.
Each of these factors should be considered over the course of the useful life of the solution, including:
1.

Deployment use cases:


a. Will the Firewall be deployed to protect servers, desktop clients, or both?
What does the traffic look like?
a. Concurrency and connection rates.
b. Connections per second and capacity with different traffic profiles.
c. Latency and application response times.

2.

There is usually a trade-off between security effectiveness and performance; a products security effectiveness
should be evaluated within the context of its performance (and vice versa). This ensures that new security
protections do not adversely impact performance and security shortcuts are not taken to maintain or improve
performance.
Sizing considerations are absolutely critical, since vendor performance claims can vary significantly from actual
throughput with protection turned on. Farther to the right indicates higher rated throughput. Higher up indicates
more connections per second. Products with low connections/throughput ratio run the risk of running out of
connections before they reach their maximum potential throughput.
250,000

Maximum TCP Connections per Second

Cyberoam CR2500iNG
200,000

Fortinet FortiGate-800c

150,000

Barracuda F800

100,000

NETASQ NG1000-A

Stonesoft FW-1301
Check Point 12600

50,000
Palo Alto Networks PA-5020
Dell SonicWALL NSA 4500

Juniper SRX550
WatchGuard XTM 1050
Sophos UTM 425

NETGEAR ProSecure UTM9S

2,000

4,000

6,000

8,000

NSS Rated Throughput (Mbps)

10,000

12,000

Figure 1: Throughput and Connection Rates

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance

Table of Contents
Overview ................................................................................................................................ 2
Analysis .................................................................................................................................. 4
UDP Throughput & Latency .......................................................................................................................... 5
Connection Dynamics Concurrency and Connection Rates ....................................................................... 7
HTTP Connections per Second and Capacity ................................................................................................ 9
HTTP Connections per Second and Capacity (Throughput) ....................................................................... 9
Application Average Response Time - HTTP (at 90% Max Capacity) ...................................................... 12
Real-World Traffic Mixes ............................................................................................................................ 13
Test Methodology ................................................................................................................. 14
Contact Information .............................................................................................................. 14


Table of Figures
Figure 1: Throughput and Connection Rates ................................................................................................ 2
Figure 2: Vendor Claimed vs NSS Rated Throughput in Mbps ...................................................................... 4
Figure 3: UDP Throughput by Packet Size (I) ................................................................................................ 5
Figure 4: UDP Throughput by Packet Size (II) ............................................................................................... 6
Figure 5: UDP Latency by Packet Size (Microseconds) .................................................................................. 6
Figure 6: Concurrency and Connection Rates (I) ........................................................................................... 7
Figure 7: Concurrency and Connection Rates (II) .......................................................................................... 8
Figure 8: Maximum Throughput Per Device With 44Kbyte Response .......................................................... 9
Figure 9: Maximum Throughput Per Device With 21Kbyte Response ........................................................ 10
Figure 10: Maximum Throughput Per Device With 10Kbyte Response ...................................................... 10
Figure 11: Maximum Throughput Per Device With 4.5Kbyte Response ..................................................... 11
Figure 12: Maximum Throughput Per Device With 1.7Kbyte Response ..................................................... 11
Figure 13: Maximum Connection Rates Per Device With Various Response Sizes ...................................... 12
Figure 14: Application Latency (Microseconds) Per Device With Various Response Sizes .......................... 12
Figure 15: Real-World Performance by Device ........................................................................................... 13

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance

Analysis
NSS Labs research indicates that the majority of enterprises will deploy traditional firewalls in front of their
datacenters and at the core of the network to separate unrelated traffic. Because of this, NSS rates product
performance based upon the average of three traffic types: 21KB HTTP response traffic, a mix of perimeter traffic
common in enterprises, and a mix of internal core traffic common in enterprises. Details of these traffic mixes
are available in the Firewall Test Methodology (www.nsslabs.com).
Every effort is made to deploy policies that ensure the optimal combination of security effectiveness and
performance, as would be the aim of a typical customer deploying the device in a live network environment. This
provides readers with the most useful information on key firewall security effectiveness and performance
capabilities based upon their expected usage.

NSS Throughput Rating
-

5,000

30,000

10,000
8,733
28,000
850
990
9,667

Fortinet FortiGate-800c

20,000
2,127

Juniper SRX550

5,500
2,540

NETASQ NG1000-A

7,000
231
850
4,120

Palo Alto Networks PA-5020

5,000
3,000
6,000
5,147

Stonesoft FW-1301

WatchGuard XTM 1050

25,000

9,200

Cyberoam CR2500iNG

Sophos UTM 425

20,000

8,400

Check Point 12600

NETGEAR ProSecure UTM9S

15,000

7,827

Barracuda F800

Dell SonicWALL NSA 4500

Vendor Stated Throughput

10,000

5,000
2,200
10,000


Figure 2: Vendor Claimed vs. NSS Rated Throughput in Mbps

The results presented in the chart above show the difference between the NSS performance rating and vendor
performance claims, which are often under ideal/unrealistic conditions. Where multiple figures are quoted by
vendors, NSS selects those that relate to TCP, or with protection enabled, performance expectations, rather than
the more optimistic UDP-only or large packet size performance figures often quoted.
Even so, NSS rated throughput is typically lower than that claimed by the vendor, often significantly so, since it is
more representative of how devices will perform in real-world deployments.

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance

UDP Throughput & Latency


The aim of this test is to determine the raw packet processing capability of each in-line port pair of the device only.
This traffic does not attempt to simulate any form of real-world network condition. No TCP sessions are created
during this test, and there is very little for the detection engine to do in the way of protocol analysis. However,
this test is relevant since vendors are forced to perform inspection on UDP packets as a result of VoIP, video, and
other streaming applications.
Fortinet FortiGate-800c
20,000

Stonesoft FW-1301
Cyberoam CR2500iNG
Barracuda F800

18,000

16,000

Megabits per Second

14,000

Check Point 12600

12,000

10,000

Palo Alto Networks PA-5020

8,000

Juniper SRX550
NETASQ NG1000-A
WatchGuard XTM 1050

6,000

Sophos UTM 425

4,000

Dell SonicWALL NSA 4500


2,000

NETGEAR ProSecure UTM9S

64 Byte Packets

128 Byte Packets2

256 Byte Packets

512 Byte Packets

1024 Byte Packets

1514 Byte Packets

UDP Packet Size


Figure 3: UDP Throughput by Packet Size (I)

Fortinets Fortigate 800c was the only device to demonstrate anything close to line rate capacity with packet sizes
from 1514 bytes all the way down to 64 bytes. In addition, it was the only device to consistently demonstrate
latency of less than 10 microseconds. The Juniper SRX 550 demonstrated the second best latency at 12-16
microseconds.

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance


64 Byte
Packets

128 Byte
Packets

256 Byte
Packets

512 Byte
Packets

1024 Byte
Packets

1514 Byte
Packets

780

2,000

3,700

6,500

12,600

18,500

Check Point 12600

1,900

3,400

6,400

11,700

12,100

12,200

Cyberoam CR2500iNG

3,300

9,300

9,300

11,230

16,200

19,300

120

400

400

890

1,700

2,600

18,900

19,500

19,600

19,700

19,800

19,900

Juniper SRX550

400

1,450

1,450

2,800

5,500

8,000

NETASQ NG1000-A

350

600

1,200

2,300

4,800

7,300

NETGEAR ProSecure UTM9S

12

12

20

64

102

Palo Alto Networks PA-5020

5,092

9,400

9,400

9,597

9,785

9,935

640

2,450

2,450

4,600

6,000

6,000

2,600

4,400

8,300

14,900

19,100

19,900

346

925

925

2,000

4,100

6,700

Product
Barracuda F800

Dell SonicWALL NSA 4500


Fortinet FortiGate-800c

Sophos UTM 425


Stonesoft FW-1301
WatchGuard XTM 1050

Figure 4: UDP Throughput by Packet Size (II)

In-line security devices that introduce high levels of latency are unacceptable, especially where multiple security
devices are placed in the data path. The chart below reflects the latency (in microseconds) as recorded during the
UDP throughput tests at 90% of maximum load. Lower values are preferred.

Product

64 Byte
Packets -
Latency (s)

128 Byte
Packets -
Latency (s)

256 Byte
Packets -
Latency (s)

512 Byte
Packets -
Latency (s)

1024 Byte
Packets -
Latency (s)

1514 Byte
Packets -
Latency (s)

Barracuda F800

273

163

108

104

103

109

Check Point 12600

75

124

99

82

102

109

1,185

845

452

385

302

270

Dell SonicWALL NSA 4500

30

31

32

33

37

42

Fortinet FortiGate-800c

Juniper SRX550

12

12

12

13

14

16

NETASQ NG1000-A

36

36

43

46

47

36

NETGEAR ProSecure UTM9S

232

237

243

255

337

603

Palo Alto Networks PA-5020

15

19

22

26

33

38

Sophos UTM 425

59

61

60

63

92

169

Stonesoft FW-1301

50

84

51

54

82

81

WatchGuard XTM 1050

136

156

182

269

460

705

Cyberoam CR2500iNG

Figure 5: UDP Latency by Packet Size (Microseconds)

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance

Connection Dynamics Concurrency and Connection Rates


These tests stress the detection engine to determine how the sensor copes with increasing rates of TCP
connections per second, application layer transactions per second, and concurrent open connections. All packets
contain valid payload and address data and these tests provide an excellent representation of a live network at
various connection/transaction rates.
Note that in all tests, the following critical breaking pointswhere the final measurements are takenare used:
Excessive concurrent TCP connections - latency within the firewall is causing unacceptable increase in open
connections on the server-side.
Excessive response time for HTTP transactions/SMTP sessions - latency within the firewall is causing excessive
delays and increased response time to the client.
Unsuccessful HTTP transactions/SMTP sessions normally, there should be zero unsuccessful transactions. Once
these appear, it is an indication that excessive latency within the firewall is causing connections to time out.
The following are the key connection dynamics results from the performance tests.

Max. Concurrent
TCP Connections

Max. Concurrent
TCP Connections
w/Data

TCP Connections
Per Second

HTTP Connections
Per Second

HTTP Transactions
Per Second

Barracuda F800

1,000,000

1,000,000

134,200

117,000

356,000

Check Point 12600

1,500,000

1,500,000

52,000

113,000

391,700

Cyberoam CR2500iNG

3,100,000

2,999,000

215,000

179,000

360,000

400,000

400,000

15,600

10,300

25,000

6,000,000

4,400,000

180,000

180,000

397,000

533,000

542,000

18,500

18,000

80,000

1,280,000

1,200,000

57,000

43,500

102,300

NETGEAR ProSecure UTM9S

16,700

15,000

760

480

6,400

Palo Alto Networks PA-5020

1,000,000

1,000,000

36,000

36,000

348,000

480,000

2,000,000

12,000

31,800

270,000

Stonesoft FW-1301

6,000,000

2,900,000

57,000

51,000

328,600

WatchGuard XTM 1050

2,600,000

2,600,000

19,000

18,000

136,000

Product

Dell SonicWALL NSA 4500


Fortinet FortiGate-800c
Juniper SRX550
NETASQ ng1000-A

Sophos UTM 425

Figure 6: Concurrency and Connection Rates (I)

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance


Beyond overall throughput of the device, connection dynamics can play an important role in sizing a security
device that will not unduly impede the performance of a system or an application. Maximum connection and
transaction rates help size a device more accurately than simply focusing on throughput. By knowing the
maximum connections per second, it is possible to predict maximum throughput based upon the traffic mix in a
given enterprise environment. For example If the device maximum HTTP CPS is 2,000, and average traffic size is
44KB such that 2,500 CPS = 1Gbps, then the tested device will achieve a maximum of 800 Mbps ((2,000/2,500) x
1,000 Mbps)) = 800 Mbps.
Maximum concurrent TCP connections and maximum TCP connections per second rates are also useful metrics
when attempting to size a device accurately. Products with low connection/throughput ratio run the risk of
exhausting connections before they reach their maximum potential throughput. By knowing the maximum
connections per second, it is possible to predict when a device will fail in a given enterprise environment.
250,000

Cyberoam CR2500iNG

Maximum TCP Connections per Second

200,000

Fortinet FortiGate-800c

150,000
Barracuda F800

100,000

NETASQ NG1000-A
Check Point 12600

50,000

Stonesoft FW-1301

Palo Alto Networks PA-5020


Juniper SRX550
Dell SonicWALL NSA 4500

WatchGuard XTM 1050


Sophos UTM 425

NETGEAR ProSecure UTM9S

1,000,000

2,000,000

3,000,000

4,000,000

5,000,000

6,000,000

7,000,000

Maximum Concurrent / Simultaneous TCP Connections

Figure 7: Concurrency and Connection Rates (II)

Higher up indicates increased connections per second capacity. Farther to the right indicates increased concurrent
/ simultaneous connections. Products with low concurrent connection / connection per second ratio run the risk
of exhausting connections (sessions) before they reach their maximum potential connection rate.

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance

HTTP Connections per Second and Capacity


In-line firewall devices exhibit an inverse correlation between security effectiveness and performance. The more
deep-packet inspection is performed, the fewer packets can be forwarded. Furthermore, it is important to consider
a real-world mix of traffic that a device will encounter.
NSS tests aim to stress the HTTP detection engine in order to determine how the sensor copes with detecting and
blocking under network loads of varying average packet size and varying connections per second. By creating
genuine session-based traffic with varying session lengths, the sensor is forced to track valid TCP sessions, thus
ensuring a higher workload than for simple packet-based background traffic.
Each transaction consists of a single HTTP GET request and there are no transaction delays (i.e. the web server
responds immediately to all requests). All packets contain valid payload (a mix of binary and ASCII objects) and
address data. This test provides an excellent representation of a live network (albeit one biased towards HTTP
traffic) at various network loads.
HTTP Connections per Second and Capacity (Throughput)
As previously stated, NSS research has found that there is usually a trade-off between security effectiveness and
performance. Because of this, it is important to judge a products security effectiveness within the context of its
performance (and vice versa). This ensures that new security protections do not adversely impact performance and
security shortcuts are not taken to maintain or improve performance. The following charts compare maximum
connection rate (HTTP CPS), maximum rated throughput (Mbps) and average application latency (average HTTP
response time in milliseconds) across a range of HTTP response sizes.
Megabits per Second
-

1,000

2,000

3,000

4,000

5,000

6,000

7,000

8,000

9,000

10,000

Barracuda F800

10,000

Check Point 12600

10,000

Cyberoam CR2500iNG

10,000

Dell SonicWALL NSA 4500

960

Fortinet FortiGate-800c

10,000

Juniper SRX550

2,400

NETASQ NG1000-A
NETGEAR ProSecure UTM9S

3,080
209

Palo Alto Networks PA-5020

4,120

Sophos UTM 425

3,000

Stonesoft FW-1301

3,320

WatchGuard XTM 1050

44Kbyte Response

2,520
WatchGuar
d XTM
1050

Stonesoft
FW-1301

Sophos
UTM 425

Palo Alto
Networks
PA-5020

NETGEAR
ProSecure
UTM9S

NETASQ
NG1000-A

Juniper
SRX550

Fortinet
FortiGate-8
00c

Dell
SonicWALL
NSA 4500

Cyberoam
CR2500iN
G

Check
Point 12600

Barracuda
F800

2,520

3,320

3,000

4,120

209

3,080

2,400

10,000

960

10,000

10,000

10,000


Figure 8: Maximum Throughput Per Device With 44Kbyte Response

2013 NSS Labs, Inc. All rights reserved.

NSS Labs

Firewall Comparative Analysis - Performance


Megabits per Second
-

1,000

2,000

3,000

4,000

5,000

6,000

7,000

8,000

9,000

10,000

8,780

Barracuda F800

10,000

Check Point 12600


Cyberoam CR2500iNG

10,000

Dell SonicWALL NSA 4500

820

Fortinet FortiGate-800c

10,000
1,880

Juniper SRX550
NETASQ NG1000-A

2,720

NETGEAR ProSecure UTM9S

112

Palo Alto Networks PA-5020

4,160
3,000

Sophos UTM 425


Stonesoft FW-1301

2,940

WatchGuard XTM 1050

2,200

21Kbyte Response (Mbps)

WatchGuar
d XTM
1050

Stonesoft
FW-1301

Sophos
UTM 425

Palo Alto
Networks
PA-5020

NETGEAR
ProSecure
UTM9S

NETASQ
NG1000-A

Juniper
SRX550

Fortinet
FortiGate-8
00c

Dell
SonicWAL
L NSA
4500

Cyberoam
CR2500iN
G

Check
Point
12600

Barracuda
F800

2,200

2,940

3,000

4,160

112

2,720

1,880

10,000

820

10,000

10,000

8,780

Figure 9: Maximum Throughput Per Device With 21Kbyte Response



Megabits per Second
-

1,000

2,000

3,000

4,000

5,000

6,000

Barracuda F800

7,000

8,000

7,460

Cyberoam CR2500iNG

8,491

Dell SonicWALL NSA 4500

520

Fortinet FortiGate-800c

7,700

Juniper SRX550

1,260

NETASQ NG1000-A

2,010
58

Palo Alto Networks PA-5020

3,280

Sophos UTM 425

3,000

Stonesoft FW-1301

2,910
1,610

WatchGuard XTM 1050

10KB Response (Mbps)

10,000

6,490

Check Point 12600

NETGEAR ProSecure UTM9S

9,000

WatchGuard
XTM 1050

Stonesoft
FW-1301

Sophos UTM
425

Palo Alto
Networks
PA-5020

NETGEAR
ProSecure
UTM9S

NETASQ
NG1000-A

Juniper
SRX550

1,610

2,910

3,000

3,280

58

2,010

1,260

Fortinet
Dell
FortiGate-800 SonicWALL
c
NSA 4500
7,700

520

Cyberoam
CR2500iNG

Check Point
12600

Barracuda
F800

8,491

7,460

6,490

Figure 10: Maximum Throughput Per Device With 10Kbyte Response

2013 NSS Labs, Inc. All rights reserved.

10

NSS Labs

Firewall Comparative Analysis - Performance



Megabits per Second
-

1,000

2,000

3,000

4,000

5,000

6,000

7,000

8,000

9,000

10,000

4,390

Barracuda F800
Check Point 12600

4,660
5,650

Cyberoam CR2500iNG
Dell SonicWALL NSA 4500

280

Fortinet FortiGate-800c

6,850

Juniper SRX550

770
1,360

NETASQ NG1000-A
NETGEAR ProSecure UTM9S

32

Palo Alto Networks PA-5020

1,680

Sophos UTM 425

2,600

Stonesoft FW-1301

2,200

WatchGuard XTM 1050

965

WatchGuard
XTM 1050

Stonesoft
FW-1301

Sophos UTM
425

Palo Alto
Networks
PA-5020

NETGEAR
ProSecure
UTM9S

NETASQ
NG1000-A

Juniper
SRX550

965

2,200

2,600

1,680

32

1,360

770

4.5KB Response (Mbps)

Dell
Fortinet
SonicWALL
FortiGate-800c
NSA 4500
6,850

280

Cyberoam
CR2500iNG

Check Point
12600

Barracuda
F800

5,650

4,660

4,390

Figure 11: Maximum Throughput Per Device With 4.5Kbyte Response



Megabits per Second
-

1,000

2,000

3,000

4,000

7,000

8,000

9,000

10,000

2,710

Check Point 12600


Cyberoam CR2500iNG

3,675

Dell SonicWALL NSA 4500

240

Fortinet FortiGate-800c

3,925

Juniper SRX550

440

NETASQ NG1000-A

920
17

Palo Alto Networks PA-5020

1,150

Sophos UTM 425

1,400

Stonesoft FW-1301

1,050

WatchGuard XTM 1050

1.7KB Response (Mbps)

6,000

2,603

Barracuda F800

NETGEAR ProSecure UTM9S

5,000

525

WatchGuard
XTM 1050

Stonesoft
FW-1301

Sophos UTM
425

Palo Alto
Networks
PA-5020

NETGEAR
ProSecure
UTM9S

NETASQ
NG1000-A

Juniper
SRX550

525

1,050

1,400

1,150

17

920

440

Dell
Fortinet
SonicWALL
FortiGate-800c
NSA 4500
3,925

240

Cyberoam
CR2500iNG

Check Point
12600

Barracuda
F800

3,675

2,710

2,603

Figure 12: Maximum Throughput Per Device With 1.7Kbyte Response



2013 NSS Labs, Inc. All rights reserved.

11

NSS Labs

Firewall Comparative Analysis - Performance


The following table shows the number of HTTP connections per second required to achieve the rated throughput.
44Kbyte
Response

21Kbyte
Response

10Kbyte
Response

4.5Kbyte
Response

1.7Kbyte
Response

Barracuda F800

25,000

43,900

64,900

87,800

104,100

Check Point 12600

25,000

50,000

74,600

93,200

108,400

Cyberoam CR2500iNG

25,000

50,000

84,908

113,000

147,000

Dell SonicWALL NSA 4500

2,400

4,100

5,200

5,600

9,600

Fortinet FortiGate-800c

25,000

50,000

77,000

137,000

157,000

Juniper SRX550

6,000

9,400

12,600

15,400

17,600

NETASQ ng1000-A

7,700

13,600

20,100

27,200

36,800

NETGEAR ProSecure UTM9S

522

562

582

642

661

Palo Alto Networks PA-5020

10,300

20,800

32,800

33,600

46,000

Sophos UTM 425

7,500

15,000

30,000

52,000

56,000

Stonesoft FW-1301

8,300

14,700

29,100

44,000

42,000

WatchGuard XTM 1050

6,300

11,000

16,100

19,300

21,000

Product

Figure 13: Maximum Connection Rates Per Device With Various Response Sizes


Application Average Response Time - HTTP (at 90% Max Capacity)
The following table details the average application response time (application latency) with various traffic sizes at
90% Max Capacity (Throughput). The lower the number the better (improved application response time).
Juniper SRX 3600 demonstrated the lowest (best) application response times, while the NETGEAR ProSecure
UTM9S introduced the most application latency.

44Kbyte
Latency (ms)

21Kbyte
Latency (ms)

10Kbyte
Latency (ms)

4.5Kbyte
Latency (ms)

1.7Kbyte
Latency (ms)

Barracuda F800

1.7

1.1

0.9

0.3

0.3

Check Point 12600

1.81

1.08

1.1

0.35

0.28

Cyberoam CR2500iNG

1.61

0.83

0.7

0.22

0.24

Dell SonicWALL NSA 4500

1.99

1.02

0.04

0.04

0.07

Fortinet FortiGate-800c

0.94

0.78

0.53

0.4

0.34

Juniper SRX550

0.14

0.04

0.04

0.01

0.03

NETASQ ng1000-A

1.42

1.25

0.53

0.12

0.16

NETGEAR ProSecure UTM9S

2.55

1.39

2.43

0.93

0.65

Palo Alto Networks PA-5020

1.01

0.63

0.27

0.1

0.08

Sophos UTM 425

1.98

0.99

0.75

0.06

0.03

Stonesoft FW-1301

1.1

0.9

0.4

0.2

0.08

WatchGuard XTM 1050

5.1

3.01

2.14

0.85

0.77

Product

Figure 14: Application Latency (Microseconds) Per Device With Various Response Sizes

2013 NSS Labs, Inc. All rights reserved.

12

NSS Labs

Firewall Comparative Analysis - Performance

Real-World Traffic Mixes


The aim of these tests is to measure the performance of the device under test (DUT) in a real world environment
by introducing additional protocols and real content, while still maintaining a precisely repeatable and consistent
background traffic load. In order to simulate real use cases, different protocol mixes are utilized to model
placement of the DUT within various locations on a corporate network. For details about real world traffic protocol
types and percentages, see the NSS Labs Firewall Test Methodology, available at www.nsslabs.com.
Real World Protocol Mix (Perimeter)
Real World Protocol Mix (Core)
2,000
4,000
6,000
8,000

10,000

Barracuda F800

4,700
10,000

Check Point 12600

5,200
10,000

Cyberoam CR2500iNG

6,200
950

Dell SonicWALL NSA 4500

780
10,000

Fortinet FortiGate-800c

9,000
3,000

Juniper SRX550

1,500
3,700

NETASQ NG1000-A

NETGEAR ProSecure UTM9S

10,000

1,200
460
121
4,500

Palo Alto Networks PA-5020

3,700
3,000

Sophos UTM 425

3,000
8,700

Stonesoft FW-1301

3,800
2,600

WatchGuard XTM 1050

1,800


Figure 15: Real-World Performance by Device

Most vendors perform better in the real-world protocol mix (perimeter,) a protocol mix typically seen at an
enterprise perimeter. However Fortinet Fortigate-800c is the only vendor to scale equally well in the real-world
protocol mix (core), a protocol mix typical of that seen in a large datacenter or the core of an enterprise network.

2013 NSS Labs, Inc. All rights reserved.

13

NSS Labs

Firewall Comparative Analysis - Performance

Test Methodology
Methodology Version: Firewall v4
A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com


Contact Information
NSS Labs, Inc.
206 Wild Basin Rd, Suite 200A
Austin, TX 78746 USA
+1 (512) 961-5300
info@nsslabs.com
www.nsslabs.com

v2013.02.07


This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse,
please contact NSS Labs at +1 (512) 961-5300 or sales@nsslabs.com.

2013
NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval
system, or transmitted without the express written consent of the authors.

Please note that access to or use of this report is conditioned on the following:

1. The information in this report is subject to change by NSS Labs without notice.

2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not
guaranteed. All use of and reliance on this report are at the readers sole risk. NSS Labs is not liable or responsible for any
damages, losses, or expenses arising from any error or omission in this report.
3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED
WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT ARE DISCLAIMED AND
EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT
DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE
POSSIBILITY THEREOF.
4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or
software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no
errors or defects in the products or that the products will meet the readers expectations, requirements, needs, or
specifications, or that they will operate without interruption.
5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned
in this report.
6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of
their respective owners.

2013 NSS Labs, Inc. All rights reserved.

14

S-ar putea să vă placă și