Documente Academic
Documente Profesional
Documente Cultură
COMPARATIVE
ANALYSIS
Performance
2013
Thomas
Skybakmoen,
Francisco
Artes,
Bob
Walder,
Ryan
Liles
Tested
Products
Barracuda
F800,
Check
Point
12600,
Cyberoam
CR2500iNG,
Dell
SonicWALL
NSA
4500,
Fortinet
FortiGate-800c,
Juniper
SRX550,
NETASQ
NG1000-A,
NETGEAR
ProSecure
UTM9S,
Palo
Alto
Networks
PA-5020,
Sophos
UTM
425,
Stonesoft
FW-1301,
WatchGuard
XTM
1050
NSS Labs
Overview
Implementation
of
a
firewall
can
be
a
complex
process
with
multiple
factors
affecting
the
overall
performance
of
the
solution.
Each
of
these
factors
should
be
considered
over
the
course
of
the
useful
life
of
the
solution,
including:
1.
2.
There
is
usually
a
trade-off
between
security
effectiveness
and
performance;
a
products
security
effectiveness
should
be
evaluated
within
the
context
of
its
performance
(and
vice
versa).
This
ensures
that
new
security
protections
do
not
adversely
impact
performance
and
security
shortcuts
are
not
taken
to
maintain
or
improve
performance.
Sizing
considerations
are
absolutely
critical,
since
vendor
performance
claims
can
vary
significantly
from
actual
throughput
with
protection
turned
on.
Farther
to
the
right
indicates
higher
rated
throughput.
Higher
up
indicates
more
connections
per
second.
Products
with
low
connections/throughput
ratio
run
the
risk
of
running
out
of
connections
before
they
reach
their
maximum
potential
throughput.
250,000
Cyberoam CR2500iNG
200,000
Fortinet FortiGate-800c
150,000
Barracuda F800
100,000
NETASQ NG1000-A
Stonesoft FW-1301
Check Point 12600
50,000
Palo Alto Networks PA-5020
Dell SonicWALL NSA 4500
Juniper SRX550
WatchGuard XTM 1050
Sophos UTM 425
2,000
4,000
6,000
8,000
10,000
12,000
NSS Labs
Table
of
Contents
Overview
................................................................................................................................
2
Analysis
..................................................................................................................................
4
UDP
Throughput
&
Latency
..........................................................................................................................
5
Connection
Dynamics
Concurrency
and
Connection
Rates
.......................................................................
7
HTTP
Connections
per
Second
and
Capacity
................................................................................................
9
HTTP
Connections
per
Second
and
Capacity
(Throughput)
.......................................................................
9
Application
Average
Response
Time
-
HTTP
(at
90%
Max
Capacity)
......................................................
12
Real-World
Traffic
Mixes
............................................................................................................................
13
Test
Methodology
.................................................................................................................
14
Contact
Information
..............................................................................................................
14
Table
of
Figures
Figure
1:
Throughput
and
Connection
Rates
................................................................................................
2
Figure
2:
Vendor
Claimed
vs
NSS
Rated
Throughput
in
Mbps
......................................................................
4
Figure
3:
UDP
Throughput
by
Packet
Size
(I)
................................................................................................
5
Figure
4:
UDP
Throughput
by
Packet
Size
(II)
...............................................................................................
6
Figure
5:
UDP
Latency
by
Packet
Size
(Microseconds)
..................................................................................
6
Figure
6:
Concurrency
and
Connection
Rates
(I)
...........................................................................................
7
Figure
7:
Concurrency
and
Connection
Rates
(II)
..........................................................................................
8
Figure
8:
Maximum
Throughput
Per
Device
With
44Kbyte
Response
..........................................................
9
Figure
9:
Maximum
Throughput
Per
Device
With
21Kbyte
Response
........................................................
10
Figure
10:
Maximum
Throughput
Per
Device
With
10Kbyte
Response
......................................................
10
Figure
11:
Maximum
Throughput
Per
Device
With
4.5Kbyte
Response
.....................................................
11
Figure
12:
Maximum
Throughput
Per
Device
With
1.7Kbyte
Response
.....................................................
11
Figure
13:
Maximum
Connection
Rates
Per
Device
With
Various
Response
Sizes
......................................
12
Figure
14:
Application
Latency
(Microseconds)
Per
Device
With
Various
Response
Sizes
..........................
12
Figure
15:
Real-World
Performance
by
Device
...........................................................................................
13
NSS Labs
Analysis
NSS
Labs
research
indicates
that
the
majority
of
enterprises
will
deploy
traditional
firewalls
in
front
of
their
datacenters
and
at
the
core
of
the
network
to
separate
unrelated
traffic.
Because
of
this,
NSS
rates
product
performance
based
upon
the
average
of
three
traffic
types:
21KB
HTTP
response
traffic,
a
mix
of
perimeter
traffic
common
in
enterprises,
and
a
mix
of
internal
core
traffic
common
in
enterprises.
Details
of
these
traffic
mixes
are
available
in
the
Firewall
Test
Methodology
(www.nsslabs.com).
Every
effort
is
made
to
deploy
policies
that
ensure
the
optimal
combination
of
security
effectiveness
and
performance,
as
would
be
the
aim
of
a
typical
customer
deploying
the
device
in
a
live
network
environment.
This
provides
readers
with
the
most
useful
information
on
key
firewall
security
effectiveness
and
performance
capabilities
based
upon
their
expected
usage.
NSS Throughput Rating
-
5,000
30,000
10,000
8,733
28,000
850
990
9,667
Fortinet FortiGate-800c
20,000
2,127
Juniper SRX550
5,500
2,540
NETASQ NG1000-A
7,000
231
850
4,120
5,000
3,000
6,000
5,147
Stonesoft FW-1301
25,000
9,200
Cyberoam CR2500iNG
20,000
8,400
15,000
7,827
Barracuda F800
10,000
5,000
2,200
10,000
Figure
2:
Vendor
Claimed
vs.
NSS
Rated
Throughput
in
Mbps
The
results
presented
in
the
chart
above
show
the
difference
between
the
NSS
performance
rating
and
vendor
performance
claims,
which
are
often
under
ideal/unrealistic
conditions.
Where
multiple
figures
are
quoted
by
vendors,
NSS
selects
those
that
relate
to
TCP,
or
with
protection
enabled,
performance
expectations,
rather
than
the
more
optimistic
UDP-only
or
large
packet
size
performance
figures
often
quoted.
Even
so,
NSS
rated
throughput
is
typically
lower
than
that
claimed
by
the
vendor,
often
significantly
so,
since
it
is
more
representative
of
how
devices
will
perform
in
real-world
deployments.
NSS Labs
Stonesoft FW-1301
Cyberoam CR2500iNG
Barracuda F800
18,000
16,000
14,000
12,000
10,000
8,000
Juniper SRX550
NETASQ NG1000-A
WatchGuard XTM 1050
6,000
4,000
64 Byte Packets
Figure
3:
UDP
Throughput
by
Packet
Size
(I)
Fortinets
Fortigate
800c
was
the
only
device
to
demonstrate
anything
close
to
line
rate
capacity
with
packet
sizes
from
1514
bytes
all
the
way
down
to
64
bytes.
In
addition,
it
was
the
only
device
to
consistently
demonstrate
latency
of
less
than
10
microseconds.
The
Juniper
SRX
550
demonstrated
the
second
best
latency
at
12-16
microseconds.
NSS Labs
64
Byte
Packets
128
Byte
Packets
256
Byte
Packets
512
Byte
Packets
1024
Byte
Packets
1514
Byte
Packets
780
2,000
3,700
6,500
12,600
18,500
1,900
3,400
6,400
11,700
12,100
12,200
Cyberoam CR2500iNG
3,300
9,300
9,300
11,230
16,200
19,300
120
400
400
890
1,700
2,600
18,900
19,500
19,600
19,700
19,800
19,900
Juniper SRX550
400
1,450
1,450
2,800
5,500
8,000
NETASQ NG1000-A
350
600
1,200
2,300
4,800
7,300
12
12
20
64
102
5,092
9,400
9,400
9,597
9,785
9,935
640
2,450
2,450
4,600
6,000
6,000
2,600
4,400
8,300
14,900
19,100
19,900
346
925
925
2,000
4,100
6,700
Product
Barracuda
F800
In-line
security
devices
that
introduce
high
levels
of
latency
are
unacceptable,
especially
where
multiple
security
devices
are
placed
in
the
data
path.
The
chart
below
reflects
the
latency
(in
microseconds)
as
recorded
during
the
UDP
throughput
tests
at
90%
of
maximum
load.
Lower
values
are
preferred.
Product
64
Byte
Packets
-
Latency
(s)
128
Byte
Packets
-
Latency
(s)
256
Byte
Packets
-
Latency
(s)
512
Byte
Packets
-
Latency
(s)
1024
Byte
Packets
-
Latency
(s)
1514
Byte
Packets
-
Latency
(s)
Barracuda F800
273
163
108
104
103
109
75
124
99
82
102
109
1,185
845
452
385
302
270
30
31
32
33
37
42
Fortinet FortiGate-800c
Juniper SRX550
12
12
12
13
14
16
NETASQ NG1000-A
36
36
43
46
47
36
232
237
243
255
337
603
15
19
22
26
33
38
59
61
60
63
92
169
Stonesoft FW-1301
50
84
51
54
82
81
136
156
182
269
460
705
Cyberoam CR2500iNG
NSS Labs
Max.
Concurrent
TCP
Connections
w/Data
TCP
Connections
Per
Second
HTTP
Connections
Per
Second
HTTP
Transactions
Per
Second
Barracuda F800
1,000,000
1,000,000
134,200
117,000
356,000
1,500,000
1,500,000
52,000
113,000
391,700
Cyberoam CR2500iNG
3,100,000
2,999,000
215,000
179,000
360,000
400,000
400,000
15,600
10,300
25,000
6,000,000
4,400,000
180,000
180,000
397,000
533,000
542,000
18,500
18,000
80,000
1,280,000
1,200,000
57,000
43,500
102,300
16,700
15,000
760
480
6,400
1,000,000
1,000,000
36,000
36,000
348,000
480,000
2,000,000
12,000
31,800
270,000
Stonesoft FW-1301
6,000,000
2,900,000
57,000
51,000
328,600
2,600,000
2,600,000
19,000
18,000
136,000
Product
NSS Labs
Beyond
overall
throughput
of
the
device,
connection
dynamics
can
play
an
important
role
in
sizing
a
security
device
that
will
not
unduly
impede
the
performance
of
a
system
or
an
application.
Maximum
connection
and
transaction
rates
help
size
a
device
more
accurately
than
simply
focusing
on
throughput.
By
knowing
the
maximum
connections
per
second,
it
is
possible
to
predict
maximum
throughput
based
upon
the
traffic
mix
in
a
given
enterprise
environment.
For
example
If
the
device
maximum
HTTP
CPS
is
2,000,
and
average
traffic
size
is
44KB
such
that
2,500
CPS
=
1Gbps,
then
the
tested
device
will
achieve
a
maximum
of
800
Mbps
((2,000/2,500)
x
1,000
Mbps))
=
800
Mbps.
Maximum
concurrent
TCP
connections
and
maximum
TCP
connections
per
second
rates
are
also
useful
metrics
when
attempting
to
size
a
device
accurately.
Products
with
low
connection/throughput
ratio
run
the
risk
of
exhausting
connections
before
they
reach
their
maximum
potential
throughput.
By
knowing
the
maximum
connections
per
second,
it
is
possible
to
predict
when
a
device
will
fail
in
a
given
enterprise
environment.
250,000
Cyberoam CR2500iNG
200,000
Fortinet FortiGate-800c
150,000
Barracuda F800
100,000
NETASQ NG1000-A
Check Point 12600
50,000
Stonesoft FW-1301
1,000,000
2,000,000
3,000,000
4,000,000
5,000,000
6,000,000
7,000,000
Higher
up
indicates
increased
connections
per
second
capacity.
Farther
to
the
right
indicates
increased
concurrent
/
simultaneous
connections.
Products
with
low
concurrent
connection
/
connection
per
second
ratio
run
the
risk
of
exhausting
connections
(sessions)
before
they
reach
their
maximum
potential
connection
rate.
NSS Labs
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000
10,000
Barracuda F800
10,000
10,000
Cyberoam CR2500iNG
10,000
960
Fortinet FortiGate-800c
10,000
Juniper SRX550
2,400
NETASQ NG1000-A
NETGEAR ProSecure UTM9S
3,080
209
4,120
3,000
Stonesoft FW-1301
3,320
44Kbyte Response
2,520
WatchGuar
d XTM
1050
Stonesoft
FW-1301
Sophos
UTM 425
Palo Alto
Networks
PA-5020
NETGEAR
ProSecure
UTM9S
NETASQ
NG1000-A
Juniper
SRX550
Fortinet
FortiGate-8
00c
Dell
SonicWALL
NSA 4500
Cyberoam
CR2500iN
G
Check
Point 12600
Barracuda
F800
2,520
3,320
3,000
4,120
209
3,080
2,400
10,000
960
10,000
10,000
10,000
Figure
8:
Maximum
Throughput
Per
Device
With
44Kbyte
Response
NSS Labs
Megabits per Second
-
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000
10,000
8,780
Barracuda F800
10,000
10,000
820
Fortinet FortiGate-800c
10,000
1,880
Juniper SRX550
NETASQ NG1000-A
2,720
112
4,160
3,000
2,940
2,200
WatchGuar
d XTM
1050
Stonesoft
FW-1301
Sophos
UTM 425
Palo Alto
Networks
PA-5020
NETGEAR
ProSecure
UTM9S
NETASQ
NG1000-A
Juniper
SRX550
Fortinet
FortiGate-8
00c
Dell
SonicWAL
L NSA
4500
Cyberoam
CR2500iN
G
Check
Point
12600
Barracuda
F800
2,200
2,940
3,000
4,160
112
2,720
1,880
10,000
820
10,000
10,000
8,780
Megabits per Second
-
1,000
2,000
3,000
4,000
5,000
6,000
Barracuda F800
7,000
8,000
7,460
Cyberoam CR2500iNG
8,491
520
Fortinet FortiGate-800c
7,700
Juniper SRX550
1,260
NETASQ NG1000-A
2,010
58
3,280
3,000
Stonesoft FW-1301
2,910
1,610
10,000
6,490
9,000
WatchGuard
XTM 1050
Stonesoft
FW-1301
Sophos UTM
425
Palo Alto
Networks
PA-5020
NETGEAR
ProSecure
UTM9S
NETASQ
NG1000-A
Juniper
SRX550
1,610
2,910
3,000
3,280
58
2,010
1,260
Fortinet
Dell
FortiGate-800 SonicWALL
c
NSA 4500
7,700
520
Cyberoam
CR2500iNG
Check Point
12600
Barracuda
F800
8,491
7,460
6,490
10
NSS Labs
Megabits per Second
-
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000
10,000
4,390
Barracuda F800
Check Point 12600
4,660
5,650
Cyberoam CR2500iNG
Dell SonicWALL NSA 4500
280
Fortinet FortiGate-800c
6,850
Juniper SRX550
770
1,360
NETASQ NG1000-A
NETGEAR ProSecure UTM9S
32
1,680
2,600
Stonesoft FW-1301
2,200
965
WatchGuard
XTM 1050
Stonesoft
FW-1301
Sophos UTM
425
Palo Alto
Networks
PA-5020
NETGEAR
ProSecure
UTM9S
NETASQ
NG1000-A
Juniper
SRX550
965
2,200
2,600
1,680
32
1,360
770
Dell
Fortinet
SonicWALL
FortiGate-800c
NSA 4500
6,850
280
Cyberoam
CR2500iNG
Check Point
12600
Barracuda
F800
5,650
4,660
4,390
Megabits per Second
-
1,000
2,000
3,000
4,000
7,000
8,000
9,000
10,000
2,710
3,675
240
Fortinet FortiGate-800c
3,925
Juniper SRX550
440
NETASQ NG1000-A
920
17
1,150
1,400
Stonesoft FW-1301
1,050
6,000
2,603
Barracuda F800
5,000
525
WatchGuard
XTM 1050
Stonesoft
FW-1301
Sophos UTM
425
Palo Alto
Networks
PA-5020
NETGEAR
ProSecure
UTM9S
NETASQ
NG1000-A
Juniper
SRX550
525
1,050
1,400
1,150
17
920
440
Dell
Fortinet
SonicWALL
FortiGate-800c
NSA 4500
3,925
240
Cyberoam
CR2500iNG
Check Point
12600
Barracuda
F800
3,675
2,710
2,603
2013
NSS
Labs,
Inc.
All
rights
reserved.
11
NSS Labs
The
following
table
shows
the
number
of
HTTP
connections
per
second
required
to
achieve
the
rated
throughput.
44Kbyte
Response
21Kbyte
Response
10Kbyte
Response
4.5Kbyte
Response
1.7Kbyte
Response
Barracuda F800
25,000
43,900
64,900
87,800
104,100
25,000
50,000
74,600
93,200
108,400
Cyberoam CR2500iNG
25,000
50,000
84,908
113,000
147,000
2,400
4,100
5,200
5,600
9,600
Fortinet FortiGate-800c
25,000
50,000
77,000
137,000
157,000
Juniper SRX550
6,000
9,400
12,600
15,400
17,600
NETASQ ng1000-A
7,700
13,600
20,100
27,200
36,800
522
562
582
642
661
10,300
20,800
32,800
33,600
46,000
7,500
15,000
30,000
52,000
56,000
Stonesoft FW-1301
8,300
14,700
29,100
44,000
42,000
6,300
11,000
16,100
19,300
21,000
Product
Figure 13: Maximum Connection Rates Per Device With Various Response Sizes
Application
Average
Response
Time
-
HTTP
(at
90%
Max
Capacity)
The
following
table
details
the
average
application
response
time
(application
latency)
with
various
traffic
sizes
at
90%
Max
Capacity
(Throughput).
The
lower
the
number
the
better
(improved
application
response
time).
Juniper
SRX
3600
demonstrated
the
lowest
(best)
application
response
times,
while
the
NETGEAR
ProSecure
UTM9S
introduced
the
most
application
latency.
44Kbyte
Latency
(ms)
21Kbyte
Latency
(ms)
10Kbyte
Latency
(ms)
4.5Kbyte
Latency
(ms)
1.7Kbyte
Latency
(ms)
Barracuda F800
1.7
1.1
0.9
0.3
0.3
1.81
1.08
1.1
0.35
0.28
Cyberoam CR2500iNG
1.61
0.83
0.7
0.22
0.24
1.99
1.02
0.04
0.04
0.07
Fortinet FortiGate-800c
0.94
0.78
0.53
0.4
0.34
Juniper SRX550
0.14
0.04
0.04
0.01
0.03
NETASQ ng1000-A
1.42
1.25
0.53
0.12
0.16
2.55
1.39
2.43
0.93
0.65
1.01
0.63
0.27
0.1
0.08
1.98
0.99
0.75
0.06
0.03
Stonesoft FW-1301
1.1
0.9
0.4
0.2
0.08
5.1
3.01
2.14
0.85
0.77
Product
Figure 14: Application Latency (Microseconds) Per Device With Various Response Sizes
12
NSS Labs
10,000
Barracuda F800
4,700
10,000
5,200
10,000
Cyberoam CR2500iNG
6,200
950
780
10,000
Fortinet FortiGate-800c
9,000
3,000
Juniper SRX550
1,500
3,700
NETASQ NG1000-A
10,000
1,200
460
121
4,500
3,700
3,000
3,000
8,700
Stonesoft FW-1301
3,800
2,600
1,800
Figure
15:
Real-World
Performance
by
Device
Most
vendors
perform
better
in
the
real-world
protocol
mix
(perimeter,)
a
protocol
mix
typically
seen
at
an
enterprise
perimeter.
However
Fortinet
Fortigate-800c
is
the
only
vendor
to
scale
equally
well
in
the
real-world
protocol
mix
(core),
a
protocol
mix
typical
of
that
seen
in
a
large
datacenter
or
the
core
of
an
enterprise
network.
13
NSS Labs
Test
Methodology
Methodology
Version:
Firewall
v4
A
copy
of
the
test
methodology
is
available
on
the
NSS
Labs
website
at
www.nsslabs.com
Contact
Information
NSS
Labs,
Inc.
206
Wild
Basin
Rd,
Suite
200A
Austin,
TX
78746
USA
+1
(512)
961-5300
info@nsslabs.com
www.nsslabs.com
v2013.02.07
This
and
other
related
documents
available
at:
www.nsslabs.com.
To
receive
a
licensed
copy
or
report
misuse,
please
contact
NSS
Labs
at
+1
(512)
961-5300
or
sales@nsslabs.com.
2013
NSS
Labs,
Inc.
All
rights
reserved.
No
part
of
this
publication
may
be
reproduced,
photocopied,
stored
on
a
retrieval
system,
or
transmitted
without
the
express
written
consent
of
the
authors.
Please note that access to or use of this report is conditioned on the following:
1. The information in this report is subject to change by NSS Labs without notice.
2.
The
information
in
this
report
is
believed
by
NSS
Labs
to
be
accurate
and
reliable
at
the
time
of
publication,
but
is
not
guaranteed.
All
use
of
and
reliance
on
this
report
are
at
the
readers
sole
risk.
NSS
Labs
is
not
liable
or
responsible
for
any
damages,
losses,
or
expenses
arising
from
any
error
or
omission
in
this
report.
3.
NO
WARRANTIES,
EXPRESS
OR
IMPLIED
ARE
GIVEN
BY
NSS
LABS.
ALL
IMPLIED
WARRANTIES,
INCLUDING
IMPLIED
WARRANTIES
OF
MERCHANTABILITY,
FITNESS
FOR
A
PARTICULAR
PURPOSE,
AND
NON-INFRINGEMENT
ARE
DISCLAIMED
AND
EXCLUDED
BY
NSS
LABS.
IN
NO
EVENT
SHALL
NSS
LABS
BE
LIABLE
FOR
ANY
CONSEQUENTIAL,
INCIDENTAL
OR
INDIRECT
DAMAGES,
OR
FOR
ANY
LOSS
OF
PROFIT,
REVENUE,
DATA,
COMPUTER
PROGRAMS,
OR
OTHER
ASSETS,
EVEN
IF
ADVISED
OF
THE
POSSIBILITY
THEREOF.
4.
This
report
does
not
constitute
an
endorsement,
recommendation,
or
guarantee
of
any
of
the
products
(hardware
or
software)
tested
or
the
hardware
and
software
used
in
testing
the
products.
The
testing
does
not
guarantee
that
there
are
no
errors
or
defects
in
the
products
or
that
the
products
will
meet
the
readers
expectations,
requirements,
needs,
or
specifications,
or
that
they
will
operate
without
interruption.
5.
This
report
does
not
imply
any
endorsement,
sponsorship,
affiliation,
or
verification
by
or
with
any
organizations
mentioned
in
this
report.
6.
All
trademarks,
service
marks,
and
trade
names
used
in
this
report
are
the
trademarks,
service
marks,
and
trade
names
of
their
respective
owners.
14