Documente Academic
Documente Profesional
Documente Cultură
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Overview
NSS Labs performed an independent test of the Arbor Networks APS 2800 v5.8.1. The product was subjected to
thorough testing at the NSS facility in Austin, Texas, based on the Distributed Denial-of-Service (DDoS) Prevention
Test Methodology v2.0 available at www.nsslabs.com. This test was conducted free of charge and NSS did not
receive any compensation in return for Arbor Networks participation.
While the companion Comparative Reports on security, performance, and total cost of ownership (TCO) will
provide information about all tested products, this Test Report provides detailed information not available
elsewhere.
During NSS testing, vendors tune their devices to create a performance baseline based on normalized network
traffic. Devices are further tuned for accuracy as needed. The performance baseline traffic is 40% of rated
throughput and consists of a mix of web application traffic to provide readers with relevant security effectiveness
and performance dimensions based on their expected usage.
Product
Arbor Networks APS 2800
v5.8.1
Volumetric
Overall
Attack
Mitigation
Overall
Baseline
Impact
NSS-Tested
Throughput
3-Year TCO
(List Price)
3-Year TCO
(Street Price)
90.8%
0.4%
20,000
Mbps
NA
$274,600
Application1
Protocol
Attack Mitigation
82.5%
Attack Mitigation
90.0%
Attack Mitigation
Baseline Impact
0.3%
Baseline Impact
0.0%
Baseline Impact
100.0%
1.0%
Using the tuned policy, the APS 2800 provided 90.8% of overall attack mitigation. There was 0.4% impact on the
overall baseline traffic. The device also passed all stability and reliability tests.
The APS 2800 is rated by NSS at 20,000 Mbps, which is in line with the vendor-claimed performance; Arbor
licensed the device under test at 20 Gbps. NSS-Tested Throughput is calculated as an average of all of the realworld protocol mixes and the 21 KB HTTP response-based capacity test.
Arbor Networks declined to provide list pricing for the device tested.
Two of the devices have an HTTP redirect approach (a valid mitigation technique), which resulted in an incompatibility with the DDoS
Prevention test harness. For this reason, the HTTP GET Flood test score is not included in either the Application scores or the Overall scores.
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Table of Contents
Overview............................................................................................................................... 2
Security Effectiveness ............................................................................................................ 5
Volumetric Attacks ........................................................................................................................................................5
Protocol Attacks.............................................................................................................................................................6
Application Attacks ........................................................................................................................................................6
Performance ......................................................................................................................... 7
HTTP Capacity with No Transaction Delays ...................................................................................................................7
Application Average Response Time HTTP .................................................................................................................8
Real-World Traffic Mixes ...............................................................................................................................................8
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Table of Figures
Figure 1 Overall Test Results.......................................................................................................................................2
Figure 2 Volumetric Attacks ........................................................................................................................................5
Figure 3 Protocol Attacks ............................................................................................................................................6
Figure 4 Application Attacks .......................................................................................................................................6
Figure 5 HTTP Capacity with No Transaction Delay ....................................................................................................7
Figure 6 Average Application Response Time (Milliseconds) .....................................................................................8
Figure 7 Real-World Traffic Mixes ..............................................................................................................................8
Figure 8 Stability and Reliability Results .....................................................................................................................9
Figure 9 Sensor Installation Time (Hours) .................................................................................................................11
Figure 10 List Price 3-Year TCO .................................................................................................................................12
Figure 11 Street Price 3-Year TCO .............................................................................................................................12
Figure 12 Detailed Scorecard ....................................................................................................................................14
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Security Effectiveness
This section verifies that the DDoS prevention device under test (DUT) can detect and mitigate DDoS attacks
effectively. Both the legitimate network traffic and the DDoS attack are executed using a shared pool of IP
addresses, which represents a worst-case scenario for the enterprise. Since legitimate traffic uses the same IP
addresses as the attacker, no product can simply block or blacklist a range of IP addresses. This test can reveal the
DUTs true ability to effectively mitigate attacks.
NSS analysis is conducted first by testing every category of DDoS attack individually to determine that the DUT can
successfully detect and mitigate each attack. Once a baseline of security effectiveness is determined, NSS builds
upon this baseline by adding multiple DDoS attacks from different categories in an attempt to overwhelm the DUT
and allow attack leakage to occur. At each point during testing, NSS validates that legitimate traffic is still
allowed and is not inadvertently blocked by the DUT.
In all security effectiveness tests, a mix of http traffic is run to establish a baseline of legitimate traffic. This traffic is
run at 40% of the devices rated bandwidth, and is used to ensure that legitimate traffic is not affected during
mitigation. The baseline impact percentage is listed as the amount of baseline traffic that is affected by the device
while the device is mitigating an attack. As an example, if the baseline impact score is 5%, then 5% of the known
baseline traffic was inadvertently blocked during attack mitigation.
After the baseline traffic has been stabilized at 40% of the rated bandwidth, an attack is started. These attacks are
intended to saturate the network link in terms of either bandwidth or packet rate. To calculate the percentage of
each attack that is being mitigated, the known amount of attack traffic being injected into the device is compared
with the amount of attack traffic that is allowed to pass through the device. This percentage is calculated
separately from the baseline impact score, which allows for a scenario where a device may mitigate an attack very
well, but at the same time cause an unintended impact to legitimate services.
Volumetric Attacks
Volumetric attacks consume all of a targets available bandwidth. An attacker can use multiple hosts, for example,
a botnet, to generate a large volume of traffic that causes network congestion between the target and the rest of
the Internet, and leaves no available bandwidth for legitimate users. There are many types of packet floods used in
volumetric attacks, including regular and malformed ICMP, regular and malformed UDP, and spoofed IP.
Type of Volumetric Attack
Mitigation
Baseline Impact
100%
0%
100%
0%
100%
0%
95%
0%
0%
0%
100%
2%
UDP Flood
Figure 2 Volumetric Attacks
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Protocol Attacks
Protocol DDoS attacks exhaust resources on the target or on a specific device between the target and the Internet,
such as routers and load balancers. After the DDoS attack has consumed enough of the devices resources, the
device cannot open any new connections because it is waiting for old connections to close or expire. Examples of
protocol DDoS attacks include SYN floods, ACK floods, RST attacks, and TCP connection floods.
Type of Protocol Attack
Mitigation
Baseline Impact
ACK Flood
60%
0%
RST Flood
100%
0%
udp-flood-frag.sh
100%
0%
SYN Flood
100%
0%
Application Attacks
An application attack takes advantage of vulnerabilities in the application layer protocol or within the application
itself. This style of DDoS attack may require, in some instances, as little as one or two packets to render the target
unresponsive. Application DDoS attacks can also consume application layer or application resources by slowly
opening up connections and then leaving them open until no new connections can be made. Examples of
application attacks include HTTP floods, HTTP resource exhaustion, and SSL exhaustion.
Type of Application Attack
Mitigation
Baseline Impact
100%
0%
100%
0%
LOIC
100%
0%
100%
0%
100%
5%
100%
0%
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Performance
There is frequently a trade-off between security effectiveness and performance. Because of this trade-off, it is
important to judge a products security effectiveness within the context of its performance and vice versa. This
ensures that new security protections do not adversely impact performance and that security shortcuts are not
taken to maintain or improve performance.
This section measures the performance of the system using various traffic conditions that provide metrics for realworld performance. Individual implementations will vary based on usage; however, these quantitative metrics
provide a gauge as to whether a particular device is appropriate for a given environment. Network traffic was
passed through the inspection engine, but no mitigation rules were in place. Performance metrics were measured
to see if the device was able to perform at the advertised rate.
The net difference between the baseline (without the device) and the measured capacity of the device is recorded
for each of the following tests.
1,000,000
20,000
20,000
20,000
20,000
20,000
100,000
14,553
10,000
15,000
1,000
10,000
100
5,000
0
CPS
Mbps
10
2880 KB Response
768 KB Response
192 KB Response
44 KB Response
21 KB Response
10 KB Response
1.7 KB Response
800
3,000
12,000
50,000
100,000
200,000
582,100
20,000
20,000
20,000
20,000
20,000
20,000
14,553
20,000
20,000
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Milliseconds
0.310
2.000
0.700
1.100
0.430
0.580
0.700
Mbps
20,000
20,000
20,000
20,000
20,000
Real-World (Financial)
Real-World (ISP)
20,000
20,000
20,000
20,000
15,000
10,000
5,000
0
Mbps
The APS 2800 was tested by NSS to perform in line with the throughput claimed by the vendor for all real-world
traffic mixes.
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Result
PASS
PASS
Power Fail
PASS
Persistence of Data
PASS
Figure 8 Stability and Reliability Results
These tests also determine the behavior of the state engine under load. All DDoS prevention devices must choose
whether to risk denying legitimate traffic or risk allowing malicious traffic once they run low on resources. A DDoS
prevention device will drop new connections when resources (such as state table memory) are low, or when traffic
loads exceed its capacity. In theory, this means the DUT will block legitimate traffic but maintain state on existing
connections (and prevent attack leakage).
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
General Management and Configuration How easy is it to install and configure devices, and how easy is it to
deploy multiple devices throughout a large enterprise network?
Policy Handling How easy is it to create, edit, and deploy complicated security policies across an enterprise?
Alert Handling How accurate and timely is the alerting, and how easy is it to drill down to locate critical
information needed to remediate a security problem?
Reporting How effective is the reporting capability, and how readily can it be customized?
10
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
For the purposes of this report, capital expenditure (capex) items are included for a single device only (the cost of
acquisition and installation).
Installation Hours
This table depicts the number of hours of labor required to install each device using only local device management
options. The table accurately reflects the amount of time that NSS engineers, with the help of vendor engineers,
needed to install and configure the device to the point where it operated successfully in the test harness, passed
legitimate traffic, and blocked and detected prohibited or malicious traffic. This closely mimics a typical enterprise
deployment scenario for a single device.
The installation cost is based on the time that an experienced security engineer would require to perform the
installation tasks described above. This approach allows NSS to hold constant the talent cost and measure only the
difference in time required for installation. Readers should substitute their own costs to obtain accurate TCO
figures.
Product
Installation (Hours)
v5.8.1
11
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Purchase
Maintenance
/Year
Year 1
Cost
Year 2
Cost
Year 3
Cost
3-Year TCO
NA
NA
NA
NA
NA
NA
v5.8.1
Year 1 Cost is calculated by adding installation costs (US$75 per hour fully loaded labor x installation time) +
purchase price + first-year maintenance/support fees.
Year 2 Cost consists only of maintenance/support fees.
Year 3 Cost consists only of maintenance/support fees.
Purchase
Maintenance
/Year
Year 1
Cost
Year 2
Cost
Year 3
Cost
3-Year TCO
$175,000
$33,000
$208,600
$33,000
$33,000
$274,600
v5.8.1
Year 1 Cost is calculated by adding installation costs (US$75 per hour fully loaded labor x installation time) +
purchase price + first-year maintenance/support fees.
Year 2 Cost consists only of maintenance/support fees.
Year 3 Cost consists only of maintenance/support fees.
For additional TCO analysis, including for the CMS, refer to the TCO Comparative Report.
12
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Result
Security Effectiveness
Mitigation
Baseline Impact
100%
0%
100%
0%
100%
0%
95%
0%
0%
0%
100%
2%
ACK Flood
60%
0%
RST Flood
100%
0%
Udp-flood-frag.sh
100%
0%
SYN Flood
100%
0%
100%
0%
100%
0%
100%
0%
100%
0%
100%
5%
100%
0%
Volumetric
UDP Flood
Protocol
Application
Performance
HTTP Capacity with No Transaction Delays
2880 KB Response
800
768 KB Response
3,000
192 KB Response
12,000
44 KB Response
50,000
21 KB Response
100,000
10 KB Response
200,000
1.7 KB Response
582,100
Milliseconds
0.310
2.000
0.700
1.100
0.430
0.580
0.700
Real-World Traffic
Mbps
20,000
20,000
Real-World Protocol Mix (Data Center Web Based Applications And Services)
20,000
13
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Real-World Protocol Mix (Data Center Internet Service Provider (ISP) Mix)
20,000
PASS
PASS
Power Fail
PASS
Persistence of Data
PASS
See Comparative
See Comparative
Expected Costs
Initial Purchase (hardware as tested)
NA
NA
NA
NA
See Comparative
See Comparative
NA
Year 2
NA
Year 3
NA
NA
See Comparative
See Comparative
Expected Costs
Initial Purchase (hardware as tested)
$175,000
$600
$33,000
$0
See Comparative
See Comparative
$208,600
Year 2
$33,000
Year 3
$33,000
$274,600
14
NSS Labs
Distributed Denial-of-Service Prevention Test Report Arbor Networks APS 2800 v5.8.1
Test Methodology
Distributed Denial-of-Service (DDoS) Prevention Test Methodology v2.0
A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com.
Contact Information
NSS Labs, Inc.
206 Wild Basin Road
Building A, Suite 200
Austin, TX 78746 USA
info@nsslabs.com
www.nsslabs.com
This and other related documents are available at: www.nsslabs.com. To receive a licensed copy or report misuse,
please contact NSS Labs.
2016 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, copied/scanned, stored on a retrieval
system, e-mailed or otherwise disseminated or transmitted without the express written consent of NSS Labs, Inc. (us or we).
Please read the disclaimer in this box because it contains important information that binds you. If you do not agree to these
conditions, you should not read the rest of this report but should instead return the report immediately to us. You or your
means the person who accesses this report and any entity on whose behalf he/she has obtained this report.
1. The information in this report is subject to change by us without notice, and we disclaim any obligation to update it.
2. The information in this report is believed by us to be accurate and reliable at the time of publication, but is not guaranteed.
All use of and reliance on this report are at your sole risk. We are not liable or responsible for any damages, losses, or expenses
of any nature whatsoever arising from any error or omission in this report.
3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY US. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT, ARE HEREBY DISCLAIMED AND EXCLUDED
BY US. IN NO EVENT SHALL WE BE LIABLE FOR ANY DIRECT, CONSEQUENTIAL, INCIDENTAL, PUNITIVE, EXEMPLARY, OR INDIRECT
DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE
POSSIBILITY THEREOF.
4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or
software) tested or the hardware and/or software used in testing the products. The testing does not guarantee that there are
no errors or defects in the products or that the products will meet your expectations, requirements, needs, or specifications, or
that they will operate without interruption.
5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned
in this report.
6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of
their respective owners.
15