Menu

100 Gbps Test Link Sets Pace for Faster Trans-Atlantic Data Transfers

April 4, 2014

Contact: Jon Bashor, jbashor@lbl.gov, 510-486-5849

Data transfers from the Large Hadron Collider at CERN in Switzerland to sites in the U.S. have historically taken different paths – 15 in all – via 10 gigabit per second (Gbps) links separately managed by three research networks in the U.S. and Europe. So what would happen if those massive datasets were instead transferred using a single 100 Gbps connection?

 That was the thinking behind an experiment that began in early March as the Department of Energy’s ESnet, Internet2, CANARIE, GÉANT, NORDUnet and SURFnet -- the leading research and education networks in the U.S., Canada, Europe and Scandinavia – collaborated with CERN to use a leased 100 Gbps connection between Amsterdam (Netherlight Open Exchange) and New York. The four-week test was conducted in collaboration with LHCONE, the LHC Open Network Environment.

The results of the initial test were impressive.

“Using test data, we ran a 10 minute saturation test at 99.9 percent utilization with no loss, no errors,” O’Connor said. “Then we ran a 24 hour test at 50 Gbps, passing over 540 terabytes of data with no loss and no errors. This successful testing will help pave the way to production use of the connection for data from LHC experiments.”

The 100 Gbps link, called the Advanced North Atlantic 100G Pilot project, was launched by the participating organizations last June with the first Transatlantic 100 Gbps demonstrations. The year-long project being used for engineering and testing applications, resources, monitoring techniques and advanced technologies such as software-defined networking.

The test was important as each of the partners operates its own 100 Gbps network, but then have to connect to a 10 Gbps link to move data across the Atlantic Ocean as none of them had access to a 100 Gbps submarine link – until now. Having an end-to-end 100 Gbps connection is important as data from the LHC is sent to two sites in the U.S. – Brookhaven National Laboratory in New York and Fermilab in Illinois – and then made available to participating researchers at national labs and universities around the country.

“We created a ‘walled garden’ Internet, or a private network with no direct routing in or out,” said ESnet network engineer Mike O’Connor, co-chair of the LHCONE Operations Group. “You have to be a member to use this network, which allows us to treat the data going through the network as science data so we can get a very good idea of the type of load it would place on the system.”

But the test was not just about bandwidth. Each of the existing 10 Gbps links is managed by a different network, each with different policies. Those policies can result in some links being saturated, while others are not in use.

“We wanted to see if the 10 Gbps lines have constrained the utilization as the load was distributed according to policies created by six different networks,” O’Connor said. “For the 100 Gbps link, we had to collaborate on policy, so it brings us together in both a physical and a policy sense.”

To assess the performance of the network, the team used perfSONAR , a tool for end-to-end monitoring and troubleshooting of multi-domain network performance. perfSONAR  itself is a collaboration, developed over the past 10 years by ESnet, Fermilab, SLAC, Georgia Tech, Indiana University, Internet2, the University of Delaware, the GÉANT project in Europe and RNP in Brazil.

The only hiccups in the project occurred early on, before the test was started. The underwater cable was cut twice in the North Sea, apparently by ships dragging their anchors.

According to O’Connor, the test has a lot of potential for future applications and will help ensure that data from the LHC can reach the scientists participating in the experiments, and will provide extra capacity as the experimental facilities at CERN are upgraded.

“A single 100 Gbps connection will provide plenty of bandwidth for what we need to do now, but with traffic doubling every 18 months on ESnet, we can see the day when we will need to add even more capacity,” O’Connor said.