ESnet’s FasterData Site Is Popular International Destination in Data-Driven World
Contact: Jon Bashor, 510-486-5849, email@example.com
For the past 15 years or so, ESnet’s Brian Tierney has maintained FasterData.es.net, an online repository of tips and tricks for improving network performance. Last year, 83,649 unique visitors landed at FasterData, making 120,776 visits and viewing 213,151 pages.
Although the site comprises around 120 pages, about 50% of the page views were on the pages that focus on tuning Linux hosts for better network performance on network paths above 1 gigabit per second. “There are a few settings that aren’t defaults and if you use them, they can gain you a lot in terms of performance,” Tierney said, cautioning that the same settings will actually downgrade the performance of slower networks or home routers.
The site contains a lot of arcane details that are hard to memorize, Tierney said, so users can go to a page, copy and paste the settings into their systems. According to an analysis of traffic to the site, those systems are located around the world. The United States is home to more than 43,000 of those visitors, with users in the United Kingdom, India, Germany, Russia, Canada, France, China, Japan and Brazil rounding out the top 10 countries visiting the site.
“I’ve also gotten requests for permission to translate the site into Russian, Chinese, Portuguese and even Armenian,” Tierney said. In the spirit of collaboration, he’s said yes.
Those users also occasionally send in corrections or suggestions, said Tierney, who maintains the site with help from ESnet network engineer Eli Dart. Other users, including some of his friends who work in industry in Silicon Valley, also weigh in, telling him they find the site useful.
Recent additions to the site have been information on building a Science DMZ, including lots of advice on building and tuning a “data transfer node.” The Science DMZ model, described at http://fasterdata.es.net/science-dmz/, is quickly gaining recognition as an excellent way to quickly move huge scientific data sets between sites by creating an environment that is tailored to the needs of high performance science applications.