OARnet Regional Cyberinfrastructure Plan - 2014


The OARnet network consists of more than 1,850 miles of fiber-optic backbone. The network blankets the state, providing connectivity to Ohio's colleges and universities, K-12 schools, public broadcasting stations, academic medical centers, and state, federal and partnering research organizations. In addition to Ohio the fiber assets extend into Kentucky, West Virginia, Pennsylvania and Michigan.  OARnet also owns fiber assets in the Metro Chicago area.

OARnet uses the Cisco 15454 Dense Wave Division Multiplexing (DWDM) for optical services. Currently OARnet is running 100Gbps, 10 Gbps and 1 Gbps waves across the infrastructure and currently has 43 Points-of-Presence (POPs) across the state and 21 regeneration sites.

For Ethernet and Internet Protocol (IP) routing services OARnet uses the Juniper MX series of routers primarily the MX960 and MX480. OARnet also provides CPE gateway devices for most campuses.

OARnet provides a full suite of services, optical transport, IPv4, IPv6, Multicast, Jumbo Frames, MPLS and Ethernet services.

Research and Education Campus Needs

OARnet member campuses range in size from very large research institutions such as, The Ohio State University (OSU) and Case Western Reserve University (CWRU) to medium sized campuses such as Wright State University (WSU), University of Dayton (UD) and Ohio University (OU), to very small campuses such as Denison and Wittenberg Universities. Campuses have needs such as Science DMZ connections to the backbone etc, and OARnet works with campuses to engineer the interconnections to meet the campus' requirements. The two basic needs all campuses require are big bandwidth in support of big data, and reliable connectivity to meet their academic missions. To those ends, OARnet has consistently planned strategically. Current examples include the 100Gbps project and redundant connectivity to Internet2 (I2).

Details of Plans Currently Under Way

In early 2012, OARnet received funding from the Governor's office to construct one of the first 100Gbps regional networks in the U.S. Although the grant was generous, and all of the major links have been incorporated into 100Gbps infrastructure, there are pockets where redundancy was not possible under the project budget, so while the big bandwidth requirement is met in a large part of the network, reliability remains an issue. Figure 1 depicts the backbone 100Gbps project. Note: the links and sites not currently enabled at the higher bandwidth, such as in southern Ohio between Portsmouth and Athens.
In addition to the backbone, in the Columbus metro area OARnet operates two metro rings; one of the rings services OSU. The 100Gbps connection to OSU is made at the State of Ohio Computer Center (SOCC). The ring runs between the SOCC and the Neilston POP, which is the POP where the metro ring attaches to the backbone. The east side of the ring is 100Gbps but the west side of the ring, which provides the redundancy for OSU’s campus today runs at 20 Gbps, as depicted in figure 2.

In addition to the metro ring situation, the second connection to I2 has not been built yet.

To address the issues described there are two projects currently under way and budgeted. The current budget expires on June 30, 2014. First OARnet will build out the west side of the metro ring to 100Gbps capacity and add second MX960 to the SOCC interconnect point to provide full redundancy to OSU. OARnet will also be adding capacity and another MX960 to the Neilston POP to complete the other end of the ring’s west side and enhance the connectivity options attaching to the backbone.

Secondly, OARnet will complete phase one of the I2 connection in Cincinnati. OARnet has acquired fiber from Cincinnati Bell Technology Services (CBTS) between the current OARnet CenturyLink POP in Cincinnati and the CyrusOne Data Center. OARnet has also acquired Fiber from CyrusOne to Level 3 to facilitate the connection between OARnet and I2. In phase one OARnet will complete the construction and integration of the CyrusOne POP into the OARnet backbone to set the stage for phase two, the interconnection of OARnet into a second and redundant I2 port.

Finally in the current budget cycle OARnet will be relocating one of its Cleveland POPs. Today OARnet has a POP at the Terminal Tower and a POP at the meet-me point on Euclid Ave. The connectivity in the Terminal Tower location is constrained so that by relocating to a different floor within the Euclid facility, we are able to improve connectivity options for Cleveland area campuses.  The same facility also offers data center services and will be on the OARnet backbone permitting options for campuses seeking Disaster Recovery (DR) sites. For example: a campus can leverage its current bandwidth costs without incurring additional overhead. It is worth noting that a similar opportunity exists for the CyrusOne facility previously mentioned. These data centers are approximately 300 miles apart, thus significantly lowering risk of a single event damaging both.

In the next budget cycle OARnet has made capital budget requests for the core mission of supporting higher education institutions in Ohio. As of this writing these requests appear to be on track for approval, but the final legislative action is pending.

Backbone Infrastructure

As part OARnet's ongoing operations, new equipment must be purchased to replace aging gear. Additionally, with increased reliance on OARnet infrastructure, improvements are needed to meet an increasingly demanding uptime requirement.

Specifically proposed funding will support:

  1. A disaster recovery mechanism for the Columbus Metro area. Today the Neilston POP has all rings connect into it. OARnet does have disaster recovery capability for a number of rings but due to fiber and geographical constraints there are failure vectors where in the rare event the Neilston facility were completely destroyed, some sections of the network would be rendered inoperable.
  2. The optical transmission equipment needs upgrades for the  Timing Control Cards TCC  which are the “brains” of the system. The current generation of cards is coming to the end of useful life due to processing and memory exhaustion.
  3. The backbone side of the client universities needs to upgrade to 10G to permit those connection speeds. (See Customer-Premises Equipment (CPE) Devices/Optics section below)

100-Gigabit Completion

OARnet deployed 100 Gigabit per second last year over the majority of its footprint. We plan the completion of that project (phase 2) to provide complete redundancy on ring 4 which is the ring serving OU. Today the University has a 100G service, but only 10G recovery on the backside of the ring.

Additionally, three universities, OSU, CWRU and UC are participating in the I2 Innovation Platform initiative. This initiative requires each school commit to 100G,  Science DMZ and Software Defined Networking (SDN). Out of the three UC is having the greatest difficulty. OARnet currently is working with them to meet at the CyrusOne Data Center to enable their 100G connectivity and is leveraging that activity to complete the lighting of the second I2 port. The Cincinnati connection to I2 will primarily be an Advanced Layer 2 Service (AL2S) port while the Cleveland port will be Layer 3 research traffic and other Layer 3 production services. However, both ports can be leveraged to provide redundancy for all services.

CPE Upgrades

Currently OARnet provides backbone, Internet and advanced network services to 90 universities and ITC sites throughout the state. The vast majority connects at One Gigabit of physical connectivity. Many are peaking at their maximum capacity and need to upgrade to higher bandwidth. Much of this is being driven by the use of tablets and portable computers by students and staff. There is also increasing use of eServices and eLearning by all educational sectors as Ohio strives to remain competitive in the world. OARnet plans on spending 1.6 Million dollars to replace the vast majority of the campus CPE devices to make them 10 Gbps capable.