Evolving IT Strategies Require Flexible Architectures

Last week we launched a special report series exploring how high-speed fiber networks can future-proof distributed data centers, specifically looking at how data center growth created complexity. This week, we’ll look at how those market dynamics are driving IT leaders to evolve their data processing strategies to create flexible architectures.

architectures

Get the full report

Taken together, the market dynamics discussed in our last article demand that IT leaders evolve their data processing strategies with an eye toward more flexible architectures that move processing closer to the point of value. Future strategies are likely to include a combination of on-site infrastructure, public and private clouds, colocation services, edge data centers in the field or inside telecommunication carrier sites, and unattended smart devices.

The location and type of computing that is used will be driven by the characteristics of the workload. For example, applications like autonomous vehicles and streaming video delivery demand sub-five millisecond response times. In such cases, the processing is best distributed across multiple tiers with stream processing at the edge, a mid-tier control plane managing multiple devices and cloud servers aggregating and analyzing data at a high level.

This environment may include a combination of owned infrastructure, co-location services, carrier services, and public and private cloud. Operations and maintenance may be provided by dedicated staff and a network of service providers and contractors.

Other applications, such as real-time ad delivery and securities trading, require high-speed interconnection of the type provided by co-location services.

While traditional online transaction processing workloads are likely to remain in the data center or private cloud, data may also traverse the network for such uses as analytics, reporting, and sharing with business partners.

Networks will need to be segmented to allocate dedicated bandwidth to latency-sensitive processes and multitiered compute fabrics will be deployed based upon required response times, workload characteristics, security needs, and other factors.

While traditional online transaction processing workloads are likely to remain in the data center or private cloud, data may also traverse the network for such uses as analytics, reporting, and sharing with business partners. For companies that operate internationally, this will create new demands on network infrastructure as well as processing considerations driven by data sovereignty regulations and cost.

Download the full report, Future-Proofing the Distributed Data Center with High-Speed Fiber Networks, courtesy of Belden, to learn more. In our next article, we’ll explore three key considerations for IT leaders for future-proofing their infrastructure while creating those flexible architectures – resiliency, security, and performance. Catch up on the previous article here

Leave a Reply

Your email address will not be published.

%d bloggers like this: