Surveillance Capitalism and DCIM

Surveillance Capitalism and DCIM

2019

In her book “Surveillance Capitalism,” the Harvard scholar Shoshana Zuboff describes how some software and service providers have been collecting vast amounts of data, with the goal of tracking, anticipating, shaping and even controlling the behavior of individuals. She sees it as a threat to individual freedom, to business and to democracy.

Zuboff outlines the actions, strategies and excesses of Facebook, Google and Microsoft in some detail. Much of this is well-known, and many legislators have been grappling with how they might limit the activities of some of these powerful companies. But the intense level of this surveillance extends far beyond these giants to many other suppliers, serving many markets. The emergence of the internet of things (IoT) accelerates the process dramatically.

Zuboff describes, for example, how a mattress supplier uses sensors and apps to collect data on sleepers’ habits, even after they have opted out; how a doll listens to and analyzes snippets of children’s conversations; and, nearer to home for many businesses, how Google’s home automation system Nest is able to harvest and exploit data about users’ behavior (and anticipated behavior) from their use of power, heating and cooling. Laws such as Europe’s General Data Protection Regulation offer theoretical protection but are mostly worked around by fine print: Privacy policies, Zuboff says, should be renamed surveillance policies.

All this is made possible, of course, because of ubiquitous connected devices; automation; large-scale, low-cost compute and storage; and data centers. And there is an irony — because many data center operators are themselves concerned at how software, service and equipment suppliers are collecting (or hope to collect) vast amounts of data about the operation of their equipment and their facilities. At one recent Uptime Institute customer roundtable with heavy financial services representation, some attendees strongly expressed the view that suppliers should (and would) not be allowed to collect and keep data regarding their data center’s performance. Others, meanwhile, see the suppliers’ request to collect data, and leverage the insights from that data, as benign and valuable, leading to better availability. If the supplier benefits in the process, they say, it’s a fair trade.

Of all the data center technology suppliers, Schneider Electric has moved furthest and fastest on this front. Its EcoStruxure for IT service is a cloud-based data center infrastructure management (DCIM) product (known as data center management as a service, or DMaaS) that pulls data from its customers’ many thousands of devices, sensors and monitoring systems and pools it into data lakes for analysis. (By using the service, customers effectively agree to share their anonymized data.) With the benefit of artificial intelligence (AI) and other big-data techniques, it is able to use this anonymized data to build performance models, reveal hitherto unseen patterns, make better products and identify optimal operational practices. Some of the insights are shared back with customers.

Schneider acknowledges that some potential customers have proven to be resistant and suspicious, primarily for security reasons (some prefer an air gap, with no direct internet connections for equipment). But they also say that take-up of their DCIM/DMaaS products have risen sharply since they began offering the low-cost, cloud-based monitoring services. Privacy concerns are not so great that they deter operators from taking advantage of a service they like.

Competitors are also wary. Some worry about competitive advantage, that a big DMaaS company will have the ability to see into a data center as surely as if its staff were standing in the aisles — indeed, somewhat better. And it is true: a supplier with good data and models could determine, with a fairly high degree of certainty, what will likely happen in a data center tomorrow and probably next year — when it might reach full capacity, when it might need more cooling, when equipment might fail, even when more staff are needed. That kind of insight is hugely valuable to the customer — and to any forewarned supplier.

To be fair, these competitive concerns aren’t exactly new: services companies have always had early access to equipment needs, for example, and remote monitoring and software as a service are now common in all industries. But the ability to pool data, divine new patterns, to predict and even shape decisions, almost without competition … this is a newer trend and arguably could stifle competition and create vendor lock-ins in a way not see before. With the benefit of AI, a supplier may know when cooling capacity will need to be increased even before the customer has thought about it.

Uptime has discussed the privacy (surveillance?) issues with executives at several large suppliers. Unsurprisingly, those who are competitively at most risk are most concerned. For others, their biggest concern is simply that they don’t have enough data to do this effectively themselves.

Schneider, which has a big market share but is not, arguably, in a position of dominance, says that it addressed both privacy and security fears when it designed and launched EcoStruxure. It says that the way data is collected and used is fully under customer control. The (encrypted) machine data that is collected by the EcoStruxure DMaaS is seen only by a select number of trusted developers, all of whom are under nondisclosure agreements. Data is tagged to a particular customer via a unique identifier to ensure proper matching, but it is fully segregated from other customers’ data and anonymized when used to inform analytics. These insights from the anonymized models may be shared with all customers, but neither Schneider nor anyone else can identify particular sites.

Using Schneider’s services, customers can see their own data, and see it in context — they see how their data center or equipment compares with the aggregated pool of data, providing valuable insights. But still, outside of the small number of trusted developers, no one but the customer sees it — unless, that is, the customer appoints a reseller, advisor or Schneider to look at the data and give advice. At that point, the supplier does have an advantage, but the duty is on the customer to decide how to take that advice.

None of this seems to raise any major privacy flags, but it is not clear that Zuboff would be entirely satisfied. For example, it might be argued that the agreement between data center operators and their suppliers is similar the common practice of the “surveillance capitalists.” These giant consumer-facing companies offer a superior service/product in exchange for a higher level of data access, which they can use as they like; anyone who denies access to the supplier is simply denied use of the product. Very few people ever deny Google or Apple access to their location, for example, because doing so will prevent many applications from working.

While DCIM is unusually wide in its scope and monitoring capability, this is not just about software tools. Increasingly, a lot of data center hardware, such as such as uninterruptible power supplies, power distribution units or blade servers, requires access to the host for updates and effective monitoring. Denying this permission reduces the functionality — probably to the point where it becomes impractical.

And this raises a secondary issue that is not well covered by most privacy laws: Who owns what data? It is clear that the value of a suppliers’ AI services grows significantly with more customer data. The relationship is symbiotic, but some in the data center industry are questioning the balance. Who, they ask, benefits the most? Who should be paying whom?

The issue of permissions and anonymity can also get muddy. In theory, an accurate picture of a customer (a data center operator) could be built (probably by an AI-based system) using data from utilities, cooling equipment, network traffic and on-site monitoring systems — without the customer ever actually owning or controlling any of that data.

Speaking on regulation and technology at a recent Uptime meeting held in the United States, Alex Howard, a Washington-based technology regulation specialist, advised that the customers could not be expected to track all this, and that more regulation is required. In the meantime, he advised business to take vigilant stance.

Uptime’s advice, reflecting client’s concerns, is that DMaaS provides a powerful and trusted service — but operators should always consider worse cases and how a dependency might be reversed. Big data provides many tempting opportunities, while anonymized data in theory can be breached and sometimes de-anonymized. Businesses, like governments, can change over time, and armed with powerful tools and data, they may cut corners or breach walls if the situation — or if a new owner or government — demands it. This is a now a reality in all business.

For specific recommendations on using DMaaS and related data center cloud services, see our report Very smart data centers: How artificial intelligence will power operational decisions.

The post Surveillance Capitalism and DCIM appeared first on Website Host Review.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: