Posts tagged api management
Time to consider MFT to mitigate the risk of data breaches and non-compliance with regulatory mandates

This a guest post by Saurabh Sharma, a Senior Analyst with Ovum IT and a member of Ovum’s Infrastructure Solutions team.

File transfer may seem to be a rather innocuous process from the perspective of business users, but IT keeps worrying about where the associated data will reside in the end and who would be able to access this data. Enterprise mobility and the emergence of cloud services have driven the uptake of ad hoc approaches to file transfer, which are well suited to the “working style” of business users. An often-neglected aspect is the lack of data security, governance, and reporting capabilities, which render file transfer protocol (FTP) and other traditional approaches to file transfer obsolete, especially from the perspective of customer service-level agreements (SLAs) and stringent regulatory compliance mandates.

A recent Ovum primary research survey revealed some interesting and not-so-innocuous trends and figures:

  • On average, 32% of business-critical processes involve file transfers and about 4% of FTP-based file transfers fail.
  • The average total cost of a data loss/breach incident is $350 per breached record (or $3 million, on an overall basis).
  • About a quarter of survey respondents revealed that their organization failed a security audit in the last 3 years.
  • 17% indicated “no confidence” in passing a compliance audit with the existing file transfer solutions.
  • There is little inclination to shift towards a “cloud-only” model for delivery of file transfer capabilities, with only 11% relying on software-as-a-service-based (SaaS-based) file transfer solutions for all of their file transfer needs.

These figures reveal the extent of vulnerability enterprises are exposed to when mission-critical data is knowingly or unknowingly shared with external parties without any appropriate data security and governance provisions. Clearly, to mitigate data breaches, enterprises need a more robust approach to file transfer. Managed file transfer (MFT) fits the bill for such requirements and can effectively meet stringent regulatory compliance mandates.

One might argue that with 80% of the IT budget being used for “keeping the lights on,” it is rather difficult to secure funding for a comprehensive MFT solution. However, given the level of business risk associated with data breaches and non-compliance to regulatory mandates, IT can effectively build a strong business case that specifies how a shift to MFT will add business value.

A comprehensive MFT solution will provide off-the-shelf integration with common middleware platforms and security products and end-to-end visibility into, and monitoring of, file transfers. It will help ensure rapid onboarding of new customers and partners, as well as governing interactions with trading partners. Clearly, there are “more than enough” reasons to abandon “islands” of file transfer infrastructure and shift to a comprehensive MFT solution.

Many IT leaders have so far failed to see the big picture and do not pay due attention to the need to govern the flow of data within, at, and beyond the edge of the enterprise. What enterprises need is a central governance layer on top of the different components of the existing middleware stack, and this could be realized with a suitable combination of MFT, B2B integration, and API management solutions. For IT leaders, there is a clear call for action to safeguard mission-critical data against unauthorized access, irrespective of the means used for transfer of this data, both within and outside the enterprise. Moreover, it is never too late to start bridging the gaps between enterprise integration infrastructure and data security and governance frameworks.

Saurabh is a Senior Analyst with Ovum IT and is a member of Ovum’s Infrastructure Solutions team. His research covers integration infrastructure and enterprise integration strategies that span across application-to-application (A2A), B2B, and cloud service integration. He also focuses on other associated disciplines such as API management, integration and solution architectures, and communications integration.

To review the whitepaper titled “The Imperative for Effective Data Flow Governance in Response to Data Security, Risk Mitigation, and Compliance Requirements,” please click here.

The CIO’s priorities for purchasing services and applications in 2015

This is a transcript of The Axway Podcast of the same name.

ANNOUNCER: From Phoenix, Arizona, this is The Axway Podcast. Here’s your host, Mike Pallagi.

PALLAGI: In November, CIO.com published an article by Kevin Corbin that reviewed a survey of the members The National Association of State CIOs (NASCIO). The survey aimed to identify the goals of the CIOs for the coming year and their priorities for purchasing services and applications. The three priorities: cybersecurity, adopting cloud services, and optimizing and consolidating resources and services. So I asked Rob Meyer, Axway’s vice president of the API Management Business Line, to tell me a little about their significance.

YouTube Preview Image

MEYER: Regarding the first priority around cybersecurity: if you look at SafeNet or other sites that track the number of breaches, we’ll probably end up close to two billion records breached this year. And it’s going to continue to get worse. Most of the systems — and this is really what’s behind it — that most of the systems were designed assuming they were behind a firewall.

PALLAGI: But a lot of the initiatives that are taking state and federal agencies toward digital business — meaning becoming digital agencies — like the cloud, like adopting mobile devices, or even just integrating more with other agencies and companies — they all break that assumption.

MEYER: It’s like trying to defend against modern bombers and paratroopers and helicopters or bridges with a moat. It just doesn’t work. And so, what companies have to do to deal with cybersecurity today, what agencies have to do, or state and local organizations… They have to build up a new layer, really — the people, the processes, the technology that replace the traditional firewall, the moat — with the right approaches, the best practices, for cybersecurity today.

PALLAGI: Then there’s the second priority, adopting cloud services. Corbin writes that, “State CIOs named the adoption of cloud services their second priority for 2015, with survey respondents identifying strategy, provider selection and governance models, among others, as key areas of focus. Unsurprisingly, CIOs said that security is also a top concern as they consider moving services and applications to the cloud.”

MEYER: If you do anything with cloud, something needs to ground the cloud. It needs to connect it to all the on-premise systems. And that layer just doesn’t exist. If it did, it really wouldn’t be a priority today. People would be able to just hook up their on-premise applications and systems with cloud services. So there needs to be a secure integration layer — some people even call this a “service virtualization layer” or a “service control layer” — that surfaces up, that exposes the inside services in the systems, turns them into outside secure APIs or web services or even files that are hardened, that are auditable, that are secure and documented, and can be discovered and consumed through some self-service techniques by people outside the firewall.

PALLAGI: In short, Rob said, that’s what’s truly needed if people are going to be successful with cloud today. To optimize, they need something that grounds it. And that brings us to the last priority: optimizing and consolidating resources and services. What was Rob’s take?

MEYER: Regarding the last priority around optimizing and consolidating resources and services: it’s important to consolidate the hardware and software and all the boxes being used at what many thought was about ten percent utilization. And so, you saw a lot of data center consolidation initiatives at the federal, state, and local levels. But these CIOs do have bigger issues that come out once they start consolidating.

PALLAGI: So when CIOs start putting up all these applications, whether they’re legacy or packaged or custom, they start to realize that the hardware and the software was only about 25 percent of the cost of the app.

MEYER: And when you end up with several of the same types of apps running in the same data center, you might ask, “Should I consolidate these and how?” Or the executives outside of IT, they ask for improving the processes that we have, “What kinds of changes should we do to the business processes we have in place that are going to make the agency, the government, more efficient?” That triggers bigger changes. That triggers any kind of business process improvement or some other kind of consolidation.

PALLAGI: And what you don’t want to do after all of this work is end up making the same integration mistakes all over again, just under the covers, so to speak.

MEYER: If you do what you did on premise in the cloud, you could call it spaghetti as a service. And that’s not where you want to end up. You want to invest now in a center of excellence, exactly the way we did it with SOA to help govern these end-to-end flows and integrations between the agencies involved or the outside corporations involved between the agencies that are crossing the firewall, and to reuse the work, to monitor and manage all these integrations, and to avoid having another integration nightmare.

To read Kevin Corbin’s article, please click here.

To learn more about cloud integration, please click here.