Posts tagged file transfer protocol
Is your MFT ready for the IBM z13?

IBM recently announced the availability of a “new” mainframe dubbed the z13. Flying in the face of the “mainframe is dead” mantra, IBM is introducing a mainframe that is poised to address the needs of mobile, analytics (big data), cloud, and security. According to IBM, it can handle 2.5 billion transactions a day. It’s a bit more impressive when it’s written out fully: 2,500,000,000 transactions a day. This represents the total number of transactions of 100 cyber-malls … for every day of the year!

It’s clear that IBM sees value in what many pundits deem as legacy technology. In fact, they’re stating, by this introduction, that legacy infrastructure can (and should) be updated and invested in. “Legacy” technology is often the heart of a business that cannot be ripped and replaced. It can be “upgraded,” as IBM is showing here. In fact, they’re acknowledging that there are concerns with legacy technology (i.e., the focus on security: real-time encryption of data) which have to be addressed, in addition to adding new functionality.

So the mainframe is being updated — what does this mean for other infrastructure? Well, I’d argue that it means that everything that is considered legacy should be open to updating. Just because it was working (mainframes worked) doesn’t mean that it meets the needs of the current business environment needs around security, visibility, and integration.

This brings us to Managed File Transfer technology. It’s often considered legacy. It’s often believed it should be left in place “because it does the job.” Whether your file transfer technology is some version of SFTP or a purpose-built MFT solution, the introduction of the z13 should cause people to take the time to reevaluate their legacy technology.

We at Axway recommend that you review your legacy technology, including your MFT technology, and ask the question, “Is my MFT ready for the z13?”

Time to consider MFT to mitigate the risk of data breaches and non-compliance with regulatory mandates

This a guest post by Saurabh Sharma, a Senior Analyst with Ovum IT and a member of Ovum’s Infrastructure Solutions team.

File transfer may seem to be a rather innocuous process from the perspective of business users, but IT keeps worrying about where the associated data will reside in the end and who would be able to access this data. Enterprise mobility and the emergence of cloud services have driven the uptake of ad hoc approaches to file transfer, which are well suited to the “working style” of business users. An often-neglected aspect is the lack of data security, governance, and reporting capabilities, which render file transfer protocol (FTP) and other traditional approaches to file transfer obsolete, especially from the perspective of customer service-level agreements (SLAs) and stringent regulatory compliance mandates.

A recent Ovum primary research survey revealed some interesting and not-so-innocuous trends and figures:

  • On average, 32% of business-critical processes involve file transfers and about 4% of FTP-based file transfers fail.
  • The average total cost of a data loss/breach incident is $350 per breached record (or $3 million, on an overall basis).
  • About a quarter of survey respondents revealed that their organization failed a security audit in the last 3 years.
  • 17% indicated “no confidence” in passing a compliance audit with the existing file transfer solutions.
  • There is little inclination to shift towards a “cloud-only” model for delivery of file transfer capabilities, with only 11% relying on software-as-a-service-based (SaaS-based) file transfer solutions for all of their file transfer needs.

These figures reveal the extent of vulnerability enterprises are exposed to when mission-critical data is knowingly or unknowingly shared with external parties without any appropriate data security and governance provisions. Clearly, to mitigate data breaches, enterprises need a more robust approach to file transfer. Managed file transfer (MFT) fits the bill for such requirements and can effectively meet stringent regulatory compliance mandates.

One might argue that with 80% of the IT budget being used for “keeping the lights on,” it is rather difficult to secure funding for a comprehensive MFT solution. However, given the level of business risk associated with data breaches and non-compliance to regulatory mandates, IT can effectively build a strong business case that specifies how a shift to MFT will add business value.

A comprehensive MFT solution will provide off-the-shelf integration with common middleware platforms and security products and end-to-end visibility into, and monitoring of, file transfers. It will help ensure rapid onboarding of new customers and partners, as well as governing interactions with trading partners. Clearly, there are “more than enough” reasons to abandon “islands” of file transfer infrastructure and shift to a comprehensive MFT solution.

Many IT leaders have so far failed to see the big picture and do not pay due attention to the need to govern the flow of data within, at, and beyond the edge of the enterprise. What enterprises need is a central governance layer on top of the different components of the existing middleware stack, and this could be realized with a suitable combination of MFT, B2B integration, and API management solutions. For IT leaders, there is a clear call for action to safeguard mission-critical data against unauthorized access, irrespective of the means used for transfer of this data, both within and outside the enterprise. Moreover, it is never too late to start bridging the gaps between enterprise integration infrastructure and data security and governance frameworks.

Saurabh is a Senior Analyst with Ovum IT and is a member of Ovum’s Infrastructure Solutions team. His research covers integration infrastructure and enterprise integration strategies that span across application-to-application (A2A), B2B, and cloud service integration. He also focuses on other associated disciplines such as API management, integration and solution architectures, and communications integration.

To review the whitepaper titled “The Imperative for Effective Data Flow Governance in Response to Data Security, Risk Mitigation, and Compliance Requirements,” please click here.