Posts tagged file transfer protocol
The Imperative for Effective Data Flow Governance

This is a transcript of the Axway podcast of the same name.

ANDREWS: One of the things that really comes through is that status quo, while it being the easiest thing for IT organizations to deal with…it’s problematic. The reason why is we don’t see technology remaining status quo in any environment or any part of our life, so why should it be that way for IT departments?YouTube Preview Image

ANNOUNCER: From Phoenix, Arizona, this is The Axway Podcast. Here’s your host, Mike Pallagi.

PALLAGI: Recently, Axway and Ovum — a leading global technology research and advisory firm — announced the results of a global study that examined data security, governance, and integration challenges facing organizations. The study highlighted how the growing complexity of governance and compliance initiatives challenge IT integration and C-level executives, and how isolation between IT integration and corporate governance forms economic and reputational risks. Of the 450 respondents from North America, Asia Pacific, and EMEA, 23 percent said their company failed a security audit in the last three years, while 17 percent either didn’t believe or didn’t know if they would pass a compliance audit today. The study also revealed that the average overall cost of a data breach was $3 million. So to learn more about the significance of these numbers, I caught up with John Andrews, Axway’s Director of Solution Marketing for Managed File Transfer. Here’s John again.

ANDREWS: Business environments are changing, external threats are changing. If you don’t look at your key technology and adapt it to these upcoming threats in business environments, it exposes problems that may have already been there or it can show new problems that the changing environments have occurred. So what we really want people to look at is, using things like FTP and other traditional file transfer approaches, staying status quo, will ultimately lead to problems in their overall environment.

PALLAGI: The Ovum report pointed out that a comprehensive MFT solution is essential for meeting increasingly complex data security governance requirements.

ANDREWS: It used to be enough to just secure transfers, and what organizations worried about was the delivery. Now, that’s not enough. You have to be able to deliver, track, and audit every transfer that you make. And it has to be done securely, not only for the transport, but securely for the data as well. The only way you really can do this is by implementing a comprehensive governance solution that (not only) manages policies and configurations, but is also easy to manage with updates and changes based on the way that business environments and external threats change. So this ties into that initial point of “staying status quo doesn’t really help organizations.”

PALLAGI: And what about community management and how MFT simplifies it?

ANDREWS: This is probably one of the biggest challenges I think companies experience. It used to be that interactions were dealt with…just external partners, but only a handful of partners. You had key people that you worked with, and you worked with them consistently all year around. Again, with the way the business environments have changed, you still have your core partners, but you have seasonal partners or transitory partners. This is because of the diversification of the way that people are doing business, cost competition in supplying parts, and supplying a product that companies use. And sometimes it’s just timing.

PALLAGI: So, for example, your core partner may not have something that you need at the right time so you’ll engage for a short period of time with a new partner. The problem is that the onboarding process has always been somewhat complex and cumbersome, so if there’s a way to simplify that through a governance process, then THAT can create significant business opportunities for organizations by giving them flexibility on how they interact with different partners.

ANDREWS: Then the Ovum report moves on to talk about a consolidation strategy for file transfer infrastructure, used to reduce footprints and maintenance and support costs. What we’ve seen in the past two to three years is that IT budgets have rebounded somewhat, and because of the security concerns that have been raised through data breaches, the events like Target and most recently JP Morgan Chase have really exposed that there are security concerns. The problem is that money isn’t being thrown at these problems, so IT departments need to come up with cohesive and comprehensive strategy that allows them to address these security needs. And it’s key from when you dig into the facts that file transfer technology has to be part of this. Companies still using FTP are at greater risk of data breaches, and because of this, file transfer technology is probably even more crucial to businesses than it has been in the past.

PALLAGI: Also, the report indicated that there’s a transition going on from governance silos to a central governance layer.

ANDREWS: This is something that, at Axway, we’ve been talking about for a while in that companies need a comprehensive strategy for their file transfer technology. If you look at one of the aerospace companies that we work with, they have both military and civilian divisions. While a civilian divisions does not have the same kind of security requirements as the military divisions, they’re both within the same organization. And by one of the divisions having a more lax environment, it could cause trouble for the military division because they are interconnected.

PALLAGI: That organization took a centralized approach to the way that they do file transfers and made sure that each of the divisions actually adhered to the policies that they laid out.

ANDREWS: This centralized vision and control of information is then independent of the divisional silos, and it makes sure that the data that’s moving in and around the environment, and externally, is secured. Finally, the Ovum report identifies that there needs to be a more thought-out process to MFT, B2B integration, and API management solutions. The way I summarize this is, just like you don’t want corporate silos in your organization for IT, you don’t want technology silos within your IT organization. Meaning, you have a separate MFT group, a separate B2B group, and a separate API group. Those technologies, the technologies that span firewalls, need to be thought of more holistically in the way that they’re managed. This means that they need to have policies that can be applied across all the technologies with similar or the same level of security, and that you should be able to govern them centrally so that you can ensure that there’s consistency in the way that data is moved in and out your environments.

To review the whitepaper titled “The Imperative for Effective Data Flow Governance in Response to Data Security, Risk Mitigation, and Compliance Requirements,” please click here.

To listen to the podcast on YouTube (audio only), please click here.

The Home Depot data breach and why hackers love FTP

This is a transcript of the Axway podcast of the same name.

ANDREWS: Security can always be breached. It’s that visibility piece that’s more about detection that really would help people at Home Depot understand that something unusual is happening in their environment. That is, if they were able to get past the security portions of the APIs anyways.

ANNOUNCER: From Phoenix, Arizona, this is The Axway Podcast. Here’s your host, Mike Pallagi.

PALLAGI: In early September, The Home Depot’s banking partners and law enforcement notified them of unusual activity connected to their payment systems. The Home Depot’s IT security team immediately began working with leading IT security firms, their banking partners, and the Secret Service to investigate. That investigation confirmed that a breach of The Home Depot’s payment card systems occurred. Since then, they’ve fixed it, but I had a question: What could they have done to prevent it? Here’s what John Andrews, Axway’s Director of Solution Marketing for Managed File Transfer, had to say.

ANDREWS: FTP is a very old technology. The original specification for it was published in April of 1971. The specification actually pre-dated TCP as a major way for technical communication between computers. And so it was done more as an academic research, being able to share… How do academics share information electronically rather than putting their research into an envelope and then mailing it and hoping it gets there? Got used very quickly in computer science circles because it was a way to share information.

PALLAGI: Andrews said that FTP was never designed with security in mind and because of that, it’s become one of the favorite venues for hackers looking to get into a corporate network.

ANDREWS: They have built security on top of it. However, the secure variance, while they provide protection around being able to log in or protect the data in motion, they don’t provide any audit or log capability. So thinking back to the API example, if you can’t track or have visibility into what’s being done, it becomes harder to detect unusual patterns. And so that kind of lack of traceability, visibility, auditability make FTP a very insecure piece of software. If you think along a CSI type of metaphor, there isn’t a lot of evidence left by FTP to trace back or identify who actually committed the crime. And that’s why hackers love it.

PALLAGI: So when it comes to logging into an FTP, you just need to use your name and password and there’s nothing to authenticate that that username and password belong to the person who’s trying to log in. Also, when a person logs in, it’s not actually recorded, so there’s no audit trail.

ANDREWS: If a file is moved, there’s no audit of it being logged. There’s plenty of examples of hackers using FTP to gain user credentials. If you Google on some of these, you’ll find them. And I have a bunch of examples. Like, in 2001, Yale University had 43,000 people — user IDs — exposed because the database information with all that user information was stored on an FTP server. In the same year, in 2001, 40,000 Acer customers had their details stolen — again, because the information was stored on the company FTP server. More recently, 7,000 FTP sites had their credentials circulated in underground forums. And that was found from a company called Hold Security. That was probably just about a year ago when that happened. If you look at all the other events that have happened in the last little while, while it hasn’t been clearly stated, FTP could be a primary suspect in allowing hackers to get into systems.

PALLAGI: Is there anything a business can do to track that activity? Some set of improvised actions? A best practice?

ANDREWS: Once somebody gets access to an FTP server, there’s no log of that activity. Now, you can write manual scripts to try and track that, but it’s far from foolproof, and often requires a lot of maintenance, so you’re never quite sure if you’re getting all the information. In fact, if you Google for a Python script, you can find a script written in Python that will scroll through a range of IP addresses to tell you if there’s an FTP server on that machine, whether it’s working or not, and whether the anonymous login is available on that server.

PALLAGI: What else is it about FTP that makes it so attractive to hackers?

ANDREWS: There are a number of things around FTP that make it highly suspect to hacking. For example, it can be used in a brute force attack, so just checking every single port that is on an IP address to see if it has an FTP server exposed on it. You can do bounce attacks checking to see whether or not your attempt to log into the system is available. Use a port command and try to just access … use the FTP server as a way to connect to another system. You also have packet capture. So the idea is that if you know which port the FTP server is listening on, you can listen on that port and just analyze the packets as they’re going to that server. And while the secure versions of FTP can address this, you still don’t have traceability. Ultimately, an FTP server sits on top of a file system that is usually connected to your internal network. So once you get access to the FTP server, you then have access to the internal file system. Once you have access to the internal file system, you can access databases, you can access LDAP stores. If you know what you’re doing and know where to look, the FTP is that proverbial back door to get into a network environment, then find almost anything you’re looking for.

PALLAGI: What can organizations do to reduce their reliance on FTP and secure information in motion?

ANDREWS: First and foremost, there are two things that really come to mind. And that is there is a higher level of security, meaning that username and password aren’t always going to be enough to connect to an MFT solution. While we can mimic FTP functionality, the ability to access it may not only require username and password, but authentication, so we can up the level of security needed for people trying to access. We also abstract away the actual physical … the physical file system away from the attackers so they don’t have easy access to the back-end network. Most importantly, we track and audit all of the interactions. So if you log in as Mike Pallagi, we will see that login, exactly what protocol you were using, and what you tried to access. That audit log is hugely beneficial, especially in diagnostic and troubleshooting situations. And that is provided through a level of visibility that FTP doesn’t have. With an MFT solution, not only are you going to log that activity, you’re going to be able to see how frequently, how often, and who’s trying to gain access. That preventative… or that visibility, allows for preventative measures rather than reactive measures.

To download the first two parts of Axway’s three-part MFT Survival Guide series, click here

To view the video blog on YouTube, please click here.