Home > Microsoft software, Security > Microsoft’s free secure FTP server for Windows Server 2008

Microsoft’s free secure FTP server for Windows Server 2008

The dirty little secret in information security is that anyone or company using FTP to transfer files is probably violating every security compliance requirement under the sun and most companies are guilty of it.  The authentication and payload transmission system in the FTP protocol is completely unencrypted and in the clear.  If those authentication credentials are shared by other access controls in the organization, then a lot more than the FTP server is at stake and a sniffed FTP password can lead to a much larger security compromise.

While HTTPS (HTTP over SSL) has solved the problem for data distribution (users downloading), it doesn’t solve the data collection problem (users uploading).  FTP is primarily used to allow users to upload files to the server and if any form of access control is implemented on the FTP server, the user has to authenticate in clear text.  If this is done over an insecure connection such as a wireless hotspot or if an attacker uses other means to snoop over a wired connection, then the user credentials and the data are completely exposed.

While a secure version of FTP called “FTPS” (FTP over SSL or TLS) has existed for years, it’s simply not commonly used because there is no bundled FTPS client in Windows or Internet Explorer which means most people are only exposed to FTP.  On the server side, FTPS has been available in various commercial packages but it didn’t come out of the box until now.  Microsoft has published a free FTP server add-on for Windows Server 2008 that supports FTP over SSL/TLS and I’ve included the links below.

On the client end, there are no reputable free FTPS clients that I am aware of.  The closest thing to a free and good FTPS client is Smart FTP but it’s only free for personal, educational, or non-profit use.  Kevin in the comment section recommended FileZilla which appears to be an Open Source client.

To deploy FTPS on the server side, you’re going to need a digital certificate that’s trusted by the client.  I would recommend reading an article I wrote in 2007 “How to implement SSL or TLS secure communications“.  The easiest way to do this is buy from a publicly trusted Certificate Authority and the cheapest one I’m aware of is GoDaddy.com SSL at $30/year per certificate.

Important note: There’s no need to get a $300 certificate from a name brand SSL company because THERE IS NO DIFFERENCE.  Even if you insist on buying a $300 certificate from one of those name brand security companies, any compromise at GoDaddy.com will still affect you and everyone else in the world.  If you buy a certificate at GoDaddy.com and there is a compromise at VeriSign (this has happened before), then that also compromises everyone.  This is the trust model in commercial PKI and there’s nothing you can do about it.  What you can do is refuse to overpay hundreds of dollars on a “name brand” digital certificate and make sure you implement best practice.I know so many “security experts” in corporations who refuse to buy anything but name brand certificates.  Then because they don’t have the budget to buy all the brand name certificates they need, they use home grown certificates or use expired certificates and ask their users to bypass the warning which conditions users for future easy exploitation.  The lesson here is that security shouldn’t be about brand names and ego.

When you’re buying a certificate, it is possible to use the same certificate for multiple servers and services if they share a common host name.  So if I buy a certificate with a common name of www.ForMortals.com, I could use it for HTTPS or FTPS.  That means https://www.ForMortals.com and ftps://www.ForMortals.com would both be valid because the certificate is only bound to the host name and not the protocol.  If I load balanced on 10 servers, I can copy the same certificate to all 10 servers and that would be perfectly valid.  But if I wanted to host an FTPS site ftp.ForMortals.com, then I would not be able to share the certificate with www.ForMortals.com.

Categories: Microsoft software, Security Tags:
  1. May 11th, 2008 at 13:21 | #1

    Folks, George’s solution is excellent, but I have another: Cygwin

    Go here:

    http://www.cygwin.com/

    Included in the package bundles is openssh Secure Shell, which includes, ssh, sftp, scp

    ssh is the most used internet web server administration tool for good reason: security.
    Be sure that your sshd service sshd_config is set to use Protocol 2 only and set PermitRootLogin to no.

    You can also set up passwordless dsa encryption private/public keyed TKIP access.
    Once set up, you open your Cygwin terminal window (just like a DOS cmd window) and type:

    $>sftp username@hostname

    And you are connected with ftp over ssh–fully encrypted and secure.

    Want to SOCKS5 proxy your web browser activity to your remote server? No problem:
    $>ssh -D 8080 username@hostname

    Set your browser to use proxy of 127.0.0.1 and port 8080 and you are off!

    Don’t have the newest Windows Terminal server with encryption and want a secure RDP session? No problem just tunnel your rdp over ssh:

    $>ssh -o TCP KeepAlive=yes -L 3389:localhost:3389 -f -N -l remote_user_name

    This sets up the tunnel.

    Then, type into your XP Remote Desktop Connection ‘computer name’ field: localhost

    Your Windows XP rdp (port 3389) session is now safely encrypted.

    No certificates required for securing your ftp.
    Out of pocket cost: $0

    I could go on about ssh, but I will restrain myself. :)
    Plenty of info on it. Google is your friend.

    Thanks George.

  2. May 11th, 2008 at 20:47 | #2

    We’re not talking about secure RDP here. Besides, there are ways to secure RDP with certificate-based authentication and FIPS grade encryption. http://articles.techrepublic.com.com/5100-10878_11-6166676.html

  3. May 11th, 2008 at 20:51 | #3

    "No certificates required for securing your ftp. Out of pocket cost: $0"

    SSH without certificates is pretty easy to attack from the middle. A lot of people use it and just blindly accept any old certificate the server throws up. That’s ripe for a man-in-middle attack. The whole point of certificates for HTTPS and FTPS is that you’re making a server available to the general public that is already trusted. You cannot expect people to manually compare public keys and hashes and that’s something that OSS types need to understand.

  4. May 11th, 2008 at 22:39 | #4

    >"SSH without certificates is pretty easy to attack from the middle".

    As I wrote: "You can also set up passwordless dsa encryption private/public keyed TKIP access."

    Reading the footnote at ssh.com for generating keys:

    http://www.ssh.com/support/documentation/online/ssh/adminguide/32/Generating_the_Host_Key.html

    " Note: Administrators that have other users connecting to their sshd2 daemon should notify the users of the host-key change. If you do not, the users will receive a warning the next time they connect, because the host key the users have saved on their disk for your server does not match the host key now being provided by your sshd2 daemon. The users may not know how to respond to this error. You can run the following to generate a fingerprint for your new public host key which you can provide to your users via some unalterable method (such as digitally signed email):

    ssh-keygen2 -F hostkey.pub

    When the users connect and receive the error message about the host key having changed, they can compare the fingerprint of the new key with the fingerprint you have provided in your email, and ensure that they are connecting to the correct sshd2 daemon. Inform your users to notify you if the fingerprints do not match, or if they receive a message about a host-key change and do not receive a corresponding message from you notifying them of the change.

    **************************************************************
    This procedure can help ensure that you do not become a victim of a man-in-the-middle attack, as your users will notify you if the host-key fingerprints do not match. You will also be aware if the users encounter host-key change messages when you have not regenerated your host key pair.
    **************************************************************
    It is also possible to send the public host key to the users via an unalterable method. The users can save the key to the ~/.ssh2/hostkeys directory as key_22_machinename.pub. In this case, manual fingerprint check is not needed. "

    "Security is a process, not a product."

    Thanks George.

  5. May 12th, 2008 at 01:49 | #5

    "Note: Administrators that have other users connecting to their sshd2 daemon should notify the users of the host-key change."

    How do you notify the entire world Dietrich? We’re talking about a publicly accessible server with potentially millions of clients. That’s the whole point of having a PKI where the trust relationship is already in place.

  6. May 12th, 2008 at 02:35 | #6

    For a free SFTP client, try Filezilla. You can find it on sourceforge.org

  7. May 12th, 2008 at 02:38 | #7

    Thanks for the recommendation Kevin, Filezilla supports both SFTP and FTPS

    SFTP = FTP over SSH
    FTPS = FTP over SSL or TLS

    http://filezilla-project.org/download.php?type=client

  8. May 12th, 2008 at 06:22 | #8

    ssh is just fine for most situations and a reasonably safe no-cost solution for secure ftp that will work across ALL O/S platforms, be they servers or PCs.

    Your recommendation, although good, is for a narrow market Windows Server 2008 only solution.

    I’ll stick with ssh thank you very much.

  9. May 12th, 2008 at 11:37 | #9

    Open source solution for multiple platforms including Linux
    http://en.wikipedia.org/wiki/CrossFTP_Server

    FileZilla works on the client side for Linux too.

  10. May 12th, 2008 at 15:29 | #10

    http://www.exogenesis.org/notes/ssh.html

    My all-time favorite is the ‘Reverse ssh’ tunnel.

    Thanks George.

  11. May 12th, 2008 at 15:35 | #11

    Will check it out.

  12. May 12th, 2008 at 21:36 | #12

    Web services over SSL allow PUT operations to upload files. There are home-grown Web services API’s (such as del.icio.us) which support file access through a REST HTTP GET method. Of course SSL supports uploading files, where did you get the idea that it did not?

    While Web services expose their own security problems, anything that is based on a web server (such as IIS) is much safer than any random FTP server. FTP servers simply shouldn’t be used in 2007. SSH ports should never be open to the Internet — they probably can be used with port-knocking or similar.

    Sure, organization use legacy FTP all the time, even under the auspices of compliance. In almost all compliance situations (e.g. SOX and PCI come to mind first) — FTP servers are under a compensating control. This means that there is a specific exception in place and it is monitored with other prevention, detection, and monitoring controls.

  13. May 13th, 2008 at 01:16 | #13

    Agreed Dre, but it’s not as easy as an FTPS application. A web based solution requires some custom programming and it’s not as easy to use as an FTP interface. You can’t drag a whole bunch of folders in to the client to copy some files.

  14. May 13th, 2008 at 22:00 | #14

    for corporate FTPS like access you may use webdav over TLS, firewall (say ISA) with SSL bridging+pre-authentication+credentials delegation, maybe something like here:
    http://www.carbonwind.net/ISA/WebDav/WebDav1.htm
    way more secure than a typical FTPS pass through a regular swiss chesse firewall which can’t understand FTPS …

  15. May 13th, 2008 at 22:00 | #15

    The FileZilla also has a good secure FTP server product. We use both server and client of FileZilla for years and like them.

  16. May 13th, 2008 at 23:04 | #16

    "Agreed Dre, but it’s not as easy as an FTPS application. A web based solution requires some custom programming and it’s not as easy to use as an FTP interface. You can’t drag a whole bunch of folders in to the client to copy some files"

    I don’t want people dragging folders and copying files into FTP servers that serve the data. This is typically known as a way to increase risk to the security of your systems/network/apps and the privacy of said data. FTP doesn’t support role-based access-control or declarative controls. It’s typically not a workable model.

    Yes, it requires more thought to have a web-form required to access a file to download. But it’s more accessible from a web browser, right? The analytics and customer/user information you get from the web is better than FTP, right?

    FTP just doesn’t have the preventative, monitoring, and detective controls necessary to support a workable model for the risk of threats and vulnerabilities.

    Remember than upgrading a 300G disk to a 600G disk doubles the risk of the privacy for your data.

  17. May 13th, 2008 at 23:37 | #17

    Dre: "I don’t want people dragging folders and copying files into FTP servers that serve the data. This is typically known as a way to increase risk to the security of your systems/network/apps and the privacy of said data. FTP doesn’t support role-based access-control or declarative controls. It’s typically not a workable model."

    Don’t confuse obscurity for security Dre. You’re saying you oppose a technology simply because it’s easier to use. Just because you force someone to go through a more cumbersome web-ui doesn’t make it more secure.

    Dre: "Yes, it requires more thought to have a web-form required to access a file to download. But it’s more accessible from a web browser, right? The analytics and customer/user information you get from the web is better than FTP, right?"

    We’re talking about file tranfer, not information gathering. Web-UI is very simple for downloading but more cumbersome for uploading, especially for multi-file uploads.

    Dre: "FTP just doesn’t have the preventative, monitoring, and detective controls necessary to support a workable model for the risk of threats and vulnerabilities."

    We’re talking about files coming in, not going out.

    Dre: "Remember than upgrading a 300G disk to a 600G disk doubles the risk of the privacy for your data."

    So what? Living has a risk of dying so why live? One of the key rules of security (They teach CISSPs this) is availability. Cutting off half the capacity for the name of security is destroying security. This is why no one implements the ultimate form of Internet security, which is when you cut your access to the outside world.

  18. May 14th, 2008 at 00:31 | #18

    I somehow think that FTP is more obscure in the world of HTTP and the web.

    As far as the CIA model goes… there are better models that apply to the new school of information security. While I agree that reliability is tied to security and vice-versa — the scalability of data in an organization can be restricted by IdM or user-based/server-based/application-based quotas as a preventative measure to creating "too much data" while still keeping systems up and running. Bigger isn’t always better or as the quote goes, "The bigger they are; the harder they fall".

  19. May 14th, 2008 at 01:25 | #19

    Uploading files is absolutely more burdensome with a Web-UI. That’s why various social networking sites use special applications that users can download to upload their files or they use Active-X type applications that allow the uploading of multiple files. Those Active-X applications don’t always work even under an all Windows environment much less a non-Windows environment.

    "Bigger isn’t always better or as the quote goes, "The bigger they are; the harder they fall"."

    You have a knack for misapplying quotations and misunderstanding theory.

  20. May 14th, 2008 at 05:12 | #20

    "You have a knack for misapplying quotations and misunderstanding theory"

    Why all of the personal attacks? I’m merely stating that Web services have already taken over FTP, and that people should move to using Web services. Recommending FTP in 2008 is just not kosher anymore, dood — even if it is SSH-based, SFTP, FTPS, or SSLv3/TLS1 over FTP, et al.

  21. May 14th, 2008 at 09:28 | #21

    "Recommending ActiveX is a sure sign you don’t know anything about security"

    1. I didn’t recommend ActiveX. I said many websites use it to provide more functionality that people want; the sort of rich GUI functionality a web form UI can’t provide.
    2. ActiveX is not a single application; it’s a web-development platform. Saying ActiveX is insecure is sort of like saying C++ is insecure.
    3. ActiveX applications have bugs but it’s nowhere near as bad as something like Apple QuickTime.
    4. Your attack on my knowledge about security is ridiculous and doesn’t have a leg to stand on.

    "Why all of the personal attacks? I’m merely stating that Web services have already taken over FTP, and that people should move to using Web services."

    You stated that we don’t need bigger hard drives because bigger isn’t always better. That’s a ridiculous conclusion IMO.

    You also stated that FTP isn’t necessary and insecure. I say FTPS is good for some things and you cannot make the security argument against FTPS. I think your conclusion is ridiculous.

    You also stated that all router manufactures ought to be sued under class action just because they tack on 1ms (2ms at most) delay per router. Based on that you say QoS wouldn’t be necessary if this problem is fixed. That’s one of the most ridiculous things I’ve ever heard.

    So based on these three things you said and believe which appears to me to be a trend, I think you "have a knack for misapplying quotations and misunderstanding theory". My negative assessment of your understanding of theory is well grounded.

  22. May 14th, 2008 at 10:36 | #22

    "Recommending ActiveX is a sure sign you don’t know anything about security
    1. I didn’t recommend ActiveX."

    Yeah I know you didn’t. This quote only applies to anyone who does. Why are you so defensive? I’m not trying to attack you. Chill out, man!

    The cost of securing ActiveX outweighs any of the benefits to writing applications with it. Comparing ActiveX to C++ is not a conversation that I want to get into. We’ll just settle and agree that they’re totally different monsters. Your comparison of ActiveX to QuickTime is completely unfair considering you just stated that ActiveX is not an application — but rather a development platform. I’m not too familiar with QuickTime internals since I don’t work for Apple, but on the surface it appears as an application and not a computer programming language. You’re breaking your own logic.

    I didn’t say "we don’t need bigger hard drives". I said that organizations should think about doubling the size of their drives universally because of the inherent privacy risks. Without controls in place, increasing the size of local filesystems or file shares just means more data that can be compromised.

    FTP is less secure (not insecure) than Web services because it doesn’t and can’t support any authorization besides the basics. There are plenty of other reasons, but this was the only one that I can glean from the above points that I tried to make about the differences.

    The QoS thread is totally different. I also didn’t say those things… you must have misread what I said. I also think that your theories are just that — theories — and nothing more. If you want to talk QoS / router vendors /etc , I’ll go back to that thread and we can re-hash our discussion and bring it back to center.

  23. May 14th, 2008 at 14:08 | #23

    "You’re breaking your own logic."

    If you read what I said, you’d understand. I compared an ActiveX APPLICATION to QuickTime which is also an APPLICATION. The comparison is valid because there are no single ActiveX applications that are as habitually buggy as Apple QuickTime, Firefox, or Internet Explorer. You’re the one who brought up the entire ActiveX platform and made a blanket statement that it’s insecure when it’s developers making mistakes with ActiveX that are the problem just like it’s the developers that make coding mistakes in any language.

    "The QoS thread is totally different. I also didn’t say those things"

    Don’t be a liar Dre when your comments are in writing. Your most recent post in the other thread on Net Neutrality continues to recklessly call for class-action lawsuits against router manufacturers because routers add a millisecond of delay. You sound like one of those people pushing the theory that the US Governement brought down the towers and that’s not what this site is about and I’ve run out of patience trying to clean up your mess.

  24. May 14th, 2008 at 18:31 | #24

    I have nothing left to respond to except ad hominem attacks. I’ll take it that you agree with my assessment of Web services and FTP?

    Here, I’ll try to bring this back to some sort of semblance of a subject. I would prefer that you stay on-topic and answer questions when they are directed towards you. I would prefer that you respond to the meat of my arguments instead of one-liners that are unrelated to the argument(s).

    "You’re the one who brought up the entire ActiveX platform and made a blanket statement that it’s insecure when it’s developers making mistakes with ActiveX that are the problem just like it’s the developers that make coding mistakes in any language"

    I don’t want to talk about ActiveX. I do want to discuss software security safe guards.

    ActiveX is a good topic for software security, but it’s not unique. Maybe "insecure" is the wrong word (note to self: did I even use that terminology or are you putting words in my mouth again?).

    The correct wording would be "increases the attack surface to an unmanageable level, thus increasing risk to the point where preventative measures and other controls create an unbalanced equation".

    Developers only make mistakes when you give them enough rope to hang themselves. Rope can be a useful tool for going places (you can’t climb a mountain without it), however — 2008 is not the time to be climbing. If you look at the rate and severity of data breaches today compared to 1-2 and 2-5 years ago… things are in really bad shape. Crimeware is extremely prolific, and yes — some of it is ActiveX-based. ActiveX is not immune to the Internet disease that is spreading at unprecedented rates.

    We need to pull back on the developer reins. This is mostly a process issue — something that a Secure Development Lifecycle would seek to solve. It is also a people issue. We need people doing the right things (process), and on the second iteration — we need to improve the tools (i.e. technology) that they use, both on the code itself, as well as with the process.

    If you know anything about the software development lifecycle or have read Boehm’s proofs (and later Jaquith’s, Soo Hoo’s, and Geer’s work related to security bugs), you’ll note that over 80% of bugs are created during the requirements phase, and that the cost to fix these bugs is as much as 200 times more during the maintenance phase.

    Developer mistakes is the same thing as saying "pilot error". We need a better process, and continuous improvement of process/people/technology to move on this issue.

    What does this have to do with FTP and Web services? Well, FTP does not meet the baseline security requirements when considering it does not implement a modern AAA stack to help with your precious CIA model. Web services, when properly designed and implemented through a Secure SDLC, will rise to the occasion. It will also have problems, maybe even more in the short term. But as a scalable, long-term solution to reduce risk — it is worth the investment now. I do not believe in a complex Web services stack although this could easily happen. Web services should be about as complex as FTP when implemented for this purpose and not much more.

  25. May 14th, 2008 at 22:54 | #25

    You beleive that FTPS is an unacceptable risk should be banned. You don’t understand that it is not your place to decide what type of technology gets used or not and that businesses and users don’t care for your arrogant attitude. You do not own the data or the business and it’s not your place to say “no”.

  26. May 14th, 2008 at 23:36 | #26

    I make recommendations like this to Global 2000 as a trusted advisor on risk and security issues.

    It’s definitely my place and the businesses/user do care.

  27. May 19th, 2008 at 21:09 | #27

    Grow up.
    We’re talking FTPS… not —

  28. July 18th, 2008 at 04:43 | #28

    After years of working with email usage analysis and then file transfer solutions along with seeing the pains of ftp, I have put together http://www.EzFileSend.com . I represent multiple companies that offer solutions for ftp replacement and email attachment delivery. All of my solutions are completely secure and are offered either as hosted or non-hosted solutions. They are Outlook or Lotus Notes plug-ins that are as easy to use as sending an email and can even be automated for size and file type. All with zero impact on your email infrastructure. We even have unlimited file size options. I have negotiated pricing with these companies for Enterprise Companies. 100% get usage, pay for only 10%.
    Thanks,
    Douglas
    Douglas@EzFileSend.com
    http://www.EzFileSend.com

  29. August 12th, 2009 at 15:21 | #29

    There is technically a difference between the less expensive and more expensive SSL certificates: the chain. All of the less expensive (GoDaddy, et al) certificates I am asked to install for my customers require at least one intermediate certificate. This is a certificate used to sign your SSL certificate, which itself is signed by a trusted root. Without the intermediate, your SSL certificate becomes invalid to the end-user because the chain is broken and there is no path to a trusted root.

    The more expensive certificates are often signed by trusted roots already installed in the client operating system, browser, or SSL tool kit. This eliminates the requirement for an additional certificate installation or configuration. Installing the intermediate certificate is fairly simple in Windows/IIS, some *nix programs have different file format requirements — each cert in a different file, or combined in a single file in some particular order.

    Certificates not signed by a trusted root can also be problematic for some mobile devices. Of course, even certs signed by trusted root may be problematic if the root is not part of the device’s root store and the device is locked by the carrier to prevent installation of new roots.

    I use GoDaddy’s multiple host name certificates (can’t think of the designation, but they use X509v3 Subject Alternative Name extensions) with little problem, even with the intermediate certificate requirements. The only issue is the missing signing root (ValiCert Class 2 Policy Validation Authority) on some mobile devices, which I mitigate by installing the root manually — again, not available for all devices.

    So there technically is a difference between them. While the differences may be transparent to the end-user, in certain cases they are certainly visible by the administrator.

    • August 12th, 2009 at 21:06 | #30

      This is not true. You need to install an intermediate certificate from GoDaddy only when you’re installing the certificate on a Windows Server 2008 machine. On every client I’ve ever tested, I’ve never had a problem with trusting an SSL certificate from GoDaddy.

  30. September 24th, 2009 at 06:27 | #31

    I have to say, SSH was the best thing they ever came out with. You can’t beat its security nor its reliabilty.

  1. September 8th, 2009 at 16:58 | #1